Compare commits

..

228 Commits

Author SHA1 Message Date
Mark Paluch
c8846d3d1c DATAMONGO-2034 - Release version 2.0.10 (Kay SR10). 2018-09-10 12:52:18 +02:00
Mark Paluch
a4835c8fcf DATAMONGO-2034 - Prepare 2.0.10 (Kay SR10). 2018-09-10 12:51:27 +02:00
Mark Paluch
7875c8399f DATAMONGO-2034 - Updated changelog. 2018-09-10 12:51:22 +02:00
Mark Paluch
9046857721 DATAMONGO-2035 - Updated changelog. 2018-09-10 10:20:56 +02:00
Oliver Gierke
e8bb63c9f7 DATAMONGO-2076 - Fixed attribute substitution in reactive MongoDB section.
We now redeclare the Asciidoctor Maven plugin to register the store specific attributes. Apparently they must not contain dots, so we replaced them with dashes.
2018-08-30 11:45:08 +02:00
Oliver Gierke
b431a56a95 DATAMONGO-2076 - Fixed attribute substitution in getting started section. 2018-08-30 09:32:05 +02:00
Oliver Gierke
dc820017e0 DATAMONGO-2033 - Updated changelog. 2018-08-20 11:07:56 +02:00
Oliver Gierke
34ce87b80c DATAMONGO-2046 - Performance improvements in mapping and conversion subsystem.
In MappingMongoConverter, we now avoid the creation of a ParameterValueProvider for parameter-less constructors. We also skip property population if entity can be constructed entirely through constructor creation. Replaced the lambda in MappingMongoConverter.readAndPopulateIdentifier(…) with direct call to ….readIdValue(…). Objectpath now uses decomposed ObjectPathItems to avoid array copying and creation. It now stores a reference to its parent and ObjectPathItem fields are now merged into ObjectPath, which reduces the number of created objects during reads.

Extended CachingMongoPersistentProperty with DBRef caching. Turned key access in DocumentAccessor into an optimistic lookup. DbRefResolverCallbacks are now created lazily.

Related tickets: DATACMNS-1366.
Original pull request: #602.
2018-08-15 16:12:22 +02:00
Mark Paluch
9098d509a5 DATAMONGO-2055 - Polishing.
Move test to UpdateMapperUnitTests.

Original pull request: #600.
2018-08-15 15:59:55 +02:00
Christoph Strobl
861c8279a3 DATAMONGO-2055 - Allow position modifier to be negative using push at position on Update.
Original pull request: #600.
2018-08-15 15:53:59 +02:00
Mark Paluch
e545787e7e DATAMONGO-2050 - Polishing.
Tweak Javadoc.

Original pull request: #596.
2018-08-15 15:18:19 +02:00
Christoph Strobl
38ccdc5dfc DATAMONGO-2050 - Polishing.
Move to AssertJ.

Original pull request: #596.
2018-08-15 15:05:06 +02:00
Christoph Strobl
7a34cc73d8 DATAMONGO-2050 - Allow to specify the index to use for $geoNear aggregation operation.
Original pull request: #596.
2018-08-15 15:05:04 +02:00
Mark Paluch
ba6fa834e5 DATAMONGO-2051 - Polishing.
Use method argument types to avoid false positives with different method signatures.

Original pull request: #597.
Related pull request: #598.
2018-08-14 16:37:32 +02:00
Christoph Strobl
7100cd17be DATAMONGO-2051 - Add support for SCRAM-SHA-256 authentication mechanism to MongoCredentialPropertyEditor.
Original pull request: #597.
Related pull request: #598.
2018-08-14 16:33:39 +02:00
Mark Paluch
7c65472e2d DATAMONGO-2049 - Polishing.
Add static import for assertThat(…).

Original pull request: #594.
2018-08-14 10:51:41 +02:00
Christoph Strobl
f98f586a23 DATAMONGO-2049 - Add support for $ltrim, $rtrim, and $trim.
Original pull request: #594.
2018-08-14 10:51:41 +02:00
Mark Paluch
19b5b6b6f0 DATAMONGO-2048 - Polishing.
Javadoc tweaks.

Original pull request: #595.
2018-08-13 16:00:40 +02:00
Christoph Strobl
b9ffa9b89d DATAMONGO-2048 - Add support for MongoDB 4.0 $convert aggregation operator.
We now support the following type conversion aggregation operators:

* $convert
* $toBool
* $toDate
* $toDecimal
* $toDouble
* $toInt
* $toLong
* $toObjectId
* $toString

Original pull request: #595.
2018-08-13 16:00:40 +02:00
Mark Paluch
3ba589072f DATAMONGO-2047 - Polishing.
Retain previous options when calling withTimezone(…)/onNull…(…). Add tests. Javadoc.

Original pull request: #593.
2018-08-13 13:27:27 +02:00
Christoph Strobl
e237c5dfc4 DATAMONGO-2047 - Update $dateToString and $dateFromString aggregation operators to match MongoDB 4.0 changes.
We added the format and onNull options to DateFromString and changed format to an optional parameter.

Original pull request: #593.
2018-08-13 13:27:26 +02:00
Mark Paluch
ecb560cdbc DATAMONGO-2045 - Polishing.
Use diamond syntax where possible. Add initial size to HashMap instances with known number of elements. Fix typos in private constant names. Fix duplicate error code ids.

Original pull request: #592.
2018-08-13 10:31:28 +02:00
Mark Paluch
fc4a21775a DATAMONGO-2043 - Polishing.
Slightly tweak Javadoc.

Original pull request: #589.
2018-08-08 11:01:14 +02:00
Christoph Strobl
ae62e70c52 DATAMONGO-2043 - Omit type hint when mapping simple types.
Original pull request: #589.
2018-08-08 11:01:09 +02:00
Christoph Strobl
f83622709d DATAMONGO-2027 - Polishing.
Remove duplicate tests and fix assertions on existing ones. Move tests over to AssertJ and fix output database not applied correctly.

Original Pull Request: #588
2018-08-07 13:37:22 +02:00
Mark Paluch
83d218081c DATAMONGO-2027 - Consider MapReduce output type.
We now consider the output type (collection output) when rendering the MapReduce command. Previously, all output was returned inline without storing the results in the configured collection.

Original Pull Request: #588

# Conflicts:
#	spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/ReactiveMongoTemplate.java
#	spring-data-mongodb/src/test/java/org/springframework/data/mongodb/core/mapreduce/ReactiveMapReduceTests.java
2018-08-07 13:20:46 +02:00
Mark Paluch
70fe406602 DATAMONGO-2006 - Updated changelog. 2018-07-27 11:45:23 +02:00
Mark Paluch
18046e9040 DATAMONGO-2007 - After release cleanups. 2018-07-26 15:23:24 +02:00
Mark Paluch
69310552e3 DATAMONGO-2007 - Prepare next development iteration. 2018-07-26 15:23:22 +02:00
Mark Paluch
b8f093269d DATAMONGO-2007 - Release version 2.0.9 (Kay SR9). 2018-07-26 14:44:00 +02:00
Mark Paluch
172db96fea DATAMONGO-2007 - Prepare 2.0.9 (Kay SR9). 2018-07-26 14:43:06 +02:00
Mark Paluch
c8381c734b DATAMONGO-2007 - Updated changelog. 2018-07-26 14:42:54 +02:00
Mark Paluch
bf82964474 DATAMONGO-1982 - Updated changelog. 2018-07-26 14:03:20 +02:00
Mark Paluch
2d0495874f DATAMONGO-2029 - Encode collections of UUID and byte array query method arguments to their binary form.
We now convert collections that only contain UUID or byte array items to a BSON list that contains the encoded form of these items. Previously, we only converted single UUID and byte arrays into $binary so lists rendered to e.g. $uuid which does not work for queries.

Encoding is now encapsulated in strategy objects that implement the encoding only for their type. This allows to break up the conditional flow and improve organization of responsibilities.
2018-07-25 15:16:15 +02:00
Mark Paluch
82c91cbb71 DATAMONGO-2030 - Reinstantiate existsBy queries for reactive repositories.
We now support existsBy queries for reactive repositories to align with blocking repository support. ExistsBy support got lost during merging and is now back in place.

Extract boolean flag counting into BooleanUtil.
2018-07-23 16:34:12 +02:00
Christoph Strobl
4d309bd7f0 DATAMONGO-2011 - Relax type check when mapping collections.
Original pull request: #587.
2018-07-13 12:55:07 +02:00
Mark Paluch
6f011b0fa1 DATAMONGO-2021 - Polishing.
Adapt getResources(…) to use the file id and no longer the file name when opening a download stream. Add author tag.

Original pull request: #581.
2018-07-06 13:12:36 +02:00
Niklas Helge Hanft
1a3b9e3c42 DATAMONGO-2021 - Use getObjectId() instead of getFilename() for opening the GridFS download stream.
Using the file name leads to duplicate resource streams as file names are not unique therefore we're using the file's ObjectId to lookup the file content.

Original pull request: #581.
2018-07-06 13:12:36 +02:00
Mark Paluch
5a37468103 DATAMONGO-2016 - Polishing.
Fail gracefully if query string parameter has no value. Reformat test. Convert assertions to AssertJ.

Original pull request: #578.
2018-07-04 11:25:38 +02:00
Stephen Tyler Conrad
d4b0963550 DATAMONGO-2016 - Fix username/password extraction in MongoCredentialPropertyEditor.
MongoCredentialPropertyEditor inspects now the connection URI for the appropriate delimiter tokens. Previously, inspection used the char questionmark for username/password delimiter inspection.

Original pull request: #578.
2018-07-04 11:25:35 +02:00
Mark Paluch
468c497525 DATAMONGO-1969 - After release cleanups. 2018-06-13 21:24:35 +02:00
Mark Paluch
4562f39d7a DATAMONGO-1969 - Prepare next development iteration. 2018-06-13 21:24:33 +02:00
Mark Paluch
49957e8c6e DATAMONGO-1969 - Release version 2.0.8 (Kay SR8). 2018-06-13 15:13:01 +02:00
Mark Paluch
b462b35284 DATAMONGO-1969 - Prepare 2.0.8 (Kay SR8). 2018-06-13 15:12:06 +02:00
Mark Paluch
445388bb5f DATAMONGO-1969 - Updated changelog. 2018-06-13 15:12:00 +02:00
Mark Paluch
61e9eac49b DATAMONGO-1967 - Updated changelog. 2018-06-13 15:01:58 +02:00
Mark Paluch
c219f6e7f2 DATAMONGO-2003 - Polishing.
Add nullability annotation to MongoParameterAccessor.getPoint(). Remove superfluous casts.

Convert MongoQueryCreatorUnitTests to user AssertJ assertions.

Original pull request: #570.
2018-06-11 14:19:02 +02:00
Christoph Strobl
1ab130ffca DATAMONGO-2003 - Fix derived query using regex pattern with options.
We now consider regex pattern options when using the pattern as a derived finder argument.

Original pull request: #570.
2018-06-11 14:19:01 +02:00
Oliver Gierke
a4d6a0cf8a DATAMONGO-2002 - Fixed Criteria.equals(…) for usage with Pattern instances.
For Criteria instances that use regular expressions we now properly compare the two Pattern instances produced by also including the pattern flags in the comparison.
2018-06-07 19:12:44 +02:00
Mark Paluch
c28f725f48 DATAMONGO-1979 - Polishing.
Convert ReactiveMongoRepositoryTests to AssertJ. Add missing verifyComplete() steps to StepVerifier.

Original pull request: #566.
2018-06-07 10:06:37 +02:00
Mark Paluch
a71f50f15c DATAMONGO-1998 - Polishing.
Switch id field name check to equals or to match the last property path segment.

Original pull request: #567.
2018-06-06 11:35:22 +02:00
Christoph Strobl
0ad715f806 DATAMONGO-1998 - Fix Querydsl id handling for nested property references using ObjectId hex String representation.
We now follow the conversion rules for id properties with a valid ObjectId representation when parsing Querydsl queries.

Original pull request: #567.
2018-06-06 11:35:22 +02:00
Mark Paluch
ba559c223a DATAMONGO-1986 - Polishing.
Refactor duplicated code into AggregationUtil.

Original pull request: #564.
2018-06-06 10:37:20 +02:00
Christoph Strobl
5f3ad68114 DATAMONGO-1986 - Always provide a typed AggregationOperationContext for TypedAggregation.
We now initialize a TypeBasedAggregationOperationContext for TypedAggregations if no context is provided. This makes sure that potential Criteria objects are run trough the QueryMapper.
In case the default context is used we now also make sure to at least run the aggregation pipeline through the QueryMapper to avoid passing on non MongoDB simple types to the driver.

Original pull request: #564.
2018-06-06 10:37:20 +02:00
Mark Paluch
28b18d25cb DATAMONGO-1988 - Polishing.
Match exactly for either top-level properties of leaf-properties instead of accepting the property/field name suffix.

Original pull request: #565.
2018-06-05 11:14:11 +02:00
Christoph Strobl
22c0e5029c DATAMONGO-1988 - Fix query creation for id property references using ObjectId hex String representation.
We now follow the conversion rules for id properties with a valid ObjectId representation when creating queries. Prior to this change e.g. String values would have been turned into ObejctIds when saving a document, but not when querying the latter.

Original pull request: #565.
2018-06-05 11:13:01 +02:00
Christoph Strobl
4582d3152c DATAMONGO-1927 - Updated changelog. 2018-05-17 10:32:57 +02:00
Sébastien Deleuze
d219e8ed7c DATAMONGO-1980 - Fix a typo in CriteriaExtensions.kt.
Original pull request: #563.
2018-05-16 09:43:34 +02:00
Victor
ab7740faf5 DATAMONGO-1978 - Fix minor typo in Field.positionKey field name.
Original pull request: #558.
2018-05-15 12:30:06 +02:00
Mark Paluch
0fba00311d DATAMONGO-1466 - Polishing.
Switch conditionals to Map-based Function registry to pick the appropriate converter. Fix typos in method names.

Original pull request: #561.
2018-05-15 11:30:21 +02:00
Christoph Strobl
33863999e6 DATAMONGO-1466 - Polishing.
Just some minor code style improvements.

Original pull request: #561.
2018-05-15 11:30:21 +02:00
Christoph Strobl
ae18958955 DATAMONGO-1466 - Add embedded typeinformation-based reading GeoJSON converter.
Original pull request: #561.
2018-05-15 11:30:21 +02:00
Mark Paluch
489d637a00 DATAMONGO-1974 - Polishing.
Fix typos, links, and code fences.

Original pull request: #559.
2018-05-11 15:29:57 +02:00
Jay Bryant
1b5ce651be DATAMONGO-1974 - Full editing pass for Spring Data MongoDB.
Full editing pass of the Spring Data MongoDB reference guide. I also adjusted index.adoc to work with the changes I made to the build project, so that we get Epub and PDF as well as HTML.

Original pull request: #559.
2018-05-11 15:29:18 +02:00
Mark Paluch
e035210917 DATAMONGO-1971 - Polishing.
Remove outdated profiles.

Original pull request: #554.
2018-05-09 16:35:07 +02:00
Mark Paluch
57fc260c43 DATAMONGO-1971 - Install MongoDB 3.7.9 on TravisCI.
We now download and unpack MongoDB directly instead of using TravisCI's outdated MongoDB version.

Original pull request: #554.
2018-05-09 16:35:04 +02:00
Mark Paluch
f9ec63425e DATAMONGO-1918 - After release cleanups. 2018-05-08 15:04:28 +02:00
Mark Paluch
aedb50d728 DATAMONGO-1918 - Prepare next development iteration. 2018-05-08 15:04:26 +02:00
Mark Paluch
dbf4990f60 DATAMONGO-1918 - Release version 2.0.7 (Kay SR7). 2018-05-08 14:15:27 +02:00
Mark Paluch
c5c43158c2 DATAMONGO-1918 - Prepare 2.0.7 (Kay SR7). 2018-05-08 14:14:32 +02:00
Mark Paluch
56ffe7913d DATAMONGO-1918 - Updated changelog. 2018-05-08 14:14:23 +02:00
Mark Paluch
eae263eebc DATAMONGO-1917 - Updated changelog. 2018-05-08 12:22:53 +02:00
Mark Paluch
0dd2fa3dce DATAMONGO-1943 - Polishing.
Reduce visibility. Use List interface instead of concrete type.

Original pull request: #556.
2018-05-07 16:20:52 +02:00
Christoph Strobl
e648ea5903 DATAMONGO-1943 - Fix ClassCastException caused by SpringDataMongodbSerializer.
We now convert List-typed predicates to List to BasicDBList to meet MongodbSerializer's expectations for top-level lists used for the $and operator.

Original pull request: #556.
2018-05-07 16:20:52 +02:00
Mark Paluch
f389812b7c DATAMONGO-1869 - Updated changelog. 2018-04-13 15:11:29 +02:00
Mark Paluch
2127ddcbb8 DATAMONGO-1893 - Polishing.
Inherit fields from previous operation if at least one field is excluded. Extend FieldsExposingAggregationOperation to conditionally inherit fields.

Original pull request: #538.
2018-04-06 10:45:52 +02:00
Christoph Strobl
7f9ab3bb44 DATAMONGO-1893 - Allow exclusion of other fields than _id in aggregation $project.
As of MongoDB 3.4 exclusion of fields other than _id is allowed so we removed the limitation in our code.

Original pull request: #538.
2018-04-06 10:45:52 +02:00
Mark Paluch
aea40ca490 DATAMONGO-1888 - After release cleanups. 2018-04-04 16:42:33 +02:00
Mark Paluch
fb8d03db31 DATAMONGO-1888 - Prepare next development iteration. 2018-04-04 16:42:30 +02:00
Mark Paluch
890f08f19a DATAMONGO-1888 - Release version 2.0.6 (Kay SR6). 2018-04-04 15:53:22 +02:00
Mark Paluch
be58472777 DATAMONGO-1888 - Prepare 2.0.6 (Kay SR6). 2018-04-04 15:52:31 +02:00
Mark Paluch
b082d4ad98 DATAMONGO-1888 - Updated changelog. 2018-04-04 15:52:22 +02:00
Mark Paluch
e80b031f54 DATAMONGO-1857 - Updated changelog. 2018-04-04 15:16:20 +02:00
Mark Paluch
50b017c08b DATAMONGO-1903 - Polishing.
Remove client side operating system check as operating system-dependant constraints depend on the server. Add check on whitespaces. Add author tags. Extend tests.

Adapt check in SimpleReactiveMongoDatabaseFactory accordingly. Remove superfluous UnknownHostException declaration in reactive database factory. Replace references to legacy types in Javadoc with references to current ones.

Original pull request: #546.
2018-04-03 13:44:19 +02:00
George Moraitis
78429eb33d DATAMONGO-1903 - Align database name check in SimpleMongoDbFactory with MongoDB limitations.
We now test database names against the current (3.6) MongoDB specifications for database names.

Original pull request: #546.
2018-04-03 13:44:16 +02:00
Mark Paluch
3ed0bd7a18 DATAMONGO-1916 - Polishing.
Remove unused final keywords from method parameters and unused variables. Add nullable annotations to parameters that can be null. Fix generics.

Original pull request: #547.
2018-04-03 11:33:13 +02:00
Christoph Strobl
cbc923c727 DATAMONGO-1916 - Fix potential ClassCastException in MappingMongoConverter#writeInternal when writing collections.
Original pull request: #547.
2018-04-03 11:32:53 +02:00
Mark Paluch
f6ca0049b6 DATAMONGO-1834 - Polishing.
Increase visibility of Timezone factory methods. Add missing nullable annotation. Tweaked Javadoc. Add tests for Timezone using expressions/field references.

Original Pull Request: #539
2018-03-28 11:40:14 +02:00
Christoph Strobl
82c9b0c662 DATAMONGO-1834 - Polishing.
Remove DateFactory and split up tests.
Introduce dedicated Timezone abstraction and update existing factories to apply the timezone if appropriate. Update builders and align code style.

Original Pull Request: #539
2018-03-28 11:39:58 +02:00
Matt Morrissette
3ca2349ce3 DATAMONGO-1834 - Add support for MongoDB 3.6 DateOperators $dateFromString, $dateFromParts and $dateToParts including timezones.
Original Pull Request: #539
2018-03-28 11:34:57 +02:00
Oliver Gierke
a76f157457 DATAMONGO-1915 - Removed explicit declaration of Jackson library versions. 2018-03-27 19:35:24 +02:00
Christoph Strobl
560a6a5bc2 DATAMONGO-1911 - Polishing.
Use native MongoDB Codec facilities to render binary and uuid.

Original Pull Request: #544
2018-03-27 14:17:46 +02:00
Mark Paluch
51d5c52193 DATAMONGO-1911 - Fix UUID serialization in String-based queries.
We now render to the correct UUID representation in String-based queries. Unquoted values render to $binary representation, quoted UUIDs are rendered with their toString() value.

Previously we used JSON.serialize() to encode values to JSON. The com.mongodb.util.JSON serializer does not produce JSON that is compatible with Document.parse. It uses an older JSON format that preceded the MongoDB Extended JSON specification.

Original Pull Request: #544
2018-03-27 14:04:34 +02:00
Mark Paluch
56b6748068 DATAMONGO-1913 - Add missing nullable annotations to GridFsTemplate. 2018-03-26 14:10:53 +02:00
Felipe Zanardo Affonso
1e19f405cc DATAMONGO-1909 - Fix typo on return statement.
Original pull request: #523.
2018-03-21 16:05:25 +01:00
Mark Paluch
54d2c122eb DATAMONGO-1907 - Polishing.
Rename test method to reflect test subject.

Switch from flatMap(…) to map(…) to avoid overhead of Mono creation.

Original pull request: #541.
2018-03-21 09:54:07 +01:00
Ruben J Garcia
b47c5704e7 DATAMONGO-1907 - Adjust SimpleReactiveMongoRepository.findOne(…) to complete without exception on empty result
We now no longer emit an exception via SimpleReactiveMongoRepository.findOne(Example) if the query completes without yielding a result. Previously findOne(Example) emitted a NoSuchElementException if the query returned no result.

Original pull request: #541.
2018-03-21 09:51:32 +01:00
Oliver Gierke
6b0b1cd97d DATAMONGO-1904 - Optimizations in MappingMongoConverter.readCollectionOrArray(…).
Switched to ClassUtils.isAssignableValue(…) in getPotentiallyConvertedSimpleRead(…) as it transparently handles primitives and their wrapper types so that we can avoid the superfluous invocation of the converter infrastructure.
2018-03-15 15:03:53 +01:00
Oliver Gierke
35bbc604aa DATAMONGO-1904 - Fixed handling of nested arrays on reads in MappingMongoConverter.
We now properly forward the component type information into recursive calls to MappingMongoConverter.readCollectionOrArray(…).
2018-03-15 15:03:51 +01:00
Oliver Gierke
9ade830a10 DATAMONGO-1901 - Added project.root configuration to make JavaDoc generation work again.
Related ticket: https://github.com/spring-projects/spring-data-build/issues/527.
2018-03-14 09:37:46 +01:00
Oliver Gierke
8fbff50f4f DATAMONGO-1898 - Added unit tests for the conversion handling of enums implementing interfaces.
Related tickets: DATACMNS-1278.
2018-03-12 11:07:40 +01:00
Oliver Gierke
14b49638a0 DATAMONGO-1896 - SimpleMongoRepository.saveAll(…) now consistently uses aggregate collection for inserts.
We previously used MongoTemplate.insertAll(…) which determines the collection to insert the individual elements based on the type, which - in cases of entity inheritance - will use dedicated collections for sub-types of the aggregate root. Subsequent lookups of the entities will then fail, as those are executed against the collection the aggregate root is mapped to.

We now rather use ….insert(Collection, String) handing the collection of the aggregate root explicitly.
2018-03-09 00:03:44 +01:00
Mark Paluch
dc31f4f32f DATAMONGO-1882 - After release cleanups. 2018-02-28 10:43:35 +01:00
Mark Paluch
708f9ac7b3 DATAMONGO-1882 - Prepare next development iteration. 2018-02-28 10:43:34 +01:00
Mark Paluch
17d6100426 DATAMONGO-1882 - Release version 2.0.5 (Kay SR5). 2018-02-28 10:14:58 +01:00
Mark Paluch
27a4e25880 DATAMONGO-1882 - Prepare 2.0.5 (Kay SR5). 2018-02-28 10:14:05 +01:00
Mark Paluch
d378bcb442 DATAMONGO-1882 - Updated changelog. 2018-02-28 10:13:57 +01:00
Mark Paluch
f6505c7758 DATAMONGO-1859 - After release cleanups. 2018-02-19 20:29:08 +01:00
Mark Paluch
d25f88c70e DATAMONGO-1859 - Prepare next development iteration. 2018-02-19 20:29:06 +01:00
Mark Paluch
cec6edfa26 DATAMONGO-1859 - Release version 2.0.4 (Kay SR4). 2018-02-19 19:46:53 +01:00
Mark Paluch
3261936e8a DATAMONGO-1859 - Prepare 2.0.4 (Kay SR4). 2018-02-19 19:46:05 +01:00
Mark Paluch
d2d471d135 DATAMONGO-1859 - Updated changelog. 2018-02-19 19:45:58 +01:00
Mark Paluch
bcd2de000c DATAMONGO-1870 - Polishing.
Extend copyright license years. Slightly reword documentation. Use IntStream and insertAll to create test fixture.

Original pull request: #532.
Related pull request: #531.
2018-02-15 10:56:14 +01:00
Christoph Strobl
c873e49d71 DATAMONGO-1870 - Consider skip/limit on MongoOperations.remove(Query, Class).
We now use _id lookup for remove operations that query with limit or skip parameters. This allows more fine grained control over documents removed.

Original pull request: #532.
Related pull request: #531.
2018-02-15 10:56:05 +01:00
Christoph Strobl
4ebcac19bc DATAMONGO-1860 - Polishing.
Fix references to QuerydslPredicateExecutor.

Original Pull Request: #529
2018-02-14 13:36:17 +01:00
Mark Paluch
78212948bc DATAMONGO-1860 - Polishing.
Fix type references in Javadoc. Change lambdas to method references where applicable.

Original Pull Request: #529
2018-02-14 13:36:17 +01:00
Mark Paluch
38575baec1 DATAMONGO-1860 - Retrieve result count via QuerydslMongoPredicateExecutor only for paging.
We now use AbstractMongodbQuery.fetch() instead of AbstractMongodbQuery.fetchResults() to execute MongoDB queries. fetchResults() executes a find(…) and a count(…) query. Retrieving the record count is an expensive operation in MongoDB and the count is not always required. For regular find(…) method, the count is ignored, for paging the count(…) is only required in certain result/request scenarios.

Original Pull Request: #529
2018-02-14 13:36:17 +01:00
Mark Paluch
f1a3c37a79 DATAMONGO-1865 - Polishing.
Adapt to collection name retrieval during query execution. Slightly reword documentation and JavaDoc.

Original pull request: #530.
2018-02-14 12:01:44 +01:00
Christoph Strobl
c668a47243 DATAMONGO-1865 - Avoid IncorrectResultSizeDataAccessException for derived findFirst/findTop queries.
We now return the first result when executing findFirst/findTop queries. This fixes a glitch introduced in the Kay release throwing IncorrectResultSizeDataAccessException for single entity executions returning more than one result, which is explicitly not the desired behavior in this case.

Original pull request: #530.
2018-02-14 12:01:44 +01:00
Mark Paluch
6a20ddf5a2 DATAMONGO-1871 - Polishing.
Migrate test to AssertJ.

Original pull request: #533.
2018-02-14 11:05:20 +01:00
Christoph Strobl
cec6526543 DATAMONGO-1871 - Fix AggregationExpression aliasing.
We now make sure to allow a nested property alias by setting the target.

Original pull request: #533.
2018-02-14 11:05:17 +01:00
Oliver Gierke
46ea58f3b9 DATAMONGO-1872 - Polishing.
Fixed @since tag for newly introduced method in MongoEntityMetadata.
2018-02-13 12:24:09 +01:00
Oliver Gierke
ebaea8d22f DATAMONGO-1872 - Repository query execution doesn't prematurely fix collection to be queried.
We now avoid calling ….inCollection(…) with a fixed, one-time calculated collection name to make sure we dynamically resolve the collections. That's necessary to make sure SpEL expressions in @Document are evaluated for every query execution.
2018-02-13 12:18:19 +01:00
Christoph Strobl
ed6aaeed25 DATAMONGO-1794 - Updated changelog. 2018-02-06 11:14:01 +01:00
Mark Paluch
89b1b6fbb2 DATAMONGO-1830 - After release cleanups. 2018-01-24 13:46:10 +01:00
Mark Paluch
23769301b5 DATAMONGO-1830 - Prepare next development iteration. 2018-01-24 13:46:08 +01:00
Mark Paluch
3399160acf DATAMONGO-1830 - Release version 2.0.3 (Kay SR3). 2018-01-24 13:21:24 +01:00
Mark Paluch
32a8ee9b31 DATAMONGO-1830 - Prepare 2.0.3 (Kay SR3). 2018-01-24 13:20:39 +01:00
Mark Paluch
17cea70abc DATAMONGO-1830 - Updated changelog. 2018-01-24 13:20:34 +01:00
Mark Paluch
07731c39ba DATAMONGO-1858 - Fix line endings to LF. 2018-01-24 12:57:24 +01:00
Mark Paluch
c5b580b82b DATAMONGO-1829 - Updated changelog. 2018-01-24 12:22:10 +01:00
Mark Paluch
9a1385186e DATAMONGO-1843 - Polishing.
Convert anonymous classes to lambdas. Typo fixes. Migrate test to AssertJ.

Original pull request: #526.
2018-01-23 10:34:46 +01:00
Christoph Strobl
704524d7f4 DATAMONGO-1843 - Fix parameter shadowing in ArrayOperators reduce.
Original pull request: #526.
2018-01-23 10:34:35 +01:00
Christoph Strobl
cc9a3ac8da DATAMONGO-1850 - Polishing.
Remove blank line, add tests and migrate to AssertJ.

Original Pull Request: #527
2018-01-22 16:16:52 +01:00
Mark Paluch
acb68f3ca4 DATAMONGO-1850 - Guard GridFsResource.getContentType() against absent file metadata.
We now consider fall back to GridFS.getContentType() if GridFS metadata is absent to prevent null dereference.

Original Pull Request: #527
2018-01-22 15:14:35 +01:00
Mark Paluch
3088f0469e DATAMONGO-1824 - Polishing.
Move method from AggregationCommandPreparer and AggregationResultPostProcessor to BatchAggregationLoader. Extract field names to constants. Tiny renames to variables. Add unit test for aggregation response without cursor use. Migrate test to AssertJ.

Original pull request: #521.
2017-12-15 14:30:24 +01:00
Christoph Strobl
a1ae04881d DATAMONGO-1824 - Skip tests no longer applicable for MongoDB 3.6.
Original pull request: #521.
2017-12-15 14:25:31 +01:00
Christoph Strobl
6f55c66060 DATAMONGO-1824 - Fix aggregation execution for MongoDB 3.6.
We now send aggregation commands along a cursor batch size for compatibility with MongoDB 3.6 that no longer supports aggregations without cursor. We consume the whole cursor before returning and converting results and omit the 16MB aggregation result limit. For MongoDB versions not supporting aggregation cursors we return results directly.

Original pull request: #521.
2017-12-15 14:25:20 +01:00
Christoph Strobl
f86447bd04 DATAMONGO-1831 - Fix array type conversion for empty source.
We now make sure that we convert empty sources to the corresponding target type. This prevents entity instantiation from failing due to incorrect argument types when invoking the constructor.

Original pull request: #520.
2017-12-02 12:18:55 -08:00
Mark Paluch
1bb4324b2e DATAMONGO-1816 - After release cleanups. 2017-11-27 16:42:54 +01:00
Mark Paluch
856506f121 DATAMONGO-1816 - Prepare next development iteration. 2017-11-27 16:42:52 +01:00
Mark Paluch
2a81dc75a8 DATAMONGO-1816 - Release version 2.0.2 (Kay SR2). 2017-11-27 16:12:34 +01:00
Mark Paluch
58cd4c08ca DATAMONGO-1816 - Prepare 2.0.2 (Kay SR2). 2017-11-27 16:11:21 +01:00
Mark Paluch
344e019143 DATAMONGO-1816 - Updated changelog. 2017-11-27 16:11:15 +01:00
Mark Paluch
918b7e96bb DATAMONGO-1799 - Updated changelog. 2017-11-27 15:58:45 +01:00
Christoph Strobl
fce7a5c1cb DATAMONGO-1818 - Polishing.
Move overlapping/duplicate documentation into one place.

Original Pull Request: #512
2017-11-27 07:53:22 +01:00
Mark Paluch
dbd2de8e0f DATAMONGO-1818 - Reword tailable cursors documentation.
Fix reference to @Tailable annotation. Slightly reword documentation.

Original Pull Request: #512
2017-11-27 07:53:08 +01:00
Mark Paluch
0dbe331ab0 DATAMONGO-1823 - Polishing.
Replace constructor with lombok's RequiredArgsConstructor. Add Nullable annotation. Tiny reformatting. Align license header. Migrate test to AssertJ.

Original pull request: #517.
2017-11-22 14:33:20 +01:00
Christoph Strobl
846ebcd91d DATAMONGO-1823 - Emit ApplicationEvents using projecting find methods.
We now again emit application events when using finder methods that apply projection.

Original pull request: #517.
2017-11-22 14:33:20 +01:00
Oliver Gierke
9e0b5caeac DATAMONGO-1737 - BasicMongoPersistentEntity now correctly initializes comparator.
In BasicMongoPersistentEntity.verify() we now properly call the super method to make sure the comparators that honor the @Field's order value are initialized properly.
2017-11-17 14:55:00 +01:00
Mark Paluch
cf70f5e5eb DATAMONGO-1819 - Polishing.
Use native field names for NamedMongoScript query instead of relying on metadata-based mapping as NamedMongoScript is considered a simple top-level type.

Related pull request: #513.
2017-11-17 13:49:06 +01:00
Mark Paluch
331dc6df6f DATAMONGO-1821 - Fix method ambiguity in tests when compiling against MongoDB 3.6 2017-11-07 12:47:51 +01:00
Mark Paluch
a51dce2c90 DATAMONGO-1820 - Set Mongo's Feature Compatibility flag for TravisCI build to 3.4.
Apply setFeatureCompatibilityVersion to upgrade MongoDB to 3.4 features.
2017-11-06 10:28:10 +01:00
Mark Paluch
c0cf1aa95b DATAMONGO-1817 - Polishing.
Remove blank line.

Original pull request: #510.
2017-11-06 10:02:35 +01:00
Sola
7104ffa543 DATAMONGO-1817 - Align nullability in Kotlin MongoOperationsExtensions with Java API.
Return types in MongoOperationsExtensions are now aligned to the nullability of MongoOperations.

Original pull request: #510.
2017-11-06 10:02:35 +01:00
Oliver Gierke
28d2fb6680 DATAMONGO-1793 - After release cleanups. 2017-10-27 15:50:48 +02:00
Oliver Gierke
140e26946f DATAMONGO-1793 - Prepare next development iteration. 2017-10-27 15:50:45 +02:00
Oliver Gierke
f4e730ce87 DATAMONGO-1793 - Release version 2.0.1 (Kay SR1). 2017-10-27 15:25:11 +02:00
Oliver Gierke
e3a83ebc42 DATAMONGO-1793 - Prepare 2.0.1 (Kay SR1). 2017-10-27 15:24:24 +02:00
Oliver Gierke
f65c1e324e DATAMONGO-1793 - Updated changelog. 2017-10-27 15:24:14 +02:00
Oliver Gierke
1dd0061f03 DATAMONGO-1815 - Adapt API changes in Property in test cases. 2017-10-27 11:13:31 +02:00
Mark Paluch
5ea860700c DATAMONGO-1814 - Update reference documentation for faceted classification.
Original pull request: #426.
Original ticket: DATAMONGO-1552.
2017-10-26 09:44:50 +02:00
Christoph Strobl
3dd653a702 DATAMONGO-1811 - Update documentation of MongoOperations.executeCommand.
Update Javadoc and reference documentation.
2017-10-24 14:59:47 +02:00
Christoph Strobl
f87847407b DATAMONGO-1805 - Update GridFsOperations documentation.
Fix return type in reference documentation and update Javadoc.
2017-10-24 14:59:40 +02:00
Christoph Strobl
433a125c9e DATAMONGO-1806 - Polishing.
Remove unused import, trailing whitespaces and update Javadoc.

Original Pull Request: #506
2017-10-24 14:59:33 +02:00
hartmut
5827cb0971 DATAMONGO-1806 - Fix Javadoc for GridFsResource.
Original Pull Request: #506
2017-10-24 14:59:24 +02:00
Mark Paluch
0109bf6858 DATAMONGO-1809 - Introduce AssertJ assertions for Document.
Original pull request: #508.
2017-10-24 14:45:03 +02:00
Christoph Strobl
49d1555576 DATAMONGO-1809 - Polishing.
Move tests to AssertJ.

Original pull request: #508.
2017-10-24 14:45:03 +02:00
Christoph Strobl
fdbb305b8e DATAMONGO-1809 - Fix positional parameter detection for PropertyPaths.
We now make sure to capture all digits for positional parameters.

Original pull request: #508.
2017-10-24 14:45:03 +02:00
Mark Paluch
49dd03311a DATAMONGO-1696 - Mention appropriate EnableMongoAuditing annotation in reference documentation. 2017-10-20 08:45:33 +02:00
Mark Paluch
a86a3210e1 DATAMONGO-1802 - Polishing.
Reduce converter visibility to MongoConverters's package-scope visibility. Tiny alignment in Javadoc wording. Copyright year, create empty byte array with element count instead initializer.

Original pull request: #505.
2017-10-17 14:52:11 +02:00
Christoph Strobl
4b655abfb6 DATAMONGO-1802 - Add Binary to byte array converter.
We now provide and register a Binary to byte[] converter to provide conversion of binary data to a byte array. MongoDB deserializes binary data using the document API to its Binary type. With this converter, we reinstantiated the previous capability to use byte arrays for binary data within domain types.

Original pull request: #505.
2017-10-17 14:52:11 +02:00
Oliver Gierke
0963e6cf77 DATAMONGO-1775 - Updated changelog. 2017-10-11 19:03:29 +02:00
Oliver Gierke
3e1b2c4bdb DATAMONGO-1795 - Removed obsolete Kotlin build setup. 2017-10-04 11:05:27 +02:00
Mark Paluch
03e0e0c431 DATAMONGO-1776 - After release cleanups. 2017-10-02 11:38:04 +02:00
Mark Paluch
51900021a1 DATAMONGO-1776 - Prepare next development iteration. 2017-10-02 11:38:03 +02:00
Mark Paluch
e5e8fa45c2 DATAMONGO-1776 - Release version 2.0 GA (Kay). 2017-10-02 11:10:22 +02:00
Mark Paluch
f5ad4e42f9 DATAMONGO-1776 - Prepare 2.0 GA (Kay). 2017-10-02 11:09:17 +02:00
Mark Paluch
e6b7d2ffd0 DATAMONGO-1776 - Updated changelog. 2017-10-02 11:09:09 +02:00
Mark Paluch
5b24d3fd0b DATAMONGO-1778 - Polishing.
Javadoc, modifiers.

Original pull request: #503.
2017-10-02 10:38:20 +02:00
Christoph Strobl
10f13c8f37 DATAMONGO-1778 - Polishing.
Migrate UpdateTests to AssertJ and adjust constructor visibility.

Original pull request: #503.
2017-10-02 10:38:20 +02:00
Christoph Strobl
c05f8f056c DATAMONGO-1778 - Fix equals() and hashCode() for Update.
We now include the entire update document with its modifiers and the isolation flag when computing the hash code and comparing for object equality.

Original pull request: #503.
2017-10-02 10:37:34 +02:00
Mark Paluch
dbd38a8e82 DATAMONGO-1791 - Polishing.
Replace RxJava 1 repositories with RxJava 2 repositories. Fix broken links. Fix duplicate section ids.
2017-09-27 12:19:11 +02:00
Mark Paluch
77b1f3cb37 DATAMONGO-1791 - Adapt to changed Spring Framework 5 documentation structure.
Update links in the reference docs to their new locations.
2017-09-27 12:13:51 +02:00
Mark Paluch
5444ac39b5 DATAMONGO-1785 - Downgrade to CDI 1.0.
We now build against CDI 1.0 again while using CDI 2.0 for testing.
2017-09-21 13:50:39 +02:00
Mark Paluch
cf476b9bc8 DATAMONGO-1787 - Added explicit automatic module name for JDK 9. 2017-09-21 13:50:39 +02:00
Christoph Strobl
f28d47b01b DATAMONGO-1777 - Polishing. 2017-09-21 11:32:52 +02:00
Christoph Strobl
5bf03cfa70 DATAMONGO-1777 - Pretty print Modifiers when calling Update.toString().
We now make sure to pretty print Update modifiers by safely rendering to a json representation including an optional $isolated opereator if applicable.
Along the way we also fix a flaw in PushOperationBuilder ignoring eg. $position when pushing single values.
2017-09-21 11:32:50 +02:00
Oliver Gierke
98e893636b DATAMONGO-1779 - Polishing.
Fixed imports.
2017-09-21 11:31:45 +02:00
Oliver Gierke
4b552b051e DATAMONGO-1779 - Fixed handling of empty queries in MongoTemplate.find(…).
Calls to MongoTemplate.find(…) were routed to ….findAll(…) in case no criteria definition or sort was defined on the query. This however neglected that cursor preparation aspects (limits, skips) are defined on the query as well which cause them not to be applied correctly. Removed the over-optimistic re-routing so that normal query execution now always gets applied.
2017-09-21 11:31:44 +02:00
Mark Paluch
1c295b62c6 DATAMONGO-1786 - Adapt tests to nullability validation in Spring Data Commons.
Related issue: DATACMNS-1157.
2017-09-21 11:27:41 +02:00
Christoph Strobl
0a8458a045 DATAMONGO-1784 - Polishing.
Update JavaDoc, enforce nullability constraints and add tests.

Original Pull Request: #501
2017-09-20 12:47:29 +02:00
Sergey Shcherbakov
a3b9fb33ea DATAMONGO-1784 - Add expression support to GroupOperation#sum().
We now allow passing an AggregationExpression to GroupOperation.sum which allows construction of more complex expressions.

Original Pull Request: #501
2017-09-20 12:46:14 +02:00
Christoph Strobl
3d651b72ad DATAMONGO-1782 - Polishing.
toCyclePath now returns an empty String when Path does not cycle.
Also split and add tests and move code to Java8.

Original Pull Request: #500
2017-09-19 11:07:04 +02:00
Mark Paluch
187c25bcc0 DATAMONGO-1782 - Detect type cycles using PersistentProperty paths.
We now rely on PersistentProperty paths to detect cycles between types. Cycles are detected when building up the path object and traversing PersistentProperty stops after the cycle was hit for the second time to generated indexes for at least one hierarchy level.

Previously, we used String-based property dot paths and compared whether paths to a particular property was already found by a substring search which caused false positives if a property was reachable via multiple paths.

Original Pull Request: #500
2017-09-19 10:03:40 +02:00
Mark Paluch
087482d82e DATAMONGO-1785 - Upgrade to OpenWebBeans 2.0.1
Upgrade also to Equalsverifier 1.7.8 to resolve ASM version conflict.
2017-09-18 15:21:57 +02:00
Mark Paluch
e80d1df571 DATAMONGO-1781 - Update what's new in reference documentation. 2017-09-14 14:09:08 +02:00
Oliver Gierke
a9b1b640c0 DATAMONGO-1754 - After release cleanups. 2017-09-11 17:40:21 +02:00
Oliver Gierke
b888864407 DATAMONGO-1754 - Prepare next development iteration. 2017-09-11 17:40:18 +02:00
Oliver Gierke
3e672e4563 DATAMONGO-1754 - Release version 2.0 RC3 (Kay). 2017-09-11 17:24:45 +02:00
Oliver Gierke
0fecd8bed9 DATAMONGO-1754 - Prepare 2.0 RC3 (Kay). 2017-09-11 17:23:51 +02:00
Oliver Gierke
4b6ff36724 DATAMONGO-1754 - Updated changelog. 2017-09-11 17:23:43 +02:00
Mark Paluch
ab31b24f99 DATAMONGO-1755 - Updated changelog. 2017-09-11 12:43:54 +02:00
Christoph Strobl
ba81caffe4 DATAMONGO-1631 - Polishing
Update Javadoc and test classes.

Original Pull Request: #490
2017-09-08 20:01:28 +02:00
Mark Paluch
3ab3dab2b4 DATAMONGO-1631 - Better method names for reactive configuration support.
This commit renames methods in AbstractReactiveMongoConfiguration for methods exposing MongoClient and ReactiveMongoDatabaseFactory instances. Renaming prevents possible clashes with beans created via AbstractMongoConfiguration (blocking driver) as bean names default to the producing method name.

Original Pull Request: #490
2017-09-08 20:00:42 +02:00
Mark Paluch
2d21b04a12 DATAMONGO-1774 - Polishing.
Synchronize javadoc between reactive and imperative operations interfaces. Add trailing punctuation. Consistent naming for collection name parameter.

Use Object.class instead of null in ReactiveMongoTemplate.findAllAndRemove to prevent null-pointer exceptions.

Original pull request: #498.
2017-09-06 11:47:23 +02:00
Christoph Strobl
301dd51560 DATAMONGO-1774 - Enforce null constraint checks.
And remove code paths returning null where a reactive wrapper type would be expected.

Original pull request: #498.
2017-09-06 11:44:36 +02:00
Christoph Strobl
d3d6242a16 DATAMONGO-1774 - Fix infinite loop in ReactiveMongoOperations#remove(Mono, String).
Original pull request: #498.
2017-09-06 11:44:20 +02:00
Christoph Strobl
7cf69c5b1a DATAMONGO-1772 - Fix UpdateMapper type key rendering for abstract list elements contained in concrete typed ones.
Original pull request: #497.
2017-09-05 10:59:02 +02:00
Christoph Strobl
995e5bf830 DATAMONGO-1770 - Upgrade to MongoDB Java driver 3.5 and reactive driver 1.6.
Fix test failures do to changed Json rendering.
2017-08-31 09:00:05 +02:00
Christoph Strobl
f281ab6c56 DATAMONGO-1771 - Fix Travis CI setup. 2017-08-29 15:19:23 +02:00
Mark Paluch
3012bcd575 DATAMONGO-1762 - Fix line endings.
Convert line endings from CRLF to LF.
2017-08-29 14:51:59 +02:00
Mark Paluch
0be4d1345e DATAMONGO-1762 - Polishing.
Add missing Nullable annotations. Provide getRequired…() methods for values known to exist. Update javadoc according to null-allowance/non-null requirements. Remove superfluous null-checks for values known to be non-null. Remove license from package Javadoc. Update license headers, remove trailing whitespaces.
2017-08-29 14:51:39 +02:00
Christoph Strobl
bdd5c9dec7 DATAMONGO-1762 - Add @Nullable and @NonNullApi annotations.
Marked all packages with Spring Frameworks @NonNullApi. Added Spring's @Nullable to methods, parameters and fields that take or produce null values. Adapted using code to make sure the IDE can evaluate the null flow properly. Fixed Javadoc in places where an invalid null handling policy was advertised. Strengthened null requirements for types that expose null-instances.
2017-08-29 14:48:15 +02:00
Mark Paluch
e28bede416 DATAMONGO-1768 - Polishing.
Use Lombok to generate constructors. Extend javadocs. Make methods static/reorder methods where possible. Use diamond syntax where possible. Formatting.

Original pull request: #496.
2017-08-25 10:47:57 +02:00
Christoph Strobl
2d825bed41 DATAMONGO-1768 - Introduce UntypedExampleMatcher.
Introducing UntypedExampleMatcher allows creation of QBE criteria that does not infer a strict type match.

Original pull request: #496.
2017-08-25 10:45:58 +02:00
Christoph Strobl
5fedbe9598 DATAMONGO-1768 - Allow ignoring type restriction when issuing QBE.
We now allow to remove the type restriction inferred by the QBE mapping via an ignored path expression on the ExampleMatcher. This allows to create untyped QBE expressions returning all entities matching the query without limiting the result to types assignable to the probe itself.

Original pull request: #496.
2017-08-25 10:45:38 +02:00
Oliver Gierke
faf7e36311 DATAMONGO-1765 - Polishing.
Lambdas and AssertJ in test cases.
2017-08-07 18:23:10 +02:00
Oliver Gierke
a95f77245e DATAMONGO-1765 - DefaultDbRefResolver now maps duplicate references correctly.
On bulk resolution of a DBRef array we now map the resulting documents back to their ids to make sure that reoccurring identifiers are mapped to the corresponding documents.
2017-08-07 17:48:36 +02:00
Mark Paluch
d956c8cbf2 DATAMONGO-1756 - Polishing.
Remove redundant casts. Add author tag.

Original pull request: #491.
2017-08-02 08:39:36 +02:00
Christoph Strobl
e8ae928e74 DATAMONGO-1756 - Fix nested field name resolution for arithmetic aggregation ops.
Original pull request: #491.
2017-08-02 08:39:23 +02:00
Oliver Gierke
f2e72fe931 DATAMONGO-1757 - Polishing.
Formatting.
2017-07-27 18:09:47 +02:00
Oliver Gierke
c35ea14c4f DATAMONGO-1757 - Improved exception being thrown if Document is supposed to be read into unsuitable type.
In case we run into a situation that we're supposed to read a Document into a type that's not a PersistentEntity, we previously only exposed the latter in an exception. This is now changed to add more context to the exception, incl. the source value to be read and the target type that we were supposed to read into.

This should leave the users with a better clue where the problem is.
2017-07-27 18:09:47 +02:00
Christoph Strobl
fc65bffc21 DATAMONGO-1758 - Remove spring-data-mongodb-log4j module from benchmarks profile. 2017-07-27 14:06:39 +02:00
Oliver Gierke
190f21fe46 DATAMONGO-1750 - Updated changelog. 2017-07-27 00:48:58 +02:00
Mark Paluch
5245cd4b55 DATAMONGO-1706 - Adapt to deprecated RxJava 1 CRUD repositories.
Move off RxJava1CrudRepository as base repository in tests by extending just Repository and declaring query methods/overloads using RxJava1 types.
2017-07-26 14:18:17 +02:00
Oliver Gierke
b72b8c5e09 DATAMONGO-1751 - After release cleanups. 2017-07-25 16:10:38 +02:00
Oliver Gierke
6652279189 DATAMONGO-1751 - Prepare next development iteration. 2017-07-25 16:10:34 +02:00
329 changed files with 16971 additions and 8617 deletions

View File

@@ -3,33 +3,33 @@ language: java
jdk: jdk:
- oraclejdk8 - oraclejdk8
before_script: before_install:
- mongod --version - mkdir -p downloads
- mkdir -p var/db var/log
- if [[ ! -d downloads/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION} ]] ; then cd downloads && wget https://fastdl.mongodb.org/linux/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}.tgz && tar xzf mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}.tgz && cd ..; fi
- downloads/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}/bin/mongod --version
- downloads/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}/bin/mongod --dbpath var/db --replSet rs0 --fork --logpath var/log/mongod.log
- sleep 10
- |-
downloads/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}/bin/mongo --eval "rs.initiate({_id: 'rs0', members:[{_id: 0, host: '127.0.0.1:27017'}]});"
sleep 15
env: env:
matrix: matrix:
- PROFILE=ci - PROFILE=ci
- PROFILE=mongo35-next global:
- MONGO_VERSION=3.7.9
# Current MongoDB version is 2.4.2 as of 2016-04, see https://github.com/travis-ci/travis-ci/issues/3694
# apt-get starts a MongoDB instance so it's not started using before_script
addons: addons:
apt: apt:
sources:
- mongodb-upstart
- sourceline: 'deb [arch=amd64] http://repo.mongodb.org/apt/ubuntu precise/mongodb-org/3.4 multiverse'
key_url: 'https://www.mongodb.org/static/pgp/server-3.4.asc'
packages: packages:
- mongodb-org-server - oracle-java8-installer
- mongodb-org-shell
- oracle-java8-installer
sudo: false sudo: false
cache: cache:
directories: directories:
- $HOME/.m2 - $HOME/.m2
- downloads
install: true
script: "mvn clean dependency:list test -P${PROFILE} -Dsort" script: "mvn clean dependency:list test -P${PROFILE} -Dsort"

65
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId> <artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.0.RC2</version> <version>2.0.10.RELEASE</version>
<packaging>pom</packaging> <packaging>pom</packaging>
<name>Spring Data MongoDB</name> <name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent> <parent>
<groupId>org.springframework.data.build</groupId> <groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId> <artifactId>spring-data-parent</artifactId>
<version>2.0.0.RC2</version> <version>2.0.10.RELEASE</version>
</parent> </parent>
<modules> <modules>
@@ -27,9 +27,9 @@
<properties> <properties>
<project.type>multi</project.type> <project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id> <dist.id>spring-data-mongodb</dist.id>
<springdata.commons>2.0.0.RC2</springdata.commons> <springdata.commons>2.0.10.RELEASE</springdata.commons>
<mongo>3.4.2</mongo> <mongo>3.5.0</mongo>
<mongo.reactivestreams>1.5.0</mongo.reactivestreams> <mongo.reactivestreams>1.6.0</mongo.reactivestreams>
<jmh.version>1.19</jmh.version> <jmh.version>1.19</jmh.version>
</properties> </properties>
@@ -115,38 +115,6 @@
<profiles> <profiles>
<profile>
<id>mongo34-next</id>
<properties>
<mongo>3.4.3-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>mongo35-next</id>
<properties>
<mongo>3.5.0-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile> <profile>
<id>release</id> <id>release</id>
<build> <build>
@@ -165,12 +133,29 @@
<modules> <modules>
<module>spring-data-mongodb</module> <module>spring-data-mongodb</module>
<module>spring-data-mongodb-cross-store</module> <module>spring-data-mongodb-cross-store</module>
<module>spring-data-mongodb-log4j</module>
<module>spring-data-mongodb-distribution</module> <module>spring-data-mongodb-distribution</module>
<module>spring-data-mongodb-benchmarks</module> <module>spring-data-mongodb-benchmarks</module>
</modules> </modules>
</profile> </profile>
<profile>
<id>distribute</id>
<build>
<plugins>
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
<configuration>
<attributes>
<mongo-reactivestreams>${mongo.reactivestreams}</mongo-reactivestreams>
<reactor>${reactor}</reactor>
</attributes>
</configuration>
</plugin>
</plugins>
</build>
</profile>
</profiles> </profiles>
<dependencies> <dependencies>
@@ -184,8 +169,8 @@
<repositories> <repositories>
<repository> <repository>
<id>spring-libs-milestone</id> <id>spring-libs-release</id>
<url>https://repo.spring.io/libs-milestone</url> <url>https://repo.spring.io/libs-release</url>
</repository> </repository>
</repositories> </repositories>

View File

@@ -7,7 +7,7 @@
<parent> <parent>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId> <artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.0.RC2</version> <version>2.0.10.RELEASE</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>

View File

@@ -6,7 +6,7 @@
<parent> <parent>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId> <artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.0.RC2</version> <version>2.0.10.RELEASE</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>
@@ -16,6 +16,8 @@
<properties> <properties>
<jpa>2.1.1</jpa> <jpa>2.1.1</jpa>
<hibernate>5.2.1.Final</hibernate> <hibernate>5.2.1.Final</hibernate>
<java-module-name>spring.data.mongodb.cross.store</java-module-name>
<project.root>${basedir}/..</project.root>
</properties> </properties>
<dependencies> <dependencies>
@@ -48,7 +50,7 @@
<dependency> <dependency>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId> <artifactId>spring-data-mongodb</artifactId>
<version>2.0.0.RC2</version> <version>2.0.10.RELEASE</version>
</dependency> </dependency>
<!-- reactive --> <!-- reactive -->

View File

@@ -1,214 +1,214 @@
/* /*
* Copyright 2011-2017 the original author or authors. * Copyright 2011-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* http://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.springframework.data.mongodb.crossstore; package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManagerFactory; import javax.persistence.EntityManagerFactory;
import org.bson.Document; import org.bson.Document;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException; import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException; import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.crossstore.ChangeSet; import org.springframework.data.crossstore.ChangeSet;
import org.springframework.data.crossstore.ChangeSetBacked; import org.springframework.data.crossstore.ChangeSetBacked;
import org.springframework.data.crossstore.ChangeSetPersister; import org.springframework.data.crossstore.ChangeSetPersister;
import org.springframework.data.mongodb.core.CollectionCallback; import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate; import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.util.ClassUtils; import org.springframework.util.ClassUtils;
import com.mongodb.MongoException; import com.mongodb.MongoException;
import com.mongodb.client.MongoCollection; import com.mongodb.client.MongoCollection;
import com.mongodb.client.model.Filters; import com.mongodb.client.model.Filters;
import com.mongodb.client.result.DeleteResult; import com.mongodb.client.result.DeleteResult;
/** /**
* @author Thomas Risberg * @author Thomas Risberg
* @author Oliver Gierke * @author Oliver Gierke
* @author Alex Vengrovsk * @author Alex Vengrovsk
* @author Mark Paluch * @author Mark Paluch
* @deprecated will be removed without replacement. * @deprecated will be removed without replacement.
*/ */
@Deprecated @Deprecated
public class MongoChangeSetPersister implements ChangeSetPersister<Object> { public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_CLASS = "_entity_class"; private static final String ENTITY_CLASS = "_entity_class";
private static final String ENTITY_ID = "_entity_id"; private static final String ENTITY_ID = "_entity_id";
private static final String ENTITY_FIELD_NAME = "_entity_field_name"; private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class"; private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
private final Logger log = LoggerFactory.getLogger(getClass()); private final Logger log = LoggerFactory.getLogger(getClass());
private MongoTemplate mongoTemplate; private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory; private EntityManagerFactory entityManagerFactory;
public void setMongoTemplate(MongoTemplate mongoTemplate) { public void setMongoTemplate(MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate; this.mongoTemplate = mongoTemplate;
} }
public void setEntityManagerFactory(EntityManagerFactory entityManagerFactory) { public void setEntityManagerFactory(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory; this.entityManagerFactory = entityManagerFactory;
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentState(java.lang.Class, java.lang.Object, org.springframework.data.crossstore.ChangeSet) * @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentState(java.lang.Class, java.lang.Object, org.springframework.data.crossstore.ChangeSet)
*/ */
public void getPersistentState(Class<? extends ChangeSetBacked> entityClass, Object id, final ChangeSet changeSet) public void getPersistentState(Class<? extends ChangeSetBacked> entityClass, Object id, final ChangeSet changeSet)
throws DataAccessException, NotFoundException { throws DataAccessException, NotFoundException {
if (id == null) { if (id == null) {
log.debug("Unable to load MongoDB data for null id"); log.debug("Unable to load MongoDB data for null id");
return; return;
} }
String collName = getCollectionNameForEntity(entityClass); String collName = getCollectionNameForEntity(entityClass);
final Document dbk = new Document(); final Document dbk = new Document();
dbk.put(ENTITY_ID, id); dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName()); dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for {}", dbk); log.debug("Loading MongoDB data for {}", dbk);
} }
mongoTemplate.execute(collName, new CollectionCallback<Object>() { mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException { public Object doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException {
for (Document dbo : collection.find(dbk)) { for (Document dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME); String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("Processing key: {}", key); log.debug("Processing key: {}", key);
} }
if (!changeSet.getValues().containsKey(key)) { if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS); String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) { if (className == null) {
throw new DataIntegrityViolationException( throw new DataIntegrityViolationException(
"Unble to convert property " + key + ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available"); "Unble to convert property " + key + ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
} }
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader()); Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo); Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: {}", key); log.debug("Adding to ChangeSet: {}", key);
} }
changeSet.set(key, value); changeSet.set(key, value);
} }
} }
return null; return null;
} }
}); });
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet) * @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/ */
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException { public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("getPersistentId called on {}", entity); log.debug("getPersistentId called on {}", entity);
} }
if (entityManagerFactory == null) { if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null"); throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
} }
return entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity); return entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity);
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#persistState(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet) * @see org.springframework.data.crossstore.ChangeSetPersister#persistState(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/ */
public Object persistState(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException { public Object persistState(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (cs == null) { if (cs == null) {
log.debug("Flush: changeset was null, nothing to flush."); log.debug("Flush: changeset was null, nothing to flush.");
return 0L; return 0L;
} }
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("Flush: changeset: {}", cs.getValues()); log.debug("Flush: changeset: {}", cs.getValues());
} }
String collName = getCollectionNameForEntity(entity.getClass()); String collName = getCollectionNameForEntity(entity.getClass());
if (mongoTemplate.getCollection(collName) == null) { if (mongoTemplate.getCollection(collName) == null) {
mongoTemplate.createCollection(collName); mongoTemplate.createCollection(collName);
} }
for (String key : cs.getValues().keySet()) { for (String key : cs.getValues().keySet()) {
if (key != null && !key.startsWith("_") && !key.equals(ChangeSetPersister.ID_KEY)) { if (key != null && !key.startsWith("_") && !key.equals(ChangeSetPersister.ID_KEY)) {
Object value = cs.getValues().get(key); Object value = cs.getValues().get(key);
final Document dbQuery = new Document(); final Document dbQuery = new Document();
dbQuery.put(ENTITY_ID, getPersistentId(entity, cs)); dbQuery.put(ENTITY_ID, getPersistentId(entity, cs));
dbQuery.put(ENTITY_CLASS, entity.getClass().getName()); dbQuery.put(ENTITY_CLASS, entity.getClass().getName());
dbQuery.put(ENTITY_FIELD_NAME, key); dbQuery.put(ENTITY_FIELD_NAME, key);
final Document dbId = mongoTemplate.execute(collName, new CollectionCallback<Document>() { final Document dbId = mongoTemplate.execute(collName, new CollectionCallback<Document>() {
public Document doInCollection(MongoCollection<Document> collection) public Document doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException { throws MongoException, DataAccessException {
Document id = collection.find(dbQuery).first(); Document id = collection.find(dbQuery).first();
return id; return id;
} }
}); });
if (value == null) { if (value == null) {
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("Flush: removing: {}", dbQuery); log.debug("Flush: removing: {}", dbQuery);
} }
mongoTemplate.execute(collName, new CollectionCallback<Object>() { mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection) public Object doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException { throws MongoException, DataAccessException {
DeleteResult dr = collection.deleteMany(dbQuery); DeleteResult dr = collection.deleteMany(dbQuery);
return null; return null;
} }
}); });
} else { } else {
final Document dbDoc = new Document(); final Document dbDoc = new Document();
dbDoc.putAll(dbQuery); dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("Flush: saving: {}", dbQuery); log.debug("Flush: saving: {}", dbQuery);
} }
mongoTemplate.getConverter().write(value, dbDoc); mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName()); dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());
if (dbId != null) { if (dbId != null) {
dbDoc.put("_id", dbId.get("_id")); dbDoc.put("_id", dbId.get("_id"));
} }
mongoTemplate.execute(collName, new CollectionCallback<Object>() { mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection) public Object doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException { throws MongoException, DataAccessException {
if (dbId != null) { if (dbId != null) {
collection.replaceOne(Filters.eq("_id", dbId.get("_id")), dbDoc); collection.replaceOne(Filters.eq("_id", dbId.get("_id")), dbDoc);
} else { } else {
if (dbDoc.containsKey("_id") && dbDoc.get("_id") == null) { if (dbDoc.containsKey("_id") && dbDoc.get("_id") == null) {
dbDoc.remove("_id"); dbDoc.remove("_id");
} }
collection.insertOne(dbDoc); collection.insertOne(dbDoc);
} }
return null; return null;
} }
}); });
} }
} }
} }
return 0L; return 0L;
} }
/** /**
* Returns the collection the given entity type shall be persisted to. * Returns the collection the given entity type shall be persisted to.
* *
* @param entityClass must not be {@literal null}. * @param entityClass must not be {@literal null}.
* @return * @return
*/ */
private String getCollectionNameForEntity(Class<? extends ChangeSetBacked> entityClass) { private String getCollectionNameForEntity(Class<? extends ChangeSetBacked> entityClass) {
return mongoTemplate.getCollectionName(entityClass); return mongoTemplate.getCollectionName(entityClass);
} }
} }

View File

@@ -13,7 +13,7 @@
<parent> <parent>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId> <artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.0.RC2</version> <version>2.0.10.RELEASE</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>

View File

@@ -11,13 +11,15 @@
<parent> <parent>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId> <artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.0.RC2</version> <version>2.0.10.RELEASE</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>
<properties> <properties>
<objenesis>1.3</objenesis> <objenesis>1.3</objenesis>
<equalsverifier>1.5</equalsverifier> <equalsverifier>1.7.8</equalsverifier>
<java-module-name>spring.data.mongodb</java-module-name>
<project.root>${basedir}/..</project.root>
</properties> </properties>
<dependencies> <dependencies>
@@ -137,6 +139,21 @@
</dependency> </dependency>
<!-- CDI --> <!-- CDI -->
<!-- Dependency order required to build against CDI 1.0 and test with CDI 2.0 -->
<dependency>
<groupId>org.apache.geronimo.specs</groupId>
<artifactId>geronimo-jcdi_2.0_spec</artifactId>
<version>1.0.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.interceptor</groupId>
<artifactId>javax.interceptor-api</artifactId>
<version>1.2.1</version>
<scope>test</scope>
</dependency>
<dependency> <dependency>
<groupId>javax.enterprise</groupId> <groupId>javax.enterprise</groupId>
<artifactId>cdi-api</artifactId> <artifactId>cdi-api</artifactId>
@@ -146,26 +163,19 @@
</dependency> </dependency>
<dependency> <dependency>
<groupId>javax.el</groupId> <groupId>javax.annotation</groupId>
<artifactId>el-api</artifactId> <artifactId>javax.annotation-api</artifactId>
<version>${cdi}</version> <version>${javax-annotation-api}</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.apache.openwebbeans.test</groupId> <groupId>org.apache.openwebbeans</groupId>
<artifactId>cditest-owb</artifactId> <artifactId>openwebbeans-se</artifactId>
<version>${webbeans}</version> <version>${webbeans}</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>2.5</version>
<scope>test</scope>
</dependency>
<!-- JSR 303 Validation --> <!-- JSR 303 Validation -->
<dependency> <dependency>
<groupId>javax.validation</groupId> <groupId>javax.validation</groupId>
@@ -205,7 +215,6 @@
<dependency> <dependency>
<groupId>com.fasterxml.jackson.core</groupId> <groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId> <artifactId>jackson-databind</artifactId>
<version>${jackson}</version>
<optional>true</optional> <optional>true</optional>
</dependency> </dependency>
@@ -279,74 +288,9 @@
</dependencies> </dependencies>
<build> <build>
<plugins> <plugins>
<plugin>
<artifactId>kotlin-maven-plugin</artifactId>
<groupId>org.jetbrains.kotlin</groupId>
<version>${kotlin}</version>
<configuration>
<jvmTarget>${source.level}</jvmTarget>
</configuration>
<executions>
<execution>
<id>compile</id>
<phase>compile</phase>
<goals>
<goal>compile</goal>
</goals>
<configuration>
<sourceDirs>
<sourceDir>${project.basedir}/src/main/kotlin</sourceDir>
<sourceDir>${project.basedir}/src/main/java</sourceDir>
</sourceDirs>
</configuration>
</execution>
<execution>
<id>test-compile</id>
<phase>test-compile</phase>
<goals>
<goal>test-compile</goal>
</goals>
<configuration>
<sourceDirs>
<sourceDir>${project.basedir}/src/test/kotlin</sourceDir>
<sourceDir>${project.basedir}/src/test/java</sourceDir>
</sourceDirs>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<executions>
<execution>
<id>default-compile</id>
<phase>none</phase>
</execution>
<execution>
<id>default-testCompile</id>
<phase>none</phase>
</execution>
<execution>
<id>java-compile</id>
<phase>compile</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>java-test-compile</id>
<phase>test-compile</phase>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin> <plugin>
<groupId>com.mysema.maven</groupId> <groupId>com.mysema.maven</groupId>
<artifactId>apt-maven-plugin</artifactId> <artifactId>apt-maven-plugin</artifactId>
@@ -375,7 +319,6 @@
<plugin> <plugin>
<groupId>org.apache.maven.plugins</groupId> <groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId> <artifactId>maven-surefire-plugin</artifactId>
<version>2.12</version>
<configuration> <configuration>
<useFile>false</useFile> <useFile>false</useFile>
<includes> <includes>
@@ -397,6 +340,8 @@
</properties> </properties>
</configuration> </configuration>
</plugin> </plugin>
</plugins> </plugins>
</build> </build>
</project> </project>

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2010-2012 the original author or authors. * Copyright 2010-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -17,16 +17,18 @@ package org.springframework.data.mongodb;
import org.springframework.dao.DataAccessResourceFailureException; import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.data.authentication.UserCredentials; import org.springframework.data.authentication.UserCredentials;
import org.springframework.lang.Nullable;
/** /**
* Exception being thrown in case we cannot connect to a MongoDB instance. * Exception being thrown in case we cannot connect to a MongoDB instance.
* *
* @author Oliver Gierke * @author Oliver Gierke
* @author Mark Paluch
*/ */
public class CannotGetMongoDbConnectionException extends DataAccessResourceFailureException { public class CannotGetMongoDbConnectionException extends DataAccessResourceFailureException {
private final UserCredentials credentials; private final UserCredentials credentials;
private final String database; private final @Nullable String database;
private static final long serialVersionUID = 1172099106475265589L; private static final long serialVersionUID = 1172099106475265589L;
@@ -40,7 +42,7 @@ public class CannotGetMongoDbConnectionException extends DataAccessResourceFailu
this(msg, null, UserCredentials.NO_CREDENTIALS); this(msg, null, UserCredentials.NO_CREDENTIALS);
} }
public CannotGetMongoDbConnectionException(String msg, String database, UserCredentials credentials) { public CannotGetMongoDbConnectionException(String msg, @Nullable String database, UserCredentials credentials) {
super(msg); super(msg);
this.database = database; this.database = database;
this.credentials = credentials; this.credentials = credentials;
@@ -48,7 +50,7 @@ public class CannotGetMongoDbConnectionException extends DataAccessResourceFailu
/** /**
* Returns the {@link UserCredentials} that were used when trying to connect to the MongoDB instance. * Returns the {@link UserCredentials} that were used when trying to connect to the MongoDB instance.
* *
* @return * @return
*/ */
public UserCredentials getCredentials() { public UserCredentials getCredentials() {
@@ -57,9 +59,10 @@ public class CannotGetMongoDbConnectionException extends DataAccessResourceFailu
/** /**
* Returns the name of the database trying to be accessed. * Returns the name of the database trying to be accessed.
* *
* @return * @return
*/ */
@Nullable
public String getDatabase() { public String getDatabase() {
return database; return database;
} }

View File

@@ -23,16 +23,16 @@ import com.mongodb.DB;
import com.mongodb.client.MongoDatabase; import com.mongodb.client.MongoDatabase;
/** /**
* Interface for factories creating {@link DB} instances. * Interface for factories creating {@link MongoDatabase} instances.
* *
* @author Mark Pollack * @author Mark Pollack
* @author Thomas Darimont * @author Thomas Darimont
*/ */
public interface MongoDbFactory { public interface MongoDbFactory {
/** /**
* Creates a default {@link DB} instance. * Creates a default {@link MongoDatabase} instance.
* *
* @return * @return
* @throws DataAccessException * @throws DataAccessException
*/ */
@@ -40,7 +40,7 @@ public interface MongoDbFactory {
/** /**
* Creates a {@link DB} instance to access the database with the given name. * Creates a {@link DB} instance to access the database with the given name.
* *
* @param dbName must not be {@literal null} or empty. * @param dbName must not be {@literal null} or empty.
* @return * @return
* @throws DataAccessException * @throws DataAccessException
@@ -49,7 +49,7 @@ public interface MongoDbFactory {
/** /**
* Exposes a shared {@link MongoExceptionTranslator}. * Exposes a shared {@link MongoExceptionTranslator}.
* *
* @return will never be {@literal null}. * @return will never be {@literal null}.
*/ */
PersistenceExceptionTranslator getExceptionTranslator(); PersistenceExceptionTranslator getExceptionTranslator();

View File

@@ -1,112 +1,114 @@
/* /*
* Copyright 2011-2017 the original author or authors. * Copyright 2011-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* http://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.springframework.data.mongodb.config; package org.springframework.data.mongodb.config;
import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration; import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.MongoDbFactory; import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate; import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory; import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.convert.DbRefResolver; import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver; import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter; import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.Document; import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.lang.Nullable;
import com.mongodb.MongoClient;
import com.mongodb.MongoClient;
/**
* Base class for Spring Data MongoDB configuration using JavaConfig. /**
* * Base class for Spring Data MongoDB configuration using JavaConfig.
* @author Mark Pollack *
* @author Oliver Gierke * @author Mark Pollack
* @author Thomas Darimont * @author Oliver Gierke
* @author Ryan Tenney * @author Thomas Darimont
* @author Christoph Strobl * @author Ryan Tenney
* @author Mark Paluch * @author Christoph Strobl
* @see MongoConfigurationSupport * @author Mark Paluch
*/ * @see MongoConfigurationSupport
@Configuration */
public abstract class @Configuration
AbstractMongoConfiguration extends MongoConfigurationSupport { public abstract class
AbstractMongoConfiguration extends MongoConfigurationSupport {
/**
* Return the {@link MongoClient} instance to connect to. Annotate with {@link Bean} in case you want to expose a /**
* {@link MongoClient} instance to the {@link org.springframework.context.ApplicationContext}. * Return the {@link MongoClient} instance to connect to. Annotate with {@link Bean} in case you want to expose a
* * {@link MongoClient} instance to the {@link org.springframework.context.ApplicationContext}.
* @return *
*/ * @return
public abstract MongoClient mongoClient(); */
public abstract MongoClient mongoClient();
/**
* Creates a {@link MongoTemplate}. /**
* * Creates a {@link MongoTemplate}.
* @return *
*/ * @return
@Bean */
public MongoTemplate mongoTemplate() throws Exception { @Bean
return new MongoTemplate(mongoDbFactory(), mappingMongoConverter()); public MongoTemplate mongoTemplate() throws Exception {
} return new MongoTemplate(mongoDbFactory(), mappingMongoConverter());
}
/**
* Creates a {@link SimpleMongoDbFactory} to be used by the {@link MongoTemplate}. Will use the {@link MongoClient} /**
* instance configured in {@link #mongo()}. * Creates a {@link SimpleMongoDbFactory} to be used by the {@link MongoTemplate}. Will use the {@link MongoClient}
* * instance configured in {@link #mongo()}.
* @see #mongoClient() *
* @see #mongoTemplate() * @see #mongoClient()
* @return * @see #mongoTemplate()
*/ * @return
@Bean */
public MongoDbFactory mongoDbFactory() { @Bean
return new SimpleMongoDbFactory(mongoClient(), getDatabaseName()); public MongoDbFactory mongoDbFactory() {
} return new SimpleMongoDbFactory(mongoClient(), getDatabaseName());
}
/**
* Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration /**
* class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending * Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration
* {@link AbstractMongoConfiguration} the base package will be considered {@code com.acme} unless the method is * class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending
* overridden to implement alternate behavior. * {@link AbstractMongoConfiguration} the base package will be considered {@code com.acme} unless the method is
* * overridden to implement alternate behavior.
* @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for *
* entities. * @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for
* @deprecated use {@link #getMappingBasePackages()} instead. * entities.
*/ * @deprecated use {@link #getMappingBasePackages()} instead.
@Deprecated */
protected String getMappingBasePackage() { @Deprecated
@Nullable
Package mappingBasePackage = getClass().getPackage(); protected String getMappingBasePackage() {
return mappingBasePackage == null ? null : mappingBasePackage.getName();
} Package mappingBasePackage = getClass().getPackage();
return mappingBasePackage == null ? null : mappingBasePackage.getName();
/** }
* Creates a {@link MappingMongoConverter} using the configured {@link #mongoDbFactory()} and
* {@link #mongoMappingContext()}. Will get {@link #customConversions()} applied. /**
* * Creates a {@link MappingMongoConverter} using the configured {@link #mongoDbFactory()} and
* @see #customConversions() * {@link #mongoMappingContext()}. Will get {@link #customConversions()} applied.
* @see #mongoMappingContext() *
* @see #mongoDbFactory() * @see #customConversions()
* @return * @see #mongoMappingContext()
* @throws Exception * @see #mongoDbFactory()
*/ * @return
@Bean * @throws Exception
public MappingMongoConverter mappingMongoConverter() throws Exception { */
@Bean
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory()); public MappingMongoConverter mappingMongoConverter() throws Exception {
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext());
converter.setCustomConversions(customConversions()); DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext());
return converter; converter.setCustomConversions(customConversions());
}
} return converter;
}
}

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2016 the original author or authors. * Copyright 2016-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -13,17 +13,14 @@
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.springframework.data.mongodb.config; package org.springframework.data.mongodb.config;
import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration; import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.ReactiveMongoDatabaseFactory; import org.springframework.data.mongodb.ReactiveMongoDatabaseFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.ReactiveMongoOperations; import org.springframework.data.mongodb.core.ReactiveMongoOperations;
import org.springframework.data.mongodb.core.SimpleReactiveMongoDatabaseFactory;
import org.springframework.data.mongodb.core.ReactiveMongoTemplate; import org.springframework.data.mongodb.core.ReactiveMongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory; import org.springframework.data.mongodb.core.SimpleReactiveMongoDatabaseFactory;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter; import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import com.mongodb.reactivestreams.client.MongoClient; import com.mongodb.reactivestreams.client.MongoClient;
@@ -32,6 +29,7 @@ import com.mongodb.reactivestreams.client.MongoClient;
* Base class for reactive Spring Data MongoDB configuration using JavaConfig. * Base class for reactive Spring Data MongoDB configuration using JavaConfig.
* *
* @author Mark Paluch * @author Mark Paluch
* @author Christoph Strobl
* @since 2.0 * @since 2.0
* @see MongoConfigurationSupport * @see MongoConfigurationSupport
*/ */
@@ -39,45 +37,44 @@ import com.mongodb.reactivestreams.client.MongoClient;
public abstract class AbstractReactiveMongoConfiguration extends MongoConfigurationSupport { public abstract class AbstractReactiveMongoConfiguration extends MongoConfigurationSupport {
/** /**
* Return the {@link MongoClient} instance to connect to. Annotate with {@link Bean} in case you want to expose a * Return the Reactive Streams {@link MongoClient} instance to connect to. Annotate with {@link Bean} in case you want
* {@link MongoClient} instance to the {@link org.springframework.context.ApplicationContext}. * to expose a {@link MongoClient} instance to the {@link org.springframework.context.ApplicationContext}.
* *
* @return * @return never {@literal null}.
*/ */
public abstract MongoClient mongoClient(); public abstract MongoClient reactiveMongoClient();
/** /**
* Creates a {@link ReactiveMongoTemplate}. * Creates {@link ReactiveMongoOperations}.
* *
* @return * @return never {@literal null}.
*/ */
@Bean @Bean
public ReactiveMongoOperations reactiveMongoTemplate() throws Exception { public ReactiveMongoOperations reactiveMongoTemplate() throws Exception {
return new ReactiveMongoTemplate(mongoDbFactory(), mappingMongoConverter()); return new ReactiveMongoTemplate(reactiveMongoDbFactory(), mappingMongoConverter());
} }
/** /**
* Creates a {@link SimpleMongoDbFactory} to be used by the {@link MongoTemplate}. Will use the {@link Mongo} instance * Creates a {@link ReactiveMongoDatabaseFactory} to be used by the {@link ReactiveMongoOperations}. Will use the
* configured in {@link #mongoClient()}. * {@link MongoClient} instance configured in {@link #reactiveMongoClient()}.
* *
* @see #mongoClient() * @see #reactiveMongoClient()
* @see #reactiveMongoTemplate() * @see #reactiveMongoTemplate()
* @return * @return never {@literal null}.
* @throws Exception
*/ */
@Bean @Bean
public ReactiveMongoDatabaseFactory mongoDbFactory() { public ReactiveMongoDatabaseFactory reactiveMongoDbFactory() {
return new SimpleReactiveMongoDatabaseFactory(mongoClient(), getDatabaseName()); return new SimpleReactiveMongoDatabaseFactory(reactiveMongoClient(), getDatabaseName());
} }
/** /**
* Creates a {@link MappingMongoConverter} using the configured {@link #mongoDbFactory()} and * Creates a {@link MappingMongoConverter} using the configured {@link #reactiveMongoDbFactory()} and
* {@link #mongoMappingContext()}. Will get {@link #customConversions()} applied. * {@link #mongoMappingContext()}. Will get {@link #customConversions()} applied.
* *
* @see #customConversions() * @see #customConversions()
* @see #mongoMappingContext() * @see #mongoMappingContext()
* @see #mongoDbFactory() * @see #reactiveMongoDbFactory()
* @return * @return never {@literal null}.
* @throws Exception * @throws Exception
*/ */
@Bean @Bean

View File

@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport; import java.beans.PropertyEditorSupport;
import java.io.UnsupportedEncodingException; import java.io.UnsupportedEncodingException;
import java.lang.reflect.Method;
import java.net.URLDecoder; import java.net.URLDecoder;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Arrays; import java.util.Arrays;
@@ -25,6 +26,8 @@ import java.util.Properties;
import java.util.regex.Matcher; import java.util.regex.Matcher;
import java.util.regex.Pattern; import java.util.regex.Pattern;
import org.springframework.lang.Nullable;
import org.springframework.util.ReflectionUtils;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
import com.mongodb.MongoCredential; import com.mongodb.MongoCredential;
@@ -34,6 +37,8 @@ import com.mongodb.MongoCredential;
* *
* @author Christoph Strobl * @author Christoph Strobl
* @author Oliver Gierke * @author Oliver Gierke
* @author Stephen Tyler Conrad
* @author Mark Paluch
* @since 1.7 * @since 1.7
*/ */
public class MongoCredentialPropertyEditor extends PropertyEditorSupport { public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
@@ -51,7 +56,7 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String) * @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/ */
@Override @Override
public void setAsText(String text) throws IllegalArgumentException { public void setAsText(@Nullable String text) throws IllegalArgumentException {
if (!StringUtils.hasText(text)) { if (!StringUtils.hasText(text)) {
return; return;
@@ -97,6 +102,20 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
verifyDatabasePresent(database); verifyDatabasePresent(database);
credentials.add(MongoCredential.createScramSha1Credential(userNameAndPassword[0], database, credentials.add(MongoCredential.createScramSha1Credential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray())); userNameAndPassword[1].toCharArray()));
} else if ("SCRAM-SHA-256".equals(authMechanism)) {
Method createScramSha256Credential = ReflectionUtils.findMethod(MongoCredential.class,
"createScramSha256Credential", String.class, String.class, char[].class);
if (createScramSha256Credential == null) {
throw new IllegalArgumentException(
"SCRAM-SHA-256 auth mechanism is available as of MongoDB 4 and MongoDB Java Driver 3.8! Please make sure to use at least those versions.");
}
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.class.cast(ReflectionUtils.invokeMethod(createScramSha256Credential, null,
userNameAndPassword[0], database, userNameAndPassword[1].toCharArray())));
} else { } else {
throw new IllegalArgumentException( throw new IllegalArgumentException(
String.format("Cannot create MongoCredentials for unknown auth mechanism '%s'!", authMechanism)); String.format("Cannot create MongoCredentials for unknown auth mechanism '%s'!", authMechanism));
@@ -163,7 +182,7 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
private static Properties extractOptions(String text) { private static Properties extractOptions(String text) {
int optionsSeparationIndex = text.lastIndexOf(OPTIONS_DELIMITER); int optionsSeparationIndex = text.lastIndexOf(OPTIONS_DELIMITER);
int dbSeparationIndex = text.lastIndexOf(OPTIONS_DELIMITER); int dbSeparationIndex = text.lastIndexOf(DATABASE_DELIMITER);
if (optionsSeparationIndex == -1 || dbSeparationIndex > optionsSeparationIndex) { if (optionsSeparationIndex == -1 || dbSeparationIndex > optionsSeparationIndex) {
return new Properties(); return new Properties();
@@ -172,7 +191,13 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
Properties properties = new Properties(); Properties properties = new Properties();
for (String option : text.substring(optionsSeparationIndex + 1).split(OPTION_VALUE_DELIMITER)) { for (String option : text.substring(optionsSeparationIndex + 1).split(OPTION_VALUE_DELIMITER)) {
String[] optionArgs = option.split("="); String[] optionArgs = option.split("=");
if (optionArgs.length == 1) {
throw new IllegalArgumentException(String.format("Query parameter '%s' has no value!", optionArgs[0]));
}
properties.put(optionArgs[0], optionArgs[1]); properties.put(optionArgs[0], optionArgs[1]);
} }

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2011-2017 by the original author(s). * Copyright 2011-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* http://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -33,6 +33,7 @@ import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder; import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mongodb.core.MongoClientFactoryBean; import org.springframework.data.mongodb.core.MongoClientFactoryBean;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory; import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.lang.Nullable;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
import org.w3c.dom.Element; import org.w3c.dom.Element;
@@ -42,12 +43,13 @@ import com.mongodb.MongoURI;
/** /**
* {@link BeanDefinitionParser} to parse {@code db-factory} elements into {@link BeanDefinition}s. * {@link BeanDefinitionParser} to parse {@code db-factory} elements into {@link BeanDefinition}s.
* *
* @author Jon Brisbin * @author Jon Brisbin
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
* @author Christoph Strobl * @author Christoph Strobl
* @author Viktor Khoroshko * @author Viktor Khoroshko
* @author Mark Paluch
*/ */
public class MongoDbFactoryParser extends AbstractBeanDefinitionParser { public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
@@ -62,7 +64,7 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES = Collections.unmodifiableSet(mongoUriAllowedAdditionalAttributes); MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES = Collections.unmodifiableSet(mongoUriAllowedAdditionalAttributes);
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext) * @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/ */
@@ -74,7 +76,7 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
return StringUtils.hasText(id) ? id : BeanNames.DB_FACTORY_BEAN_NAME; return StringUtils.hasText(id) ? id : BeanNames.DB_FACTORY_BEAN_NAME;
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext) * @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#parseInternal(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/ */
@@ -119,7 +121,7 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
/** /**
* Registers a default {@link BeanDefinition} of a {@link Mongo} instance and returns the name under which the * Registers a default {@link BeanDefinition} of a {@link Mongo} instance and returns the name under which the
* {@link Mongo} instance was registered under. * {@link Mongo} instance was registered under.
* *
* @param element must not be {@literal null}. * @param element must not be {@literal null}.
* @param parserContext must not be {@literal null}. * @param parserContext must not be {@literal null}.
* @return * @return
@@ -138,11 +140,12 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
* attributes. <br /> * attributes. <br />
* Errors when configured element contains {@literal uri} or {@literal client-uri} along with other attributes except * Errors when configured element contains {@literal uri} or {@literal client-uri} along with other attributes except
* {@literal write-concern} and/or {@literal id}. * {@literal write-concern} and/or {@literal id}.
* *
* @param element must not be {@literal null}. * @param element must not be {@literal null}.
* @param parserContext * @param parserContext
* @return {@literal null} in case no client-/uri defined. * @return {@literal null} in case no client-/uri defined.
*/ */
@Nullable
private BeanDefinition getMongoUri(Element element, ParserContext parserContext) { private BeanDefinition getMongoUri(Element element, ParserContext parserContext) {
boolean hasClientUri = element.hasAttribute("client-uri"); boolean hasClientUri = element.hasAttribute("client-uri");

View File

@@ -1,76 +1,76 @@
/* /*
* Copyright 2011-2017 the original author or authors. * Copyright 2011-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* http://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.springframework.data.mongodb.config; package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition; import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition; import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition; import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder; import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.BeanDefinitionParser; import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext; import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.mongodb.core.MongoAdmin; import org.springframework.data.mongodb.core.MongoAdmin;
import org.springframework.data.mongodb.monitor.*; import org.springframework.data.mongodb.monitor.*;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
import org.w3c.dom.Element; import org.w3c.dom.Element;
/** /**
* @author Mark Pollack * @author Mark Pollack
* @author Thomas Risberg * @author Thomas Risberg
* @author John Brisbin * @author John Brisbin
* @author Oliver Gierke * @author Oliver Gierke
* @author Christoph Strobl * @author Christoph Strobl
*/ */
public class MongoJmxParser implements BeanDefinitionParser { public class MongoJmxParser implements BeanDefinitionParser {
public BeanDefinition parse(Element element, ParserContext parserContext) { public BeanDefinition parse(Element element, ParserContext parserContext) {
String name = element.getAttribute("mongo-ref"); String name = element.getAttribute("mongo-ref");
if (!StringUtils.hasText(name)) { if (!StringUtils.hasText(name)) {
name = BeanNames.MONGO_BEAN_NAME; name = BeanNames.MONGO_BEAN_NAME;
} }
registerJmxComponents(name, element, parserContext); registerJmxComponents(name, element, parserContext);
return null; return null;
} }
protected void registerJmxComponents(String mongoRefName, Element element, ParserContext parserContext) { protected void registerJmxComponents(String mongoRefName, Element element, ParserContext parserContext) {
Object eleSource = parserContext.extractSource(element); Object eleSource = parserContext.extractSource(element);
CompositeComponentDefinition compositeDef = new CompositeComponentDefinition(element.getTagName(), eleSource); CompositeComponentDefinition compositeDef = new CompositeComponentDefinition(element.getTagName(), eleSource);
createBeanDefEntry(AssertMetrics.class, compositeDef, mongoRefName, eleSource, parserContext); createBeanDefEntry(AssertMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(BackgroundFlushingMetrics.class, compositeDef, mongoRefName, eleSource, parserContext); createBeanDefEntry(BackgroundFlushingMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(BtreeIndexCounters.class, compositeDef, mongoRefName, eleSource, parserContext); createBeanDefEntry(BtreeIndexCounters.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(ConnectionMetrics.class, compositeDef, mongoRefName, eleSource, parserContext); createBeanDefEntry(ConnectionMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(GlobalLockMetrics.class, compositeDef, mongoRefName, eleSource, parserContext); createBeanDefEntry(GlobalLockMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(MemoryMetrics.class, compositeDef, mongoRefName, eleSource, parserContext); createBeanDefEntry(MemoryMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(OperationCounters.class, compositeDef, mongoRefName, eleSource, parserContext); createBeanDefEntry(OperationCounters.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(ServerInfo.class, compositeDef, mongoRefName, eleSource, parserContext); createBeanDefEntry(ServerInfo.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(MongoAdmin.class, compositeDef, mongoRefName, eleSource, parserContext); createBeanDefEntry(MongoAdmin.class, compositeDef, mongoRefName, eleSource, parserContext);
parserContext.registerComponent(compositeDef); parserContext.registerComponent(compositeDef);
} }
protected void createBeanDefEntry(Class<?> clazz, CompositeComponentDefinition compositeDef, String mongoRefName, protected void createBeanDefEntry(Class<?> clazz, CompositeComponentDefinition compositeDef, String mongoRefName,
Object eleSource, ParserContext parserContext) { Object eleSource, ParserContext parserContext) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(clazz); BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(clazz);
builder.getRawBeanDefinition().setSource(eleSource); builder.getRawBeanDefinition().setSource(eleSource);
builder.addConstructorArgReference(mongoRefName); builder.addConstructorArgReference(mongoRefName);
BeanDefinition assertDef = builder.getBeanDefinition(); BeanDefinition assertDef = builder.getBeanDefinition();
String assertName = parserContext.getReaderContext().registerWithGeneratedName(assertDef); String assertName = parserContext.getReaderContext().registerWithGeneratedName(assertDef);
compositeDef.addNestedComponent(new BeanComponentDefinition(assertDef, assertName)); compositeDef.addNestedComponent(new BeanComponentDefinition(assertDef, assertName));
} }
} }

View File

@@ -1,170 +1,170 @@
/* /*
* Copyright 2011-2017 the original author or authors. * Copyright 2011-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* http://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.springframework.data.mongodb.config; package org.springframework.data.mongodb.config;
import static org.springframework.data.config.ParsingUtils.*; import static org.springframework.data.config.ParsingUtils.*;
import java.util.Map; import java.util.Map;
import org.springframework.beans.factory.config.BeanDefinition; import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer; import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.BeanDefinitionBuilder; import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedMap; import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser; import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.data.mongodb.core.MongoClientOptionsFactoryBean; import org.springframework.data.mongodb.core.MongoClientOptionsFactoryBean;
import org.springframework.util.xml.DomUtils; import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element; import org.w3c.dom.Element;
/** /**
* Utility methods for {@link BeanDefinitionParser} implementations for MongoDB. * Utility methods for {@link BeanDefinitionParser} implementations for MongoDB.
* *
* @author Mark Pollack * @author Mark Pollack
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
* @author Christoph Strobl * @author Christoph Strobl
*/ */
@SuppressWarnings("deprecation") @SuppressWarnings("deprecation")
abstract class MongoParsingUtils { abstract class MongoParsingUtils {
private MongoParsingUtils() {} private MongoParsingUtils() {}
/** /**
* Parses the mongo replica-set element. * Parses the mongo replica-set element.
* *
* @param parserContext the parser context * @param parserContext the parser context
* @param element the mongo element * @param element the mongo element
* @param mongoBuilder the bean definition builder to populate * @param mongoBuilder the bean definition builder to populate
* @return * @return
*/ */
static void parseReplicaSet(Element element, BeanDefinitionBuilder mongoBuilder) { static void parseReplicaSet(Element element, BeanDefinitionBuilder mongoBuilder) {
setPropertyValue(mongoBuilder, element, "replica-set", "replicaSetSeeds"); setPropertyValue(mongoBuilder, element, "replica-set", "replicaSetSeeds");
} }
/** /**
* Parses the {@code mongo:client-options} sub-element. Populates the given attribute factory with the proper * Parses the {@code mongo:client-options} sub-element. Populates the given attribute factory with the proper
* attributes. * attributes.
* *
* @param element must not be {@literal null}. * @param element must not be {@literal null}.
* @param mongoClientBuilder must not be {@literal null}. * @param mongoClientBuilder must not be {@literal null}.
* @return * @return
* @since 1.7 * @since 1.7
*/ */
public static boolean parseMongoClientOptions(Element element, BeanDefinitionBuilder mongoClientBuilder) { public static boolean parseMongoClientOptions(Element element, BeanDefinitionBuilder mongoClientBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "client-options"); Element optionsElement = DomUtils.getChildElementByTagName(element, "client-options");
if (optionsElement == null) { if (optionsElement == null) {
return false; return false;
} }
BeanDefinitionBuilder clientOptionsDefBuilder = BeanDefinitionBuilder BeanDefinitionBuilder clientOptionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoClientOptionsFactoryBean.class); .genericBeanDefinition(MongoClientOptionsFactoryBean.class);
setPropertyValue(clientOptionsDefBuilder, optionsElement, "description", "description"); setPropertyValue(clientOptionsDefBuilder, optionsElement, "description", "description");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-connections-per-host", "minConnectionsPerHost"); setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-connections-per-host", "minConnectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connections-per-host", "connectionsPerHost"); setPropertyValue(clientOptionsDefBuilder, optionsElement, "connections-per-host", "connectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "threads-allowed-to-block-for-connection-multiplier", setPropertyValue(clientOptionsDefBuilder, optionsElement, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier"); "threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-wait-time", "maxWaitTime"); setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-wait-time", "maxWaitTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-idle-time", "maxConnectionIdleTime"); setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-idle-time", "maxConnectionIdleTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-life-time", "maxConnectionLifeTime"); setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-life-time", "maxConnectionLifeTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connect-timeout", "connectTimeout"); setPropertyValue(clientOptionsDefBuilder, optionsElement, "connect-timeout", "connectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-timeout", "socketTimeout"); setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-timeout", "socketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-keep-alive", "socketKeepAlive"); setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "read-preference", "readPreference"); setPropertyValue(clientOptionsDefBuilder, optionsElement, "read-preference", "readPreference");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "write-concern", "writeConcern"); setPropertyValue(clientOptionsDefBuilder, optionsElement, "write-concern", "writeConcern");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-frequency", "heartbeatFrequency"); setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-frequency", "heartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-heartbeat-frequency", "minHeartbeatFrequency"); setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-heartbeat-frequency", "minHeartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-connect-timeout", "heartbeatConnectTimeout"); setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-connect-timeout", "heartbeatConnectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-socket-timeout", "heartbeatSocketTimeout"); setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-socket-timeout", "heartbeatSocketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "ssl", "ssl"); setPropertyValue(clientOptionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(clientOptionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory"); setPropertyReference(clientOptionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "server-selection-timeout", "serverSelectionTimeout"); setPropertyValue(clientOptionsDefBuilder, optionsElement, "server-selection-timeout", "serverSelectionTimeout");
mongoClientBuilder.addPropertyValue("mongoClientOptions", clientOptionsDefBuilder.getBeanDefinition()); mongoClientBuilder.addPropertyValue("mongoClientOptions", clientOptionsDefBuilder.getBeanDefinition());
return true; return true;
} }
/** /**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a * Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link WriteConcernPropertyEditor}. * {@link WriteConcernPropertyEditor}.
* *
* @return * @return
*/ */
static BeanDefinitionBuilder getWriteConcernPropertyEditorBuilder() { static BeanDefinitionBuilder getWriteConcernPropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>(); Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.WriteConcern", WriteConcernPropertyEditor.class); customEditors.put("com.mongodb.WriteConcern", WriteConcernPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class); BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors); builder.addPropertyValue("customEditors", customEditors);
return builder; return builder;
} }
/** /**
* One should only register one bean definition but want to have the convenience of using * One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the * AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
* container. * container.
*/ */
static BeanDefinitionBuilder getServerAddressPropertyEditorBuilder() { static BeanDefinitionBuilder getServerAddressPropertyEditorBuilder() {
Map<String, String> customEditors = new ManagedMap<String, String>(); Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]", customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor"); "org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class); BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors); builder.addPropertyValue("customEditors", customEditors);
return builder; return builder;
} }
/** /**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a * Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link ReadPreferencePropertyEditor}. * {@link ReadPreferencePropertyEditor}.
* *
* @return * @return
* @since 1.7 * @since 1.7
*/ */
static BeanDefinitionBuilder getReadPreferencePropertyEditorBuilder() { static BeanDefinitionBuilder getReadPreferencePropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>(); Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.ReadPreference", ReadPreferencePropertyEditor.class); customEditors.put("com.mongodb.ReadPreference", ReadPreferencePropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class); BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors); builder.addPropertyValue("customEditors", customEditors);
return builder; return builder;
} }
/** /**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a * Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link MongoCredentialPropertyEditor}. * {@link MongoCredentialPropertyEditor}.
* *
* @return * @return
* @since 1.7 * @since 1.7
*/ */
static BeanDefinitionBuilder getMongoCredentialPropertyEditor() { static BeanDefinitionBuilder getMongoCredentialPropertyEditor() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>(); Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.MongoCredential[]", MongoCredentialPropertyEditor.class); customEditors.put("com.mongodb.MongoCredential[]", MongoCredentialPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class); BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors); builder.addPropertyValue("customEditors", customEditors);
return builder; return builder;
} }
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2015 the original author or authors. * Copyright 2015-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -17,6 +17,8 @@ package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport; import java.beans.PropertyEditorSupport;
import org.springframework.lang.Nullable;
import com.mongodb.ReadPreference; import com.mongodb.ReadPreference;
/** /**
@@ -32,7 +34,7 @@ public class ReadPreferencePropertyEditor extends PropertyEditorSupport {
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String) * @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/ */
@Override @Override
public void setAsText(String readPreferenceString) throws IllegalArgumentException { public void setAsText(@Nullable String readPreferenceString) throws IllegalArgumentException {
if (readPreferenceString == null) { if (readPreferenceString == null) {
return; return;
@@ -59,8 +61,8 @@ public class ReadPreferencePropertyEditor extends PropertyEditorSupport {
} else if ("NEAREST".equalsIgnoreCase(readPreferenceString)) { } else if ("NEAREST".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.nearest()); setValue(ReadPreference.nearest());
} else { } else {
throw new IllegalArgumentException(String.format("Cannot find matching ReadPreference for %s", throw new IllegalArgumentException(
readPreferenceString)); String.format("Cannot find matching ReadPreference for %s", readPreferenceString));
} }
} }
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011-2013 the original author or authors. * Copyright 2011-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -23,6 +23,7 @@ import java.util.Set;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
@@ -30,10 +31,11 @@ import com.mongodb.ServerAddress;
/** /**
* Parse a {@link String} to a {@link ServerAddress} array. The format is host1:port1,host2:port2,host3:port3. * Parse a {@link String} to a {@link ServerAddress} array. The format is host1:port1,host2:port2,host3:port3.
* *
* @author Mark Pollack * @author Mark Pollack
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
* @author Christoph Strobl
*/ */
public class ServerAddressPropertyEditor extends PropertyEditorSupport { public class ServerAddressPropertyEditor extends PropertyEditorSupport {
@@ -49,7 +51,7 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String) * @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/ */
@Override @Override
public void setAsText(String replicaSetString) { public void setAsText(@Nullable String replicaSetString) {
if (!StringUtils.hasText(replicaSetString)) { if (!StringUtils.hasText(replicaSetString)) {
setValue(null); setValue(null);
@@ -78,10 +80,11 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
/** /**
* Parses the given source into a {@link ServerAddress}. * Parses the given source into a {@link ServerAddress}.
* *
* @param source * @param source
* @return the * @return the
*/ */
@Nullable
private ServerAddress parseServerAddress(String source) { private ServerAddress parseServerAddress(String source) {
if (!StringUtils.hasText(source)) { if (!StringUtils.hasText(source)) {
@@ -112,7 +115,7 @@ public class ServerAddressPropertyEditor extends PropertyEditorSupport {
/** /**
* Extract the host and port from the given {@link String}. * Extract the host and port from the given {@link String}.
* *
* @param addressAndPortSource must not be {@literal null}. * @param addressAndPortSource must not be {@literal null}.
* @return * @return
*/ */

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011 the original author or authors. * Copyright 2011-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -17,6 +17,9 @@ package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport; import java.beans.PropertyEditorSupport;
import org.springframework.lang.Nullable;
import org.springframework.util.StringUtils;
import com.mongodb.WriteConcern; import com.mongodb.WriteConcern;
/** /**
@@ -26,6 +29,7 @@ import com.mongodb.WriteConcern;
* string value. * string value.
* *
* @author Mark Pollack * @author Mark Pollack
* @author Christoph Strobl
*/ */
public class WriteConcernPropertyEditor extends PropertyEditorSupport { public class WriteConcernPropertyEditor extends PropertyEditorSupport {
@@ -33,7 +37,11 @@ public class WriteConcernPropertyEditor extends PropertyEditorSupport {
* Parse a string to a List<ServerAddress> * Parse a string to a List<ServerAddress>
*/ */
@Override @Override
public void setAsText(String writeConcernString) { public void setAsText(@Nullable String writeConcernString) {
if (!StringUtils.hasText(writeConcernString)) {
return;
}
WriteConcern writeConcern = WriteConcern.valueOf(writeConcernString); WriteConcern writeConcern = WriteConcern.valueOf(writeConcernString);
if (writeConcern != null) { if (writeConcern != null) {

View File

@@ -1,5 +1,6 @@
/** /**
* Spring XML namespace configuration for MongoDB specific repositories. * Spring XML namespace configuration for MongoDB specific repositories.
*/ */
@org.springframework.lang.NonNullApi
package org.springframework.data.mongodb.config; package org.springframework.data.mongodb.config;

View File

@@ -0,0 +1,118 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import lombok.AllArgsConstructor;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
import org.bson.Document;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.lang.Nullable;
import org.springframework.util.ObjectUtils;
/**
* Utility methods to map {@link org.springframework.data.mongodb.core.aggregation.Aggregation} pipeline definitions and
* create type-bound {@link AggregationOperationContext}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.0.8
*/
@AllArgsConstructor
class AggregationUtil {
QueryMapper queryMapper;
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
/**
* Prepare the {@link AggregationOperationContext} for a given aggregation by either returning the context itself it
* is not {@literal null}, create a {@link TypeBasedAggregationOperationContext} if the aggregation contains type
* information (is a {@link TypedAggregation}) or use the {@link Aggregation#DEFAULT_CONTEXT}.
*
* @param aggregation must not be {@literal null}.
* @param context can be {@literal null}.
* @return the root {@link AggregationOperationContext} to use.
*/
AggregationOperationContext prepareAggregationContext(Aggregation aggregation,
@Nullable AggregationOperationContext context) {
if (context != null) {
return context;
}
if (aggregation instanceof TypedAggregation) {
return new TypeBasedAggregationOperationContext(((TypedAggregation) aggregation).getInputType(), mappingContext,
queryMapper);
}
return Aggregation.DEFAULT_CONTEXT;
}
/**
* Extract and map the aggregation pipeline into a {@link List} of {@link Document}.
*
* @param aggregation
* @param context
* @return
*/
Document createPipeline(String collectionName, Aggregation aggregation, AggregationOperationContext context) {
if (!ObjectUtils.nullSafeEquals(context, Aggregation.DEFAULT_CONTEXT)) {
return aggregation.toDocument(collectionName, context);
}
Document command = aggregation.toDocument(collectionName, context);
command.put("pipeline", mapAggregationPipeline(command.get("pipeline", List.class)));
return command;
}
/**
* Extract the command and map the aggregation pipeline.
*
* @param aggregation
* @param context
* @return
*/
Document createCommand(String collection, Aggregation aggregation, AggregationOperationContext context) {
Document command = aggregation.toDocument(collection, context);
if (!ObjectUtils.nullSafeEquals(context, Aggregation.DEFAULT_CONTEXT)) {
return command;
}
command.put("pipeline", mapAggregationPipeline(command.get("pipeline", List.class)));
return command;
}
private List<Document> mapAggregationPipeline(List<Document> pipeline) {
return pipeline.stream().map(val -> queryMapper.getMappedObject(val, Optional.empty()))
.collect(Collectors.toList());
}
}

View File

@@ -1,44 +1,46 @@
/* /*
* Copyright 2010-2016 the original author or authors. * Copyright 2010-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* http://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import org.bson.Document; import org.bson.Document;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
import org.springframework.lang.Nullable;
import com.mongodb.MongoException;
import com.mongodb.client.MongoCollection; import com.mongodb.MongoException;
import com.mongodb.client.MongoCollection;
/**
* Callback interface for executing actions against a {@link MongoCollection} /**
* * Callback interface for executing actions against a {@link MongoCollection}.
* @author Mark Pollak *
* @author Grame Rocher * @author Mark Pollak
* @author Oliver Gierke * @author Grame Rocher
* @author John Brisbin * @author Oliver Gierke
* @auhtor Christoph Strobl * @author John Brisbin
* @since 1.0 * @auhtor Christoph Strobl
*/ * @since 1.0
public interface CollectionCallback<T> { */
public interface CollectionCallback<T> {
/**
* @param collection never {@literal null}. /**
* @return * @param collection never {@literal null}.
* @throws MongoException * @return can be {@literal null}.
* @throws DataAccessException * @throws MongoException
*/ * @throws DataAccessException
T doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException; */
@Nullable
} T doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException;
}

View File

@@ -1,164 +1,166 @@
/* /*
* Copyright 2010-2017 the original author or authors. * Copyright 2010-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* http://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import java.util.Optional; import java.util.Optional;
import org.springframework.data.mongodb.core.query.Collation; import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.util.Assert; import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
/**
* Provides a simple wrapper to encapsulate the variety of settings you can use when creating a collection. /**
* * Provides a simple wrapper to encapsulate the variety of settings you can use when creating a collection.
* @author Thomas Risberg *
* @author Christoph Strobl * @author Thomas Risberg
* @author Mark Paluch * @author Christoph Strobl
*/ * @author Mark Paluch
public class CollectionOptions { */
public class CollectionOptions {
private Long maxDocuments;
private Long size; private @Nullable Long maxDocuments;
private Boolean capped; private @Nullable Long size;
private Collation collation; private @Nullable Boolean capped;
private @Nullable Collation collation;
/**
* Constructs a new <code>CollectionOptions</code> instance. /**
* * Constructs a new <code>CollectionOptions</code> instance.
* @param size the collection size in bytes, this data space is preallocated. *
* @param maxDocuments the maximum number of documents in the collection. * @param size the collection size in bytes, this data space is preallocated. Can be {@literal null}.
* @param capped true to created a "capped" collection (fixed size with auto-FIFO behavior based on insertion order), * @param maxDocuments the maximum number of documents in the collection. Can be {@literal null}.
* false otherwise. * @param capped true to created a "capped" collection (fixed size with auto-FIFO behavior based on insertion order),
* @deprecated since 2.0 please use {@link CollectionOptions#empty()} as entry point. * false otherwise. Can be {@literal null}.
*/ * @deprecated since 2.0 please use {@link CollectionOptions#empty()} as entry point.
@Deprecated */
public CollectionOptions(Long size, Long maxDocuments, Boolean capped) { @Deprecated
this(size, maxDocuments, capped, null); public CollectionOptions(@Nullable Long size, @Nullable Long maxDocuments, @Nullable Boolean capped) {
} this(size, maxDocuments, capped, null);
}
private CollectionOptions(Long size, Long maxDocuments, Boolean capped, Collation collation) {
private CollectionOptions(@Nullable Long size, @Nullable Long maxDocuments, @Nullable Boolean capped,
this.maxDocuments = maxDocuments; @Nullable Collation collation) {
this.size = size;
this.capped = capped; this.maxDocuments = maxDocuments;
this.collation = collation; this.size = size;
} this.capped = capped;
this.collation = collation;
/** }
* Create new {@link CollectionOptions} by just providing the {@link Collation} to use.
* /**
* @param collation must not be {@literal null}. * Create new {@link CollectionOptions} by just providing the {@link Collation} to use.
* @return new {@link CollectionOptions}. *
* @since 2.0 * @param collation must not be {@literal null}.
*/ * @return new {@link CollectionOptions}.
public static CollectionOptions just(Collation collation) { * @since 2.0
*/
Assert.notNull(collation, "Collation must not be null!"); public static CollectionOptions just(Collation collation) {
return new CollectionOptions(null, null, null, collation); Assert.notNull(collation, "Collation must not be null!");
}
return new CollectionOptions(null, null, null, collation);
/** }
* Create new empty {@link CollectionOptions}.
* /**
* @return new {@link CollectionOptions}. * Create new empty {@link CollectionOptions}.
* @since 2.0 *
*/ * @return new {@link CollectionOptions}.
public static CollectionOptions empty() { * @since 2.0
return new CollectionOptions(null, null, null, null); */
} public static CollectionOptions empty() {
return new CollectionOptions(null, null, null, null);
/** }
* Create new {@link CollectionOptions} with already given settings and capped set to {@literal true}. <br />
* <strong>NOTE</strong> Using capped collections requires defining {@link #size(int)}. /**
* * Create new {@link CollectionOptions} with already given settings and capped set to {@literal true}. <br />
* @return new {@link CollectionOptions}. * <strong>NOTE</strong> Using capped collections requires defining {@link #size(int)}.
* @since 2.0 *
*/ * @return new {@link CollectionOptions}.
public CollectionOptions capped() { * @since 2.0
return new CollectionOptions(size, maxDocuments, true, collation); */
} public CollectionOptions capped() {
return new CollectionOptions(size, maxDocuments, true, collation);
/** }
* Create new {@link CollectionOptions} with already given settings and {@code maxDocuments} set to given value.
* /**
* @param maxDocuments can be {@literal null}. * Create new {@link CollectionOptions} with already given settings and {@code maxDocuments} set to given value.
* @return new {@link CollectionOptions}. *
* @since 2.0 * @param maxDocuments can be {@literal null}.
*/ * @return new {@link CollectionOptions}.
public CollectionOptions maxDocuments(long maxDocuments) { * @since 2.0
return new CollectionOptions(size, maxDocuments, capped, collation); */
} public CollectionOptions maxDocuments(long maxDocuments) {
return new CollectionOptions(size, maxDocuments, capped, collation);
/** }
* Create new {@link CollectionOptions} with already given settings and {@code size} set to given value.
* /**
* @param size can be {@literal null}. * Create new {@link CollectionOptions} with already given settings and {@code size} set to given value.
* @return new {@link CollectionOptions}. *
* @since 2.0 * @param size can be {@literal null}.
*/ * @return new {@link CollectionOptions}.
public CollectionOptions size(long size) { * @since 2.0
return new CollectionOptions(size, maxDocuments, capped, collation); */
} public CollectionOptions size(long size) {
return new CollectionOptions(size, maxDocuments, capped, collation);
/** }
* Create new {@link CollectionOptions} with already given settings and {@code collation} set to given value.
* /**
* @param collation can be {@literal null}. * Create new {@link CollectionOptions} with already given settings and {@code collation} set to given value.
* @return new {@link CollectionOptions}. *
* @since 2.0 * @param collation can be {@literal null}.
*/ * @return new {@link CollectionOptions}.
public CollectionOptions collation(Collation collation) { * @since 2.0
return new CollectionOptions(size, maxDocuments, capped, collation); */
} public CollectionOptions collation(@Nullable Collation collation) {
return new CollectionOptions(size, maxDocuments, capped, collation);
/** }
* Get the max number of documents the collection should be limited to.
* /**
* @return {@link Optional#empty()} if not set. * Get the max number of documents the collection should be limited to.
*/ *
public Optional<Long> getMaxDocuments() { * @return {@link Optional#empty()} if not set.
return Optional.ofNullable(maxDocuments); */
} public Optional<Long> getMaxDocuments() {
return Optional.ofNullable(maxDocuments);
/** }
* Get the {@literal size} in bytes the collection should be limited to.
* /**
* @return {@link Optional#empty()} if not set. * Get the {@literal size} in bytes the collection should be limited to.
*/ *
public Optional<Long> getSize() { * @return {@link Optional#empty()} if not set.
return Optional.ofNullable(size); */
} public Optional<Long> getSize() {
return Optional.ofNullable(size);
/** }
* Get if the collection should be capped.
* /**
* @return {@link Optional#empty()} if not set. * Get if the collection should be capped.
* @since 2.0 *
*/ * @return {@link Optional#empty()} if not set.
public Optional<Boolean> getCapped() { * @since 2.0
return Optional.ofNullable(capped); */
} public Optional<Boolean> getCapped() {
return Optional.ofNullable(capped);
/** }
* Get the {@link Collation} settings.
* /**
* @return {@link Optional#empty()} if not set. * Get the {@link Collation} settings.
* @since 2.0 *
*/ * @return {@link Optional#empty()} if not set.
public Optional<Collation> getCollation() { * @since 2.0
return Optional.ofNullable(collation); */
} public Optional<Collation> getCollation() {
} return Optional.ofNullable(collation);
}
}

View File

@@ -1,30 +1,44 @@
/* /*
* Copyright 2010-2016 the original author or authors. * Copyright 2010-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* http://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
import org.springframework.lang.Nullable;
import com.mongodb.MongoException;
import com.mongodb.client.MongoDatabase; import com.mongodb.MongoException;
import com.mongodb.client.MongoDatabase;
/**
* /**
* @param <T> * Callback interface for executing actions against a {@link MongoDatabase}.
*/ *
public interface DbCallback<T> { * @author Mark Pollak
* @author Graeme Rocher
T doInDB(MongoDatabase db) throws MongoException, DataAccessException; * @author Thomas Risberg
} * @author Oliver Gierke
* @author John Brisbin
* @author Christoph Strobl
*/
public interface DbCallback<T> {
/**
* @param db must not be {@literal null}.
* @return can be {@literal null}.
* @throws MongoException
* @throws DataAccessException
*/
@Nullable
T doInDB(MongoDatabase db) throws MongoException, DataAccessException;
}

View File

@@ -1,66 +0,0 @@
package org.springframework.data.mongodb.core;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import com.mongodb.DB;
import org.springframework.transaction.support.ResourceHolderSupport;
import org.springframework.util.Assert;
class DbHolder extends ResourceHolderSupport {
private static final Object DEFAULT_KEY = new Object();
private final Map<Object, DB> dbMap = new ConcurrentHashMap<Object, DB>();
public DbHolder(DB db) {
addDB(db);
}
public DbHolder(Object key, DB db) {
addDB(key, db);
}
public DB getDB() {
return getDB(DEFAULT_KEY);
}
public DB getDB(Object key) {
return this.dbMap.get(key);
}
public DB getAnyDB() {
if (!this.dbMap.isEmpty()) {
return this.dbMap.values().iterator().next();
}
return null;
}
public void addDB(DB session) {
addDB(DEFAULT_KEY, session);
}
public void addDB(Object key, DB session) {
Assert.notNull(key, "Key must not be null");
Assert.notNull(session, "DB must not be null");
this.dbMap.put(key, session);
}
public DB removeDB(Object key) {
return this.dbMap.remove(key);
}
public boolean containsDB(DB session) {
return this.dbMap.containsValue(session);
}
public boolean isEmpty() {
return this.dbMap.isEmpty();
}
public boolean doesNotHoldNonDefaultDB() {
synchronized (this.dbMap) {
return this.dbMap.isEmpty() || (this.dbMap.size() == 1 && this.dbMap.containsKey(DEFAULT_KEY));
}
}
}

View File

@@ -35,6 +35,7 @@ import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Query; import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update; import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.Pair; import org.springframework.data.util.Pair;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import com.mongodb.BulkWriteException; import com.mongodb.BulkWriteException;
@@ -67,7 +68,7 @@ class DefaultBulkOperations implements BulkOperations {
private final List<WriteModel<Document>> models = new ArrayList<>(); private final List<WriteModel<Document>> models = new ArrayList<>();
private PersistenceExceptionTranslator exceptionTranslator; private PersistenceExceptionTranslator exceptionTranslator;
private WriteConcern defaultWriteConcern; private @Nullable WriteConcern defaultWriteConcern;
private BulkWriteOptions bulkOptions; private BulkWriteOptions bulkOptions;
@@ -99,7 +100,7 @@ class DefaultBulkOperations implements BulkOperations {
* *
* @param exceptionTranslator can be {@literal null}. * @param exceptionTranslator can be {@literal null}.
*/ */
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) { public void setExceptionTranslator(@Nullable PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? new MongoExceptionTranslator() : exceptionTranslator; this.exceptionTranslator = exceptionTranslator == null ? new MongoExceptionTranslator() : exceptionTranslator;
} }
@@ -108,7 +109,7 @@ class DefaultBulkOperations implements BulkOperations {
* *
* @param defaultWriteConcern can be {@literal null}. * @param defaultWriteConcern can be {@literal null}.
*/ */
void setDefaultWriteConcern(WriteConcern defaultWriteConcern) { void setDefaultWriteConcern(@Nullable WriteConcern defaultWriteConcern) {
this.defaultWriteConcern = defaultWriteConcern; this.defaultWriteConcern = defaultWriteConcern;
} }

View File

@@ -29,6 +29,7 @@ import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo; import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.index.IndexOperations; import org.springframework.data.mongodb.core.index.IndexOperations;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity; import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import com.mongodb.MongoException; import com.mongodb.MongoException;
@@ -52,7 +53,7 @@ public class DefaultIndexOperations implements IndexOperations {
private final MongoDbFactory mongoDbFactory; private final MongoDbFactory mongoDbFactory;
private final String collectionName; private final String collectionName;
private final QueryMapper mapper; private final QueryMapper mapper;
private final Class<?> type; private final @Nullable Class<?> type;
/** /**
* Creates a new {@link DefaultIndexOperations}. * Creates a new {@link DefaultIndexOperations}.
@@ -62,7 +63,6 @@ public class DefaultIndexOperations implements IndexOperations {
* @param queryMapper must not be {@literal null}. * @param queryMapper must not be {@literal null}.
*/ */
public DefaultIndexOperations(MongoDbFactory mongoDbFactory, String collectionName, QueryMapper queryMapper) { public DefaultIndexOperations(MongoDbFactory mongoDbFactory, String collectionName, QueryMapper queryMapper) {
this(mongoDbFactory, collectionName, queryMapper, null); this(mongoDbFactory, collectionName, queryMapper, null);
} }
@@ -76,7 +76,7 @@ public class DefaultIndexOperations implements IndexOperations {
* @since 1.10 * @since 1.10
*/ */
public DefaultIndexOperations(MongoDbFactory mongoDbFactory, String collectionName, QueryMapper queryMapper, public DefaultIndexOperations(MongoDbFactory mongoDbFactory, String collectionName, QueryMapper queryMapper,
Class<?> type) { @Nullable Class<?> type) {
Assert.notNull(mongoDbFactory, "MongoDbFactory must not be null!"); Assert.notNull(mongoDbFactory, "MongoDbFactory must not be null!");
Assert.notNull(collectionName, "Collection name can not be null!"); Assert.notNull(collectionName, "Collection name can not be null!");
@@ -98,10 +98,6 @@ public class DefaultIndexOperations implements IndexOperations {
Document indexOptions = indexDefinition.getIndexOptions(); Document indexOptions = indexDefinition.getIndexOptions();
if (indexOptions == null) {
return collection.createIndex(indexDefinition.getIndexKeys());
}
IndexOptions ops = IndexConverters.indexDefinitionToIndexOptionsConverter().convert(indexDefinition); IndexOptions ops = IndexConverters.indexDefinitionToIndexOptionsConverter().convert(indexDefinition);
if (indexOptions.containsKey(PARTIAL_FILTER_EXPRESSION_KEY)) { if (indexOptions.containsKey(PARTIAL_FILTER_EXPRESSION_KEY)) {
@@ -116,7 +112,8 @@ public class DefaultIndexOperations implements IndexOperations {
}); });
} }
private MongoPersistentEntity<?> lookupPersistentEntity(Class<?> entityType, String collection) { @Nullable
private MongoPersistentEntity<?> lookupPersistentEntity(@Nullable Class<?> entityType, String collection) {
if (entityType != null) { if (entityType != null) {
return mapper.getMappingContext().getRequiredPersistentEntity(entityType); return mapper.getMappingContext().getRequiredPersistentEntity(entityType);
@@ -185,6 +182,7 @@ public class DefaultIndexOperations implements IndexOperations {
}); });
} }
@Nullable
public <T> T execute(CollectionCallback<T> callback) { public <T> T execute(CollectionCallback<T> callback) {
Assert.notNull(callback, "CollectionCallback must not be null!"); Assert.notNull(callback, "CollectionCallback must not be null!");

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2016 the original author or authors. * Copyright 2016-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -21,10 +21,10 @@ import org.springframework.data.mongodb.core.index.IndexOperations;
import org.springframework.data.mongodb.core.index.IndexOperationsProvider; import org.springframework.data.mongodb.core.index.IndexOperationsProvider;
/** /**
* {@link IndexOperationsProvider} to obtain {@link IndexOperations} from a given {@link MongoDbFactory}. TODO: Review * {@link IndexOperationsProvider} to obtain {@link IndexOperations} from a given {@link MongoDbFactory}.
* me
* *
* @author Mark Paluch * @author Mark Paluch
* @author Christoph Strobl
* @since 2.0 * @since 2.0
*/ */
class DefaultIndexOperationsProvider implements IndexOperationsProvider { class DefaultIndexOperationsProvider implements IndexOperationsProvider {
@@ -34,12 +34,16 @@ class DefaultIndexOperationsProvider implements IndexOperationsProvider {
/** /**
* @param mongoDbFactory must not be {@literal null}. * @param mongoDbFactory must not be {@literal null}.
* @param mapper must not be {@literal null}.
*/ */
DefaultIndexOperationsProvider(MongoDbFactory mongoDbFactory, QueryMapper mapper) { DefaultIndexOperationsProvider(MongoDbFactory mongoDbFactory, QueryMapper mapper) {
this.mongoDbFactory = mongoDbFactory; this.mapper = mapper;
this.mongoDbFactory = mongoDbFactory;
this.mapper = mapper;
} }
/* (non-Javadoc) /*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexOperationsProvider#reactiveIndexOps(java.lang.String) * @see org.springframework.data.mongodb.core.index.IndexOperationsProvider#reactiveIndexOps(java.lang.String)
*/ */
@Override @Override

View File

@@ -15,7 +15,6 @@
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import org.springframework.data.mongodb.core.index.ReactiveIndexOperations;
import reactor.core.publisher.Flux; import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono; import reactor.core.publisher.Mono;
@@ -26,7 +25,9 @@ import org.bson.Document;
import org.springframework.data.mongodb.core.convert.QueryMapper; import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.index.IndexDefinition; import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo; import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.index.ReactiveIndexOperations;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity; import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import com.mongodb.client.model.IndexOptions; import com.mongodb.client.model.IndexOptions;
@@ -85,7 +86,8 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
this.type = type; this.type = type;
} }
/* (non-Javadoc) /*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition) * @see org.springframework.data.mongodb.core.index.ReactiveIndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/ */
public Mono<String> ensureIndex(final IndexDefinition indexDefinition) { public Mono<String> ensureIndex(final IndexDefinition indexDefinition) {
@@ -94,10 +96,6 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
Document indexOptions = indexDefinition.getIndexOptions(); Document indexOptions = indexDefinition.getIndexOptions();
if (indexOptions == null) {
return collection.createIndex(indexDefinition.getIndexKeys());
}
IndexOptions ops = IndexConverters.indexDefinitionToIndexOptionsConverter().convert(indexDefinition); IndexOptions ops = IndexConverters.indexDefinitionToIndexOptionsConverter().convert(indexDefinition);
if (indexOptions.containsKey(PARTIAL_FILTER_EXPRESSION_KEY)) { if (indexOptions.containsKey(PARTIAL_FILTER_EXPRESSION_KEY)) {
@@ -117,6 +115,7 @@ public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
}).next(); }).next();
} }
@Nullable
private MongoPersistentEntity<?> lookupPersistentEntity(String collection) { private MongoPersistentEntity<?> lookupPersistentEntity(String collection) {
Collection<? extends MongoPersistentEntity<?>> entities = queryMapper.getMappingContext().getPersistentEntities(); Collection<? extends MongoPersistentEntity<?>> entities = queryMapper.getMappingContext().getPersistentEntities();

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2014-2016 the original author or authors. * Copyright 2014-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -31,6 +31,7 @@ import org.bson.types.ObjectId;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.script.ExecutableMongoScript; import org.springframework.data.mongodb.core.script.ExecutableMongoScript;
import org.springframework.data.mongodb.core.script.NamedMongoScript; import org.springframework.data.mongodb.core.script.NamedMongoScript;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils; import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils; import org.springframework.util.ObjectUtils;
@@ -42,9 +43,10 @@ import com.mongodb.client.MongoDatabase;
/** /**
* Default implementation of {@link ScriptOperations} capable of saving and executing {@link ServerSideJavaScript}. * Default implementation of {@link ScriptOperations} capable of saving and executing {@link ServerSideJavaScript}.
* *
* @author Christoph Strobl * @author Christoph Strobl
* @author Oliver Gierke * @author Oliver Gierke
* @author Mark Paluch
* @since 1.7 * @since 1.7
*/ */
class DefaultScriptOperations implements ScriptOperations { class DefaultScriptOperations implements ScriptOperations {
@@ -56,7 +58,7 @@ class DefaultScriptOperations implements ScriptOperations {
/** /**
* Creates new {@link DefaultScriptOperations} using given {@link MongoOperations}. * Creates new {@link DefaultScriptOperations} using given {@link MongoOperations}.
* *
* @param mongoOperations must not be {@literal null}. * @param mongoOperations must not be {@literal null}.
*/ */
public DefaultScriptOperations(MongoOperations mongoOperations) { public DefaultScriptOperations(MongoOperations mongoOperations) {
@@ -140,7 +142,7 @@ class DefaultScriptOperations implements ScriptOperations {
Assert.hasText(scriptName, "ScriptName must not be null or empty!"); Assert.hasText(scriptName, "ScriptName must not be null or empty!");
return mongoOperations.exists(query(where("name").is(scriptName)), NamedMongoScript.class, SCRIPT_COLLECTION_NAME); return mongoOperations.exists(query(where("_id").is(scriptName)), NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
} }
/* /*
@@ -189,7 +191,7 @@ class DefaultScriptOperations implements ScriptOperations {
* Generate a valid name for the {@literal JavaScript}. MongoDB requires an id of type String for scripts. Calling * Generate a valid name for the {@literal JavaScript}. MongoDB requires an id of type String for scripts. Calling
* scripts having {@link ObjectId} as id fails. Therefore we create a random UUID without {@code -} (as this won't * scripts having {@link ObjectId} as id fails. Therefore we create a random UUID without {@code -} (as this won't
* work) an prefix the result with {@link #SCRIPT_NAME_PREFIX}. * work) an prefix the result with {@link #SCRIPT_NAME_PREFIX}.
* *
* @return * @return
*/ */
private static String generateScriptName() { private static String generateScriptName() {

View File

@@ -24,6 +24,7 @@ import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults; import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation; import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.util.CloseableIterator; import org.springframework.data.util.CloseableIterator;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
@@ -74,8 +75,8 @@ class ExecutableAggregationOperationSupport implements ExecutableAggregationOper
@NonNull MongoTemplate template; @NonNull MongoTemplate template;
@NonNull Class<T> domainType; @NonNull Class<T> domainType;
Aggregation aggregation; @Nullable Aggregation aggregation;
String collection; @Nullable String collection;
/* /*
* (non-Javadoc) * (non-Javadoc)

View File

@@ -22,6 +22,7 @@ import java.util.stream.Stream;
import org.springframework.data.geo.GeoResults; import org.springframework.data.geo.GeoResults;
import org.springframework.data.mongodb.core.query.NearQuery; import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query; import org.springframework.data.mongodb.core.query.Query;
import org.springframework.lang.Nullable;
/** /**
* {@link ExecutableFindOperation} allows creation and execution of MongoDB find operations in a fluent API style. * {@link ExecutableFindOperation} allows creation and execution of MongoDB find operations in a fluent API style.
@@ -83,6 +84,7 @@ public interface ExecutableFindOperation {
* @return {@literal null} if no match found. * @return {@literal null} if no match found.
* @throws org.springframework.dao.IncorrectResultSizeDataAccessException if more than one match found. * @throws org.springframework.dao.IncorrectResultSizeDataAccessException if more than one match found.
*/ */
@Nullable
T oneValue(); T oneValue();
/** /**
@@ -99,6 +101,7 @@ public interface ExecutableFindOperation {
* *
* @return {@literal null} if no match found. * @return {@literal null} if no match found.
*/ */
@Nullable
T firstValue(); T firstValue();
/** /**

View File

@@ -31,6 +31,7 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.SerializationUtils; import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.data.util.CloseableIterator; import org.springframework.data.util.CloseableIterator;
import org.springframework.data.util.StreamUtils; import org.springframework.data.util.StreamUtils;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils; import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
@@ -88,7 +89,7 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
@NonNull MongoTemplate template; @NonNull MongoTemplate template;
@NonNull Class<?> domainType; @NonNull Class<?> domainType;
Class<T> returnType; Class<T> returnType;
String collection; @Nullable String collection;
Query query; Query query;
/* /*
@@ -204,7 +205,7 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return template.exists(query, domainType, getCollectionName()); return template.exists(query, domainType, getCollectionName());
} }
private List<T> doFind(CursorPreparer preparer) { private List<T> doFind(@Nullable CursorPreparer preparer) {
Document queryObject = query.getQueryObject(); Document queryObject = query.getQueryObject();
Document fieldsObject = query.getFieldsObject(); Document fieldsObject = query.getFieldsObject();
@@ -217,8 +218,8 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return template.doStream(query, domainType, getCollectionName(), returnType); return template.doStream(query, domainType, getCollectionName(), returnType);
} }
private CursorPreparer getCursorPreparer(Query query, CursorPreparer preparer) { private CursorPreparer getCursorPreparer(Query query, @Nullable CursorPreparer preparer) {
return query == null || preparer != null ? preparer : template.new QueryCursorPreparer(query, domainType); return preparer != null ? preparer : template.new QueryCursorPreparer(query, domainType);
} }
private String getCollectionName() { private String getCollectionName() {
@@ -236,10 +237,10 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
*/ */
static class DelegatingQueryCursorPreparer implements CursorPreparer { static class DelegatingQueryCursorPreparer implements CursorPreparer {
private final CursorPreparer delegate; private final @Nullable CursorPreparer delegate;
private Optional<Integer> limit = Optional.empty(); private Optional<Integer> limit = Optional.empty();
DelegatingQueryCursorPreparer(CursorPreparer delegate) { DelegatingQueryCursorPreparer(@Nullable CursorPreparer delegate) {
this.delegate = delegate; this.delegate = delegate;
} }

View File

@@ -24,6 +24,7 @@ import java.util.ArrayList;
import java.util.Collection; import java.util.Collection;
import org.springframework.data.mongodb.core.BulkOperations.BulkMode; import org.springframework.data.mongodb.core.BulkOperations.BulkMode;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
@@ -75,8 +76,8 @@ class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
@NonNull MongoTemplate template; @NonNull MongoTemplate template;
@NonNull Class<T> domainType; @NonNull Class<T> domainType;
String collection; @Nullable String collection;
BulkMode bulkMode; @Nullable BulkMode bulkMode;
/* /*
* (non-Javadoc) * (non-Javadoc)

View File

@@ -23,6 +23,7 @@ import lombok.experimental.FieldDefaults;
import java.util.List; import java.util.List;
import org.springframework.data.mongodb.core.query.Query; import org.springframework.data.mongodb.core.query.Query;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
@@ -77,7 +78,7 @@ class ExecutableRemoveOperationSupport implements ExecutableRemoveOperation {
@NonNull MongoTemplate template; @NonNull MongoTemplate template;
@NonNull Class<T> domainType; @NonNull Class<T> domainType;
Query query; Query query;
String collection; @Nullable String collection;
/* /*
* (non-Javadoc) * (non-Javadoc)

View File

@@ -21,6 +21,7 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update; import org.springframework.data.mongodb.core.query.Update;
import com.mongodb.client.result.UpdateResult; import com.mongodb.client.result.UpdateResult;
import org.springframework.lang.Nullable;
/** /**
* {@link ExecutableUpdateOperation} allows creation and execution of MongoDB update / findAndModify operations in a * {@link ExecutableUpdateOperation} allows creation and execution of MongoDB update / findAndModify operations in a
@@ -148,6 +149,7 @@ public interface ExecutableUpdateOperation {
* *
* @return {@literal null} if nothing found. * @return {@literal null} if nothing found.
*/ */
@Nullable
T findAndModifyValue(); T findAndModifyValue();
} }

View File

@@ -22,6 +22,7 @@ import lombok.experimental.FieldDefaults;
import org.springframework.data.mongodb.core.query.Query; import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update; import org.springframework.data.mongodb.core.query.Update;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
@@ -76,9 +77,9 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
@NonNull MongoTemplate template; @NonNull MongoTemplate template;
@NonNull Class<T> domainType; @NonNull Class<T> domainType;
Query query; Query query;
Update update; @Nullable Update update;
String collection; @Nullable String collection;
FindAndModifyOptions options; @Nullable FindAndModifyOptions options;
/* /*
* (non-Javadoc) * (non-Javadoc)
@@ -160,8 +161,8 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
* @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingFindAndModify#findAndModifyValue() * @see org.springframework.data.mongodb.core.ExecutableUpdateOperation.TerminatingFindAndModify#findAndModifyValue()
*/ */
@Override @Override
public T findAndModifyValue() { public @Nullable T findAndModifyValue() {
return template.findAndModify(query, update, options, domainType, getCollectionName()); return template.findAndModify(query, update, options != null ? options : new FindAndModifyOptions(), domainType, getCollectionName());
} }
private UpdateResult doUpdate(boolean multi, boolean upsert) { private UpdateResult doUpdate(boolean multi, boolean upsert) {

View File

@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.core;
import java.util.Optional; import java.util.Optional;
import org.springframework.data.mongodb.core.query.Collation; import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.lang.Nullable;
/** /**
* @author Mark Pollak * @author Mark Pollak
@@ -30,7 +31,7 @@ public class FindAndModifyOptions {
private boolean upsert; private boolean upsert;
private boolean remove; private boolean remove;
private Collation collation; private @Nullable Collation collation;
/** /**
* Static factory method to create a FindAndModifyOptions instance * Static factory method to create a FindAndModifyOptions instance
@@ -46,7 +47,7 @@ public class FindAndModifyOptions {
* @return * @return
* @since 2.0 * @since 2.0
*/ */
public static FindAndModifyOptions of(FindAndModifyOptions source) { public static FindAndModifyOptions of(@Nullable FindAndModifyOptions source) {
FindAndModifyOptions options = new FindAndModifyOptions(); FindAndModifyOptions options = new FindAndModifyOptions();
if (source == null) { if (source == null) {
@@ -83,7 +84,7 @@ public class FindAndModifyOptions {
* @return * @return
* @since 2.0 * @since 2.0
*/ */
public FindAndModifyOptions collation(Collation collation) { public FindAndModifyOptions collation(@Nullable Collation collation) {
this.collation = collation; this.collation = collation;
return this; return this;

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2016 the original author or authors. * Copyright 2016-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -13,7 +13,6 @@
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import java.util.concurrent.TimeUnit; import java.util.concurrent.TimeUnit;
@@ -22,6 +21,7 @@ import org.bson.Document;
import org.springframework.core.convert.converter.Converter; import org.springframework.core.convert.converter.Converter;
import org.springframework.data.mongodb.core.index.IndexDefinition; import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo; import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.lang.Nullable;
import org.springframework.util.ObjectUtils; import org.springframework.util.ObjectUtils;
import com.mongodb.client.model.Collation; import com.mongodb.client.model.Collation;
@@ -119,7 +119,8 @@ abstract class IndexConverters {
}; };
} }
public static Collation fromDocument(Document source) { @Nullable
public static Collation fromDocument(@Nullable Document source) {
if (source == null) { if (source == null) {
return null; return null;

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011-2016 the original author or authors. * Copyright 2011-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import org.bson.Document; import org.bson.Document;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import com.mongodb.WriteConcern; import com.mongodb.WriteConcern;
@@ -36,26 +37,28 @@ import com.mongodb.WriteConcern;
public class MongoAction { public class MongoAction {
private final String collectionName; private final String collectionName;
private final WriteConcern defaultWriteConcern;
private final Class<?> entityType;
private final MongoActionOperation mongoActionOperation; private final MongoActionOperation mongoActionOperation;
private final Document query;
private final Document document; private final @Nullable WriteConcern defaultWriteConcern;
private final @Nullable Class<?> entityType;
private final @Nullable Document query;
private final @Nullable Document document;
/** /**
* Create an instance of a {@link MongoAction}. * Create an instance of a {@link MongoAction}.
* *
* @param defaultWriteConcern the default write concern. * @param defaultWriteConcern the default write concern. Can be {@literal null}.
* @param mongoActionOperation action being taken against the collection * @param mongoActionOperation action being taken against the collection. Must not be {@literal null}.
* @param collectionName the collection name, must not be {@literal null} or empty. * @param collectionName the collection name, must not be {@literal null} or empty.
* @param entityType the POJO that is being operated against * @param entityType the POJO that is being operated against. Can be {@literal null}.
* @param document the converted Document from the POJO or Spring Update object * @param document the converted Document from the POJO or Spring Update object. Can be {@literal null}.
* @param query the converted Document from the Spring Query object * @param query the converted Document from the Spring Query object. Can be {@literal null}.
*/ */
public MongoAction(WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation, String collectionName, public MongoAction(@Nullable WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation,
Class<?> entityType, Document document, Document query) { String collectionName, @Nullable Class<?> entityType, @Nullable Document document, @Nullable Document query) {
Assert.hasText(collectionName, "Collection name must not be null or empty!"); Assert.hasText(collectionName, "Collection name must not be null or empty!");
Assert.notNull(mongoActionOperation, "MongoActionOperation must not be null!");
this.defaultWriteConcern = defaultWriteConcern; this.defaultWriteConcern = defaultWriteConcern;
this.mongoActionOperation = mongoActionOperation; this.mongoActionOperation = mongoActionOperation;
@@ -69,22 +72,27 @@ public class MongoAction {
return collectionName; return collectionName;
} }
@Nullable
public WriteConcern getDefaultWriteConcern() { public WriteConcern getDefaultWriteConcern() {
return defaultWriteConcern; return defaultWriteConcern;
} }
@Nullable
public Class<?> getEntityType() { public Class<?> getEntityType() {
return entityType; return entityType;
} }
@Nullable
public MongoActionOperation getMongoActionOperation() { public MongoActionOperation getMongoActionOperation() {
return mongoActionOperation; return mongoActionOperation;
} }
@Nullable
public Document getQuery() { public Document getQuery() {
return query; return query;
} }
@Nullable
public Document getDocument() { public Document getDocument() {
return document; return document;
} }

View File

@@ -1,77 +1,78 @@
/* /*
* Copyright 2011-2017 the original author or authors. * Copyright 2011-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* http://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import com.mongodb.MongoClient; import org.bson.Document;
import com.mongodb.client.MongoDatabase; import org.springframework.jmx.export.annotation.ManagedOperation;
import org.bson.Document; import org.springframework.jmx.export.annotation.ManagedResource;
import org.springframework.jmx.export.annotation.ManagedOperation; import org.springframework.util.Assert;
import org.springframework.jmx.export.annotation.ManagedResource;
import org.springframework.util.Assert; import com.mongodb.MongoClient;
import com.mongodb.client.MongoDatabase;
/**
* Mongo server administration exposed via JMX annotations /**
* * Mongo server administration exposed via JMX annotations
* @author Mark Pollack *
* @author Thomas Darimont * @author Mark Pollack
* @author Mark Paluch * @author Thomas Darimont
* @author Christoph Strobl * @author Mark Paluch
*/ * @author Christoph Strobl
@ManagedResource(description = "Mongo Admin Operations") */
public class MongoAdmin implements MongoAdminOperations { @ManagedResource(description = "Mongo Admin Operations")
public class MongoAdmin implements MongoAdminOperations {
private final MongoClient mongoClient;
private final MongoClient mongoClient;
public MongoAdmin(MongoClient mongoClient) {
public MongoAdmin(MongoClient mongoClient) {
Assert.notNull(mongoClient, "MongoClient must not be null!");
this.mongoClient = mongoClient; Assert.notNull(mongoClient, "MongoClient must not be null!");
} this.mongoClient = mongoClient;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#dropDatabase(java.lang.String) /* (non-Javadoc)
*/ * @see org.springframework.data.mongodb.core.core.MongoAdminOperations#dropDatabase(java.lang.String)
@ManagedOperation */
public void dropDatabase(String databaseName) { @ManagedOperation
getDB(databaseName).drop(); public void dropDatabase(String databaseName) {
} getDB(databaseName).drop();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#createDatabase(java.lang.String) /* (non-Javadoc)
*/ * @see org.springframework.data.mongodb.core.core.MongoAdminOperations#createDatabase(java.lang.String)
@ManagedOperation */
public void createDatabase(String databaseName) { @ManagedOperation
getDB(databaseName); public void createDatabase(String databaseName) {
} getDB(databaseName);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#getDatabaseStats(java.lang.String) /* (non-Javadoc)
*/ * @see org.springframework.data.mongodb.core.core.MongoAdminOperations#getDatabaseStats(java.lang.String)
@ManagedOperation */
public String getDatabaseStats(String databaseName) { @ManagedOperation
return getDB(databaseName).runCommand(new Document("dbStats", 1).append("scale" , 1024)).toJson(); public String getDatabaseStats(String databaseName) {
} return getDB(databaseName).runCommand(new Document("dbStats", 1).append("scale", 1024)).toJson();
}
@ManagedOperation
public String getServerStatus() { @ManagedOperation
return getDB("admin").runCommand(new Document("serverStatus", 1).append("rangeDeleter", 1).append("repl", 1)).toJson(); public String getServerStatus() {
} return getDB("admin").runCommand(new Document("serverStatus", 1).append("rangeDeleter", 1).append("repl", 1))
.toJson();
}
MongoDatabase getDB(String databaseName) {
return mongoClient.getDatabase(databaseName); MongoDatabase getDB(String databaseName) {
} return mongoClient.getDatabase(databaseName);
} }
}

View File

@@ -1,34 +1,34 @@
/* /*
* Copyright 2011-2014 the original author or authors. * Copyright 2011-2014 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* http://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import org.springframework.jmx.export.annotation.ManagedOperation; import org.springframework.jmx.export.annotation.ManagedOperation;
/** /**
* @author Mark Pollack * @author Mark Pollack
* @author Oliver Gierke * @author Oliver Gierke
*/ */
public interface MongoAdminOperations { public interface MongoAdminOperations {
@ManagedOperation @ManagedOperation
void dropDatabase(String databaseName); void dropDatabase(String databaseName);
@ManagedOperation @ManagedOperation
void createDatabase(String databaseName); void createDatabase(String databaseName);
@ManagedOperation @ManagedOperation
String getDatabaseStats(String databaseName); String getDatabaseStats(String databaseName);
} }

View File

@@ -23,6 +23,7 @@ import java.util.List;
import org.springframework.beans.factory.config.AbstractFactoryBean; import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator; import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.lang.Nullable;
import org.springframework.util.CollectionUtils; import org.springframework.util.CollectionUtils;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
@@ -33,61 +34,62 @@ import com.mongodb.ServerAddress;
/** /**
* Convenient factory for configuring MongoDB. * Convenient factory for configuring MongoDB.
* *
* @author Christoph Strobl * @author Christoph Strobl
* @author Mark Paluch
* @since 1.7 * @since 1.7
*/ */
public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> implements PersistenceExceptionTranslator { public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> implements PersistenceExceptionTranslator {
private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator(); private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator();
private MongoClientOptions mongoClientOptions; private @Nullable MongoClientOptions mongoClientOptions;
private String host; private @Nullable String host;
private Integer port; private @Nullable Integer port;
private List<ServerAddress> replicaSetSeeds; private List<ServerAddress> replicaSetSeeds = Collections.emptyList();
private List<MongoCredential> credentials; private List<MongoCredential> credentials = Collections.emptyList();
private PersistenceExceptionTranslator exceptionTranslator = DEFAULT_EXCEPTION_TRANSLATOR; private PersistenceExceptionTranslator exceptionTranslator = DEFAULT_EXCEPTION_TRANSLATOR;
/** /**
* Set the {@link MongoClientOptions} to be used when creating {@link MongoClient}. * Set the {@link MongoClientOptions} to be used when creating {@link MongoClient}.
* *
* @param mongoClientOptions * @param mongoClientOptions
*/ */
public void setMongoClientOptions(MongoClientOptions mongoClientOptions) { public void setMongoClientOptions(@Nullable MongoClientOptions mongoClientOptions) {
this.mongoClientOptions = mongoClientOptions; this.mongoClientOptions = mongoClientOptions;
} }
/** /**
* Set the list of credentials to be used when creating {@link MongoClient}. * Set the list of credentials to be used when creating {@link MongoClient}.
* *
* @param credentials can be {@literal null}. * @param credentials can be {@literal null}.
*/ */
public void setCredentials(MongoCredential[] credentials) { public void setCredentials(@Nullable MongoCredential[] credentials) {
this.credentials = filterNonNullElementsAsList(credentials); this.credentials = filterNonNullElementsAsList(credentials);
} }
/** /**
* Set the list of {@link ServerAddress} to build up a replica set for. * Set the list of {@link ServerAddress} to build up a replica set for.
* *
* @param replicaSetSeeds can be {@literal null}. * @param replicaSetSeeds can be {@literal null}.
*/ */
public void setReplicaSetSeeds(ServerAddress[] replicaSetSeeds) { public void setReplicaSetSeeds(@Nullable ServerAddress[] replicaSetSeeds) {
this.replicaSetSeeds = filterNonNullElementsAsList(replicaSetSeeds); this.replicaSetSeeds = filterNonNullElementsAsList(replicaSetSeeds);
} }
/** /**
* Configures the host to connect to. * Configures the host to connect to.
* *
* @param host * @param host
*/ */
public void setHost(String host) { public void setHost(@Nullable String host) {
this.host = host; this.host = host;
} }
/** /**
* Configures the port to connect to. * Configures the port to connect to.
* *
* @param port * @param port
*/ */
public void setPort(int port) { public void setPort(int port) {
@@ -96,10 +98,10 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> imp
/** /**
* Configures the {@link PersistenceExceptionTranslator} to use. * Configures the {@link PersistenceExceptionTranslator} to use.
* *
* @param exceptionTranslator * @param exceptionTranslator
*/ */
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) { public void setExceptionTranslator(@Nullable PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator; this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator;
} }
@@ -115,11 +117,12 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> imp
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException) * @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
*/ */
@Nullable
public DataAccessException translateExceptionIfPossible(RuntimeException ex) { public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
return exceptionTranslator.translateExceptionIfPossible(ex); return exceptionTranslator.translateExceptionIfPossible(ex);
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance() * @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/ */
@@ -130,20 +133,19 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> imp
mongoClientOptions = MongoClientOptions.builder().build(); mongoClientOptions = MongoClientOptions.builder().build();
} }
if (credentials == null) {
credentials = Collections.emptyList();
}
return createMongoClient(); return createMongoClient();
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object) * @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object)
*/ */
@Override @Override
protected void destroyInstance(MongoClient instance) throws Exception { protected void destroyInstance(@Nullable MongoClient instance) throws Exception {
instance.close();
if (instance != null) {
instance.close();
}
} }
private MongoClient createMongoClient() throws UnknownHostException { private MongoClient createMongoClient() throws UnknownHostException {
@@ -165,11 +167,11 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> imp
/** /**
* Returns the given array as {@link List} with all {@literal null} elements removed. * Returns the given array as {@link List} with all {@literal null} elements removed.
* *
* @param elements the elements to filter <T>, can be {@literal null}. * @param elements the elements to filter <T>, can be {@literal null}.
* @return a new unmodifiable {@link List#} from the given elements without {@literal null}s. * @return a new unmodifiable {@link List#} from the given elements without {@literal null}s.
*/ */
private static <T> List<T> filterNonNullElementsAsList(T[] elements) { private static <T> List<T> filterNonNullElementsAsList(@Nullable T[] elements) {
if (elements == null) { if (elements == null) {
return Collections.emptyList(); return Collections.emptyList();

View File

@@ -20,6 +20,7 @@ import javax.net.ssl.SSLSocketFactory;
import org.springframework.beans.factory.config.AbstractFactoryBean; import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.data.mongodb.MongoDbFactory; import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.lang.Nullable;
import com.mongodb.DBDecoderFactory; import com.mongodb.DBDecoderFactory;
import com.mongodb.DBEncoderFactory; import com.mongodb.DBEncoderFactory;
@@ -40,7 +41,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
private static final MongoClientOptions DEFAULT_MONGO_OPTIONS = MongoClientOptions.builder().build(); private static final MongoClientOptions DEFAULT_MONGO_OPTIONS = MongoClientOptions.builder().build();
private String description = DEFAULT_MONGO_OPTIONS.getDescription(); private @Nullable String description = DEFAULT_MONGO_OPTIONS.getDescription();
private int minConnectionsPerHost = DEFAULT_MONGO_OPTIONS.getMinConnectionsPerHost(); private int minConnectionsPerHost = DEFAULT_MONGO_OPTIONS.getMinConnectionsPerHost();
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.getConnectionsPerHost(); private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.getConnectionsPerHost();
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS
@@ -51,11 +52,11 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
private int connectTimeout = DEFAULT_MONGO_OPTIONS.getConnectTimeout(); private int connectTimeout = DEFAULT_MONGO_OPTIONS.getConnectTimeout();
private int socketTimeout = DEFAULT_MONGO_OPTIONS.getSocketTimeout(); private int socketTimeout = DEFAULT_MONGO_OPTIONS.getSocketTimeout();
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.isSocketKeepAlive(); private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.isSocketKeepAlive();
private ReadPreference readPreference = DEFAULT_MONGO_OPTIONS.getReadPreference(); private @Nullable ReadPreference readPreference = DEFAULT_MONGO_OPTIONS.getReadPreference();
private DBDecoderFactory dbDecoderFactory = DEFAULT_MONGO_OPTIONS.getDbDecoderFactory(); private DBDecoderFactory dbDecoderFactory = DEFAULT_MONGO_OPTIONS.getDbDecoderFactory();
private DBEncoderFactory dbEncoderFactory = DEFAULT_MONGO_OPTIONS.getDbEncoderFactory(); private DBEncoderFactory dbEncoderFactory = DEFAULT_MONGO_OPTIONS.getDbEncoderFactory();
private WriteConcern writeConcern = DEFAULT_MONGO_OPTIONS.getWriteConcern(); private @Nullable WriteConcern writeConcern = DEFAULT_MONGO_OPTIONS.getWriteConcern();
private SocketFactory socketFactory = DEFAULT_MONGO_OPTIONS.getSocketFactory(); private @Nullable SocketFactory socketFactory = DEFAULT_MONGO_OPTIONS.getSocketFactory();
private boolean cursorFinalizerEnabled = DEFAULT_MONGO_OPTIONS.isCursorFinalizerEnabled(); private boolean cursorFinalizerEnabled = DEFAULT_MONGO_OPTIONS.isCursorFinalizerEnabled();
private boolean alwaysUseMBeans = DEFAULT_MONGO_OPTIONS.isAlwaysUseMBeans(); private boolean alwaysUseMBeans = DEFAULT_MONGO_OPTIONS.isAlwaysUseMBeans();
private int heartbeatFrequency = DEFAULT_MONGO_OPTIONS.getHeartbeatFrequency(); private int heartbeatFrequency = DEFAULT_MONGO_OPTIONS.getHeartbeatFrequency();
@@ -66,14 +67,14 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
private int serverSelectionTimeout = DEFAULT_MONGO_OPTIONS.getServerSelectionTimeout(); private int serverSelectionTimeout = DEFAULT_MONGO_OPTIONS.getServerSelectionTimeout();
private boolean ssl; private boolean ssl;
private SSLSocketFactory sslSocketFactory; private @Nullable SSLSocketFactory sslSocketFactory;
/** /**
* Set the {@link MongoClient} description. * Set the {@link MongoClient} description.
* *
* @param description * @param description
*/ */
public void setDescription(String description) { public void setDescription(@Nullable String description) {
this.description = description; this.description = description;
} }
@@ -166,7 +167,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
* *
* @param readPreference * @param readPreference
*/ */
public void setReadPreference(ReadPreference readPreference) { public void setReadPreference(@Nullable ReadPreference readPreference) {
this.readPreference = readPreference; this.readPreference = readPreference;
} }
@@ -176,14 +177,14 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
* *
* @param writeConcern * @param writeConcern
*/ */
public void setWriteConcern(WriteConcern writeConcern) { public void setWriteConcern(@Nullable WriteConcern writeConcern) {
this.writeConcern = writeConcern; this.writeConcern = writeConcern;
} }
/** /**
* @param socketFactory * @param socketFactory
*/ */
public void setSocketFactory(SocketFactory socketFactory) { public void setSocketFactory(@Nullable SocketFactory socketFactory) {
this.socketFactory = socketFactory; this.socketFactory = socketFactory;
} }
@@ -248,7 +249,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
* *
* @param sslSocketFactory * @param sslSocketFactory
*/ */
public void setSslSocketFactory(SSLSocketFactory sslSocketFactory) { public void setSslSocketFactory(@Nullable SSLSocketFactory sslSocketFactory) {
this.sslSocketFactory = sslSocketFactory; this.sslSocketFactory = sslSocketFactory;
this.ssl = sslSocketFactory != null; this.ssl = sslSocketFactory != null;
@@ -269,11 +270,13 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance() * @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/ */
@SuppressWarnings("ConstantConditions")
@Override @Override
protected MongoClientOptions createInstance() throws Exception { protected MongoClientOptions createInstance() throws Exception {
SocketFactory socketFactoryToUse = ssl SocketFactory socketFactoryToUse = ssl
? (sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory.getDefault()) : this.socketFactory; ? (sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory.getDefault())
: this.socketFactory;
return MongoClientOptions.builder() // return MongoClientOptions.builder() //
.alwaysUseMBeans(this.alwaysUseMBeans) // .alwaysUseMBeans(this.alwaysUseMBeans) //

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013 the original author or authors. * Copyright 2013-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import org.springframework.dao.DataIntegrityViolationException; import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import com.mongodb.WriteResult; import com.mongodb.WriteResult;
@@ -40,7 +41,7 @@ public class MongoDataIntegrityViolationException extends DataIntegrityViolation
* @param actionOperation the {@link MongoActionOperation} that caused the exception, must not be {@literal null}. * @param actionOperation the {@link MongoActionOperation} that caused the exception, must not be {@literal null}.
*/ */
public MongoDataIntegrityViolationException(String message, WriteResult writeResult, public MongoDataIntegrityViolationException(String message, WriteResult writeResult,
MongoActionOperation actionOperation) { MongoActionOperation actionOperation) {
super(message); super(message);

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2010-2016 the original author or authors. * Copyright 2010-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import java.util.Arrays; import java.util.Arrays;
import java.util.Collections;
import java.util.HashSet; import java.util.HashSet;
import java.util.Set; import java.util.Set;
@@ -31,6 +32,7 @@ import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.BulkOperationException; import org.springframework.data.mongodb.BulkOperationException;
import org.springframework.data.mongodb.UncategorizedMongoDbException; import org.springframework.data.mongodb.UncategorizedMongoDbException;
import org.springframework.data.mongodb.util.MongoDbErrorCodes; import org.springframework.data.mongodb.util.MongoDbErrorCodes;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils; import org.springframework.util.ClassUtils;
import com.mongodb.BulkWriteException; import com.mongodb.BulkWriteException;
@@ -50,23 +52,24 @@ import com.mongodb.bulk.BulkWriteError;
*/ */
public class MongoExceptionTranslator implements PersistenceExceptionTranslator { public class MongoExceptionTranslator implements PersistenceExceptionTranslator {
private static final Set<String> DULICATE_KEY_EXCEPTIONS = new HashSet<String>( private static final Set<String> DUPLICATE_KEY_EXCEPTIONS = new HashSet<>(
Arrays.asList("MongoException.DuplicateKey", "DuplicateKeyException")); Arrays.asList("MongoException.DuplicateKey", "DuplicateKeyException"));
private static final Set<String> RESOURCE_FAILURE_EXCEPTIONS = new HashSet<String>( private static final Set<String> RESOURCE_FAILURE_EXCEPTIONS = new HashSet<>(
Arrays.asList("MongoException.Network", "MongoSocketException", "MongoException.CursorNotFound", Arrays.asList("MongoException.Network", "MongoSocketException", "MongoException.CursorNotFound",
"MongoCursorNotFoundException", "MongoServerSelectionException", "MongoTimeoutException")); "MongoCursorNotFoundException", "MongoServerSelectionException", "MongoTimeoutException"));
private static final Set<String> RESOURCE_USAGE_EXCEPTIONS = new HashSet<String>( private static final Set<String> RESOURCE_USAGE_EXCEPTIONS = new HashSet<>(
Arrays.asList("MongoInternalException")); Collections.singletonList("MongoInternalException"));
private static final Set<String> DATA_INTEGRETY_EXCEPTIONS = new HashSet<String>( private static final Set<String> DATA_INTEGRITY_EXCEPTIONS = new HashSet<>(
Arrays.asList("WriteConcernException", "MongoWriteException", "MongoBulkWriteException")); Arrays.asList("WriteConcernException", "MongoWriteException", "MongoBulkWriteException"));
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException) * @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
*/ */
@Nullable
public DataAccessException translateExceptionIfPossible(RuntimeException ex) { public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
// Check for well-known MongoException subclasses. // Check for well-known MongoException subclasses.
@@ -77,7 +80,7 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
String exception = ClassUtils.getShortName(ClassUtils.getUserClass(ex.getClass())); String exception = ClassUtils.getShortName(ClassUtils.getUserClass(ex.getClass()));
if (DULICATE_KEY_EXCEPTIONS.contains(exception)) { if (DUPLICATE_KEY_EXCEPTIONS.contains(exception)) {
return new DuplicateKeyException(ex.getMessage(), ex); return new DuplicateKeyException(ex.getMessage(), ex);
} }
@@ -89,7 +92,7 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex); return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex);
} }
if (DATA_INTEGRETY_EXCEPTIONS.contains(exception)) { if (DATA_INTEGRITY_EXCEPTIONS.contains(exception)) {
if (ex instanceof MongoServerException) { if (ex instanceof MongoServerException) {
if (((MongoServerException) ex).getCode() == 11000) { if (((MongoServerException) ex).getCode() == 11000) {

View File

@@ -1,29 +0,0 @@
/*
* Copyright 2012-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.transaction.support.ResourceHolder;
import org.springframework.transaction.support.ResourceHolderSynchronization;
/**
* @author Oliver Gierke
*/
class MongoSynchronization extends ResourceHolderSynchronization<ResourceHolder, Object> {
public MongoSynchronization(ResourceHolder resourceHolder, Object resourceKey) {
super(resourceHolder, resourceKey);
}
}

View File

@@ -1,34 +1,34 @@
/* /*
* Copyright 2016 the original author or authors. * Copyright 2016-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* http://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import org.springframework.dao.DataAccessException; import org.bson.Document;
import org.reactivestreams.Publisher;
import com.mongodb.MongoException; import org.springframework.dao.DataAccessException;
import com.mongodb.reactivestreams.client.MongoCollection;
import org.bson.Document; import com.mongodb.MongoException;
import org.reactivestreams.Publisher; import com.mongodb.reactivestreams.client.MongoCollection;
/** /**
* @author Mark Paluch * @author Mark Paluch
* @param <T> * @param <T>
* @since 2.0 * @since 2.0
*/ */
public interface ReactiveCollectionCallback<T> { public interface ReactiveCollectionCallback<T> {
Publisher<T> doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException; Publisher<T> doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException;
} }

View File

@@ -19,6 +19,7 @@ import lombok.AccessLevel;
import lombok.NonNull; import lombok.NonNull;
import lombok.RequiredArgsConstructor; import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults; import lombok.experimental.FieldDefaults;
import org.springframework.lang.Nullable;
import reactor.core.publisher.Flux; import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono; import reactor.core.publisher.Mono;
@@ -195,7 +196,7 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
return template.exists(query, domainType, getCollectionName()); return template.exists(query, domainType, getCollectionName());
} }
private Flux<T> doFind(FindPublisherPreparer preparer) { private Flux<T> doFind(@Nullable FindPublisherPreparer preparer) {
Document queryObject = query.getQueryObject(); Document queryObject = query.getQueryObject();
Document fieldsObject = query.getFieldsObject(); Document fieldsObject = query.getFieldsObject();

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2016 the original author or authors. * Copyright 2016-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -19,6 +19,7 @@ package org.springframework.data.mongodb.core;
import org.springframework.beans.factory.config.AbstractFactoryBean; import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator; import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.lang.Nullable;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
import com.mongodb.async.client.MongoClientSettings; import com.mongodb.async.client.MongoClientSettings;
@@ -37,10 +38,10 @@ public class ReactiveMongoClientFactoryBean extends AbstractFactoryBean<MongoCli
private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator(); private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator();
private String connectionString; private @Nullable String connectionString;
private String host; private @Nullable String host;
private Integer port; private @Nullable Integer port;
private MongoClientSettings mongoClientSettings; private @Nullable MongoClientSettings mongoClientSettings;
private PersistenceExceptionTranslator exceptionTranslator = DEFAULT_EXCEPTION_TRANSLATOR; private PersistenceExceptionTranslator exceptionTranslator = DEFAULT_EXCEPTION_TRANSLATOR;
/** /**
@@ -48,7 +49,7 @@ public class ReactiveMongoClientFactoryBean extends AbstractFactoryBean<MongoCli
* *
* @param host * @param host
*/ */
public void setHost(String host) { public void setHost(@Nullable String host) {
this.host = host; this.host = host;
} }
@@ -66,7 +67,7 @@ public class ReactiveMongoClientFactoryBean extends AbstractFactoryBean<MongoCli
* *
* @param connectionString * @param connectionString
*/ */
public void setConnectionString(String connectionString) { public void setConnectionString(@Nullable String connectionString) {
this.connectionString = connectionString; this.connectionString = connectionString;
} }
@@ -75,7 +76,7 @@ public class ReactiveMongoClientFactoryBean extends AbstractFactoryBean<MongoCli
* *
* @param mongoClientSettings * @param mongoClientSettings
*/ */
public void setMongoClientSettings(MongoClientSettings mongoClientSettings) { public void setMongoClientSettings(@Nullable MongoClientSettings mongoClientSettings) {
this.mongoClientSettings = mongoClientSettings; this.mongoClientSettings = mongoClientSettings;
} }
@@ -84,7 +85,7 @@ public class ReactiveMongoClientFactoryBean extends AbstractFactoryBean<MongoCli
* *
* @param exceptionTranslator * @param exceptionTranslator
*/ */
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) { public void setExceptionTranslator(@Nullable PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator; this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator;
} }
@@ -118,7 +119,7 @@ public class ReactiveMongoClientFactoryBean extends AbstractFactoryBean<MongoCli
} }
@Override @Override
protected void destroyInstance(MongoClient instance) throws Exception { protected void destroyInstance(@Nullable MongoClient instance) throws Exception {
instance.close(); instance.close();
} }

View File

@@ -35,6 +35,7 @@ import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery; import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query; import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update; import org.springframework.data.mongodb.core.query.Update;
import org.springframework.lang.Nullable;
import com.mongodb.ReadPreference; import com.mongodb.ReadPreference;
import com.mongodb.client.result.DeleteResult; import com.mongodb.client.result.DeleteResult;
@@ -60,6 +61,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/** /**
* Returns the reactive operations that can be performed on indexes * Returns the reactive operations that can be performed on indexes
* *
* @param collectionName must not be {@literal null}.
* @return index operations on the named collection * @return index operations on the named collection
*/ */
ReactiveIndexOperations indexOps(String collectionName); ReactiveIndexOperations indexOps(String collectionName);
@@ -67,13 +69,14 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/** /**
* Returns the reactive operations that can be performed on indexes * Returns the reactive operations that can be performed on indexes
* *
* @param entityClass must not be {@literal null}.
* @return index operations on the named collection associated with the given entity class * @return index operations on the named collection associated with the given entity class
*/ */
ReactiveIndexOperations indexOps(Class<?> entityClass); ReactiveIndexOperations indexOps(Class<?> entityClass);
/** /**
* Execute the a MongoDB command expressed as a JSON string. This will call the method JSON.parse that is part of the * Execute the a MongoDB command expressed as a JSON string. This will call the method JSON.parse that is part of the
* MongoDB driver to convert the JSON string to a DBObject. Any errors that result from executing this command will be * MongoDB driver to convert the JSON string to a Document. Any errors that result from executing this command will be
* converted into Spring's DAO exception hierarchy. * converted into Spring's DAO exception hierarchy.
* *
* @param jsonCommand a MongoDB command expressed as a JSON string. * @param jsonCommand a MongoDB command expressed as a JSON string.
@@ -85,7 +88,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's DAO * Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's DAO
* exception hierarchy. * exception hierarchy.
* *
* @param command a MongoDB command * @param command a MongoDB command.
* @return a result object returned by the action * @return a result object returned by the action
*/ */
Mono<Document> executeCommand(Document command); Mono<Document> executeCommand(Document command);
@@ -96,17 +99,18 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* *
* @param command a MongoDB command, must not be {@literal null}. * @param command a MongoDB command, must not be {@literal null}.
* @param readPreference read preferences to use, can be {@literal null}. * @param readPreference read preferences to use, can be {@literal null}.
* @return a result object returned by the action * @return a result object returned by the action.
*/ */
Mono<Document> executeCommand(Document command, ReadPreference readPreference); Mono<Document> executeCommand(Document command, @Nullable ReadPreference readPreference);
/** /**
* Executes a {@link ReactiveDatabaseCallback} translating any exceptions as necessary. * Executes a {@link ReactiveDatabaseCallback} translating any exceptions as necessary.
* <p/> * <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects. * Allows for returning a result object, that is a domain object or a collection of domain objects.
* *
* @param <T> return type * @param action callback object that specifies the MongoDB actions to perform on the passed in DB instance. Must not
* @param action callback object that specifies the MongoDB actions to perform on the passed in DB instance. * be {@literal null}.
* @param <T> return type.
* @return a result object returned by the action * @return a result object returned by the action
*/ */
<T> Flux<T> execute(ReactiveDatabaseCallback<T> action); <T> Flux<T> execute(ReactiveDatabaseCallback<T> action);
@@ -116,10 +120,10 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <p/> * <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects. * Allows for returning a result object, that is a domain object or a collection of domain objects.
* *
* @param entityClass class that determines the collection to use * @param entityClass class that determines the collection to use. Must not be {@literal null}.
* @param <T> return type * @param action callback object that specifies the MongoDB action. Must not be {@literal null}.
* @param action callback object that specifies the MongoDB action * @param <T> return type.
* @return a result object returned by the action or <tt>null</tt> * @return a result object returned by the action or {@literal null}.
*/ */
<T> Flux<T> execute(Class<?> entityClass, ReactiveCollectionCallback<T> action); <T> Flux<T> execute(Class<?> entityClass, ReactiveCollectionCallback<T> action);
@@ -128,51 +132,53 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <p/> * <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects. * Allows for returning a result object, that is a domain object or a collection of domain objects.
* *
* @param <T> return type * @param collectionName the name of the collection that specifies which {@link MongoCollection} instance will be
* @param collectionName the name of the collection that specifies which DBCollection instance will be passed into * passed into. Must not be {@literal null} or empty.
* @param action callback object that specifies the MongoDB action the callback action. * @param action callback object that specifies the MongoDB action the callback action. Must not be {@literal null}.
* @return a result object returned by the action or <tt>null</tt> * @param <T> return type.
* @return a result object returned by the action or {@literal null}.
*/ */
<T> Flux<T> execute(String collectionName, ReactiveCollectionCallback<T> action); <T> Flux<T> execute(String collectionName, ReactiveCollectionCallback<T> action);
/** /**
* Create an uncapped collection with a name based on the provided entity class. * Create an uncapped collection with a name based on the provided entity class.
* *
* @param entityClass class that determines the collection to create * @param entityClass class that determines the collection to create.
* @return the created collection * @return the created collection.
*/ */
<T> Mono<MongoCollection<Document>> createCollection(Class<T> entityClass); <T> Mono<MongoCollection<Document>> createCollection(Class<T> entityClass);
/** /**
* Create a collection with a name based on the provided entity class using the options. * Create a collection with a name based on the provided entity class using the options.
* *
* @param entityClass class that determines the collection to create * @param entityClass class that determines the collection to create. Must not be {@literal null}.
* @param collectionOptions options to use when creating the collection. * @param collectionOptions options to use when creating the collection.
* @return the created collection * @return the created collection.
*/ */
<T> Mono<MongoCollection<Document>> createCollection(Class<T> entityClass, CollectionOptions collectionOptions); <T> Mono<MongoCollection<Document>> createCollection(Class<T> entityClass,
@Nullable CollectionOptions collectionOptions);
/** /**
* Create an uncapped collection with the provided name. * Create an uncapped collection with the provided name.
* *
* @param collectionName name of the collection * @param collectionName name of the collection.
* @return the created collection * @return the created collection.
*/ */
Mono<MongoCollection<Document>> createCollection(String collectionName); Mono<MongoCollection<Document>> createCollection(String collectionName);
/** /**
* Create a collection with the provided name and options. * Create a collection with the provided name and options.
* *
* @param collectionName name of the collection * @param collectionName name of the collection. Must not be {@literal null} nor empty.
* @param collectionOptions options to use when creating the collection. * @param collectionOptions options to use when creating the collection.
* @return the created collection * @return the created collection.
*/ */
Mono<MongoCollection<Document>> createCollection(String collectionName, CollectionOptions collectionOptions); Mono<MongoCollection<Document>> createCollection(String collectionName, CollectionOptions collectionOptions);
/** /**
* A set of collection names. * A set of collection names.
* *
* @return Flux of collection names * @return Flux of collection names.
*/ */
Flux<String> getCollectionNames(); Flux<String> getCollectionNames();
@@ -181,7 +187,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <p/> * <p/>
* Translate any exceptions as necessary. * Translate any exceptions as necessary.
* *
* @param collectionName name of the collection * @param collectionName name of the collection.
* @return an existing collection or a newly created one. * @return an existing collection or a newly created one.
*/ */
MongoCollection<Document> getCollection(String collectionName); MongoCollection<Document> getCollection(String collectionName);
@@ -191,7 +197,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <p/> * <p/>
* Translate any exceptions as necessary. * Translate any exceptions as necessary.
* *
* @param entityClass class that determines the name of the collection * @param entityClass class that determines the name of the collection. Must not be {@literal null}.
* @return true if a collection with the given name is found, false otherwise. * @return true if a collection with the given name is found, false otherwise.
*/ */
<T> Mono<Boolean> collectionExists(Class<T> entityClass); <T> Mono<Boolean> collectionExists(Class<T> entityClass);
@@ -201,7 +207,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <p/> * <p/>
* Translate any exceptions as necessary. * Translate any exceptions as necessary.
* *
* @param collectionName name of the collection * @param collectionName name of the collection. Must not be {@literal null}.
* @return true if a collection with the given name is found, false otherwise. * @return true if a collection with the given name is found, false otherwise.
*/ */
Mono<Boolean> collectionExists(String collectionName); Mono<Boolean> collectionExists(String collectionName);
@@ -211,7 +217,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <p/> * <p/>
* Translate any exceptions as necessary. * Translate any exceptions as necessary.
* *
* @param entityClass class that determines the collection to drop/delete. * @param entityClass class that determines the collection to drop/delete. Must not be {@literal null}.
*/ */
<T> Mono<Void> dropCollection(Class<T> entityClass); <T> Mono<Void> dropCollection(Class<T> entityClass);
@@ -234,7 +240,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* to map objects since the test for class type is done in the client and not on the server. * to map objects since the test for class type is done in the client and not on the server.
* *
* @param entityClass the parametrized type of the returned {@link Flux}. * @param entityClass the parametrized type of the returned {@link Flux}.
* @return the converted collection * @return the converted collection.
*/ */
<T> Flux<T> findAll(Class<T> entityClass); <T> Flux<T> findAll(Class<T> entityClass);
@@ -248,8 +254,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* to map objects since the test for class type is done in the client and not on the server. * to map objects since the test for class type is done in the client and not on the server.
* *
* @param entityClass the parametrized type of the returned {@link Flux}. * @param entityClass the parametrized type of the returned {@link Flux}.
* @param collectionName name of the collection to retrieve the objects from * @param collectionName name of the collection to retrieve the objects from.
* @return the converted collection * @return the converted collection.
*/ */
<T> Flux<T> findAll(Class<T> entityClass, String collectionName); <T> Flux<T> findAll(Class<T> entityClass, String collectionName);
@@ -264,9 +270,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* feature rich {@link Query}. * feature rich {@link Query}.
* *
* @param query the query class that specifies the criteria used to find a record and also an optional fields * @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification * specification.
* @param entityClass the parametrized type of the returned {@link Mono}. * @param entityClass the parametrized type of the returned {@link Mono}.
* @return the converted object * @return the converted object.
*/ */
<T> Mono<T> findOne(Query query, Class<T> entityClass); <T> Mono<T> findOne(Query query, Class<T> entityClass);
@@ -281,10 +287,10 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* feature rich {@link Query}. * feature rich {@link Query}.
* *
* @param query the query class that specifies the criteria used to find a record and also an optional fields * @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification * specification.
* @param entityClass the parametrized type of the returned {@link Mono}. * @param entityClass the parametrized type of the returned {@link Mono}.
* @param collectionName name of the collection to retrieve the objects from * @param collectionName name of the collection to retrieve the objects from.
* @return the converted object * @return the converted object.
*/ */
<T> Mono<T> findOne(Query query, Class<T> entityClass, String collectionName); <T> Mono<T> findOne(Query query, Class<T> entityClass, String collectionName);
@@ -295,7 +301,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* *
* @param query the {@link Query} class that specifies the criteria used to find a record. * @param query the {@link Query} class that specifies the criteria used to find a record.
* @param collectionName name of the collection to check for objects. * @param collectionName name of the collection to check for objects.
* @return * @return {@literal true} if the query yields a result.
*/ */
Mono<Boolean> exists(Query query, String collectionName); Mono<Boolean> exists(Query query, String collectionName);
@@ -304,7 +310,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* *
* @param query the {@link Query} class that specifies the criteria used to find a record. * @param query the {@link Query} class that specifies the criteria used to find a record.
* @param entityClass the parametrized type. * @param entityClass the parametrized type.
* @return * @return {@literal true} if the query yields a result.
*/ */
Mono<Boolean> exists(Query query, Class<?> entityClass); Mono<Boolean> exists(Query query, Class<?> entityClass);
@@ -312,11 +318,11 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Determine result of given {@link Query} contains at least one element. * Determine result of given {@link Query} contains at least one element.
* *
* @param query the {@link Query} class that specifies the criteria used to find a record. * @param query the {@link Query} class that specifies the criteria used to find a record.
* @param entityClass the parametrized type. * @param entityClass the parametrized type. Can be {@literal null}.
* @param collectionName name of the collection to check for objects. * @param collectionName name of the collection to check for objects.
* @return * @return {@literal true} if the query yields a result.
*/ */
Mono<Boolean> exists(Query query, Class<?> entityClass, String collectionName); Mono<Boolean> exists(Query query, @Nullable Class<?> entityClass, String collectionName);
/** /**
* Map the results of an ad-hoc query on the collection for the entity class to a {@link Flux} of the specified type. * Map the results of an ad-hoc query on the collection for the entity class to a {@link Flux} of the specified type.
@@ -328,9 +334,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* feature rich {@link Query}. * feature rich {@link Query}.
* *
* @param query the query class that specifies the criteria used to find a record and also an optional fields * @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification * specification. Must not be {@literal null}.
* @param entityClass the parametrized type of the returned {@link Flux}. * @param entityClass the parametrized type of the returned {@link Flux}. Must not be {@literal null}.
* @return the {@link Flux} of converted objects * @return the {@link Flux} of converted objects.
*/ */
<T> Flux<T> find(Query query, Class<T> entityClass); <T> Flux<T> find(Query query, Class<T> entityClass);
@@ -344,10 +350,10 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* feature rich {@link Query}. * feature rich {@link Query}.
* *
* @param query the query class that specifies the criteria used to find a record and also an optional fields * @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification * specification. Must not be {@literal null}.
* @param entityClass the parametrized type of the returned {@link Flux}. * @param entityClass the parametrized type of the returned {@link Flux}.
* @param collectionName name of the collection to retrieve the objects from * @param collectionName name of the collection to retrieve the objects from. Must not be {@literal null}.
* @return the {@link Flux} of converted objects * @return the {@link Flux} of converted objects.
*/ */
<T> Flux<T> find(Query query, Class<T> entityClass, String collectionName); <T> Flux<T> find(Query query, Class<T> entityClass, String collectionName);
@@ -355,9 +361,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Returns a document with the given id mapped onto the given class. The collection the query is ran against will be * Returns a document with the given id mapped onto the given class. The collection the query is ran against will be
* derived from the given target class as well. * derived from the given target class as well.
* *
* @param <T> * @param id the id of the document to return. Must not be {@literal null}.
* @param id the id of the document to return. * @param entityClass the type the document shall be converted into. Must not be {@literal null}.
* @param entityClass the type the document shall be converted into.
* @return the document with the given id mapped onto the given target class. * @return the document with the given id mapped onto the given target class.
*/ */
<T> Mono<T> findById(Object id, Class<T> entityClass); <T> Mono<T> findById(Object id, Class<T> entityClass);
@@ -365,11 +370,10 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/** /**
* Returns the document with the given id from the given collection mapped onto the given target class. * Returns the document with the given id from the given collection mapped onto the given target class.
* *
* @param id the id of the document to return * @param id the id of the document to return.
* @param entityClass the type to convert the document to * @param entityClass the type to convert the document to.
* @param collectionName the collection to query for the document * @param collectionName the collection to query for the document.
* @param <T> * @return the converted object.
* @return
*/ */
<T> Mono<T> findById(Object id, Class<T> entityClass, String collectionName); <T> Mono<T> findById(Object id, Class<T> entityClass, String collectionName);
@@ -458,7 +462,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* *
* @param near must not be {@literal null}. * @param near must not be {@literal null}.
* @param entityClass must not be {@literal null}. * @param entityClass must not be {@literal null}.
* @return * @return the converted {@link GeoResult}s.
*/ */
<T> Flux<GeoResult<T>> geoNear(NearQuery near, Class<T> entityClass); <T> Flux<GeoResult<T>> geoNear(NearQuery near, Class<T> entityClass);
@@ -471,37 +475,37 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* @param entityClass must not be {@literal null}. * @param entityClass must not be {@literal null}.
* @param collectionName the collection to trigger the query against. If no collection name is given the entity class * @param collectionName the collection to trigger the query against. If no collection name is given the entity class
* will be inspected. * will be inspected.
* @return * @return the converted {@link GeoResult}s.
*/ */
<T> Flux<GeoResult<T>> geoNear(NearQuery near, Class<T> entityClass, String collectionName); <T> Flux<GeoResult<T>> geoNear(NearQuery near, Class<T> entityClass, String collectionName);
/** /**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify <a/> * Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}. * to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
* *
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional * @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification. * fields specification. Must not be {@literal null}.
* @param update the {@link Update} to apply on matching documents. * @param update the {@link Update} to apply on matching documents. Must not be {@literal null}.
* @param entityClass the parametrized type. * @param entityClass the parametrized type. Must not be {@literal null}.
* @return * @return the converted object that was updated before it was updated.
*/ */
<T> Mono<T> findAndModify(Query query, Update update, Class<T> entityClass); <T> Mono<T> findAndModify(Query query, Update update, Class<T> entityClass);
/** /**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify <a/> * Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}. * to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
* *
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional * @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification. * fields specification. Must not be {@literal null}.
* @param update the {@link Update} to apply on matching documents. * @param update the {@link Update} to apply on matching documents. Must not be {@literal null}.
* @param entityClass the parametrized type. * @param entityClass the parametrized type. Must not be {@literal null}.
* @param collectionName the collection to query. * @param collectionName the collection to query. Must not be {@literal null}.
* @return * @return the converted object that was updated before it was updated.
*/ */
<T> Mono<T> findAndModify(Query query, Update update, Class<T> entityClass, String collectionName); <T> Mono<T> findAndModify(Query query, Update update, Class<T> entityClass, String collectionName);
/** /**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify <a/> * Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking * to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account. * {@link FindAndModifyOptions} into account.
* *
@@ -510,22 +514,24 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* @param update the {@link Update} to apply on matching documents. * @param update the {@link Update} to apply on matching documents.
* @param options the {@link FindAndModifyOptions} holding additional information. * @param options the {@link FindAndModifyOptions} holding additional information.
* @param entityClass the parametrized type. * @param entityClass the parametrized type.
* @return * @return the converted object that was updated. Depending on the value of {@link FindAndModifyOptions#isReturnNew()}
* this will either be the object as it was before the update or as it is after the update.
*/ */
<T> Mono<T> findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass); <T> Mono<T> findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass);
/** /**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify <a/> * Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking * to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account. * {@link FindAndModifyOptions} into account.
* *
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional * @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification. * fields specification. Must not be {@literal null}.
* @param update the {@link Update} to apply on matching documents. * @param update the {@link Update} to apply on matching documents. Must not be {@literal null}.
* @param options the {@link FindAndModifyOptions} holding additional information. * @param options the {@link FindAndModifyOptions} holding additional information. Must not be {@literal null}.
* @param entityClass the parametrized type. * @param entityClass the parametrized type. Must not be {@literal null}.
* @param collectionName the collection to query. * @param collectionName the collection to query. Must not be {@literal null}.
* @return * @return the converted object that was updated. Depending on the value of {@link FindAndModifyOptions#isReturnNew()}
* this will either be the object as it was before the update or as it is after the update.
*/ */
<T> Mono<T> findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass, <T> Mono<T> findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass,
String collectionName); String collectionName);
@@ -541,7 +547,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* feature rich {@link Query}. * feature rich {@link Query}.
* *
* @param query the query class that specifies the criteria used to find a record and also an optional fields * @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification * specification.
* @param entityClass the parametrized type of the returned {@link Mono}. * @param entityClass the parametrized type of the returned {@link Mono}.
* @return the converted object * @return the converted object
*/ */
@@ -558,19 +564,20 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* feature rich {@link Query}. * feature rich {@link Query}.
* *
* @param query the query class that specifies the criteria used to find a record and also an optional fields * @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification * specification.
* @param entityClass the parametrized type of the returned {@link Mono}. * @param entityClass the parametrized type of the returned {@link Mono}.
* @param collectionName name of the collection to retrieve the objects from. * @param collectionName name of the collection to retrieve the objects from.
* @return the converted object * @return the converted object.
*/ */
<T> Mono<T> findAndRemove(Query query, Class<T> entityClass, String collectionName); <T> Mono<T> findAndRemove(Query query, Class<T> entityClass, String collectionName);
/** /**
* Returns the number of documents for the given {@link Query} by querying the collection of the given entity class. * Returns the number of documents for the given {@link Query} by querying the collection of the given entity class.
* *
* @param query * @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* @param entityClass must not be {@literal null}. * {@literal null}.
* @return * @param entityClass class that determines the collection to use. Must not be {@literal null}.
* @return the count of matching documents.
*/ */
Mono<Long> count(Query query, Class<?> entityClass); Mono<Long> count(Query query, Class<?> entityClass);
@@ -579,9 +586,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* must solely consist of document field references as we lack type information to map potential property references * must solely consist of document field references as we lack type information to map potential property references
* onto document fields. Use {@link #count(Query, Class, String)} to get full type specific support. * onto document fields. Use {@link #count(Query, Class, String)} to get full type specific support.
* *
* @param query * @param query the {@link Query} class that specifies the criteria used to find documents.
* @param collectionName must not be {@literal null} or empty. * @param collectionName must not be {@literal null} or empty.
* @return * @return the count of matching documents.
* @see #count(Query, Class, String) * @see #count(Query, Class, String)
*/ */
Mono<Long> count(Query query, String collectionName); Mono<Long> count(Query query, String collectionName);
@@ -590,12 +597,13 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Returns the number of documents for the given {@link Query} by querying the given collection using the given entity * Returns the number of documents for the given {@link Query} by querying the given collection using the given entity
* class to map the given {@link Query}. * class to map the given {@link Query}.
* *
* @param query * @param query the {@link Query} class that specifies the criteria used to find documents. Must not be
* @param entityClass must not be {@literal null}. * {@literal null}.
* @param entityClass the parametrized type. Can be {@literal null}.
* @param collectionName must not be {@literal null} or empty. * @param collectionName must not be {@literal null} or empty.
* @return * @return the count of matching documents.
*/ */
Mono<Long> count(Query query, Class<?> entityClass, String collectionName); Mono<Long> count(Query query, @Nullable Class<?> entityClass, String collectionName);
/** /**
* Insert the object into the collection for the entity type of the object to save. * Insert the object into the collection for the entity type of the object to save.
@@ -605,14 +613,14 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a * If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your * String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See * property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" > * <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" >
* Spring's Type Conversion"</a> for more details. * Spring's Type Conversion"</a> for more details.
* <p/> * <p/>
* <p/> * <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method. * Insert is used to initially store the object into the database. To update an existing object use the save method.
* *
* @param objectToSave the object to store in the collection. * @param objectToSave the object to store in the collection. Must not be {@literal null}.
* @return * @return the saved object.
*/ */
<T> Mono<T> insert(T objectToSave); <T> Mono<T> insert(T objectToSave);
@@ -624,27 +632,27 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <p/> * <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method. * Insert is used to initially store the object into the database. To update an existing object use the save method.
* *
* @param objectToSave the object to store in the collection * @param objectToSave the object to store in the collection. Must not be {@literal null}.
* @param collectionName name of the collection to store the object in * @param collectionName name of the collection to store the object in. Must not be {@literal null}.
* @return * @return the saved object.
*/ */
<T> Mono<T> insert(T objectToSave, String collectionName); <T> Mono<T> insert(T objectToSave, String collectionName);
/** /**
* Insert a Collection of objects into a collection in a single batch write to the database. * Insert a Collection of objects into a collection in a single batch write to the database.
* *
* @param batchToSave the batch of objects to save. * @param batchToSave the batch of objects to save. Must not be {@literal null}.
* @param entityClass class that determines the collection to use * @param entityClass class that determines the collection to use. Must not be {@literal null}.
* @return * @return the saved objects.
*/ */
<T> Flux<T> insert(Collection<? extends T> batchToSave, Class<?> entityClass); <T> Flux<T> insert(Collection<? extends T> batchToSave, Class<?> entityClass);
/** /**
* Insert a batch of objects into the specified collection in a single batch write to the database. * Insert a batch of objects into the specified collection in a single batch write to the database.
* *
* @param batchToSave the list of objects to save. * @param batchToSave the list of objects to save. Must not be {@literal null}.
* @param collectionName name of the collection to store the object in * @param collectionName name of the collection to store the object in. Must not be {@literal null}.
* @return * @return the saved objects.
*/ */
<T> Flux<T> insert(Collection<? extends T> batchToSave, String collectionName); <T> Flux<T> insert(Collection<? extends T> batchToSave, String collectionName);
@@ -652,8 +660,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Insert a mixed Collection of objects into a database collection determining the collection name to use based on the * Insert a mixed Collection of objects into a database collection determining the collection name to use based on the
* class. * class.
* *
* @param objectsToSave the list of objects to save. * @param objectsToSave the list of objects to save. Must not be {@literal null}.
* @return * @return the saved objects.
*/ */
<T> Flux<T> insertAll(Collection<? extends T> objectsToSave); <T> Flux<T> insertAll(Collection<? extends T> objectsToSave);
@@ -665,32 +673,32 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a * If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your * String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See * property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" > * <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" >
* Spring's Type Conversion"</a> for more details. * Spring's Type Conversion"</a> for more details.
* <p/> * <p/>
* <p/> * <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method. * Insert is used to initially store the object into the database. To update an existing object use the save method.
* *
* @param objectToSave the object to store in the collection. * @param objectToSave the object to store in the collection. Must not be {@literal null}.
* @return * @return the saved object.
*/ */
<T> Mono<T> insert(Mono<? extends T> objectToSave); <T> Mono<T> insert(Mono<? extends T> objectToSave);
/** /**
* Insert a Collection of objects into a collection in a single batch write to the database. * Insert a Collection of objects into a collection in a single batch write to the database.
* *
* @param batchToSave the publisher which provides objects to save. * @param batchToSave the publisher which provides objects to save. Must not be {@literal null}.
* @param entityClass class that determines the collection to use * @param entityClass class that determines the collection to use. Must not be {@literal null}.
* @return * @return the saved objects.
*/ */
<T> Flux<T> insertAll(Mono<? extends Collection<? extends T>> batchToSave, Class<?> entityClass); <T> Flux<T> insertAll(Mono<? extends Collection<? extends T>> batchToSave, Class<?> entityClass);
/** /**
* Insert objects into the specified collection in a single batch write to the database. * Insert objects into the specified collection in a single batch write to the database.
* *
* @param batchToSave the publisher which provides objects to save. * @param batchToSave the publisher which provides objects to save. Must not be {@literal null}.
* @param collectionName name of the collection to store the object in * @param collectionName name of the collection to store the object in. Must not be {@literal null}.
* @return * @return the saved objects.
*/ */
<T> Flux<T> insertAll(Mono<? extends Collection<? extends T>> batchToSave, String collectionName); <T> Flux<T> insertAll(Mono<? extends Collection<? extends T>> batchToSave, String collectionName);
@@ -698,8 +706,8 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Insert a mixed Collection of objects into a database collection determining the collection name to use based on the * Insert a mixed Collection of objects into a database collection determining the collection name to use based on the
* class. * class.
* *
* @param objectsToSave the publisher which provides objects to save. * @param objectsToSave the publisher which provides objects to save. Must not be {@literal null}.
* @return * @return the saved objects.
*/ */
<T> Flux<T> insertAll(Mono<? extends Collection<? extends T>> objectsToSave); <T> Flux<T> insertAll(Mono<? extends Collection<? extends T>> objectsToSave);
@@ -713,11 +721,11 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a * If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your * String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See * property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" > * <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" >
* Spring's Type Conversion"</a> for more details. * Spring's Type Conversion"</a> for more details.
* *
* @param objectToSave the object to store in the collection * @param objectToSave the object to store in the collection. Must not be {@literal null}.
* @return * @return the saved object.
*/ */
<T> Mono<T> save(T objectToSave); <T> Mono<T> save(T objectToSave);
@@ -731,12 +739,12 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a * If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your * String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a * property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert">Spring's * http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's
* Type Conversion"</a> for more details. * Type Conversion"</a> for more details.
* *
* @param objectToSave the object to store in the collection * @param objectToSave the object to store in the collection. Must not be {@literal null}.
* @param collectionName name of the collection to store the object in * @param collectionName name of the collection to store the object in. Must not be {@literal null}.
* @return * @return the saved object.
*/ */
<T> Mono<T> save(T objectToSave, String collectionName); <T> Mono<T> save(T objectToSave, String collectionName);
@@ -750,11 +758,11 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a * If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your * String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See * property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" > * <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" >
* Spring's Type Conversion"</a> for more details. * Spring's Type Conversion"</a> for more details.
* *
* @param objectToSave the object to store in the collection * @param objectToSave the object to store in the collection. Must not be {@literal null}.
* @return * @return the saved object.
*/ */
<T> Mono<T> save(Mono<? extends T> objectToSave); <T> Mono<T> save(Mono<? extends T> objectToSave);
@@ -768,12 +776,12 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a * If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your * String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a * property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert">Spring's * http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's
* Type Conversion"</a> for more details. * Type Conversion"</a> for more details.
* *
* @param objectToSave the object to store in the collection * @param objectToSave the object to store in the collection. Must not be {@literal null}.
* @param collectionName name of the collection to store the object in * @param collectionName name of the collection to store the object in. Must not be {@literal null}.
* @return * @return the saved object.
*/ */
<T> Mono<T> save(Mono<? extends T> objectToSave, String collectionName); <T> Mono<T> save(Mono<? extends T> objectToSave, String collectionName);
@@ -781,10 +789,12 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by * Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document. * combining the query document and the update document.
* *
* @param query the query document that specifies the criteria used to select a record to be upserted * @param query the query document that specifies the criteria used to select a record to be upserted. Must not be
* @param update the update document that contains the updated object or $ operators to manipulate the existing object * {@literal null}.
* @param entityClass class that determines the collection to use * @param update the update document that contains the updated object or $ operators to manipulate the existing
* @return the WriteResult which lets you access the results of the previous write. * object. Must not be {@literal null}.
* @param entityClass class that determines the collection to use. Must not be {@literal null}.
* @return the {@link UpdateResult} which lets you access the results of the previous write.
*/ */
Mono<UpdateResult> upsert(Query query, Update update, Class<?> entityClass); Mono<UpdateResult> upsert(Query query, Update update, Class<?> entityClass);
@@ -794,11 +804,12 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of * <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of
* domain type information. Use {@link #upsert(Query, Update, Class, String)} to get full type specific support. * domain type information. Use {@link #upsert(Query, Update, Class, String)} to get full type specific support.
* *
* @param query the query document that specifies the criteria used to select a record to be updated * @param query the query document that specifies the criteria used to select a record to be upserted. Must not be
* {@literal null}.
* @param update the update document that contains the updated object or $ operators to manipulate the existing * @param update the update document that contains the updated object or $ operators to manipulate the existing
* object. * object. Must not be {@literal null}.
* @param collectionName name of the collection to update the object in * @param collectionName name of the collection to update the object in.
* @return the WriteResult which lets you access the results of the previous write. * @return the {@link UpdateResult} which lets you access the results of the previous write.
*/ */
Mono<UpdateResult> upsert(Query query, Update update, String collectionName); Mono<UpdateResult> upsert(Query query, Update update, String collectionName);
@@ -806,11 +817,13 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by * Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document. * combining the query document and the update document.
* *
* @param query the query document that specifies the criteria used to select a record to be upserted * @param query the query document that specifies the criteria used to select a record to be upserted. Must not be
* @param update the update document that contains the updated object or $ operators to manipulate the existing object * {@literal null}.
* @param entityClass class of the pojo to be operated on * @param update the update document that contains the updated object or $ operators to manipulate the existing
* @param collectionName name of the collection to update the object in * object. Must not be {@literal null}.
* @return the WriteResult which lets you access the results of the previous write. * @param entityClass class of the pojo to be operated on. Must not be {@literal null}.
* @param collectionName name of the collection to update the object in. Must not be {@literal null}.
* @return the {@link UpdateResult} which lets you access the results of the previous write.
*/ */
Mono<UpdateResult> upsert(Query query, Update update, Class<?> entityClass, String collectionName); Mono<UpdateResult> upsert(Query query, Update update, Class<?> entityClass, String collectionName);
@@ -818,11 +831,12 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Updates the first object that is found in the collection of the entity class that matches the query document with * Updates the first object that is found in the collection of the entity class that matches the query document with
* the provided update document. * the provided update document.
* *
* @param query the query document that specifies the criteria used to select a record to be updated * @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* @param update the update document that contains the updated object or $ operators to manipulate the existing * {@literal null}.
* object. * @param update the update document that contains the updated object or $ operators to manipulate the existing. Must
* @param entityClass class that determines the collection to use * not be {@literal null}.
* @return the WriteResult which lets you access the results of the previous write. * @param entityClass class that determines the collection to use.
* @return the {@link UpdateResult} which lets you access the results of the previous write.
*/ */
Mono<UpdateResult> updateFirst(Query query, Update update, Class<?> entityClass); Mono<UpdateResult> updateFirst(Query query, Update update, Class<?> entityClass);
@@ -832,26 +846,26 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of * <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of
* domain type information. Use {@link #updateFirst(Query, Update, Class, String)} to get full type specific support. * domain type information. Use {@link #updateFirst(Query, Update, Class, String)} to get full type specific support.
* *
* @param query the query document that specifies the criteria used to select a record to be updated * @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* @param update the update document that contains the updated object or $ operators to manipulate the existing * {@literal null}.
* object. * @param update the update document that contains the updated object or $ operators to manipulate the existing. Must
* @param collectionName name of the collection to update the object in * not be {@literal null}.
* @return the WriteResult which lets you access the results of the previous write. * @param collectionName name of the collection to update the object in. Must not be {@literal null}.
* @return the {@link UpdateResult} which lets you access the results of the previous write.
*/ */
Mono<UpdateResult> updateFirst(Query query, Update update, String collectionName); Mono<UpdateResult> updateFirst(Query query, Update update, String collectionName);
/** /**
* Updates the first object that is found in the specified collection that matches the query document criteria with * Updates the first object that is found in the specified collection that matches the query document criteria with
* the provided updated document. <br /> * the provided updated document. <br />
* <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of
* domain type information. Use {@link #updateFirst(Query, Update, Class, String)} to get full type specific support.
* *
* @param query the query document that specifies the criteria used to select a record to be updated * @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* @param update the update document that contains the updated object or $ operators to manipulate the existing * {@literal null}.
* object. * @param update the update document that contains the updated object or $ operators to manipulate the existing. Must
* @param entityClass class of the pojo to be operated on * not be {@literal null}.
* @param collectionName name of the collection to update the object in * @param entityClass class of the pojo to be operated on. Must not be {@literal null}.
* @return the WriteResult which lets you access the results of the previous write. * @param collectionName name of the collection to update the object in. Must not be {@literal null}.
* @return the {@link UpdateResult} which lets you access the results of the previous write.
*/ */
Mono<UpdateResult> updateFirst(Query query, Update update, Class<?> entityClass, String collectionName); Mono<UpdateResult> updateFirst(Query query, Update update, Class<?> entityClass, String collectionName);
@@ -859,11 +873,12 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Updates all objects that are found in the collection for the entity class that matches the query document criteria * Updates all objects that are found in the collection for the entity class that matches the query document criteria
* with the provided updated document. * with the provided updated document.
* *
* @param query the query document that specifies the criteria used to select a record to be updated * @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* @param update the update document that contains the updated object or $ operators to manipulate the existing * {@literal null}.
* object. * @param update the update document that contains the updated object or $ operators to manipulate the existing. Must
* @param entityClass class that determines the collection to use * not be {@literal null}.
* @return the WriteResult which lets you access the results of the previous write. * @param entityClass class of the pojo to be operated on. Must not be {@literal null}.
* @return the {@link UpdateResult} which lets you access the results of the previous write.
*/ */
Mono<UpdateResult> updateMulti(Query query, Update update, Class<?> entityClass); Mono<UpdateResult> updateMulti(Query query, Update update, Class<?> entityClass);
@@ -873,11 +888,12 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of * <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of
* domain type information. Use {@link #updateMulti(Query, Update, Class, String)} to get full type specific support. * domain type information. Use {@link #updateMulti(Query, Update, Class, String)} to get full type specific support.
* *
* @param query the query document that specifies the criteria used to select a record to be updated * @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* @param update the update document that contains the updated object or $ operators to manipulate the existing * {@literal null}.
* object. * @param update the update document that contains the updated object or $ operators to manipulate the existing. Must
* @param collectionName name of the collection to update the object in * not be {@literal null}.
* @return the WriteResult which lets you access the results of the previous write. * @param collectionName name of the collection to update the object in. Must not be {@literal null}.
* @return the {@link UpdateResult} which lets you access the results of the previous write.
*/ */
Mono<UpdateResult> updateMulti(Query query, Update update, String collectionName); Mono<UpdateResult> updateMulti(Query query, Update update, String collectionName);
@@ -885,55 +901,57 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Updates all objects that are found in the collection for the entity class that matches the query document criteria * Updates all objects that are found in the collection for the entity class that matches the query document criteria
* with the provided updated document. * with the provided updated document.
* *
* @param query the query document that specifies the criteria used to select a record to be updated * @param query the query document that specifies the criteria used to select a record to be updated. Must not be
* @param update the update document that contains the updated object or $ operators to manipulate the existing * {@literal null}.
* object. * @param update the update document that contains the updated object or $ operators to manipulate the existing. Must
* @param entityClass class of the pojo to be operated on * not be {@literal null}.
* @param collectionName name of the collection to update the object in * @param entityClass class of the pojo to be operated on. Must not be {@literal null}.
* @return the WriteResult which lets you access the results of the previous write. * @param collectionName name of the collection to update the object in. Must not be {@literal null}.
* @return the {@link UpdateResult} which lets you access the results of the previous write.
*/ */
Mono<UpdateResult> updateMulti(final Query query, final Update update, Class<?> entityClass, String collectionName); Mono<UpdateResult> updateMulti(Query query, Update update, Class<?> entityClass, String collectionName);
/** /**
* Remove the given object from the collection by id. * Remove the given object from the collection by id.
* *
* @param object * @param object must not be {@literal null}.
* @return * @return the {@link DeleteResult} which lets you access the results of the previous delete.
*/ */
Mono<DeleteResult> remove(Object object); Mono<DeleteResult> remove(Object object);
/** /**
* Removes the given object from the given collection. * Removes the given object from the given collection.
* *
* @param object * @param object must not be {@literal null}.
* @param collection must not be {@literal null} or empty. * @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
*/ */
Mono<DeleteResult> remove(Object object, String collection); Mono<DeleteResult> remove(Object object, String collectionName);
/** /**
* Remove the given object from the collection by id. * Remove the given object from the collection by id.
* *
* @param objectToRemove * @param objectToRemove must not be {@literal null}.
* @return * @return the {@link DeleteResult} which lets you access the results of the previous delete.
*/ */
Mono<DeleteResult> remove(Mono<? extends Object> objectToRemove); Mono<DeleteResult> remove(Mono<? extends Object> objectToRemove);
/** /**
* Removes the given object from the given collection. * Removes the given object from the given collection.
* *
* @param objectToRemove * @param objectToRemove must not be {@literal null}.
* @param collection must not be {@literal null} or empty. * @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @return * @return the {@link DeleteResult} which lets you access the results of the previous delete.
*/ */
Mono<DeleteResult> remove(Mono<? extends Object> objectToRemove, String collection); Mono<DeleteResult> remove(Mono<? extends Object> objectToRemove, String collectionName);
/** /**
* Remove all documents that match the provided query document criteria from the the collection used to store the * Remove all documents that match the provided query document criteria from the the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query. * entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
* *
* @param query * @param query the query document that specifies the criteria used to remove a record.
* @param entityClass * @param entityClass class that determines the collection to use.
* @return * @return the {@link DeleteResult} which lets you access the results of the previous delete.
*/ */
Mono<DeleteResult> remove(Query query, Class<?> entityClass); Mono<DeleteResult> remove(Query query, Class<?> entityClass);
@@ -941,12 +959,12 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* Remove all documents that match the provided query document criteria from the the collection used to store the * Remove all documents that match the provided query document criteria from the the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query. * entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
* *
* @param query * @param query the query document that specifies the criteria used to remove a record.
* @param entityClass * @param entityClass class of the pojo to be operated on. Can be {@literal null}.
* @param collectionName * @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @return * @return the {@link DeleteResult} which lets you access the results of the previous delete.
*/ */
Mono<DeleteResult> remove(Query query, Class<?> entityClass, String collectionName); Mono<DeleteResult> remove(Query query, @Nullable Class<?> entityClass, String collectionName);
/** /**
* Remove all documents from the specified collection that match the provided query document criteria. There is no * Remove all documents from the specified collection that match the provided query document criteria. There is no
@@ -954,8 +972,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <strong>NOTE:</strong> Any additional support for field mapping is not available due to the lack of domain type * <strong>NOTE:</strong> Any additional support for field mapping is not available due to the lack of domain type
* information. Use {@link #remove(Query, Class, String)} to get full type specific support. * information. Use {@link #remove(Query, Class, String)} to get full type specific support.
* *
* @param query the query document that specifies the criteria used to remove a record * @param query the query document that specifies the criteria used to remove a record.
* @param collectionName name of the collection where the objects will removed * @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
*/ */
Mono<DeleteResult> remove(Query query, String collectionName); Mono<DeleteResult> remove(Query query, String collectionName);
@@ -964,18 +983,18 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* <strong>NOTE:</strong> Any additional support for field mapping is not available due to the lack of domain type * <strong>NOTE:</strong> Any additional support for field mapping is not available due to the lack of domain type
* information. Use {@link #findAllAndRemove(Query, Class, String)} to get full type specific support. * information. Use {@link #findAllAndRemove(Query, Class, String)} to get full type specific support.
* *
* @param query * @param query the query document that specifies the criteria used to find and remove documents.
* @param collectionName * @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @return * @return the {@link Flux} converted objects deleted by this operation.
*/ */
<T> Flux<T> findAllAndRemove(Query query, String collectionName); <T> Flux<T> findAllAndRemove(Query query, String collectionName);
/** /**
* Returns and removes all documents matching the given query form the collection used to store the entityClass. * Returns and removes all documents matching the given query form the collection used to store the entityClass.
* *
* @param query * @param query the query document that specifies the criteria used to find and remove documents.
* @param entityClass * @param entityClass class of the pojo to be operated on.
* @return * @return the {@link Flux} converted objects deleted by this operation.
*/ */
<T> Flux<T> findAllAndRemove(Query query, Class<T> entityClass); <T> Flux<T> findAllAndRemove(Query query, Class<T> entityClass);
@@ -984,10 +1003,10 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* store the entityClass. The Class parameter is also used to help convert the Id of the object if it is present in * store the entityClass. The Class parameter is also used to help convert the Id of the object if it is present in
* the query. * the query.
* *
* @param query * @param query the query document that specifies the criteria used to find and remove documents.
* @param entityClass * @param entityClass class of the pojo to be operated on.
* @param collectionName * @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @return * @return the {@link Flux} converted objects deleted by this operation.
*/ */
<T> Flux<T> findAllAndRemove(Query query, Class<T> entityClass, String collectionName); <T> Flux<T> findAllAndRemove(Query query, Class<T> entityClass, String collectionName);
@@ -1004,9 +1023,9 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* feature rich {@link Query}. * feature rich {@link Query}.
* *
* @param query the query class that specifies the criteria used to find a record and also an optional fields * @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification * specification.
* @param entityClass the parametrized type of the returned {@link Flux}. * @param entityClass the parametrized type of the returned {@link Flux}.
* @return the {@link Flux} of converted objects * @return the {@link Flux} of converted objects.
*/ */
<T> Flux<T> tail(Query query, Class<T> entityClass); <T> Flux<T> tail(Query query, Class<T> entityClass);
@@ -1023,10 +1042,10 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* feature rich {@link Query}. * feature rich {@link Query}.
* *
* @param query the query class that specifies the criteria used to find a record and also an optional fields * @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification * specification.
* @param entityClass the parametrized type of the returned {@link Flux}. * @param entityClass the parametrized type of the returned {@link Flux}.
* @param collectionName name of the collection to retrieve the objects from * @param collectionName name of the collection to retrieve the objects from.
* @return the {@link Flux} of converted objects * @return the {@link Flux} of converted objects.
*/ */
<T> Flux<T> tail(Query query, Class<T> entityClass, String collectionName); <T> Flux<T> tail(Query query, Class<T> entityClass, String collectionName);

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2016-2017 the original author or authors. * Copyright 2016-2018 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -20,8 +20,6 @@ import static org.springframework.data.mongodb.core.query.SerializationUtils.*;
import lombok.NonNull; import lombok.NonNull;
import lombok.RequiredArgsConstructor; import lombok.RequiredArgsConstructor;
import org.springframework.data.projection.ProjectionInformation;
import org.springframework.util.ClassUtils;
import reactor.core.publisher.Flux; import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono; import reactor.core.publisher.Mono;
import reactor.util.function.Tuple2; import reactor.util.function.Tuple2;
@@ -40,6 +38,8 @@ import java.util.Set;
import java.util.function.Function; import java.util.function.Function;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import javax.annotation.Nonnull;
import org.bson.Document; import org.bson.Document;
import org.bson.conversions.Bson; import org.bson.conversions.Bson;
import org.bson.types.ObjectId; import org.bson.types.ObjectId;
@@ -105,10 +105,13 @@ import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query; import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update; import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.util.MongoClientVersion; import org.springframework.data.mongodb.util.MongoClientVersion;
import org.springframework.data.projection.ProjectionInformation;
import org.springframework.data.projection.SpelAwareProxyProjectionFactory; import org.springframework.data.projection.SpelAwareProxyProjectionFactory;
import org.springframework.data.util.Optionals; import org.springframework.data.util.Optionals;
import org.springframework.data.util.Pair; import org.springframework.data.util.Pair;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.ObjectUtils; import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
@@ -122,6 +125,7 @@ import com.mongodb.MongoException;
import com.mongodb.ReadPreference; import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern; import com.mongodb.WriteConcern;
import com.mongodb.client.model.CreateCollectionOptions; import com.mongodb.client.model.CreateCollectionOptions;
import com.mongodb.client.model.DeleteOptions;
import com.mongodb.client.model.Filters; import com.mongodb.client.model.Filters;
import com.mongodb.client.model.FindOneAndDeleteOptions; import com.mongodb.client.model.FindOneAndDeleteOptions;
import com.mongodb.client.model.FindOneAndUpdateOptions; import com.mongodb.client.model.FindOneAndUpdateOptions;
@@ -179,12 +183,12 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
private final UpdateMapper updateMapper; private final UpdateMapper updateMapper;
private final SpelAwareProxyProjectionFactory projectionFactory; private final SpelAwareProxyProjectionFactory projectionFactory;
private WriteConcern writeConcern; private @Nullable WriteConcern writeConcern;
private WriteConcernResolver writeConcernResolver = DefaultWriteConcernResolver.INSTANCE; private WriteConcernResolver writeConcernResolver = DefaultWriteConcernResolver.INSTANCE;
private WriteResultChecking writeResultChecking = WriteResultChecking.NONE; private WriteResultChecking writeResultChecking = WriteResultChecking.NONE;
private ReadPreference readPreference; private @Nullable ReadPreference readPreference;
private ApplicationEventPublisher eventPublisher; private @Nullable ApplicationEventPublisher eventPublisher;
private MongoPersistentEntityIndexCreator indexCreator; private @Nullable MongoPersistentEntityIndexCreator indexCreator;
/** /**
* Constructor used for a basic template configuration. * Constructor used for a basic template configuration.
@@ -209,9 +213,10 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* Constructor used for a basic template configuration. * Constructor used for a basic template configuration.
* *
* @param mongoDatabaseFactory must not be {@literal null}. * @param mongoDatabaseFactory must not be {@literal null}.
* @param mongoConverter * @param mongoConverter can be {@literal null}.
*/ */
public ReactiveMongoTemplate(ReactiveMongoDatabaseFactory mongoDatabaseFactory, MongoConverter mongoConverter) { public ReactiveMongoTemplate(ReactiveMongoDatabaseFactory mongoDatabaseFactory,
@Nullable MongoConverter mongoConverter) {
Assert.notNull(mongoDatabaseFactory, "ReactiveMongoDatabaseFactory must not be null!"); Assert.notNull(mongoDatabaseFactory, "ReactiveMongoDatabaseFactory must not be null!");
@@ -226,7 +231,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
mappingContext = this.mongoConverter.getMappingContext(); mappingContext = this.mongoConverter.getMappingContext();
// We create indexes based on mapping events // We create indexes based on mapping events
if (null != mappingContext && mappingContext instanceof MongoMappingContext) { if (mappingContext instanceof MongoMappingContext) {
indexCreator = new MongoPersistentEntityIndexCreator((MongoMappingContext) mappingContext, indexCreator = new MongoPersistentEntityIndexCreator((MongoMappingContext) mappingContext,
(collectionName) -> IndexOperationsAdapter.blocking(indexOps(collectionName))); (collectionName) -> IndexOperationsAdapter.blocking(indexOps(collectionName)));
eventPublisher = new MongoMappingEventPublisher(indexCreator); eventPublisher = new MongoMappingEventPublisher(indexCreator);
@@ -242,7 +247,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* *
* @param resultChecking * @param resultChecking
*/ */
public void setWriteResultChecking(WriteResultChecking resultChecking) { public void setWriteResultChecking(@Nullable WriteResultChecking resultChecking) {
this.writeResultChecking = resultChecking == null ? DEFAULT_WRITE_RESULT_CHECKING : resultChecking; this.writeResultChecking = resultChecking == null ? DEFAULT_WRITE_RESULT_CHECKING : resultChecking;
} }
@@ -251,18 +256,18 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* configured on the {@link MongoDbFactory} will apply. If you configured a {@link Mongo} instance no * configured on the {@link MongoDbFactory} will apply. If you configured a {@link Mongo} instance no
* {@link WriteConcern} will be used. * {@link WriteConcern} will be used.
* *
* @param writeConcern * @param writeConcern can be {@literal null}.
*/ */
public void setWriteConcern(WriteConcern writeConcern) { public void setWriteConcern(@Nullable WriteConcern writeConcern) {
this.writeConcern = writeConcern; this.writeConcern = writeConcern;
} }
/** /**
* Configures the {@link WriteConcernResolver} to be used with the template. * Configures the {@link WriteConcernResolver} to be used with the template.
* *
* @param writeConcernResolver * @param writeConcernResolver can be {@literal null}.
*/ */
public void setWriteConcernResolver(WriteConcernResolver writeConcernResolver) { public void setWriteConcernResolver(@Nullable WriteConcernResolver writeConcernResolver) {
this.writeConcernResolver = writeConcernResolver; this.writeConcernResolver = writeConcernResolver;
} }
@@ -370,7 +375,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#executeCommand(org.bson.Document, com.mongodb.ReadPreference) * @see org.springframework.data.mongodb.core.ReactiveMongoOperations#executeCommand(org.bson.Document, com.mongodb.ReadPreference)
*/ */
public Mono<Document> executeCommand(final Document command, final ReadPreference readPreference) { public Mono<Document> executeCommand(final Document command, @Nullable ReadPreference readPreference) {
Assert.notNull(command, "Command must not be null!"); Assert.notNull(command, "Command must not be null!");
@@ -485,7 +490,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#createCollection(java.lang.Class, org.springframework.data.mongodb.core.CollectionOptions) * @see org.springframework.data.mongodb.core.ReactiveMongoOperations#createCollection(java.lang.Class, org.springframework.data.mongodb.core.CollectionOptions)
*/ */
public <T> Mono<MongoCollection<Document>> createCollection(Class<T> entityClass, public <T> Mono<MongoCollection<Document>> createCollection(Class<T> entityClass,
CollectionOptions collectionOptions) { @Nullable CollectionOptions collectionOptions) {
return createCollection(determineCollectionName(entityClass), collectionOptions); return createCollection(determineCollectionName(entityClass), collectionOptions);
} }
@@ -609,7 +614,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#exists(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String) * @see org.springframework.data.mongodb.core.ReactiveMongoOperations#exists(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String)
*/ */
public Mono<Boolean> exists(final Query query, final Class<?> entityClass, String collectionName) { public Mono<Boolean> exists(final Query query, @Nullable Class<?> entityClass, String collectionName) {
if (query == null) { if (query == null) {
throw new InvalidDataAccessApiUsageException("Query passed in to exist can't be null"); throw new InvalidDataAccessApiUsageException("Query passed in to exist can't be null");
@@ -639,7 +644,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#find(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String) * @see org.springframework.data.mongodb.core.ReactiveMongoOperations#find(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String)
*/ */
public <T> Flux<T> find(final Query query, Class<T> entityClass, String collectionName) { public <T> Flux<T> find(@Nullable Query query, Class<T> entityClass, String collectionName) {
if (query == null) { if (query == null) {
return findAll(entityClass, collectionName); return findAll(entityClass, collectionName);
@@ -722,14 +727,15 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @return never {@literal null}. * @return never {@literal null}.
*/ */
protected <O> Flux<O> aggregate(Aggregation aggregation, String collectionName, Class<O> outputType, protected <O> Flux<O> aggregate(Aggregation aggregation, String collectionName, Class<O> outputType,
AggregationOperationContext context) { @Nullable AggregationOperationContext context) {
Assert.notNull(aggregation, "Aggregation pipeline must not be null!"); Assert.notNull(aggregation, "Aggregation pipeline must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!"); Assert.hasText(collectionName, "Collection name must not be null or empty!");
Assert.notNull(outputType, "Output type must not be null!"); Assert.notNull(outputType, "Output type must not be null!");
AggregationOperationContext rootContext = context == null ? Aggregation.DEFAULT_CONTEXT : context; AggregationUtil aggregationUtil = new AggregationUtil(queryMapper, mappingContext);
Document command = aggregation.toDocument(collectionName, rootContext); AggregationOperationContext rootContext = aggregationUtil.prepareAggregationContext(aggregation, context);
Document command = aggregationUtil.createPipeline(collectionName, aggregation, rootContext);
AggregationOptions options = AggregationOptions.fromDocument(command); AggregationOptions options = AggregationOptions.fromDocument(command);
Assert.isTrue(!options.isExplain(), "Cannot use explain option with streaming!"); Assert.isTrue(!options.isExplain(), "Cannot use explain option with streaming!");
@@ -747,8 +753,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
private <O> Flux<O> aggregateAndMap(MongoCollection<Document> collection, List<Document> pipeline, private <O> Flux<O> aggregateAndMap(MongoCollection<Document> collection, List<Document> pipeline,
AggregationOptions options, ReadDocumentCallback<O> readCallback) { AggregationOptions options, ReadDocumentCallback<O> readCallback) {
AggregatePublisher<Document> cursor = collection.aggregate(pipeline).allowDiskUse(options.isAllowDiskUse()) AggregatePublisher<Document> cursor = collection.aggregate(pipeline)
.useCursor(true); .allowDiskUse(options.isAllowDiskUse());
if (options.getCollation().isPresent()) { if (options.getCollation().isPresent()) {
cursor = cursor.collation(options.getCollation().map(Collation::toMongoCollation).get()); cursor = cursor.collation(options.getCollation().map(Collation::toMongoCollation).get());
@@ -904,7 +910,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#count(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String) * @see org.springframework.data.mongodb.core.ReactiveMongoOperations#count(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String)
*/ */
public Mono<Long> count(final Query query, final Class<?> entityClass, String collectionName) { public Mono<Long> count(@Nullable Query query, @Nullable Class<?> entityClass, String collectionName) {
Assert.hasText(collectionName, "Collection name must not be null or empty!"); Assert.hasText(collectionName, "Collection name must not be null or empty!");
@@ -924,12 +930,15 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/ */
@Override @Override
public <T> Mono<T> insert(Mono<? extends T> objectToSave) { public <T> Mono<T> insert(Mono<? extends T> objectToSave) {
Assert.notNull(objectToSave, "Mono to insert must not be null!");
return objectToSave.flatMap(this::insert); return objectToSave.flatMap(this::insert);
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#insert(org.reactivestreams.Publisher, java.lang.Class) * @see org.springframework.data.mongodb.core.ReactiveMongoOperations#insert(reactor.core.publisher.Mono, java.lang.Class)
*/ */
@Override @Override
public <T> Flux<T> insertAll(Mono<? extends Collection<? extends T>> batchToSave, Class<?> entityClass) { public <T> Flux<T> insertAll(Mono<? extends Collection<? extends T>> batchToSave, Class<?> entityClass) {
@@ -938,10 +947,13 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#insert(org.reactivestreams.Publisher, java.lang.String) * @see org.springframework.data.mongodb.core.ReactiveMongoOperations#insert(reactor.core.publisher.Mono, java.lang.String)
*/ */
@Override @Override
public <T> Flux<T> insertAll(Mono<? extends Collection<? extends T>> batchToSave, String collectionName) { public <T> Flux<T> insertAll(Mono<? extends Collection<? extends T>> batchToSave, String collectionName) {
Assert.notNull(batchToSave, "Batch to insert must not be null!");
return Flux.from(batchToSave).flatMap(collection -> insert(collection, collectionName)); return Flux.from(batchToSave).flatMap(collection -> insert(collection, collectionName));
} }
@@ -951,6 +963,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/ */
public <T> Mono<T> insert(T objectToSave) { public <T> Mono<T> insert(T objectToSave) {
Assert.notNull(objectToSave, "Object to insert must not be null!");
ensureNotIterable(objectToSave); ensureNotIterable(objectToSave);
return insert(objectToSave, determineEntityCollectionName(objectToSave)); return insert(objectToSave, determineEntityCollectionName(objectToSave));
} }
@@ -961,6 +975,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/ */
public <T> Mono<T> insert(T objectToSave, String collectionName) { public <T> Mono<T> insert(T objectToSave, String collectionName) {
Assert.notNull(objectToSave, "Object to insert must not be null!");
ensureNotIterable(objectToSave); ensureNotIterable(objectToSave);
return doInsert(collectionName, objectToSave, this.mongoConverter); return doInsert(collectionName, objectToSave, this.mongoConverter);
} }
@@ -1014,7 +1030,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#insertAll(org.reactivestreams.Publisher) * @see org.springframework.data.mongodb.core.ReactiveMongoOperations#insertAll(reactor.core.publisher.Mono)
*/ */
@Override @Override
public <T> Flux<T> insertAll(Mono<? extends Collection<? extends T>> objectsToSave) { public <T> Flux<T> insertAll(Mono<? extends Collection<? extends T>> objectsToSave) {
@@ -1084,6 +1100,9 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/ */
@Override @Override
public <T> Mono<T> save(Mono<? extends T> objectToSave) { public <T> Mono<T> save(Mono<? extends T> objectToSave) {
Assert.notNull(objectToSave, "Mono to save must not be null!");
return objectToSave.flatMap(this::save); return objectToSave.flatMap(this::save);
} }
@@ -1093,6 +1112,9 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/ */
@Override @Override
public <T> Mono<T> save(Mono<? extends T> objectToSave, String collectionName) { public <T> Mono<T> save(Mono<? extends T> objectToSave, String collectionName) {
Assert.notNull(objectToSave, "Mono to save must not be null!");
return objectToSave.flatMap(o -> save(o, collectionName)); return objectToSave.flatMap(o -> save(o, collectionName));
} }
@@ -1241,7 +1263,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
} }
private MongoCollection<Document> prepareCollection(MongoCollection<Document> collection, private MongoCollection<Document> prepareCollection(MongoCollection<Document> collection,
WriteConcern writeConcernToUse) { @Nullable WriteConcern writeConcernToUse) {
MongoCollection<Document> collectionToUse = collection; MongoCollection<Document> collectionToUse = collection;
if (writeConcernToUse != null) { if (writeConcernToUse != null) {
@@ -1355,8 +1377,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return doUpdate(collectionName, query, update, entityClass, false, true); return doUpdate(collectionName, query, update, entityClass, false, true);
} }
protected Mono<UpdateResult> doUpdate(final String collectionName, final Query query, final Update update, protected Mono<UpdateResult> doUpdate(final String collectionName, @Nullable Query query, @Nullable Update update,
final Class<?> entityClass, final boolean upsert, final boolean multi) { @Nullable Class<?> entityClass, final boolean upsert, final boolean multi) {
MongoPersistentEntity<?> entity = entityClass == null ? null : getPersistentEntity(entityClass); MongoPersistentEntity<?> entity = entityClass == null ? null : getPersistentEntity(entityClass);
@@ -1407,7 +1429,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return result.next(); return result.next();
} }
private void increaseVersionForUpdateIfNecessary(MongoPersistentEntity<?> persistentEntity, Update update) { private void increaseVersionForUpdateIfNecessary(@Nullable MongoPersistentEntity<?> persistentEntity, Update update) {
if (persistentEntity != null && persistentEntity.hasVersionProperty()) { if (persistentEntity != null && persistentEntity.hasVersionProperty()) {
String versionFieldName = persistentEntity.getRequiredVersionProperty().getFieldName(); String versionFieldName = persistentEntity.getRequiredVersionProperty().getFieldName();
@@ -1417,7 +1439,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
} }
} }
private boolean dbObjectContainsVersionProperty(Document document, MongoPersistentEntity<?> persistentEntity) { private boolean dbObjectContainsVersionProperty(Document document,
@Nullable MongoPersistentEntity<?> persistentEntity) {
if (persistentEntity == null || !persistentEntity.hasVersionProperty()) { if (persistentEntity == null || !persistentEntity.hasVersionProperty()) {
return false; return false;
@@ -1440,8 +1463,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#remove(reactor.core.publisher.Mono, java.lang.String) * @see org.springframework.data.mongodb.core.ReactiveMongoOperations#remove(reactor.core.publisher.Mono, java.lang.String)
*/ */
@Override @Override
public Mono<DeleteResult> remove(Mono<? extends Object> objectToRemove, String collection) { public Mono<DeleteResult> remove(Mono<? extends Object> objectToRemove, String collectionName) {
return objectToRemove.flatMap(o -> remove(objectToRemove, collection)); return objectToRemove.flatMap(it -> remove(it, collectionName));
} }
/* /*
@@ -1450,9 +1473,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/ */
public Mono<DeleteResult> remove(Object object) { public Mono<DeleteResult> remove(Object object) {
if (object == null) { Assert.notNull(object, "Object must not be null!");
return null;
}
return remove(getIdQueryFor(object), object.getClass()); return remove(getIdQueryFor(object), object.getClass());
} }
@@ -1461,15 +1482,12 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#remove(java.lang.Object, java.lang.String) * @see org.springframework.data.mongodb.core.ReactiveMongoOperations#remove(java.lang.Object, java.lang.String)
*/ */
public Mono<DeleteResult> remove(Object object, String collection) { public Mono<DeleteResult> remove(Object object, String collectionName) {
Assert.hasText(collection, "Collection name must not be null or empty!"); Assert.notNull(object, "Object must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
if (object == null) { return doRemove(collectionName, getIdQueryFor(object), object.getClass());
return null;
}
return doRemove(collection, getIdQueryFor(object), object.getClass());
} }
/** /**
@@ -1578,12 +1596,11 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#remove(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String) * @see org.springframework.data.mongodb.core.ReactiveMongoOperations#remove(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String)
*/ */
public Mono<DeleteResult> remove(Query query, Class<?> entityClass, String collectionName) { public Mono<DeleteResult> remove(Query query, @Nullable Class<?> entityClass, String collectionName) {
return doRemove(collectionName, query, entityClass); return doRemove(collectionName, query, entityClass);
} }
protected <T> Mono<DeleteResult> doRemove(final String collectionName, final Query query, protected <T> Mono<DeleteResult> doRemove(String collectionName, Query query, @Nullable Class<T> entityClass) {
final Class<T> entityClass) {
if (query == null) { if (query == null) {
throw new InvalidDataAccessApiUsageException("Query passed in to remove can't be null!"); throw new InvalidDataAccessApiUsageException("Query passed in to remove can't be null!");
@@ -1596,27 +1613,39 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return execute(collectionName, collection -> { return execute(collectionName, collection -> {
maybeEmitEvent(new BeforeDeleteEvent<T>(queryObject, entityClass, collectionName)); Document removeQuey = queryMapper.getMappedObject(queryObject, entity);
Document dboq = queryMapper.getMappedObject(queryObject, entity); maybeEmitEvent(new BeforeDeleteEvent<T>(removeQuey, entityClass, collectionName));
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.REMOVE, collectionName, entityClass, MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.REMOVE, collectionName, entityClass,
null, queryObject); null, removeQuey);
final DeleteOptions deleteOptions = new DeleteOptions();
query.getCollation().map(Collation::toMongoCollation).ifPresent(deleteOptions::collation);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction); WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
MongoCollection<Document> collectionToUse = prepareCollection(collection, writeConcernToUse); MongoCollection<Document> collectionToUse = prepareCollection(collection, writeConcernToUse);
if (LOGGER.isDebugEnabled()) { if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Remove using query: {} in collection: {}.", LOGGER.debug("Remove using query: {} in collection: {}.",
new Object[] { serializeToJsonSafely(dboq), collectionName }); new Object[] { serializeToJsonSafely(removeQuey), collectionName });
} }
query.getCollation().ifPresent(val -> { if (query.getLimit() > 0 || query.getSkip() > 0) {
// TODO: add collation support as soon as it's there! See https://jira.mongodb.org/browse/JAVARS-27 FindPublisher<Document> cursor = new QueryFindPublisherPreparer(query, entityClass)
throw new IllegalArgumentException("DeleteMany does currently not accept collation settings."); .prepare(collection.find(removeQuey)) //
}); .projection(new Document(ID_FIELD, 1));
return collectionToUse.deleteMany(dboq); return Flux.from(cursor) //
.map(doc -> doc.get(ID_FIELD)) //
.collectList() //
.flatMapMany(val -> {
return collectionToUse.deleteMany(new Document(ID_FIELD, new Document("$in", val)), deleteOptions);
});
} else {
return collectionToUse.deleteMany(removeQuey, deleteOptions);
}
}).doOnNext(deleteResult -> maybeEmitEvent(new AfterDeleteEvent<T>(queryObject, entityClass, collectionName))) }).doOnNext(deleteResult -> maybeEmitEvent(new AfterDeleteEvent<T>(queryObject, entityClass, collectionName)))
.next(); .next();
@@ -1645,8 +1674,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/ */
@Override @Override
public <T> Flux<T> findAllAndRemove(Query query, String collectionName) { public <T> Flux<T> findAllAndRemove(Query query, String collectionName) {
return (Flux<T>) findAllAndRemove(query, Object.class, collectionName);
return findAllAndRemove(query, null, collectionName);
} }
/* /*
@@ -1681,7 +1709,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#tail(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String) * @see org.springframework.data.mongodb.core.ReactiveMongoOperations#tail(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String)
*/ */
@Override @Override
public <T> Flux<T> tail(Query query, Class<T> entityClass, String collectionName) { public <T> Flux<T> tail(@Nullable Query query, Class<T> entityClass, String collectionName) {
if (query == null) { if (query == null) {
@@ -1789,10 +1817,11 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @param query the query document that specifies the criteria used to find a record. * @param query the query document that specifies the criteria used to find a record.
* @param fields the document that specifies the fields to be returned. * @param fields the document that specifies the fields to be returned.
* @param entityClass the parameterized type of the returned list. * @param entityClass the parameterized type of the returned list.
* @param collation can be {@literal null}.
* @return the {@link List} of converted objects. * @return the {@link List} of converted objects.
*/ */
protected <T> Mono<T> doFindOne(String collectionName, Document query, Document fields, Class<T> entityClass, protected <T> Mono<T> doFindOne(String collectionName, Document query, @Nullable Document fields,
Collation collation) { Class<T> entityClass, @Nullable Collation collation) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass); MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
Document mappedQuery = queryMapper.getMappedObject(query, entity); Document mappedQuery = queryMapper.getMappedObject(query, entity);
@@ -1842,7 +1871,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
} }
protected <S, T> Flux<T> doFind(String collectionName, Document query, Document fields, Class<S> entityClass, protected <S, T> Flux<T> doFind(String collectionName, Document query, Document fields, Class<S> entityClass,
FindPublisherPreparer preparer, DocumentCallback<T> objectCallback) { @Nullable FindPublisherPreparer preparer, DocumentCallback<T> objectCallback) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass); MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
@@ -1889,15 +1918,14 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* For cases where {@code fields} is {@literal null} or {@literal empty} add fields required for creating the * For cases where {@code fields} is {@literal null} or {@literal empty} add fields required for creating the
* projection (target) type if the {@code targetType} is a {@literal closed interface projection}. * projection (target) type if the {@code targetType} is a {@literal closed interface projection}.
* *
* @param fields can be {@literal null}. * @param fields must not be {@literal null}.
* @param domainType must not be {@literal null}. * @param domainType must not be {@literal null}.
* @param targetType must not be {@literal null}. * @param targetType must not be {@literal null}.
* @return {@link Document} with fields to be included. * @return {@link Document} with fields to be included.
*/ */
private Document addFieldsForProjection(Document fields, Class<?> domainType, Class<?> targetType) { private Document addFieldsForProjection(Document fields, Class<?> domainType, Class<?> targetType) {
if ((fields != null && !fields.isEmpty()) || !targetType.isInterface() if (!fields.isEmpty() || !targetType.isInterface() || ClassUtils.isAssignable(domainType, targetType)) {
|| ClassUtils.isAssignable(domainType, targetType)) {
return fields; return fields;
} }
@@ -1910,9 +1938,10 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return fields; return fields;
} }
protected CreateCollectionOptions convertToCreateCollectionOptions(CollectionOptions collectionOptions) { protected CreateCollectionOptions convertToCreateCollectionOptions(@Nullable CollectionOptions collectionOptions) {
CreateCollectionOptions result = new CreateCollectionOptions(); CreateCollectionOptions result = new CreateCollectionOptions();
if (collectionOptions != null) { if (collectionOptions != null) {
collectionOptions.getCapped().ifPresent(result::capped); collectionOptions.getCapped().ifPresent(result::capped);
@@ -1920,6 +1949,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
collectionOptions.getMaxDocuments().ifPresent(result::maxDocuments); collectionOptions.getMaxDocuments().ifPresent(result::maxDocuments);
collectionOptions.getCollation().map(Collation::toMongoCollation).ifPresent(result::collation); collectionOptions.getCollation().map(Collation::toMongoCollation).ifPresent(result::collation);
} }
return result; return result;
} }
@@ -1936,7 +1966,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @return the List of converted objects. * @return the List of converted objects.
*/ */
protected <T> Mono<T> doFindAndRemove(String collectionName, Document query, Document fields, Document sort, protected <T> Mono<T> doFindAndRemove(String collectionName, Document query, Document fields, Document sort,
Collation collation, Class<T> entityClass) { @Nullable Collation collation, Class<T> entityClass) {
if (LOGGER.isDebugEnabled()) { if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("findAndRemove using query: %s fields: %s sort: %s for class: %s in collection: %s", LOGGER.debug(String.format("findAndRemove using query: %s fields: %s sort: %s for class: %s in collection: %s",
@@ -1953,8 +1983,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
protected <T> Mono<T> doFindAndModify(String collectionName, Document query, Document fields, Document sort, protected <T> Mono<T> doFindAndModify(String collectionName, Document query, Document fields, Document sort,
Class<T> entityClass, Update update, FindAndModifyOptions options) { Class<T> entityClass, Update update, FindAndModifyOptions options) {
FindAndModifyOptions optionsToUse = options != null ? options : new FindAndModifyOptions();
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass); MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
return Mono.defer(() -> { return Mono.defer(() -> {
@@ -1971,7 +1999,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
collectionName)); collectionName));
} }
return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, mappedUpdate, optionsToUse), return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, mappedUpdate, options),
new ReadDocumentCallback<T>(this.mongoConverter, entityClass, collectionName), collectionName); new ReadDocumentCallback<T>(this.mongoConverter, entityClass, collectionName), collectionName);
}); });
} }
@@ -1988,7 +2016,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @param savedObject * @param savedObject
* @param id * @param id
*/ */
private void populateIdIfNecessary(Object savedObject, Object id) { private void populateIdIfNecessary(Object savedObject, @Nullable Object id) {
if (id == null) { if (id == null) {
return; return;
@@ -2028,22 +2056,20 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
} }
protected void ensureNotIterable(Object o) { protected void ensureNotIterable(Object o) {
if (null != o) {
boolean isIterable = o.getClass().isArray(); boolean isIterable = o.getClass().isArray();
if (!isIterable) { if (!isIterable) {
for (Class iterableClass : ITERABLE_CLASSES) { for (Class iterableClass : ITERABLE_CLASSES) {
if (iterableClass.isAssignableFrom(o.getClass()) || o.getClass().getName().equals(iterableClass.getName())) { if (iterableClass.isAssignableFrom(o.getClass()) || o.getClass().getName().equals(iterableClass.getName())) {
isIterable = true; isIterable = true;
break; break;
}
} }
} }
}
if (isIterable) { if (isIterable) {
throw new IllegalArgumentException("Cannot use a collection here."); throw new IllegalArgumentException("Cannot use a collection here.");
}
} }
} }
@@ -2066,18 +2092,20 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* The returned {@link WriteConcern} will be defaulted to {@link WriteConcern#ACKNOWLEDGED} when * The returned {@link WriteConcern} will be defaulted to {@link WriteConcern#ACKNOWLEDGED} when
* {@link WriteResultChecking} is set to {@link WriteResultChecking#EXCEPTION}. * {@link WriteResultChecking} is set to {@link WriteResultChecking#EXCEPTION}.
* *
* @param mongoAction any WriteConcern already configured or null * @param mongoAction any WriteConcern already configured or {@literal null}.
* @return The prepared WriteConcern or null * @return The prepared WriteConcern or {@literal null}.
* @see #setWriteConcern(WriteConcern) * @see #setWriteConcern(WriteConcern)
* @see #setWriteConcernResolver(WriteConcernResolver) * @see #setWriteConcernResolver(WriteConcernResolver)
*/ */
@Nullable
protected WriteConcern prepareWriteConcern(MongoAction mongoAction) { protected WriteConcern prepareWriteConcern(MongoAction mongoAction) {
WriteConcern wc = writeConcernResolver.resolve(mongoAction); WriteConcern wc = writeConcernResolver.resolve(mongoAction);
return potentiallyForceAcknowledgedWrite(wc); return potentiallyForceAcknowledgedWrite(wc);
} }
private WriteConcern potentiallyForceAcknowledgedWrite(WriteConcern wc) { @Nullable
private WriteConcern potentiallyForceAcknowledgedWrite(@Nullable WriteConcern wc) {
if (ObjectUtils.nullSafeEquals(WriteResultChecking.EXCEPTION, writeResultChecking) if (ObjectUtils.nullSafeEquals(WriteResultChecking.EXCEPTION, writeResultChecking)
&& MongoClientVersion.isMongo3Driver()) { && MongoClientVersion.isMongo3Driver()) {
@@ -2121,14 +2149,14 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* *
* @param collectionCallback the callback to retrieve the {@link FindPublisher} with, must not be {@literal null}. * @param collectionCallback the callback to retrieve the {@link FindPublisher} with, must not be {@literal null}.
* @param preparer the {@link FindPublisherPreparer} to potentially modify the {@link FindPublisher} before iterating * @param preparer the {@link FindPublisherPreparer} to potentially modify the {@link FindPublisher} before iterating
* over it, may be {@literal null} * over it, may be {@literal null}.
* @param objectCallback the {@link DocumentCallback} to transform {@link Document}s into the actual domain type, must * @param objectCallback the {@link DocumentCallback} to transform {@link Document}s into the actual domain type, must
* not be {@literal null}. * not be {@literal null}.
* @param collectionName the collection to be queried, must not be {@literal null}. * @param collectionName the collection to be queried, must not be {@literal null}.
* @return * @return
*/ */
private <T> Flux<T> executeFindMultiInternal(ReactiveCollectionQueryCallback<Document> collectionCallback, private <T> Flux<T> executeFindMultiInternal(ReactiveCollectionQueryCallback<Document> collectionCallback,
FindPublisherPreparer preparer, DocumentCallback<T> objectCallback, String collectionName) { @Nullable FindPublisherPreparer preparer, DocumentCallback<T> objectCallback, String collectionName) {
return createFlux(collectionName, collection -> { return createFlux(collectionName, collection -> {
@@ -2184,17 +2212,23 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return resolved == null ? ex : resolved; return resolved == null ? ex : resolved;
} }
private MongoPersistentEntity<?> getPersistentEntity(Class<?> type) { @Nullable
private MongoPersistentEntity<?> getPersistentEntity(@Nullable Class<?> type) {
return type == null ? null : mappingContext.getPersistentEntity(type); return type == null ? null : mappingContext.getPersistentEntity(type);
} }
private MongoPersistentProperty getIdPropertyFor(Class<?> type) { @Nullable
private MongoPersistentProperty getIdPropertyFor(@Nullable Class<?> type) {
if (type == null) {
return null;
}
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(type); MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(type);
return persistentEntity != null ? persistentEntity.getIdProperty() : null; return persistentEntity != null ? persistentEntity.getIdProperty() : null;
} }
private <T> String determineEntityCollectionName(T obj) { private <T> String determineEntityCollectionName(@Nullable T obj) {
if (null != obj) { if (null != obj) {
return determineCollectionName(obj.getClass()); return determineCollectionName(obj.getClass());
@@ -2203,7 +2237,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return null; return null;
} }
String determineCollectionName(Class<?> entityClass) { String determineCollectionName(@Nullable Class<?> entityClass) {
if (entityClass == null) { if (entityClass == null) {
throw new InvalidDataAccessApiUsageException( throw new InvalidDataAccessApiUsageException(
@@ -2291,7 +2325,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
private final Optional<Document> fields; private final Optional<Document> fields;
private final Optional<Collation> collation; private final Optional<Collation> collation;
FindOneCallback(Document query, Document fields, Collation collation) { FindOneCallback(Document query, @Nullable Document fields, @Nullable Collation collation) {
this.query = query; this.query = query;
this.fields = Optional.ofNullable(fields); this.fields = Optional.ofNullable(fields);
this.collation = Optional.ofNullable(collation); this.collation = Optional.ofNullable(collation);
@@ -2327,10 +2361,10 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/ */
private static class FindCallback implements ReactiveCollectionQueryCallback<Document> { private static class FindCallback implements ReactiveCollectionQueryCallback<Document> {
private final Document query; private final @Nullable Document query;
private final Document fields; private final @Nullable Document fields;
FindCallback(Document query) { FindCallback(@Nullable Document query) {
this(query, null); this(query, null);
} }
@@ -2370,7 +2404,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
private final Document sort; private final Document sort;
private final Optional<Collation> collation; private final Optional<Collation> collation;
FindAndRemoveCallback(Document query, Document fields, Document sort, Collation collation) { FindAndRemoveCallback(Document query, Document fields, Document sort, @Nullable Collation collation) {
this.query = query; this.query = query;
this.fields = fields; this.fields = fields;
@@ -2509,7 +2543,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
this.collectionName = collectionName; this.collectionName = collectionName;
} }
public T doWith(Document object) { public T doWith(@Nullable Document object) {
if (null != object) { if (null != object) {
maybeEmitEvent(new AfterLoadEvent<T>(object, type, collectionName)); maybeEmitEvent(new AfterLoadEvent<T>(object, type, collectionName));
@@ -2540,7 +2574,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
private final @NonNull Class<T> targetType; private final @NonNull Class<T> targetType;
private final @NonNull String collectionName; private final @NonNull String collectionName;
public T doWith(Document object) { @Nullable
public T doWith(@Nullable Document object) {
if (object == null) { if (object == null) {
return null; return null;
@@ -2604,10 +2639,10 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/ */
class QueryFindPublisherPreparer implements FindPublisherPreparer { class QueryFindPublisherPreparer implements FindPublisherPreparer {
private final Query query; private final @Nullable Query query;
private final Class<?> type; private final @Nullable Class<?> type;
QueryFindPublisherPreparer(Query query, Class<?> type) { QueryFindPublisherPreparer(@Nullable Query query, @Nullable Class<?> type) {
this.query = query; this.query = query;
this.type = type; this.type = type;
@@ -2687,12 +2722,14 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
static class NoOpDbRefResolver implements DbRefResolver { static class NoOpDbRefResolver implements DbRefResolver {
@Override @Override
public Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback, @Nullable
DbRefProxyHandler proxyHandler) { public Object resolveDbRef(@Nonnull MongoPersistentProperty property, @Nonnull DBRef dbref,
@Nonnull DbRefResolverCallback callback, @Nonnull DbRefProxyHandler proxyHandler) {
return null; return null;
} }
@Override @Override
@Nullable
public DBRef createDbRef(org.springframework.data.mongodb.core.mapping.DBRef annotation, public DBRef createDbRef(org.springframework.data.mongodb.core.mapping.DBRef annotation,
MongoPersistentEntity<?> entity, Object id) { MongoPersistentEntity<?> entity, Object id) {
return null; return null;
@@ -2705,7 +2742,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
@Override @Override
public List<Document> bulkFetch(List<DBRef> dbRefs) { public List<Document> bulkFetch(List<DBRef> dbRefs) {
return null; return Collections.emptyList();
} }
} }
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2014-2015 the original author or authors. * Copyright 2014-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -19,6 +19,7 @@ import java.util.Set;
import org.springframework.data.mongodb.core.script.ExecutableMongoScript; import org.springframework.data.mongodb.core.script.ExecutableMongoScript;
import org.springframework.data.mongodb.core.script.NamedMongoScript; import org.springframework.data.mongodb.core.script.NamedMongoScript;
import org.springframework.lang.Nullable;
import com.mongodb.DB; import com.mongodb.DB;
@@ -56,6 +57,7 @@ public interface ScriptOperations {
* @return the script evaluation result. * @return the script evaluation result.
* @throws org.springframework.dao.DataAccessException * @throws org.springframework.dao.DataAccessException
*/ */
@Nullable
Object execute(ExecutableMongoScript script, Object... args); Object execute(ExecutableMongoScript script, Object... args);
/** /**
@@ -65,6 +67,7 @@ public interface ScriptOperations {
* @param args * @param args
* @return * @return
*/ */
@Nullable
Object call(String scriptName, Object... args); Object call(String scriptName, Object... args);
/** /**

View File

@@ -1,149 +1,149 @@
/* /*
* Copyright 2011-2017 the original author or authors. * Copyright 2011-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* http://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import java.net.UnknownHostException; import org.springframework.beans.factory.DisposableBean;
import org.springframework.dao.DataAccessException;
import org.springframework.beans.factory.DisposableBean; import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.dao.DataAccessException; import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.dao.support.PersistenceExceptionTranslator; import org.springframework.lang.Nullable;
import org.springframework.data.mongodb.MongoDbFactory; import org.springframework.util.Assert;
import org.springframework.util.Assert;
import com.mongodb.DB;
import com.mongodb.DB; import com.mongodb.MongoClient;
import com.mongodb.MongoClient; import com.mongodb.MongoClientURI;
import com.mongodb.MongoClientURI; import com.mongodb.WriteConcern;
import com.mongodb.WriteConcern; import com.mongodb.client.MongoDatabase;
import com.mongodb.client.MongoDatabase;
/**
/** * Factory to create {@link MongoDatabase} instances from a {@link MongoClient} instance.
* Factory to create {@link DB} instances from a {@link MongoClient} instance. *
* * @author Mark Pollack
* @author Mark Pollack * @author Oliver Gierke
* @author Oliver Gierke * @author Thomas Darimont
* @author Thomas Darimont * @author Christoph Strobl
* @author Christoph Strobl * @author George Moraitis
*/ * @author Mark Paluch
public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory { */
public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
private final MongoClient mongoClient;
private final String databaseName; private final MongoClient mongoClient;
private final boolean mongoInstanceCreated; private final String databaseName;
private final PersistenceExceptionTranslator exceptionTranslator; private final boolean mongoInstanceCreated;
private final PersistenceExceptionTranslator exceptionTranslator;
private WriteConcern writeConcern;
private @Nullable WriteConcern writeConcern;
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClientURI}. /**
* * Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClientURI}.
* @param uri must not be {@literal null}. *
* @throws UnknownHostException * @param uri must not be {@literal null}.
* @since 1.7 * @since 1.7
*/ */
public SimpleMongoDbFactory(MongoClientURI uri) { public SimpleMongoDbFactory(MongoClientURI uri) {
this(new MongoClient(uri), uri.getDatabase(), true); this(new MongoClient(uri), uri.getDatabase(), true);
} }
/** /**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClient}. * Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClient}.
* *
* @param mongoClient must not be {@literal null}. * @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null}. * @param databaseName must not be {@literal null}.
* @since 1.7 * @since 1.7
*/ */
public SimpleMongoDbFactory(MongoClient mongoClient, String databaseName) { public SimpleMongoDbFactory(MongoClient mongoClient, String databaseName) {
this(mongoClient, databaseName, false); this(mongoClient, databaseName, false);
} }
/** /**
* @param client * @param mongoClient
* @param databaseName * @param databaseName
* @param mongoInstanceCreated * @param mongoInstanceCreated
* @since 1.7 * @since 1.7
*/ */
private SimpleMongoDbFactory(MongoClient mongoClient, String databaseName, boolean mongoInstanceCreated) { private SimpleMongoDbFactory(MongoClient mongoClient, String databaseName, boolean mongoInstanceCreated) {
Assert.notNull(mongoClient, "MongoClient must not be null!"); Assert.notNull(mongoClient, "MongoClient must not be null!");
Assert.hasText(databaseName, "Database name must not be empty!"); Assert.hasText(databaseName, "Database name must not be empty!");
Assert.isTrue(databaseName.matches("[\\w-]+"), Assert.isTrue(databaseName.matches("[^/\\\\.$\"\\s]+"),
"Database name must only contain letters, numbers, underscores and dashes!"); "Database name must not contain slashes, dots, spaces, quotes, or dollar signs!");
this.mongoClient = mongoClient; this.mongoClient = mongoClient;
this.databaseName = databaseName; this.databaseName = databaseName;
this.mongoInstanceCreated = mongoInstanceCreated; this.mongoInstanceCreated = mongoInstanceCreated;
this.exceptionTranslator = new MongoExceptionTranslator(); this.exceptionTranslator = new MongoExceptionTranslator();
} }
/** /**
* Configures the {@link WriteConcern} to be used on the {@link DB} instance being created. * Configures the {@link WriteConcern} to be used on the {@link DB} instance being created.
* *
* @param writeConcern the writeConcern to set * @param writeConcern the writeConcern to set
*/ */
public void setWriteConcern(WriteConcern writeConcern) { public void setWriteConcern(WriteConcern writeConcern) {
this.writeConcern = writeConcern; this.writeConcern = writeConcern;
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb() * @see org.springframework.data.mongodb.MongoDbFactory#getDb()
*/ */
public MongoDatabase getDb() throws DataAccessException { public MongoDatabase getDb() throws DataAccessException {
return getDb(databaseName); return getDb(databaseName);
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb(java.lang.String) * @see org.springframework.data.mongodb.MongoDbFactory#getDb(java.lang.String)
*/ */
public MongoDatabase getDb(String dbName) throws DataAccessException { public MongoDatabase getDb(String dbName) throws DataAccessException {
Assert.hasText(dbName, "Database name must not be empty."); Assert.hasText(dbName, "Database name must not be empty.");
MongoDatabase db = mongoClient.getDatabase(dbName); MongoDatabase db = mongoClient.getDatabase(dbName);
if (writeConcern == null) { if (writeConcern == null) {
return db; return db;
} }
return db.withWriteConcern(writeConcern); return db.withWriteConcern(writeConcern);
} }
/** /**
* Clean up the Mongo instance if it was created by the factory itself. * Clean up the Mongo instance if it was created by the factory itself.
* *
* @see DisposableBean#destroy() * @see DisposableBean#destroy()
*/ */
public void destroy() throws Exception { public void destroy() throws Exception {
if (mongoInstanceCreated) { if (mongoInstanceCreated) {
mongoClient.close(); mongoClient.close();
} }
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getExceptionTranslator() * @see org.springframework.data.mongodb.MongoDbFactory#getExceptionTranslator()
*/ */
@Override @Override
public PersistenceExceptionTranslator getExceptionTranslator() { public PersistenceExceptionTranslator getExceptionTranslator() {
return this.exceptionTranslator; return this.exceptionTranslator;
} }
@SuppressWarnings("deprecation") @SuppressWarnings("deprecation")
@Override @Override
public DB getLegacyDb() { public DB getLegacyDb() {
return mongoClient.getDB(databaseName); return mongoClient.getDB(databaseName);
} }
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2016 the original author or authors. * Copyright 2016-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -21,6 +21,7 @@ import org.springframework.beans.factory.DisposableBean;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator; import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.ReactiveMongoDatabaseFactory; import org.springframework.data.mongodb.ReactiveMongoDatabaseFactory;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import com.mongodb.ConnectionString; import com.mongodb.ConnectionString;
@@ -41,9 +42,10 @@ public class SimpleReactiveMongoDatabaseFactory implements DisposableBean, React
private final MongoClient mongo; private final MongoClient mongo;
private final String databaseName; private final String databaseName;
private final boolean mongoInstanceCreated; private final boolean mongoInstanceCreated;
private final PersistenceExceptionTranslator exceptionTranslator; private final PersistenceExceptionTranslator exceptionTranslator;
private WriteConcern writeConcern; private @Nullable WriteConcern writeConcern;
/** /**
* Creates a new {@link SimpleReactiveMongoDatabaseFactory} instance from the given {@link ConnectionString}. * Creates a new {@link SimpleReactiveMongoDatabaseFactory} instance from the given {@link ConnectionString}.
@@ -70,8 +72,8 @@ public class SimpleReactiveMongoDatabaseFactory implements DisposableBean, React
Assert.notNull(client, "MongoClient must not be null!"); Assert.notNull(client, "MongoClient must not be null!");
Assert.hasText(databaseName, "Database name must not be empty!"); Assert.hasText(databaseName, "Database name must not be empty!");
Assert.isTrue(databaseName.matches("[\\w-]+"), Assert.isTrue(databaseName.matches("[^/\\\\.$\"\\s]+"),
"Database name must only contain letters, numbers, underscores and dashes!"); "Database name must not contain slashes, dots, spaces, quotes, or dollar signs!");
this.mongo = client; this.mongo = client;
this.databaseName = databaseName; this.databaseName = databaseName;

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011-2012 the original author or authors. * Copyright 2011-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -15,6 +15,8 @@
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import org.springframework.lang.Nullable;
import com.mongodb.WriteConcern; import com.mongodb.WriteConcern;
/** /**
@@ -33,5 +35,6 @@ public interface WriteConcernResolver {
* should not be resolved. * should not be resolved.
* @return a {@link WriteConcern} based on the passed in {@link MongoAction} value, maybe {@literal null}. * @return a {@link WriteConcern} based on the passed in {@link MongoAction} value, maybe {@literal null}.
*/ */
@Nullable
WriteConcern resolve(MongoAction action); WriteConcern resolve(MongoAction action);
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2016. the original author or authors. * Copyright 2016-2018. the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -20,12 +20,15 @@ import java.util.Arrays;
import java.util.Collections; import java.util.Collections;
import java.util.LinkedHashMap; import java.util.LinkedHashMap;
import java.util.List; import java.util.List;
import java.util.Map;
import org.bson.Document; import org.bson.Document;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils; import org.springframework.util.ObjectUtils;
/** /**
* @author Christoph Strobl * @author Christoph Strobl
* @author Matt Morrissette
* @since 1.10 * @since 1.10
*/ */
abstract class AbstractAggregationExpression implements AggregationExpression { abstract class AbstractAggregationExpression implements AggregationExpression {
@@ -46,29 +49,7 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) { public Document toDocument(Object value, AggregationOperationContext context) {
return new Document(getMongoMethod(), unpack(value, context));
Object valueToUse;
if (value instanceof List) {
List<Object> arguments = (List<Object>) value;
List<Object> args = new ArrayList<Object>(arguments.size());
for (Object val : arguments) {
args.add(unpack(val, context));
}
valueToUse = args;
} else if (value instanceof java.util.Map) {
Document dbo = new Document();
for (java.util.Map.Entry<String, Object> entry : ((java.util.Map<String, Object>) value).entrySet()) {
dbo.put(entry.getKey(), unpack(entry.getValue(), context));
}
valueToUse = dbo;
} else {
valueToUse = unpack(value, context);
}
return new Document(getMongoMethod(), valueToUse);
} }
protected static List<Field> asFields(String... fieldRefs) { protected static List<Field> asFields(String... fieldRefs) {
@@ -94,14 +75,23 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
if (value instanceof List) { if (value instanceof List) {
List<Object> sourceList = (List<Object>) value; List<Object> sourceList = (List<Object>) value;
List<Object> mappedList = new ArrayList<Object>(sourceList.size()); List<Object> mappedList = new ArrayList<>(sourceList.size());
sourceList.stream().map((item) -> unpack(item, context)).forEach(mappedList::add);
for (Object item : sourceList) {
mappedList.add(unpack(item, context));
}
return mappedList; return mappedList;
} }
if (value instanceof Map) {
Document targetDocument = new Document();
Map<String, Object> sourceMap = (Map<String, Object>) value;
sourceMap.forEach((k, v) -> targetDocument.append(k, unpack(v, context)));
return targetDocument;
}
return value; return value;
} }
@@ -112,9 +102,7 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
List<Object> clone = new ArrayList<Object>((List) this.value); List<Object> clone = new ArrayList<Object>((List) this.value);
if (value instanceof List) { if (value instanceof List) {
for (Object val : (List) value) { clone.addAll((List) value);
clone.add(val);
}
} else { } else {
clone.add(value); clone.add(value);
} }
@@ -127,10 +115,9 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
protected java.util.Map<String, Object> append(String key, Object value) { protected java.util.Map<String, Object> append(String key, Object value) {
if (!(this.value instanceof java.util.Map)) { Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
throw new IllegalArgumentException("o_O");
} java.util.Map<String, Object> clone = new LinkedHashMap<>((java.util.Map) this.value);
java.util.Map<String, Object> clone = new LinkedHashMap<String, Object>((java.util.Map<String, Object>) this.value);
clone.put(key, value); clone.put(key, value);
return clone; return clone;
@@ -144,7 +131,67 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
if (value instanceof java.util.Map) { if (value instanceof java.util.Map) {
return new ArrayList<Object>(((java.util.Map) value).values()); return new ArrayList<Object>(((java.util.Map) value).values());
} }
return new ArrayList<Object>(Collections.singletonList(value)); return new ArrayList<>(Collections.singletonList(value));
}
/**
* Get the value at a given index.
*
* @param index
* @param <T>
* @return
* @since 2.1
*/
@SuppressWarnings("unchecked")
protected <T> T get(int index) {
return (T) values().get(index);
}
/**
* Get the value for a given key.
*
* @param key
* @param <T>
* @return
* @since 2.1
*/
@SuppressWarnings("unchecked")
protected <T> T get(Object key) {
Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
return (T) ((java.util.Map<String, Object>) this.value).get(key);
}
/**
* Get the argument map.
*
* @since 2.1
* @return
*/
@SuppressWarnings("unchecked")
protected java.util.Map<String, Object> argumentMap() {
Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
return Collections.unmodifiableMap((java.util.Map) value);
}
/**
* Check if the given key is available.
*
* @param key
* @return
* @since 2.1
*/
@SuppressWarnings("unchecked")
protected boolean contains(Object key) {
if (!(this.value instanceof java.util.Map)) {
return false;
}
return ((java.util.Map<String, Object>) this.value).containsKey(key);
} }
protected abstract String getMongoMethod(); protected abstract String getMongoMethod();

View File

@@ -32,6 +32,7 @@ import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.CriteriaDefinition; import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.data.mongodb.core.query.NearQuery; import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.SerializationUtils; import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
/** /**
@@ -600,6 +601,16 @@ public class Aggregation {
return SerializationUtils.serializeToJsonSafely(toDocument("__collection__", DEFAULT_CONTEXT)); return SerializationUtils.serializeToJsonSafely(toDocument("__collection__", DEFAULT_CONTEXT));
} }
/**
* Get {@link AggregationOptions} to apply.
*
* @return never {@literal null}.
* @since 2.0.3
*/
public AggregationOptions getOptions() {
return options;
}
/** /**
* Describes the system variables available in MongoDB aggregation framework pipeline expressions. * Describes the system variables available in MongoDB aggregation framework pipeline expressions.
* *
@@ -619,7 +630,7 @@ public class Aggregation {
* @param fieldRef may be {@literal null}. * @param fieldRef may be {@literal null}.
* @return * @return
*/ */
public static boolean isReferingToSystemVariable(String fieldRef) { public static boolean isReferingToSystemVariable(@Nullable String fieldRef) {
if (fieldRef == null || !fieldRef.startsWith(PREFIX) || fieldRef.length() <= 2) { if (fieldRef == null || !fieldRef.startsWith(PREFIX) || fieldRef.length() <= 2) {
return false; return false;

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013 the original author or authors. * Copyright 2013-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -21,14 +21,16 @@ import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldRefe
import org.springframework.data.mongodb.core.spel.ExpressionNode; import org.springframework.data.mongodb.core.spel.ExpressionNode;
import org.springframework.data.mongodb.core.spel.ExpressionTransformationContextSupport; import org.springframework.data.mongodb.core.spel.ExpressionTransformationContextSupport;
import org.springframework.data.mongodb.core.spel.ExpressionTransformer; import org.springframework.data.mongodb.core.spel.ExpressionTransformer;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
/** /**
* Interface to type an {@link ExpressionTransformer} to the contained * Interface to type an {@link ExpressionTransformer} to the contained
* {@link AggregationExpressionTransformationContext}. * {@link AggregationExpressionTransformationContext}.
* *
* @author Oliver Gierke * @author Oliver Gierke
* @author Christoph Strobl * @author Christoph Strobl
* @author Mark Paluch
* @since 1.4 * @since 1.4
*/ */
interface AggregationExpressionTransformer interface AggregationExpressionTransformer
@@ -36,7 +38,7 @@ interface AggregationExpressionTransformer
/** /**
* A special {@link ExpressionTransformationContextSupport} to be aware of the {@link AggregationOperationContext}. * A special {@link ExpressionTransformationContextSupport} to be aware of the {@link AggregationOperationContext}.
* *
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
*/ */
@@ -47,14 +49,14 @@ interface AggregationExpressionTransformer
/** /**
* Creates an {@link AggregationExpressionTransformationContext}. * Creates an {@link AggregationExpressionTransformationContext}.
* *
* @param currentNode must not be {@literal null}. * @param currentNode must not be {@literal null}.
* @param parentNode * @param parentNode may be {@literal null}.
* @param previousOperationObject * @param previousOperationObject may be {@literal null}.
* @param aggregationContext must not be {@literal null}. * @param aggregationContext must not be {@literal null}.
*/ */
public AggregationExpressionTransformationContext(T currentNode, ExpressionNode parentNode, public AggregationExpressionTransformationContext(T currentNode, @Nullable ExpressionNode parentNode,
Document previousOperationObject, AggregationOperationContext context) { @Nullable Document previousOperationObject, AggregationOperationContext context) {
super(currentNode, parentNode, previousOperationObject); super(currentNode, parentNode, previousOperationObject);
@@ -64,7 +66,7 @@ interface AggregationExpressionTransformer
/** /**
* Returns the underlying {@link AggregationOperationContext}. * Returns the underlying {@link AggregationOperationContext}.
* *
* @return * @return
*/ */
public AggregationOperationContext getAggregationContext() { public AggregationOperationContext getAggregationContext() {
@@ -73,7 +75,7 @@ interface AggregationExpressionTransformer
/** /**
* Returns the {@link FieldReference} for the current {@link ExpressionNode}. * Returns the {@link FieldReference} for the current {@link ExpressionNode}.
* *
* @return * @return
*/ */
public FieldReference getFieldReference() { public FieldReference getFieldReference() {

View File

@@ -41,7 +41,7 @@ class AggregationOperationRenderer {
* {@link Document} representation. * {@link Document} representation.
* *
* @param operations must not be {@literal null}. * @param operations must not be {@literal null}.
* @param context must not be {@literal null}. * @param rootContext must not be {@literal null}.
* @return the {@link List} of {@link Document}. * @return the {@link List} of {@link Document}.
*/ */
static List<Document> toDocument(List<AggregationOperation> operations, AggregationOperationContext rootContext) { static List<Document> toDocument(List<AggregationOperation> operations, AggregationOperationContext rootContext) {
@@ -59,7 +59,7 @@ class AggregationOperationRenderer {
FieldsExposingAggregationOperation exposedFieldsOperation = (FieldsExposingAggregationOperation) operation; FieldsExposingAggregationOperation exposedFieldsOperation = (FieldsExposingAggregationOperation) operation;
ExposedFields fields = exposedFieldsOperation.getFields(); ExposedFields fields = exposedFieldsOperation.getFields();
if (operation instanceof InheritsFieldsAggregationOperation) { if (operation instanceof InheritsFieldsAggregationOperation || exposedFieldsOperation.inheritsFields()) {
contextToUse = new InheritingExposedFieldsAggregationOperationContext(fields, contextToUse); contextToUse = new InheritingExposedFieldsAggregationOperationContext(fields, contextToUse);
} else { } else {
contextToUse = fields.exposesNoFields() ? DEFAULT_CONTEXT contextToUse = fields.exposesNoFields() ? DEFAULT_CONTEXT

View File

@@ -19,6 +19,7 @@ import java.util.Optional;
import org.bson.Document; import org.bson.Document;
import org.springframework.data.mongodb.core.query.Collation; import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import com.mongodb.DBObject; import com.mongodb.DBObject;
@@ -70,7 +71,8 @@ public class AggregationOptions {
* @param collation collation for string comparison. Can be {@literal null}. * @param collation collation for string comparison. Can be {@literal null}.
* @since 2.0 * @since 2.0
*/ */
public AggregationOptions(boolean allowDiskUse, boolean explain, Document cursor, Collation collation) { public AggregationOptions(boolean allowDiskUse, boolean explain, @Nullable Document cursor,
@Nullable Collation collation) {
this.allowDiskUse = allowDiskUse; this.allowDiskUse = allowDiskUse;
this.explain = explain; this.explain = explain;
@@ -242,8 +244,8 @@ public class AggregationOptions {
private boolean allowDiskUse; private boolean allowDiskUse;
private boolean explain; private boolean explain;
private Document cursor; private @Nullable Document cursor;
private Collation collation; private @Nullable Collation collation;
/** /**
* Defines whether to off-load intensive sort-operations to disk. * Defines whether to off-load intensive sort-operations to disk.
@@ -300,7 +302,7 @@ public class AggregationOptions {
* @param collation can be {@literal null}. * @param collation can be {@literal null}.
* @return * @return
*/ */
public Builder collation(Collation collation) { public Builder collation(@Nullable Collation collation) {
this.collation = collation; this.collation = collation;
return this; return this;

View File

@@ -20,11 +20,12 @@ import java.util.Iterator;
import java.util.List; import java.util.List;
import org.bson.Document; import org.bson.Document;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
/** /**
* Collects the results of executing an aggregation operation. * Collects the results of executing an aggregation operation.
* *
* @author Tobias Trelle * @author Tobias Trelle
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
@@ -37,11 +38,11 @@ public class AggregationResults<T> implements Iterable<T> {
private final List<T> mappedResults; private final List<T> mappedResults;
private final Document rawResults; private final Document rawResults;
private final String serverUsed; private final @Nullable String serverUsed;
/** /**
* Creates a new {@link AggregationResults} instance from the given mapped and raw results. * Creates a new {@link AggregationResults} instance from the given mapped and raw results.
* *
* @param mappedResults must not be {@literal null}. * @param mappedResults must not be {@literal null}.
* @param rawResults must not be {@literal null}. * @param rawResults must not be {@literal null}.
*/ */
@@ -57,7 +58,7 @@ public class AggregationResults<T> implements Iterable<T> {
/** /**
* Returns the aggregation results. * Returns the aggregation results.
* *
* @return * @return
*/ */
public List<T> getMappedResults() { public List<T> getMappedResults() {
@@ -66,10 +67,11 @@ public class AggregationResults<T> implements Iterable<T> {
/** /**
* Returns the unique mapped result. Assumes no result or exactly one. * Returns the unique mapped result. Assumes no result or exactly one.
* *
* @return * @return
* @throws IllegalArgumentException in case more than one result is available. * @throws IllegalArgumentException in case more than one result is available.
*/ */
@Nullable
public T getUniqueMappedResult() { public T getUniqueMappedResult() {
Assert.isTrue(mappedResults.size() < 2, "Expected unique result or null, but got more than one!"); Assert.isTrue(mappedResults.size() < 2, "Expected unique result or null, but got more than one!");
return mappedResults.size() == 1 ? mappedResults.get(0) : null; return mappedResults.size() == 1 ? mappedResults.get(0) : null;
@@ -85,16 +87,17 @@ public class AggregationResults<T> implements Iterable<T> {
/** /**
* Returns the server that has been used to perform the aggregation. * Returns the server that has been used to perform the aggregation.
* *
* @return * @return
*/ */
@Nullable
public String getServerUsed() { public String getServerUsed() {
return serverUsed; return serverUsed;
} }
/** /**
* Returns the raw result that was returned by the server. * Returns the raw result that was returned by the server.
* *
* @return * @return
* @since 1.6 * @since 1.6
*/ */
@@ -102,6 +105,7 @@ public class AggregationResults<T> implements Iterable<T> {
return rawResults; return rawResults;
} }
@Nullable
private String parseServerUsed() { private String parseServerUsed() {
Object object = rawResults.get("serverUsed"); Object object = rawResults.get("serverUsed");

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2016. the original author or authors. * Copyright 2016-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -20,7 +20,7 @@ import org.springframework.util.Assert;
/** /**
* An {@link AggregationExpression} that renders a MongoDB Aggregation Framework expression from the AST of a * An {@link AggregationExpression} that renders a MongoDB Aggregation Framework expression from the AST of a
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/expressions.html">SpEL * <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#expressions">SpEL
* expression</a>. <br /> * expression</a>. <br />
* <br /> * <br />
* <strong>Samples:</strong> <br /> * <strong>Samples:</strong> <br />
@@ -35,6 +35,7 @@ import org.springframework.util.Assert;
* </code> * </code>
* *
* @author Christoph Strobl * @author Christoph Strobl
* @author Mark Paluch
* @see SpelExpressionTransformer * @see SpelExpressionTransformer
* @since 1.10 * @since 1.10
*/ */

View File

@@ -92,7 +92,7 @@ public class ArithmeticOperators {
* @return * @return
*/ */
public Abs abs() { public Abs abs() {
return fieldReference != null ? Abs.absoluteValueOf(fieldReference) : Abs.absoluteValueOf(expression); return usesFieldRef() ? Abs.absoluteValueOf(fieldReference) : Abs.absoluteValueOf(expression);
} }
/** /**
@@ -134,7 +134,7 @@ public class ArithmeticOperators {
} }
private Add createAdd() { private Add createAdd() {
return fieldReference != null ? Add.valueOf(fieldReference) : Add.valueOf(expression); return usesFieldRef() ? Add.valueOf(fieldReference) : Add.valueOf(expression);
} }
/** /**
@@ -144,7 +144,7 @@ public class ArithmeticOperators {
* @return * @return
*/ */
public Ceil ceil() { public Ceil ceil() {
return fieldReference != null ? Ceil.ceilValueOf(fieldReference) : Ceil.ceilValueOf(expression); return usesFieldRef() ? Ceil.ceilValueOf(fieldReference) : Ceil.ceilValueOf(expression);
} }
/** /**
@@ -186,7 +186,7 @@ public class ArithmeticOperators {
} }
private Divide createDivide() { private Divide createDivide() {
return fieldReference != null ? Divide.valueOf(fieldReference) : Divide.valueOf(expression); return usesFieldRef() ? Divide.valueOf(fieldReference) : Divide.valueOf(expression);
} }
/** /**
@@ -195,7 +195,7 @@ public class ArithmeticOperators {
* @return * @return
*/ */
public Exp exp() { public Exp exp() {
return fieldReference != null ? Exp.expValueOf(fieldReference) : Exp.expValueOf(expression); return usesFieldRef() ? Exp.expValueOf(fieldReference) : Exp.expValueOf(expression);
} }
/** /**
@@ -205,7 +205,7 @@ public class ArithmeticOperators {
* @return * @return
*/ */
public Floor floor() { public Floor floor() {
return fieldReference != null ? Floor.floorValueOf(fieldReference) : Floor.floorValueOf(expression); return usesFieldRef() ? Floor.floorValueOf(fieldReference) : Floor.floorValueOf(expression);
} }
/** /**
@@ -215,7 +215,7 @@ public class ArithmeticOperators {
* @return * @return
*/ */
public Ln ln() { public Ln ln() {
return fieldReference != null ? Ln.lnValueOf(fieldReference) : Ln.lnValueOf(expression); return usesFieldRef() ? Ln.lnValueOf(fieldReference) : Ln.lnValueOf(expression);
} }
/** /**
@@ -258,7 +258,7 @@ public class ArithmeticOperators {
} }
private Log createLog() { private Log createLog() {
return fieldReference != null ? Log.valueOf(fieldReference) : Log.valueOf(expression); return usesFieldRef() ? Log.valueOf(fieldReference) : Log.valueOf(expression);
} }
/** /**
@@ -267,7 +267,7 @@ public class ArithmeticOperators {
* @return * @return
*/ */
public Log10 log10() { public Log10 log10() {
return fieldReference != null ? Log10.log10ValueOf(fieldReference) : Log10.log10ValueOf(expression); return usesFieldRef() ? Log10.log10ValueOf(fieldReference) : Log10.log10ValueOf(expression);
} }
/** /**
@@ -310,7 +310,7 @@ public class ArithmeticOperators {
} }
private Mod createMod() { private Mod createMod() {
return fieldReference != null ? Mod.valueOf(fieldReference) : Mod.valueOf(expression); return usesFieldRef() ? Mod.valueOf(fieldReference) : Mod.valueOf(expression);
} }
/** /**
@@ -350,7 +350,7 @@ public class ArithmeticOperators {
} }
private Multiply createMultiply() { private Multiply createMultiply() {
return fieldReference != null ? Multiply.valueOf(fieldReference) : Multiply.valueOf(expression); return usesFieldRef() ? Multiply.valueOf(fieldReference) : Multiply.valueOf(expression);
} }
/** /**
@@ -390,7 +390,7 @@ public class ArithmeticOperators {
} }
private Pow createPow() { private Pow createPow() {
return fieldReference != null ? Pow.valueOf(fieldReference) : Pow.valueOf(expression); return usesFieldRef() ? Pow.valueOf(fieldReference) : Pow.valueOf(expression);
} }
/** /**
@@ -399,7 +399,7 @@ public class ArithmeticOperators {
* @return * @return
*/ */
public Sqrt sqrt() { public Sqrt sqrt() {
return fieldReference != null ? Sqrt.sqrtOf(fieldReference) : Sqrt.sqrtOf(expression); return usesFieldRef() ? Sqrt.sqrtOf(fieldReference) : Sqrt.sqrtOf(expression);
} }
/** /**
@@ -439,7 +439,7 @@ public class ArithmeticOperators {
} }
private Subtract createSubtract() { private Subtract createSubtract() {
return fieldReference != null ? Subtract.valueOf(fieldReference) : Subtract.valueOf(expression); return usesFieldRef() ? Subtract.valueOf(fieldReference) : Subtract.valueOf(expression);
} }
/** /**
@@ -448,7 +448,7 @@ public class ArithmeticOperators {
* @return * @return
*/ */
public Trunc trunc() { public Trunc trunc() {
return fieldReference != null ? Trunc.truncValueOf(fieldReference) : Trunc.truncValueOf(expression); return usesFieldRef() ? Trunc.truncValueOf(fieldReference) : Trunc.truncValueOf(expression);
} }
/** /**
@@ -457,7 +457,7 @@ public class ArithmeticOperators {
* @return * @return
*/ */
public Sum sum() { public Sum sum() {
return fieldReference != null ? AccumulatorOperators.Sum.sumOf(fieldReference) return usesFieldRef() ? AccumulatorOperators.Sum.sumOf(fieldReference)
: AccumulatorOperators.Sum.sumOf(expression); : AccumulatorOperators.Sum.sumOf(expression);
} }
@@ -467,7 +467,7 @@ public class ArithmeticOperators {
* @return * @return
*/ */
public Avg avg() { public Avg avg() {
return fieldReference != null ? AccumulatorOperators.Avg.avgOf(fieldReference) return usesFieldRef() ? AccumulatorOperators.Avg.avgOf(fieldReference)
: AccumulatorOperators.Avg.avgOf(expression); : AccumulatorOperators.Avg.avgOf(expression);
} }
@@ -477,7 +477,7 @@ public class ArithmeticOperators {
* @return * @return
*/ */
public Max max() { public Max max() {
return fieldReference != null ? AccumulatorOperators.Max.maxOf(fieldReference) return usesFieldRef() ? AccumulatorOperators.Max.maxOf(fieldReference)
: AccumulatorOperators.Max.maxOf(expression); : AccumulatorOperators.Max.maxOf(expression);
} }
@@ -487,7 +487,7 @@ public class ArithmeticOperators {
* @return * @return
*/ */
public Min min() { public Min min() {
return fieldReference != null ? AccumulatorOperators.Min.minOf(fieldReference) return usesFieldRef() ? AccumulatorOperators.Min.minOf(fieldReference)
: AccumulatorOperators.Min.minOf(expression); : AccumulatorOperators.Min.minOf(expression);
} }
@@ -497,7 +497,7 @@ public class ArithmeticOperators {
* @return * @return
*/ */
public StdDevPop stdDevPop() { public StdDevPop stdDevPop() {
return fieldReference != null ? AccumulatorOperators.StdDevPop.stdDevPopOf(fieldReference) return usesFieldRef() ? AccumulatorOperators.StdDevPop.stdDevPopOf(fieldReference)
: AccumulatorOperators.StdDevPop.stdDevPopOf(expression); : AccumulatorOperators.StdDevPop.stdDevPopOf(expression);
} }
@@ -507,9 +507,13 @@ public class ArithmeticOperators {
* @return * @return
*/ */
public StdDevSamp stdDevSamp() { public StdDevSamp stdDevSamp() {
return fieldReference != null ? AccumulatorOperators.StdDevSamp.stdDevSampOf(fieldReference) return usesFieldRef() ? AccumulatorOperators.StdDevSamp.stdDevSampOf(fieldReference)
: AccumulatorOperators.StdDevSamp.stdDevSampOf(expression); : AccumulatorOperators.StdDevSamp.stdDevSampOf(expression);
} }
private boolean usesFieldRef() {
return fieldReference != null;
}
} }
/** /**

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2016. the original author or authors. * Copyright 2016-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -25,12 +25,14 @@ import org.springframework.data.domain.Range;
import org.springframework.data.mongodb.core.aggregation.ArrayOperators.Filter.AsBuilder; import org.springframework.data.mongodb.core.aggregation.ArrayOperators.Filter.AsBuilder;
import org.springframework.data.mongodb.core.aggregation.ArrayOperators.Reduce.PropertyExpression; import org.springframework.data.mongodb.core.aggregation.ArrayOperators.Reduce.PropertyExpression;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField; import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
/** /**
* Gateway to {@literal array} aggregation operations. * Gateway to {@literal array} aggregation operations.
* *
* @author Christoph Strobl * @author Christoph Strobl
* @author Mark Paluch
* @since 1.0 * @since 1.0
*/ */
public class ArrayOperators { public class ArrayOperators {
@@ -223,15 +225,10 @@ public class ArrayOperators {
* @param expression must not be {@literal null}. * @param expression must not be {@literal null}.
* @return * @return
*/ */
public ArrayOperatorFactory.ReduceInitialValueBuilder reduce(final AggregationExpression expression) { public ArrayOperatorFactory.ReduceInitialValueBuilder reduce(AggregationExpression expression) {
return new ArrayOperatorFactory.ReduceInitialValueBuilder() {
@Override return initialValue -> (usesFieldRef() ? Reduce.arrayOf(fieldReference)
public Reduce startingWith(Object initialValue) { : Reduce.arrayOf(ArrayOperatorFactory.this.expression)).withInitialValue(initialValue).reduce(expression);
return (usesFieldRef() ? Reduce.arrayOf(fieldReference) : Reduce.arrayOf(expression))
.withInitialValue(initialValue).reduce(expression);
}
};
} }
/** /**
@@ -241,16 +238,10 @@ public class ArrayOperators {
* @param expressions * @param expressions
* @return * @return
*/ */
public ArrayOperatorFactory.ReduceInitialValueBuilder reduce(final PropertyExpression... expressions) { public ArrayOperatorFactory.ReduceInitialValueBuilder reduce(PropertyExpression... expressions) {
return new ArrayOperatorFactory.ReduceInitialValueBuilder() { return initialValue -> (usesFieldRef() ? Reduce.arrayOf(fieldReference) : Reduce.arrayOf(expression))
.withInitialValue(initialValue).reduce(expressions);
@Override
public Reduce startingWith(Object initialValue) {
return (usesFieldRef() ? Reduce.arrayOf(fieldReference) : Reduce.arrayOf(expression))
.withInitialValue(initialValue).reduce(expressions);
}
};
} }
/** /**
@@ -414,9 +405,9 @@ public class ArrayOperators {
*/ */
public static class Filter implements AggregationExpression { public static class Filter implements AggregationExpression {
private Object input; private @Nullable Object input;
private ExposedField as; private @Nullable ExposedField as;
private Object condition; private @Nullable Object condition;
private Filter() { private Filter() {
// used by builder // used by builder
@@ -1103,12 +1094,10 @@ public class ArrayOperators {
/** /**
* Start creating new {@link Reduce}. * Start creating new {@link Reduce}.
* *
* @param expression must not be {@literal null}. * @param arrayValueExpression must not be {@literal null}.
* @return * @return
*/ */
public static InitialValueBuilder arrayOf(final AggregationExpression expression) { public static InitialValueBuilder arrayOf(final AggregationExpression arrayValueExpression) {
Assert.notNull(expression, "AggregationExpression must not be null");
return new InitialValueBuilder() { return new InitialValueBuilder() {
@@ -1123,14 +1112,14 @@ public class ArrayOperators {
public Reduce reduce(AggregationExpression expression) { public Reduce reduce(AggregationExpression expression) {
Assert.notNull(expression, "AggregationExpression must not be null"); Assert.notNull(expression, "AggregationExpression must not be null");
return new Reduce(expression, initialValue, Collections.singletonList(expression)); return new Reduce(arrayValueExpression, initialValue, Collections.singletonList(expression));
} }
@Override @Override
public Reduce reduce(PropertyExpression... expressions) { public Reduce reduce(PropertyExpression... expressions) {
Assert.notNull(expressions, "PropertyExpressions must not be null"); Assert.notNull(expressions, "PropertyExpressions must not be null");
return new Reduce(expression, initialValue, Arrays.<AggregationExpression> asList(expressions)); return new Reduce(arrayValueExpression, initialValue, Arrays.asList(expressions));
} }
}; };
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2016. the original author or authors. * Copyright 2016-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2016. the original author or authors. * Copyright 2016-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -26,6 +26,7 @@ import org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Co
import org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.ThenBuilder; import org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.ThenBuilder;
import org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Switch.CaseOperator; import org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Switch.CaseOperator;
import org.springframework.data.mongodb.core.query.CriteriaDefinition; import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.ClassUtils; import org.springframework.util.ClassUtils;
@@ -33,6 +34,7 @@ import org.springframework.util.ClassUtils;
* Gateway to {@literal conditional expressions} that evaluate their argument expressions as booleans to a value. * Gateway to {@literal conditional expressions} that evaluate their argument expressions as booleans to a value.
* *
* @author Mark Paluch * @author Mark Paluch
* @author Christoph Strobl
* @since 1.10 * @since 1.10
*/ */
public class ConditionalOperators { public class ConditionalOperators {
@@ -121,9 +123,11 @@ public class ConditionalOperators {
public static class ConditionalOperatorFactory { public static class ConditionalOperatorFactory {
private final String fieldReference; private final @Nullable String fieldReference;
private final AggregationExpression expression;
private final CriteriaDefinition criteriaDefinition; private final @Nullable AggregationExpression expression;
private final @Nullable CriteriaDefinition criteriaDefinition;
/** /**
* Creates new {@link ConditionalOperatorFactory} for given {@literal fieldReference}. * Creates new {@link ConditionalOperatorFactory} for given {@literal fieldReference}.
@@ -358,7 +362,7 @@ public class ConditionalOperators {
*/ */
static final class IfNullOperatorBuilder implements IfNullBuilder, ThenBuilder { static final class IfNullOperatorBuilder implements IfNullBuilder, ThenBuilder {
private Object condition; private @Nullable Object condition;
private IfNullOperatorBuilder() {} private IfNullOperatorBuilder() {}
@@ -850,8 +854,8 @@ public class ConditionalOperators {
*/ */
static class ConditionalExpressionBuilder implements WhenBuilder, ThenBuilder, OtherwiseBuilder { static class ConditionalExpressionBuilder implements WhenBuilder, ThenBuilder, OtherwiseBuilder {
private Object condition; private @Nullable Object condition;
private Object thenValue; private @Nullable Object thenValue;
private ConditionalExpressionBuilder() {} private ConditionalExpressionBuilder() {}

View File

@@ -0,0 +1,670 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.Collections;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
/**
* Gateway to {@literal convert} aggregation operations.
*
* @author Christoph Strobl
* @since 2.0.10
*/
public class ConvertOperators {
/**
* Take the field referenced by given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static ConvertOperatorFactory valueOf(String fieldReference) {
return new ConvertOperatorFactory(fieldReference);
}
/**
* Take the value resulting from the given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static ConvertOperatorFactory valueOf(AggregationExpression expression) {
return new ConvertOperatorFactory(expression);
}
/**
* @author Christoph Strobl
*/
public static class ConvertOperatorFactory {
private final @Nullable String fieldReference;
private final @Nullable AggregationExpression expression;
/**
* Creates new {@link ConvertOperatorFactory} for given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
*/
public ConvertOperatorFactory(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
this.fieldReference = fieldReference;
this.expression = null;
}
/**
* Creates new {@link ConvertOperatorFactory} for given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
*/
public ConvertOperatorFactory(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
this.fieldReference = null;
this.expression = expression;
}
/**
* Creates new {@link Convert aggregation expression} that takes the associated value and converts it into the type
* specified by the given {@code stringTypeIdentifier}. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @param stringTypeIdentifier must not be {@literal null}.
* @return new instance of {@link Convert}.
*/
public Convert convertTo(String stringTypeIdentifier) {
return createConvert().to(stringTypeIdentifier);
}
/**
* Creates new {@link Convert aggregation expression} that takes the associated value and converts it into the type
* specified by the given {@code numericTypeIdentifier}. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @param numericTypeIdentifier must not be {@literal null}.
* @return new instance of {@link Convert}.
*/
public Convert convertTo(int numericTypeIdentifier) {
return createConvert().to(numericTypeIdentifier);
}
/**
* Creates new {@link Convert aggregation expression} that takes the associated value and converts it into the type
* specified by the value of the given {@link Field field reference}. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link Convert}.
*/
public Convert convertToTypeOf(String fieldReference) {
return createConvert().toTypeOf(fieldReference);
}
/**
* Creates new {@link Convert aggregation expression} that takes the associated value and converts it into the type
* specified by the given {@link AggregationExpression expression}. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link Convert}.
*/
public Convert convertToTypeOf(AggregationExpression expression) {
return createConvert().toTypeOf(expression);
}
/**
* Creates new {@link ToBool aggregation expression} for {@code $toBool} that converts a value to boolean. Shorthand
* for {@link #convertTo(String) #convertTo("bool")}. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @return new instance of {@link ToBool}.
*/
public ToBool convertToBoolean() {
return ToBool.toBoolean(valueObject());
}
/**
* Creates new {@link ToDate aggregation expression} for {@code $toDate} that converts a value to a date. Shorthand
* for {@link #convertTo(String) #convertTo("date")}. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @return new instance of {@link ToDate}.
*/
public ToDate convertToDate() {
return ToDate.toDate(valueObject());
}
/**
* Creates new {@link ToDecimal aggregation expression} for {@code $toDecimal} that converts a value to a decimal.
* Shorthand for {@link #convertTo(String) #convertTo("decimal")}. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @return new instance of {@link ToDecimal}.
*/
public ToDecimal convertToDecimal() {
return ToDecimal.toDecimal(valueObject());
}
/**
* Creates new {@link ToDouble aggregation expression} for {@code $toDouble} that converts a value to a decimal.
* Shorthand for {@link #convertTo(String) #convertTo("double")}. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @return new instance of {@link ToDouble}.
*/
public ToDouble convertToDouble() {
return ToDouble.toDouble(valueObject());
}
/**
* Creates new {@link ToInt aggregation expression} for {@code $toInt} that converts a value to an int. Shorthand
* for {@link #convertTo(String) #convertTo("int")}. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @return new instance of {@link ToInt}.
*/
public ToInt convertToInt() {
return ToInt.toInt(valueObject());
}
/**
* Creates new {@link ToInt aggregation expression} for {@code $toLong} that converts a value to a long. Shorthand
* for {@link #convertTo(String) #convertTo("long")}. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @return new instance of {@link ToInt}.
*/
public ToLong convertToLong() {
return ToLong.toLong(valueObject());
}
/**
* Creates new {@link ToInt aggregation expression} for {@code $toObjectId} that converts a value to a objectId. Shorthand
* for {@link #convertTo(String) #convertTo("objectId")}. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @return new instance of {@link ToInt}.
*/
public ToObjectId convertToObjectId() {
return ToObjectId.toObjectId(valueObject());
}
/**
* Creates new {@link ToInt aggregation expression} for {@code $toString} that converts a value to a string. Shorthand
* for {@link #convertTo(String) #convertTo("string")}. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @return new instance of {@link ToInt}.
*/
public ToString convertToString() {
return ToString.toString(valueObject());
}
private Convert createConvert() {
return usesFieldRef() ? Convert.convertValueOf(fieldReference) : Convert.convertValueOf(expression);
}
private Object valueObject() {
return usesFieldRef() ? Fields.field(fieldReference) : expression;
}
private boolean usesFieldRef() {
return fieldReference != null;
}
}
/**
* {@link AggregationExpression} for {@code $convert} that converts a value to a specified type. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @author Christoph Strobl
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/convert/">https://docs.mongodb.com/manual/reference/operator/aggregation/convert/</a>
* @since 2.0.10
*/
public static class Convert extends AbstractAggregationExpression {
private Convert(Object value) {
super(value);
}
/**
* Creates new {@link Convert} using the given value for the {@literal input} attribute.
*
* @param value must not be {@literal null}.
* @return new instance of {@link Convert}.
*/
public static Convert convertValue(Object value) {
return new Convert(Collections.singletonMap("input", value));
}
/**
* Creates new {@link Convert} using the value of the provided {@link Field fieldReference} as {@literal input}
* value.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link Convert}.
*/
public static Convert convertValueOf(String fieldReference) {
return convertValue(Fields.field(fieldReference));
}
/**
* Creates new {@link Convert} using the result of the provided {@link AggregationExpression expression} as
* {@literal input} value.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link Convert}.
*/
public static Convert convertValueOf(AggregationExpression expression) {
return convertValue(expression);
}
/**
* Specify the conversion target type via its {@link String} representation.
* <ul>
* <li>double</li>
* <li>string</li>
* <li>objectId</li>
* <li>bool</li>
* <li>date</li>
* <li>int</li>
* <li>long</li>
* <li>decimal</li>
* </ul>
*
* @param stringTypeIdentifier must not be {@literal null}.
* @return new instance of {@link Convert}.
*/
public Convert to(String stringTypeIdentifier) {
return new Convert(append("to", stringTypeIdentifier));
}
/**
* Specify the conversion target type via its numeric representation.
* <dl>
* <dt>1</dt>
* <dd>double</dd>
* <dt>2</dt>
* <dd>string</li>
* <dt>7</dt>
* <dd>objectId</li>
* <dt>8</dt>
* <dd>bool</dd>
* <dt>9</dt>
* <dd>date</dd>
* <dt>16</dt>
* <dd>int</dd>
* <dt>18</dt>
* <dd>long</dd>
* <dt>19</dt>
* <dd>decimal</dd>
* </dl>
*
* @param numericTypeIdentifier must not be {@literal null}.
* @return new instance of {@link Convert}.
*/
public Convert to(int numericTypeIdentifier) {
return new Convert(append("to", numericTypeIdentifier));
}
/**
* Specify the conversion target type via the value of the given field.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link Convert}.
*/
public Convert toTypeOf(String fieldReference) {
return new Convert(append("to", Fields.field(fieldReference)));
}
/**
* Specify the conversion target type via the value of the given {@link AggregationExpression expression}.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link Convert}.
*/
public Convert toTypeOf(AggregationExpression expression) {
return new Convert(append("to", expression));
}
/**
* Optionally specify the value to return on encountering an error during conversion.
*
* @param value must not be {@literal null}.
* @return new instance of {@link Convert}.
*/
public Convert onErrorReturn(Object value) {
return new Convert(append("onError", value));
}
/**
* Optionally specify the field holding the value to return on encountering an error during conversion.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link Convert}.
*/
public Convert onErrorReturnValueOf(String fieldReference) {
return onErrorReturn(Fields.field(fieldReference));
}
/**
* Optionally specify the expression to evaluate and return on encountering an error during conversion.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link Convert}.
*/
public Convert onErrorReturnValueOf(AggregationExpression expression) {
return onErrorReturn(expression);
}
/**
* Optionally specify the value to return when the input is {@literal null} or missing.
*
* @param value must not be {@literal null}.
* @return new instance of {@link Convert}.
*/
public Convert onNullReturn(Object value) {
return new Convert(append("onNull", value));
}
/**
* Optionally specify the field holding the value to return when the input is {@literal null} or missing.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link Convert}.
*/
public Convert onNullReturnValueOf(String fieldReference) {
return onNullReturn(Fields.field(fieldReference));
}
/**
* Optionally specify the expression to evaluate and return when the input is {@literal null} or missing.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link Convert}.
*/
public Convert onNullReturnValueOf(AggregationExpression expression) {
return onNullReturn(expression);
}
@Override
protected String getMongoMethod() {
return "$convert";
}
}
/**
* {@link AggregationExpression} for {@code $toBool} that converts a value to {@literal boolean}. Shorthand for
* {@link Convert#to(String) Convert#to("bool")}. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @author Christoph Strobl
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/toBool/">https://docs.mongodb.com/manual/reference/operator/aggregation/toBool/</a>
* @since 2.0.10
*/
public static class ToBool extends AbstractAggregationExpression {
private ToBool(Object value) {
super(value);
}
/**
* Creates new {@link ToBool} using the given value as input.
*
* @param value must not be {@literal null}.
* @return new instance of {@link ToBool}.
*/
public static ToBool toBoolean(Object value) {
return new ToBool(value);
}
@Override
protected String getMongoMethod() {
return "$toBool";
}
}
/**
* {@link AggregationExpression} for {@code $toDate} that converts a value to {@literal date}. Shorthand for
* {@link Convert#to(String) Convert#to("date")}. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @author Christoph Strobl
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/toDate/">https://docs.mongodb.com/manual/reference/operator/aggregation/toDate/</a>
* @since 2.0.10
*/
public static class ToDate extends AbstractAggregationExpression {
private ToDate(Object value) {
super(value);
}
/**
* Creates new {@link ToDate} using the given value as input.
*
* @param value must not be {@literal null}.
* @return new instance of {@link ToDate}.
*/
public static ToDate toDate(Object value) {
return new ToDate(value);
}
@Override
protected String getMongoMethod() {
return "$toDate";
}
}
/**
* {@link AggregationExpression} for {@code $toDecimal} that converts a value to {@literal decimal}. Shorthand for
* {@link Convert#to(String) Convert#to("decimal")}. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @author Christoph Strobl
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/toDecimal/">https://docs.mongodb.com/manual/reference/operator/aggregation/toDecimal/</a>
* @since 2.0.10
*/
public static class ToDecimal extends AbstractAggregationExpression {
private ToDecimal(Object value) {
super(value);
}
/**
* Creates new {@link ToDecimal} using the given value as input.
*
* @param value must not be {@literal null}.
* @return new instance of {@link ToDecimal}.
*/
public static ToDecimal toDecimal(Object value) {
return new ToDecimal(value);
}
@Override
protected String getMongoMethod() {
return "$toDecimal";
}
}
/**
* {@link AggregationExpression} for {@code $toDouble} that converts a value to {@literal double}. Shorthand for
* {@link Convert#to(String) Convert#to("double")}. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @author Christoph Strobl
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/toDouble/">https://docs.mongodb.com/manual/reference/operator/aggregation/toDouble/</a>
* @since 2.0.10
*/
public static class ToDouble extends AbstractAggregationExpression {
private ToDouble(Object value) {
super(value);
}
/**
* Creates new {@link ToDouble} using the given value as input.
*
* @param value must not be {@literal null}.
* @return new instance of {@link ToDouble}.
*/
public static ToDouble toDouble(Object value) {
return new ToDouble(value);
}
@Override
protected String getMongoMethod() {
return "$toDouble";
}
}
/**
* {@link AggregationExpression} for {@code $toInt} that converts a value to {@literal integer}. Shorthand for
* {@link Convert#to(String) Convert#to("int")}. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @author Christoph Strobl
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/toInt/">https://docs.mongodb.com/manual/reference/operator/aggregation/toInt/</a>
* @since 2.0.10
*/
public static class ToInt extends AbstractAggregationExpression {
private ToInt(Object value) {
super(value);
}
/**
* Creates new {@link ToInt} using the given value as input.
*
* @param value must not be {@literal null}.
* @return new instance of {@link ToInt}.
*/
public static ToInt toInt(Object value) {
return new ToInt(value);
}
@Override
protected String getMongoMethod() {
return "$toInt";
}
}
/**
* {@link AggregationExpression} for {@code $toLong} that converts a value to {@literal long}. Shorthand for
* {@link Convert#to(String) Convert#to("long")}. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @author Christoph Strobl
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/toLong/">https://docs.mongodb.com/manual/reference/operator/aggregation/toLong/</a>
* @since 2.0.10
*/
public static class ToLong extends AbstractAggregationExpression {
private ToLong(Object value) {
super(value);
}
/**
* Creates new {@link ToLong} using the given value as input.
*
* @param value must not be {@literal null}.
* @return new instance of {@link ToLong}.
*/
public static ToLong toLong(Object value) {
return new ToLong(value);
}
@Override
protected String getMongoMethod() {
return "$toLong";
}
}
/**
* {@link AggregationExpression} for {@code $toObjectId} that converts a value to {@literal objectId}. Shorthand for
* {@link Convert#to(String) Convert#to("objectId")}. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @author Christoph Strobl
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/toObjectId/">https://docs.mongodb.com/manual/reference/operator/aggregation/toObjectId/</a>
* @since 2.0.10
*/
public static class ToObjectId extends AbstractAggregationExpression {
private ToObjectId(Object value) {
super(value);
}
/**
* Creates new {@link ToObjectId} using the given value as input.
*
* @param value must not be {@literal null}.
* @return new instance of {@link ToObjectId}.
*/
public static ToObjectId toObjectId(Object value) {
return new ToObjectId(value);
}
@Override
protected String getMongoMethod() {
return "$toObjectId";
}
}
/**
* {@link AggregationExpression} for {@code $toString} that converts a value to {@literal string}. Shorthand for
* {@link Convert#to(String) Convert#to("string")}. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @author Christoph Strobl
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/toString/">https://docs.mongodb.com/manual/reference/operator/aggregation/toString/</a>
* @since 2.0.10
*/
public static class ToString extends AbstractAggregationExpression {
private ToString(Object value) {
super(value);
}
/**
* Creates new {@link ToString} using the given value as input.
*
* @param value must not be {@literal null}.
* @return new instance of {@link ToString}.
*/
public static ToString toString(Object value) {
return new ToString(value);
}
@Override
protected String getMongoMethod() {
return "$toString";
}
}
}

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013-2016 the original author or authors. * Copyright 2013-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -22,14 +22,14 @@ import java.util.Iterator;
import java.util.List; import java.util.List;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField; import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField; import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.CompositeIterator; import org.springframework.util.CompositeIterator;
import org.springframework.util.ObjectUtils; import org.springframework.util.ObjectUtils;
/** /**
* Value object to capture the fields exposed by an {@link AggregationOperation}. * Value object to capture the fields exposed by an {@link AggregationOperation}.
* *
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
* @author Mark Paluch * @author Mark Paluch
@@ -43,9 +43,19 @@ public final class ExposedFields implements Iterable<ExposedField> {
private final List<ExposedField> originalFields; private final List<ExposedField> originalFields;
private final List<ExposedField> syntheticFields; private final List<ExposedField> syntheticFields;
/**
* Returns an empty {@link ExposedFields} instance.
*
* @return
* @since 2.0
*/
public static ExposedFields empty() {
return EMPTY;
}
/** /**
* Creates a new {@link ExposedFields} instance from the given {@link ExposedField}s. * Creates a new {@link ExposedFields} instance from the given {@link ExposedField}s.
* *
* @param fields must not be {@literal null}. * @param fields must not be {@literal null}.
* @return * @return
*/ */
@@ -55,7 +65,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
/** /**
* Creates a new {@link ExposedFields} instance from the given {@link ExposedField}s. * Creates a new {@link ExposedFields} instance from the given {@link ExposedField}s.
* *
* @param fields must not be {@literal null}. * @param fields must not be {@literal null}.
* @return * @return
*/ */
@@ -72,7 +82,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
/** /**
* Creates synthetic {@link ExposedFields} from the given {@link Fields}. * Creates synthetic {@link ExposedFields} from the given {@link Fields}.
* *
* @param fields must not be {@literal null}. * @param fields must not be {@literal null}.
* @return * @return
*/ */
@@ -82,7 +92,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
/** /**
* Creates non-synthetic {@link ExposedFields} from the given {@link Fields}. * Creates non-synthetic {@link ExposedFields} from the given {@link Fields}.
* *
* @param fields must not be {@literal null}. * @param fields must not be {@literal null}.
* @return * @return
*/ */
@@ -92,7 +102,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
/** /**
* Creates a new {@link ExposedFields} instance for the given fields in either synthetic or non-synthetic way. * Creates a new {@link ExposedFields} instance for the given fields in either synthetic or non-synthetic way.
* *
* @param fields must not be {@literal null}. * @param fields must not be {@literal null}.
* @param synthetic * @param synthetic
* @return * @return
@@ -111,7 +121,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
/** /**
* Creates a new {@link ExposedFields} with the given originals and synthetics. * Creates a new {@link ExposedFields} with the given originals and synthetics.
* *
* @param originals must not be {@literal null}. * @param originals must not be {@literal null}.
* @param synthetic must not be {@literal null}. * @param synthetic must not be {@literal null}.
*/ */
@@ -123,7 +133,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
/** /**
* Creates a new {@link ExposedFields} adding the given {@link ExposedField}. * Creates a new {@link ExposedFields} adding the given {@link ExposedField}.
* *
* @param field must not be {@literal null}. * @param field must not be {@literal null}.
* @return * @return
*/ */
@@ -140,10 +150,11 @@ public final class ExposedFields implements Iterable<ExposedField> {
/** /**
* Returns the field with the given name or {@literal null} if no field with the given name is available. * Returns the field with the given name or {@literal null} if no field with the given name is available.
* *
* @param name * @param name
* @return * @return
*/ */
@Nullable
public ExposedField getField(String name) { public ExposedField getField(String name) {
for (ExposedField field : this) { for (ExposedField field : this) {
@@ -157,7 +168,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
/** /**
* Returns whether the {@link ExposedFields} exposes no non-synthetic fields at all. * Returns whether the {@link ExposedFields} exposes no non-synthetic fields at all.
* *
* @return * @return
*/ */
boolean exposesNoNonSyntheticFields() { boolean exposesNoNonSyntheticFields() {
@@ -166,7 +177,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
/** /**
* Returns whether the {@link ExposedFields} exposes a single non-synthetic field only. * Returns whether the {@link ExposedFields} exposes a single non-synthetic field only.
* *
* @return * @return
*/ */
boolean exposesSingleNonSyntheticFieldOnly() { boolean exposesSingleNonSyntheticFieldOnly() {
@@ -175,7 +186,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
/** /**
* Returns whether the {@link ExposedFields} exposes no fields at all. * Returns whether the {@link ExposedFields} exposes no fields at all.
* *
* @return * @return
*/ */
boolean exposesNoFields() { boolean exposesNoFields() {
@@ -184,7 +195,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
/** /**
* Returns whether the {@link ExposedFields} exposes a single field only. * Returns whether the {@link ExposedFields} exposes a single field only.
* *
* @return * @return
*/ */
boolean exposesSingleFieldOnly() { boolean exposesSingleFieldOnly() {
@@ -198,7 +209,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
return originalFields.size() + syntheticFields.size(); return originalFields.size() + syntheticFields.size();
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see java.lang.Iterable#iterator() * @see java.lang.Iterable#iterator()
*/ */
@@ -219,7 +230,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
/** /**
* A single exposed field. * A single exposed field.
* *
* @author Oliver Gierke * @author Oliver Gierke
*/ */
static class ExposedField implements Field { static class ExposedField implements Field {
@@ -229,7 +240,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
/** /**
* Creates a new {@link ExposedField} with the given key. * Creates a new {@link ExposedField} with the given key.
* *
* @param key must not be {@literal null} or empty. * @param key must not be {@literal null} or empty.
* @param synthetic whether the exposed field is synthetic. * @param synthetic whether the exposed field is synthetic.
*/ */
@@ -239,7 +250,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
/** /**
* Creates a new {@link ExposedField} for the given {@link Field}. * Creates a new {@link ExposedField} for the given {@link Field}.
* *
* @param delegate must not be {@literal null}. * @param delegate must not be {@literal null}.
* @param synthetic whether the exposed field is synthetic. * @param synthetic whether the exposed field is synthetic.
*/ */
@@ -249,7 +260,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
this.synthetic = synthetic; this.synthetic = synthetic;
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Field#getKey() * @see org.springframework.data.mongodb.core.aggregation.Field#getKey()
*/ */
@@ -285,7 +296,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
/** /**
* Returns whether the field can be referred to using the given name. * Returns whether the field can be referred to using the given name.
* *
* @param name * @param name
* @return * @return
*/ */
@@ -302,7 +313,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
return String.format("AggregationField: %s, synthetic: %s", field, synthetic); return String.format("AggregationField: %s, synthetic: %s", field, synthetic);
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object) * @see java.lang.Object#equals(java.lang.Object)
*/ */
@@ -364,7 +375,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
/** /**
* A reference to an {@link ExposedField}. * A reference to an {@link ExposedField}.
* *
* @author Oliver Gierke * @author Oliver Gierke
*/ */
static class DirectFieldReference implements FieldReference { static class DirectFieldReference implements FieldReference {
@@ -373,7 +384,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
/** /**
* Creates a new {@link FieldReference} for the given {@link ExposedField}. * Creates a new {@link FieldReference} for the given {@link ExposedField}.
* *
* @param field must not be {@literal null}. * @param field must not be {@literal null}.
*/ */
public DirectFieldReference(ExposedField field) { public DirectFieldReference(ExposedField field) {
@@ -408,14 +419,14 @@ public final class ExposedFields implements Iterable<ExposedField> {
@Override @Override
public String toString() { public String toString() {
if(getRaw().startsWith("$")) { if (getRaw().startsWith("$")) {
return getRaw(); return getRaw();
} }
return String.format("$%s", getRaw()); return String.format("$%s", getRaw());
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object) * @see java.lang.Object#equals(java.lang.Object)
*/ */
@@ -435,7 +446,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
return this.field.equals(that.field); return this.field.equals(that.field);
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see java.lang.Object#hashCode() * @see java.lang.Object#hashCode()
*/ */

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013-2016 the original author or authors. * Copyright 2013-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -16,9 +16,10 @@
package org.springframework.data.mongodb.core.aggregation; package org.springframework.data.mongodb.core.aggregation;
import org.bson.Document; import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.DirectFieldReference; import org.springframework.data.mongodb.core.aggregation.ExposedFields.DirectFieldReference;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference; import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
/** /**
@@ -53,7 +54,7 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
this.rootContext = rootContext; this.rootContext = rootContext;
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(org.bson.Document) * @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(org.bson.Document)
*/ */
@@ -62,7 +63,7 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
return rootContext.getMappedObject(document); return rootContext.getMappedObject(document);
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.ExposedFields.AvailableField) * @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.ExposedFields.AvailableField)
*/ */
@@ -71,7 +72,7 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
return getReference(field, field.getTarget()); return getReference(field, field.getTarget());
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String) * @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
*/ */
@@ -83,11 +84,11 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
/** /**
* Returns a {@link FieldReference} to the given {@link Field} with the given {@code name}. * Returns a {@link FieldReference} to the given {@link Field} with the given {@code name}.
* *
* @param field may be {@literal null} * @param field may be {@literal null}.
* @param name must not be {@literal null} * @param name must not be {@literal null}.
* @return * @return
*/ */
private FieldReference getReference(Field field, String name) { private FieldReference getReference(@Nullable Field field, String name) {
Assert.notNull(name, "Name must not be null!"); Assert.notNull(name, "Name must not be null!");
@@ -100,13 +101,15 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
} }
/** /**
* Resolves a {@link field}/{@link name} for a {@link FieldReference} if possible. * Resolves a {@link Field}/{@code name} for a {@link FieldReference} if possible.
* *
* @param field may be {@literal null} * @param field may be {@literal null}.
* @param name must not be {@literal null} * @param name must not be {@literal null}.
* @return the resolved reference or {@literal null} * @return the resolved reference or {@literal null}.
*/ */
protected FieldReference resolveExposedField(Field field, String name) { @Nullable
protected FieldReference resolveExposedField(@Nullable Field field, String name) {
ExposedField exposedField = exposedFields.getField(name); ExposedField exposedField = exposedFields.getField(name);
if (exposedField != null) { if (exposedField != null) {

View File

@@ -23,13 +23,14 @@ import java.util.Iterator;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils; import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
/** /**
* Value object to capture a list of {@link Field} instances. * Value object to capture a list of {@link Field} instances.
* *
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.3 * @since 1.3
@@ -46,7 +47,7 @@ public final class Fields implements Iterable<Field> {
/** /**
* Creates a new {@link Fields} instance from the given {@link Fields}. * Creates a new {@link Fields} instance from the given {@link Fields}.
* *
* @param fields must not be {@literal null} or empty. * @param fields must not be {@literal null} or empty.
* @return * @return
*/ */
@@ -58,7 +59,7 @@ public final class Fields implements Iterable<Field> {
/** /**
* Creates a new {@link Fields} instance for {@link Field}s with the given names. * Creates a new {@link Fields} instance for {@link Field}s with the given names.
* *
* @param names must not be {@literal null}. * @param names must not be {@literal null}.
* @return * @return
*/ */
@@ -77,7 +78,7 @@ public final class Fields implements Iterable<Field> {
/** /**
* Creates a {@link Field} with the given name. * Creates a {@link Field} with the given name.
* *
* @param name must not be {@literal null} or empty. * @param name must not be {@literal null} or empty.
* @return * @return
*/ */
@@ -101,7 +102,7 @@ public final class Fields implements Iterable<Field> {
/** /**
* Creates a new {@link Fields} instance using the given {@link Field}s. * Creates a new {@link Fields} instance using the given {@link Field}s.
* *
* @param fields must not be {@literal null}. * @param fields must not be {@literal null}.
*/ */
private Fields(List<Field> fields) { private Fields(List<Field> fields) {
@@ -139,7 +140,7 @@ public final class Fields implements Iterable<Field> {
/** /**
* Creates a new {@link Fields} instance with a new {@link Field} of the given name added. * Creates a new {@link Fields} instance with a new {@link Field} of the given name added.
* *
* @param name must not be {@literal null}. * @param name must not be {@literal null}.
* @return * @return
*/ */
@@ -166,6 +167,7 @@ public final class Fields implements Iterable<Field> {
return result; return result;
} }
@Nullable
public Field getField(String name) { public Field getField(String name) {
for (Field field : fields) { for (Field field : fields) {
@@ -177,7 +179,7 @@ public final class Fields implements Iterable<Field> {
return null; return null;
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see java.lang.Iterable#iterator() * @see java.lang.Iterable#iterator()
*/ */
@@ -196,7 +198,7 @@ public final class Fields implements Iterable<Field> {
/** /**
* Value object to encapsulate a field in an aggregation operation. * Value object to encapsulate a field in an aggregation operation.
* *
* @author Oliver Gierke * @author Oliver Gierke
*/ */
static class AggregationField implements Field { static class AggregationField implements Field {
@@ -207,7 +209,7 @@ public final class Fields implements Iterable<Field> {
/** /**
* Creates an aggregation field with the given {@code name}. * Creates an aggregation field with the given {@code name}.
* *
* @see AggregationField#AggregationField(String, String). * @see AggregationField#AggregationField(String, String).
* @param name must not be {@literal null} or empty * @param name must not be {@literal null} or empty
*/ */
@@ -220,15 +222,15 @@ public final class Fields implements Iterable<Field> {
* <p> * <p>
* The {@code name} serves as an alias for the actual backing document field denoted by {@code target}. If no target * The {@code name} serves as an alias for the actual backing document field denoted by {@code target}. If no target
* is set explicitly, the name will be used as target. * is set explicitly, the name will be used as target.
* *
* @param name must not be {@literal null} or empty * @param name must not be {@literal null} or empty
* @param target * @param target
*/ */
public AggregationField(String name, String target) { public AggregationField(String name, @Nullable String target) {
raw = name; raw = name;
String nameToSet = cleanUp(name); String nameToSet = name != null ? cleanUp(name) : null;
String targetToSet = cleanUp(target); String targetToSet = target != null ? cleanUp(target) : null;
Assert.hasText(nameToSet, "AggregationField name must not be null or empty!"); Assert.hasText(nameToSet, "AggregationField name must not be null or empty!");
@@ -241,11 +243,7 @@ public final class Fields implements Iterable<Field> {
} }
} }
private static final String cleanUp(String source) { private static String cleanUp(String source) {
if (source == null) {
return source;
}
if (Aggregation.SystemVariable.isReferingToSystemVariable(source)) { if (Aggregation.SystemVariable.isReferingToSystemVariable(source)) {
return source; return source;
@@ -301,7 +299,7 @@ public final class Fields implements Iterable<Field> {
return raw; return raw;
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see java.lang.Object#toString() * @see java.lang.Object#toString()
*/ */
@@ -310,7 +308,7 @@ public final class Fields implements Iterable<Field> {
return String.format("AggregationField - name: %s, target: %s", name, target); return String.format("AggregationField - name: %s, target: %s", name, target);
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object) * @see java.lang.Object#equals(java.lang.Object)
*/ */
@@ -330,7 +328,7 @@ public final class Fields implements Iterable<Field> {
return this.name.equals(that.name) && ObjectUtils.nullSafeEquals(this.target, that.target); return this.name.equals(that.name) && ObjectUtils.nullSafeEquals(this.target, that.target);
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see java.lang.Object#hashCode() * @see java.lang.Object#hashCode()
*/ */

View File

@@ -34,10 +34,25 @@ public interface FieldsExposingAggregationOperation extends AggregationOperation
ExposedFields getFields(); ExposedFields getFields();
/** /**
* Marker interface for {@link AggregationOperation} that inherits fields from previous operations. * @return {@literal true} to conditionally inherit fields from previous operations.
* @since 2.0.6
*/ */
static interface InheritsFieldsAggregationOperation extends FieldsExposingAggregationOperation { default boolean inheritsFields() {
return false;
} }
/**
* Marker interface for {@link AggregationOperation} that inherits fields from previous operations.
*/
interface InheritsFieldsAggregationOperation extends FieldsExposingAggregationOperation {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#inheritsFields()
*/
@Override
default boolean inheritsFields() {
return true;
}
}
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013-2016 the original author or authors. * Copyright 2013-2018 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -17,40 +17,71 @@ package org.springframework.data.mongodb.core.aggregation;
import org.bson.Document; import org.bson.Document;
import org.springframework.data.mongodb.core.query.NearQuery; import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
/** /**
* Represents a {@code geoNear} aggregation operation. * Represents a {@code geoNear} aggregation operation.
* <p> * <p>
* We recommend to use the static factory method {@link Aggregation#geoNear(NearQuery, String)} instead of creating * We recommend to use the static factory method {@link Aggregation#geoNear(NearQuery, String)} instead of creating
* instances of this class directly. * instances of this class directly.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @author Christoph Strobl * @author Christoph Strobl
* @since 1.3 * @since 1.3
* @see <a href="https://docs.mongodb.com/manual/reference/operator/aggregation/geoNear/">MongoDB Aggregation Framework:
* $geoNear</a>
*/ */
public class GeoNearOperation implements AggregationOperation { public class GeoNearOperation implements AggregationOperation {
private final NearQuery nearQuery; private final NearQuery nearQuery;
private final String distanceField; private final String distanceField;
private final @Nullable String indexKey;
/** /**
* Creates a new {@link GeoNearOperation} from the given {@link NearQuery} and the given distance field. The * Creates a new {@link GeoNearOperation} from the given {@link NearQuery} and the given distance field. The
* {@code distanceField} defines output field that contains the calculated distance. * {@code distanceField} defines output field that contains the calculated distance.
* *
* @param query must not be {@literal null}. * @param nearQuery must not be {@literal null}.
* @param distanceField must not be {@literal null}. * @param distanceField must not be {@literal null}.
*/ */
public GeoNearOperation(NearQuery nearQuery, String distanceField) { public GeoNearOperation(NearQuery nearQuery, String distanceField) {
this(nearQuery, distanceField, null);
}
/**
* Creates a new {@link GeoNearOperation} from the given {@link NearQuery} and the given distance field. The
* {@code distanceField} defines output field that contains the calculated distance.
*
* @param nearQuery must not be {@literal null}.
* @param distanceField must not be {@literal null}.
* @param indexKey can be {@literal null};
* @since 2.0.10
*/
private GeoNearOperation(NearQuery nearQuery, String distanceField, @Nullable String indexKey) {
Assert.notNull(nearQuery, "NearQuery must not be null."); Assert.notNull(nearQuery, "NearQuery must not be null.");
Assert.hasLength(distanceField, "Distance field must not be null or empty."); Assert.hasLength(distanceField, "Distance field must not be null or empty.");
this.nearQuery = nearQuery; this.nearQuery = nearQuery;
this.distanceField = distanceField; this.distanceField = distanceField;
this.indexKey = indexKey;
} }
/* /**
* Optionally specify the geospatial index to use via the field to use in the calculation. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @param key the geospatial index field to use when calculating the distance.
* @return new instance of {@link GeoNearOperation}.
* @since 2.0.10
*/
public GeoNearOperation useIndex(String key) {
return new GeoNearOperation(nearQuery, distanceField, key);
}
/*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext) * @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/ */
@@ -60,6 +91,10 @@ public class GeoNearOperation implements AggregationOperation {
Document command = context.getMappedObject(nearQuery.toDocument()); Document command = context.getMappedObject(nearQuery.toDocument());
command.put("distanceField", distanceField); command.put("distanceField", distanceField);
if (StringUtils.hasText(indexKey)) {
command.put("key", indexKey);
}
return new Document("$geoNear", command); return new Document("$geoNear", command);
} }
} }

View File

@@ -25,6 +25,7 @@ import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField; import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation; import org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation;
import org.springframework.data.mongodb.core.query.CriteriaDefinition; import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.ClassUtils; import org.springframework.util.ClassUtils;
@@ -51,12 +52,12 @@ public class GraphLookupOperation implements InheritsFieldsAggregationOperation
private final Field connectFrom; private final Field connectFrom;
private final Field connectTo; private final Field connectTo;
private final Field as; private final Field as;
private final Long maxDepth; private final @Nullable Long maxDepth;
private final Field depthField; private final @Nullable Field depthField;
private final CriteriaDefinition restrictSearchWithMatch; private final @Nullable CriteriaDefinition restrictSearchWithMatch;
private GraphLookupOperation(String from, List<Object> startWith, Field connectFrom, Field connectTo, Field as, private GraphLookupOperation(String from, List<Object> startWith, Field connectFrom, Field connectTo, Field as,
Long maxDepth, Field depthField, CriteriaDefinition restrictSearchWithMatch) { @Nullable Long maxDepth, @Nullable Field depthField, @Nullable CriteriaDefinition restrictSearchWithMatch) {
this.from = from; this.from = from;
this.startWith = startWith; this.startWith = startWith;
@@ -214,9 +215,9 @@ public class GraphLookupOperation implements InheritsFieldsAggregationOperation
static final class GraphLookupOperationFromBuilder static final class GraphLookupOperationFromBuilder
implements FromBuilder, StartWithBuilder, ConnectFromBuilder, ConnectToBuilder { implements FromBuilder, StartWithBuilder, ConnectFromBuilder, ConnectToBuilder {
private String from; private @Nullable String from;
private List<? extends Object> startWith; private @Nullable List<? extends Object> startWith;
private String connectFrom; private @Nullable String connectFrom;
/* (non-Javadoc) /* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.GraphLookupOperation.FromBuilder#from(java.lang.String) * @see org.springframework.data.mongodb.core.aggregation.GraphLookupOperation.FromBuilder#from(java.lang.String)
@@ -336,9 +337,9 @@ public class GraphLookupOperation implements InheritsFieldsAggregationOperation
private final List<Object> startWith; private final List<Object> startWith;
private final Field connectFrom; private final Field connectFrom;
private final Field connectTo; private final Field connectTo;
private Long maxDepth; private @Nullable Long maxDepth;
private Field depthField; private @Nullable Field depthField;
private CriteriaDefinition restrictSearchWithMatch; private @Nullable CriteriaDefinition restrictSearchWithMatch;
protected GraphLookupOperationBuilder(String from, List<? extends Object> startWith, String connectFrom, protected GraphLookupOperationBuilder(String from, List<? extends Object> startWith, String connectFrom,
String connectTo) { String connectTo) {

View File

@@ -19,25 +19,26 @@ import java.util.ArrayList;
import java.util.Arrays; import java.util.Arrays;
import java.util.Collections; import java.util.Collections;
import java.util.List; import java.util.List;
import java.util.Locale;
import org.bson.Document; import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField; import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference; import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
/** /**
* Encapsulates the aggregation framework {@code $group}-operation. * Encapsulates the aggregation framework {@code $group}-operation.
* <p> * <p>
* We recommend to use the static factory method {@link Aggregation#group(Fields)} instead of creating instances of this * We recommend to use the static factory method {@link Aggregation#group(Fields)} instead of creating instances of this
* class directly. * class directly.
* *
* @author Sebastian Herold * @author Sebastian Herold
* @author Thomas Darimont * @author Thomas Darimont
* @author Oliver Gierke * @author Oliver Gierke
* @author Gustavo de Geus * @author Gustavo de Geus
* @author Christoph Strobl * @author Christoph Strobl
* @author Mark Paluch
* @author Sergey Shcherbakov
* @since 1.3 * @since 1.3
* @see <a href="https://docs.mongodb.org/manual/reference/aggregation/group/">MongoDB Aggregation Framework: $group</a> * @see <a href="https://docs.mongodb.org/manual/reference/aggregation/group/">MongoDB Aggregation Framework: $group</a>
*/ */
@@ -52,7 +53,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
/** /**
* Creates a new {@link GroupOperation} including the given {@link Fields}. * Creates a new {@link GroupOperation} including the given {@link Fields}.
* *
* @param fields must not be {@literal null}. * @param fields must not be {@literal null}.
*/ */
public GroupOperation(Fields fields) { public GroupOperation(Fields fields) {
@@ -63,7 +64,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
/** /**
* Creates a new {@link GroupOperation} from the given {@link GroupOperation}. * Creates a new {@link GroupOperation} from the given {@link GroupOperation}.
* *
* @param groupOperation must not be {@literal null}. * @param groupOperation must not be {@literal null}.
*/ */
protected GroupOperation(GroupOperation groupOperation) { protected GroupOperation(GroupOperation groupOperation) {
@@ -72,7 +73,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
/** /**
* Creates a new {@link GroupOperation} from the given {@link GroupOperation} and the given {@link Operation}s. * Creates a new {@link GroupOperation} from the given {@link GroupOperation} and the given {@link Operation}s.
* *
* @param groupOperation * @param groupOperation
* @param nextOperations * @param nextOperations
*/ */
@@ -89,7 +90,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
/** /**
* Creates a new {@link GroupOperation} from the current one adding the given {@link Operation}. * Creates a new {@link GroupOperation} from the current one adding the given {@link Operation}.
* *
* @param operation must not be {@literal null}. * @param operation must not be {@literal null}.
* @return * @return
*/ */
@@ -99,7 +100,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
/** /**
* Builder for {@link GroupOperation}s on a field. * Builder for {@link GroupOperation}s on a field.
* *
* @author Thomas Darimont * @author Thomas Darimont
*/ */
public static final class GroupOperationBuilder { public static final class GroupOperationBuilder {
@@ -109,7 +110,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
/** /**
* Creates a new {@link GroupOperationBuilder} from the given {@link GroupOperation} and {@link Operation}. * Creates a new {@link GroupOperationBuilder} from the given {@link GroupOperation} and {@link Operation}.
* *
* @param groupOperation * @param groupOperation
* @param operation * @param operation
*/ */
@@ -124,7 +125,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
/** /**
* Allows to specify an alias for the new-operation operation. * Allows to specify an alias for the new-operation operation.
* *
* @param alias * @param alias
* @return * @return
*/ */
@@ -138,7 +139,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
* <p> * <p>
* Count expressions are emulated via {@code $sum: 1}. * Count expressions are emulated via {@code $sum: 1}.
* <p> * <p>
* *
* @return * @return
*/ */
public GroupOperationBuilder count() { public GroupOperationBuilder count() {
@@ -147,7 +148,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
/** /**
* Generates an {@link GroupOperationBuilder} for a {@code $sum}-expression for the given field-reference. * Generates an {@link GroupOperationBuilder} for a {@code $sum}-expression for the given field-reference.
* *
* @param reference * @param reference
* @return * @return
*/ */
@@ -155,13 +156,28 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return sum(reference, null); return sum(reference, null);
} }
private GroupOperationBuilder sum(String reference, Object value) { /**
* Generates an {@link GroupOperationBuilder} for an {@code $sum}-expression for the given
* {@link AggregationExpression}.
*
* @param expr must not be {@literal null}.
* @return new instance of {@link GroupOperationBuilder}. Never {@literal null}.
* @throws IllegalArgumentException when {@code expr} is {@literal null}.
* @since 1.10.8
*/
public GroupOperationBuilder sum(AggregationExpression expr) {
Assert.notNull(expr, "Expr must not be null!");
return newBuilder(GroupOps.SUM, null, expr);
}
private GroupOperationBuilder sum(@Nullable String reference, @Nullable Object value) {
return newBuilder(GroupOps.SUM, reference, value); return newBuilder(GroupOps.SUM, reference, value);
} }
/** /**
* Generates an {@link GroupOperationBuilder} for an {@code $add_to_set}-expression for the given field-reference. * Generates an {@link GroupOperationBuilder} for an {@code $add_to_set}-expression for the given field-reference.
* *
* @param reference * @param reference
* @return * @return
*/ */
@@ -171,7 +187,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
/** /**
* Generates an {@link GroupOperationBuilder} for an {@code $add_to_set}-expression for the given value. * Generates an {@link GroupOperationBuilder} for an {@code $add_to_set}-expression for the given value.
* *
* @param value * @param value
* @return * @return
*/ */
@@ -179,13 +195,13 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return addToSet(null, value); return addToSet(null, value);
} }
private GroupOperationBuilder addToSet(String reference, Object value) { private GroupOperationBuilder addToSet(@Nullable String reference, @Nullable Object value) {
return newBuilder(GroupOps.ADD_TO_SET, reference, value); return newBuilder(GroupOps.ADD_TO_SET, reference, value);
} }
/** /**
* Generates an {@link GroupOperationBuilder} for an {@code $last}-expression for the given field-reference. * Generates an {@link GroupOperationBuilder} for an {@code $last}-expression for the given field-reference.
* *
* @param reference * @param reference
* @return * @return
*/ */
@@ -196,7 +212,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
/** /**
* Generates an {@link GroupOperationBuilder} for an {@code $last}-expression for the given * Generates an {@link GroupOperationBuilder} for an {@code $last}-expression for the given
* {@link AggregationExpression}. * {@link AggregationExpression}.
* *
* @param expr * @param expr
* @return * @return
*/ */
@@ -206,7 +222,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
/** /**
* Generates an {@link GroupOperationBuilder} for a {@code $first}-expression for the given field-reference. * Generates an {@link GroupOperationBuilder} for a {@code $first}-expression for the given field-reference.
* *
* @param reference * @param reference
* @return * @return
*/ */
@@ -217,7 +233,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
/** /**
* Generates an {@link GroupOperationBuilder} for a {@code $first}-expression for the given * Generates an {@link GroupOperationBuilder} for a {@code $first}-expression for the given
* {@link AggregationExpression}. * {@link AggregationExpression}.
* *
* @param expr * @param expr
* @return * @return
*/ */
@@ -227,7 +243,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
/** /**
* Generates an {@link GroupOperationBuilder} for an {@code $avg}-expression for the given field-reference. * Generates an {@link GroupOperationBuilder} for an {@code $avg}-expression for the given field-reference.
* *
* @param reference * @param reference
* @return * @return
*/ */
@@ -238,7 +254,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
/** /**
* Generates an {@link GroupOperationBuilder} for an {@code $avg}-expression for the given * Generates an {@link GroupOperationBuilder} for an {@code $avg}-expression for the given
* {@link AggregationExpression}. * {@link AggregationExpression}.
* *
* @param expr * @param expr
* @return * @return
*/ */
@@ -248,7 +264,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
/** /**
* Generates an {@link GroupOperationBuilder} for an {@code $push}-expression for the given field-reference. * Generates an {@link GroupOperationBuilder} for an {@code $push}-expression for the given field-reference.
* *
* @param reference * @param reference
* @return * @return
*/ */
@@ -258,7 +274,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
/** /**
* Generates an {@link GroupOperationBuilder} for an {@code $push}-expression for the given value. * Generates an {@link GroupOperationBuilder} for an {@code $push}-expression for the given value.
* *
* @param value * @param value
* @return * @return
*/ */
@@ -266,13 +282,13 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return push(null, value); return push(null, value);
} }
private GroupOperationBuilder push(String reference, Object value) { private GroupOperationBuilder push(@Nullable String reference, @Nullable Object value) {
return newBuilder(GroupOps.PUSH, reference, value); return newBuilder(GroupOps.PUSH, reference, value);
} }
/** /**
* Generates an {@link GroupOperationBuilder} for an {@code $min}-expression that for the given field-reference. * Generates an {@link GroupOperationBuilder} for an {@code $min}-expression that for the given field-reference.
* *
* @param reference * @param reference
* @return * @return
*/ */
@@ -283,7 +299,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
/** /**
* Generates an {@link GroupOperationBuilder} for an {@code $min}-expression that for the given * Generates an {@link GroupOperationBuilder} for an {@code $min}-expression that for the given
* {@link AggregationExpression}. * {@link AggregationExpression}.
* *
* @param expr * @param expr
* @return * @return
*/ */
@@ -293,7 +309,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
/** /**
* Generates an {@link GroupOperationBuilder} for an {@code $max}-expression that for the given field-reference. * Generates an {@link GroupOperationBuilder} for an {@code $max}-expression that for the given field-reference.
* *
* @param reference * @param reference
* @return * @return
*/ */
@@ -304,7 +320,7 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
/** /**
* Generates an {@link GroupOperationBuilder} for an {@code $max}-expression that for the given * Generates an {@link GroupOperationBuilder} for an {@code $max}-expression that for the given
* {@link AggregationExpression}. * {@link AggregationExpression}.
* *
* @param expr * @param expr
* @return * @return
*/ */
@@ -325,7 +341,8 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
} }
/** /**
* Generates an {@link GroupOperationBuilder} for an {@code $stdDevSamp}-expression that for the given {@link AggregationExpression}. * Generates an {@link GroupOperationBuilder} for an {@code $stdDevSamp}-expression that for the given
* {@link AggregationExpression}.
* *
* @param expr must not be {@literal null}. * @param expr must not be {@literal null}.
* @return never {@literal null}. * @return never {@literal null}.
@@ -347,7 +364,8 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
} }
/** /**
* Generates an {@link GroupOperationBuilder} for an {@code $stdDevPop}-expression that for the given {@link AggregationExpression}. * Generates an {@link GroupOperationBuilder} for an {@code $stdDevPop}-expression that for the given
* {@link AggregationExpression}.
* *
* @param expr must not be {@literal null}. * @param expr must not be {@literal null}.
* @return never {@literal null}. * @return never {@literal null}.
@@ -357,11 +375,11 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return newBuilder(GroupOps.STD_DEV_POP, null, expr); return newBuilder(GroupOps.STD_DEV_POP, null, expr);
} }
private GroupOperationBuilder newBuilder(Keyword keyword, String reference, Object value) { private GroupOperationBuilder newBuilder(Keyword keyword, @Nullable String reference, @Nullable Object value) {
return new GroupOperationBuilder(this, new Operation(keyword, null, reference, value)); return new GroupOperationBuilder(this, new Operation(keyword, null, reference, value));
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getFields() * @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getFields()
*/ */
@@ -421,7 +439,8 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
private static enum GroupOps implements Keyword { private static enum GroupOps implements Keyword {
SUM("$sum"), LAST("$last"), FIRST("$first"), PUSH("$push"), AVG("$avg"), MIN("$min"), MAX("$max"), ADD_TO_SET("$addToSet"), STD_DEV_POP("$stdDevPop"), STD_DEV_SAMP("$stdDevSamp"); SUM("$sum"), LAST("$last"), FIRST("$first"), PUSH("$push"), AVG("$avg"), MIN("$min"), MAX("$max"), ADD_TO_SET(
"$addToSet"), STD_DEV_POP("$stdDevPop"), STD_DEV_SAMP("$stdDevSamp");
private String mongoOperator; private String mongoOperator;
@@ -429,7 +448,6 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
this.mongoOperator = mongoOperator; this.mongoOperator = mongoOperator;
} }
@Override @Override
public String toString() { public String toString() {
return mongoOperator; return mongoOperator;
@@ -439,11 +457,11 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
static class Operation implements AggregationOperation { static class Operation implements AggregationOperation {
private final Keyword op; private final Keyword op;
private final String key; private final @Nullable String key;
private final String reference; private final @Nullable String reference;
private final Object value; private final @Nullable Object value;
public Operation(Keyword op, String key, String reference, Object value) { public Operation(Keyword op, @Nullable String key, @Nullable String reference, @Nullable Object value) {
this.op = op; this.op = op;
this.key = key; this.key = key;

View File

@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.core.aggregation;
import org.bson.Document; import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField; import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation; import org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
/** /**
@@ -28,14 +29,15 @@ import org.springframework.util.Assert;
* @author Christoph Strobl * @author Christoph Strobl
* @author Mark Paluch * @author Mark Paluch
* @since 1.9 * @since 1.9
* @see <a href="https://docs.mongodb.com/manual/reference/operator/aggregation/lookup/">MongoDB Aggregation Framework: $lookup</a> * @see <a href="https://docs.mongodb.com/manual/reference/operator/aggregation/lookup/">MongoDB Aggregation Framework:
* $lookup</a>
*/ */
public class LookupOperation implements FieldsExposingAggregationOperation, InheritsFieldsAggregationOperation { public class LookupOperation implements FieldsExposingAggregationOperation, InheritsFieldsAggregationOperation {
private Field from; private final Field from;
private Field localField; private final Field localField;
private Field foreignField; private final Field foreignField;
private ExposedField as; private final ExposedField as;
/** /**
* Creates a new {@link LookupOperation} for the given {@link Field}s. * Creates a new {@link LookupOperation} for the given {@link Field}s.
@@ -58,10 +60,6 @@ public class LookupOperation implements FieldsExposingAggregationOperation, Inhe
this.as = new ExposedField(as, true); this.as = new ExposedField(as, true);
} }
private LookupOperation() {
// used by builder
}
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields() * @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
@@ -143,11 +141,10 @@ public class LookupOperation implements FieldsExposingAggregationOperation, Inhe
public static final class LookupOperationBuilder public static final class LookupOperationBuilder
implements FromBuilder, LocalFieldBuilder, ForeignFieldBuilder, AsBuilder { implements FromBuilder, LocalFieldBuilder, ForeignFieldBuilder, AsBuilder {
private final LookupOperation lookupOperation; private @Nullable Field from;
private @Nullable Field localField;
private LookupOperationBuilder() { private @Nullable Field foreignField;
this.lookupOperation = new LookupOperation(); private @Nullable ExposedField as;
}
/** /**
* Creates new builder for {@link LookupOperation}. * Creates new builder for {@link LookupOperation}.
@@ -162,7 +159,7 @@ public class LookupOperation implements FieldsExposingAggregationOperation, Inhe
public LocalFieldBuilder from(String name) { public LocalFieldBuilder from(String name) {
Assert.hasText(name, "'From' must not be null or empty!"); Assert.hasText(name, "'From' must not be null or empty!");
lookupOperation.from = Fields.field(name); from = Fields.field(name);
return this; return this;
} }
@@ -170,16 +167,16 @@ public class LookupOperation implements FieldsExposingAggregationOperation, Inhe
public LookupOperation as(String name) { public LookupOperation as(String name) {
Assert.hasText(name, "'As' must not be null or empty!"); Assert.hasText(name, "'As' must not be null or empty!");
lookupOperation.as = new ExposedField(Fields.field(name), true); as = new ExposedField(Fields.field(name), true);
return new LookupOperation(lookupOperation.from, lookupOperation.localField, lookupOperation.foreignField, return new LookupOperation(from, localField, foreignField,
lookupOperation.as); as);
} }
@Override @Override
public AsBuilder foreignField(String name) { public AsBuilder foreignField(String name) {
Assert.hasText(name, "'ForeignField' must not be null or empty!"); Assert.hasText(name, "'ForeignField' must not be null or empty!");
lookupOperation.foreignField = Fields.field(name); foreignField = Fields.field(name);
return this; return this;
} }
@@ -187,7 +184,7 @@ public class LookupOperation implements FieldsExposingAggregationOperation, Inhe
public ForeignFieldBuilder localField(String name) { public ForeignFieldBuilder localField(String name) {
Assert.hasText(name, "'LocalField' must not be null or empty!"); Assert.hasText(name, "'LocalField' must not be null or empty!");
lookupOperation.localField = Fields.field(name); localField = Fields.field(name);
return this; return this;
} }
} }

View File

@@ -22,13 +22,13 @@ import java.util.Collections;
import java.util.List; import java.util.List;
import org.bson.Document; import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.VariableOperators.Let.ExpressionVariable;
import org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond; import org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond;
import org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull; import org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField; import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField; import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
import org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder.FieldProjection; import org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder.FieldProjection;
import org.springframework.data.mongodb.core.aggregation.VariableOperators.Let.ExpressionVariable; import org.springframework.data.mongodb.core.aggregation.VariableOperators.Let.ExpressionVariable;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
/** /**
@@ -39,14 +39,15 @@ import org.springframework.util.Assert;
* <p> * <p>
* We recommend to use the static factory method {@link Aggregation#project(Fields)} instead of creating instances of * We recommend to use the static factory method {@link Aggregation#project(Fields)} instead of creating instances of
* this class directly. * this class directly.
* *
* @author Tobias Trelle * @author Tobias Trelle
* @author Thomas Darimont * @author Thomas Darimont
* @author Oliver Gierke * @author Oliver Gierke
* @author Christoph Strobl * @author Christoph Strobl
* @author Mark Paluch * @author Mark Paluch
* @since 1.3 * @since 1.3
* @see <a href="https://docs.mongodb.com/manual/reference/operator/aggregation/project/">MongoDB Aggregation Framework: $project</a> * @see <a href="https://docs.mongodb.com/manual/reference/operator/aggregation/project/">MongoDB Aggregation Framework:
* $project</a>
*/ */
public class ProjectionOperation implements FieldsExposingAggregationOperation { public class ProjectionOperation implements FieldsExposingAggregationOperation {
@@ -65,7 +66,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Creates a new {@link ProjectionOperation} including the given {@link Fields}. * Creates a new {@link ProjectionOperation} including the given {@link Fields}.
* *
* @param fields must not be {@literal null}. * @param fields must not be {@literal null}.
*/ */
public ProjectionOperation(Fields fields) { public ProjectionOperation(Fields fields) {
@@ -75,7 +76,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Copy constructor to allow building up {@link ProjectionOperation} instances from already existing * Copy constructor to allow building up {@link ProjectionOperation} instances from already existing
* {@link Projection}s. * {@link Projection}s.
* *
* @param current must not be {@literal null}. * @param current must not be {@literal null}.
* @param projections must not be {@literal null}. * @param projections must not be {@literal null}.
*/ */
@@ -91,18 +92,18 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Creates a new {@link ProjectionOperation} with the current {@link Projection}s and the given one. * Creates a new {@link ProjectionOperation} with the current {@link Projection}s and the given one.
* *
* @param projection must not be {@literal null}. * @param projection must not be {@literal null}.
* @return * @return
*/ */
private ProjectionOperation and(Projection projection) { private ProjectionOperation and(Projection projection) {
return new ProjectionOperation(this.projections, Arrays.asList(projection)); return new ProjectionOperation(this.projections, Collections.singletonList(projection));
} }
/** /**
* Creates a new {@link ProjectionOperation} with the current {@link Projection}s replacing the last current one with * Creates a new {@link ProjectionOperation} with the current {@link Projection}s replacing the last current one with
* the given one. * the given one.
* *
* @param projection must not be {@literal null}. * @param projection must not be {@literal null}.
* @return * @return
*/ */
@@ -110,12 +111,12 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
List<Projection> projections = this.projections.isEmpty() ? Collections.<Projection> emptyList() List<Projection> projections = this.projections.isEmpty() ? Collections.<Projection> emptyList()
: this.projections.subList(0, this.projections.size() - 1); : this.projections.subList(0, this.projections.size() - 1);
return new ProjectionOperation(projections, Arrays.asList(projection)); return new ProjectionOperation(projections, Collections.singletonList(projection));
} }
/** /**
* Creates a new {@link ProjectionOperationBuilder} to define a projection for the field with the given name. * Creates a new {@link ProjectionOperationBuilder} to define a projection for the field with the given name.
* *
* @param name must not be {@literal null} or empty. * @param name must not be {@literal null} or empty.
* @return * @return
*/ */
@@ -133,24 +134,19 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Excludes the given fields from the projection. * Excludes the given fields from the projection.
* *
* @param fieldNames must not be {@literal null}. * @param fieldNames must not be {@literal null}.
* @return * @return
*/ */
public ProjectionOperation andExclude(String... fieldNames) { public ProjectionOperation andExclude(String... fieldNames) {
for (String fieldName : fieldNames) {
Assert.isTrue(Fields.UNDERSCORE_ID.equals(fieldName),
String.format(EXCLUSION_ERROR, fieldName, Fields.UNDERSCORE_ID));
}
List<FieldProjection> excludeProjections = FieldProjection.from(Fields.fields(fieldNames), false); List<FieldProjection> excludeProjections = FieldProjection.from(Fields.fields(fieldNames), false);
return new ProjectionOperation(this.projections, excludeProjections); return new ProjectionOperation(this.projections, excludeProjections);
} }
/** /**
* Includes the given fields into the projection. * Includes the given fields into the projection.
* *
* @param fieldNames must not be {@literal null}. * @param fieldNames must not be {@literal null}.
* @return * @return
*/ */
@@ -162,7 +158,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Includes the given fields into the projection. * Includes the given fields into the projection.
* *
* @param fields must not be {@literal null}. * @param fields must not be {@literal null}.
* @return * @return
*/ */
@@ -184,7 +180,19 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
fields = fields == null ? ExposedFields.from(field) : fields.and(field); fields = fields == null ? ExposedFields.from(field) : fields.and(field);
} }
return fields; return fields != null ? fields : ExposedFields.empty();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#inheritsFields()
*/
@Override
public boolean inheritsFields() {
return projections.stream().filter(FieldProjection.class::isInstance) //
.map(FieldProjection.class::cast) //
.anyMatch(FieldProjection::isExcluded);
} }
/* /*
@@ -205,7 +213,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Base class for {@link ProjectionOperationBuilder}s. * Base class for {@link ProjectionOperationBuilder}s.
* *
* @author Thomas Darimont * @author Thomas Darimont
*/ */
private static abstract class AbstractProjectionOperationBuilder implements AggregationOperation { private static abstract class AbstractProjectionOperationBuilder implements AggregationOperation {
@@ -215,7 +223,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Creates a new {@link AbstractProjectionOperationBuilder} fot the given value and {@link ProjectionOperation}. * Creates a new {@link AbstractProjectionOperationBuilder} fot the given value and {@link ProjectionOperation}.
* *
* @param value must not be {@literal null}. * @param value must not be {@literal null}.
* @param operation must not be {@literal null}. * @param operation must not be {@literal null}.
*/ */
@@ -228,7 +236,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
this.operation = operation; this.operation = operation;
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext) * @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/ */
@@ -239,7 +247,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Returns the finally to be applied {@link ProjectionOperation} with the given alias. * Returns the finally to be applied {@link ProjectionOperation} with the given alias.
* *
* @param alias will never be {@literal null} or empty. * @param alias will never be {@literal null} or empty.
* @return * @return
*/ */
@@ -266,7 +274,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* An {@link ProjectionOperationBuilder} that is used for SpEL expression based projections. * An {@link ProjectionOperationBuilder} that is used for SpEL expression based projections.
* *
* @author Thomas Darimont * @author Thomas Darimont
*/ */
public static class ExpressionProjectionOperationBuilder extends ProjectionOperationBuilder { public static class ExpressionProjectionOperationBuilder extends ProjectionOperationBuilder {
@@ -277,7 +285,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Creates a new {@link ExpressionProjectionOperationBuilder} for the given value, {@link ProjectionOperation} and * Creates a new {@link ExpressionProjectionOperationBuilder} for the given value, {@link ProjectionOperation} and
* parameters. * parameters.
* *
* @param expression must not be {@literal null}. * @param expression must not be {@literal null}.
* @param operation must not be {@literal null}. * @param operation must not be {@literal null}.
* @param parameters * @param parameters
@@ -325,7 +333,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* A {@link Projection} based on a SpEL expression. * A {@link Projection} based on a SpEL expression.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @author Oliver Gierke * @author Oliver Gierke
*/ */
@@ -338,7 +346,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Creates a new {@link ExpressionProjection} for the given field, SpEL expression and parameters. * Creates a new {@link ExpressionProjection} for the given field, SpEL expression and parameters.
* *
* @param field must not be {@literal null}. * @param field must not be {@literal null}.
* @param expression must not be {@literal null} or empty. * @param expression must not be {@literal null} or empty.
* @param parameters must not be {@literal null}. * @param parameters must not be {@literal null}.
@@ -354,7 +362,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
this.params = parameters.clone(); this.params = parameters.clone();
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext) * @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/ */
@@ -372,7 +380,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Builder for {@link ProjectionOperation}s on a field. * Builder for {@link ProjectionOperation}s on a field.
* *
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
* @author Christoph Strobl * @author Christoph Strobl
@@ -382,19 +390,19 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
private static final String NUMBER_NOT_NULL = "Number must not be null!"; private static final String NUMBER_NOT_NULL = "Number must not be null!";
private static final String FIELD_REFERENCE_NOT_NULL = "Field reference must not be null!"; private static final String FIELD_REFERENCE_NOT_NULL = "Field reference must not be null!";
private final String name; private final @Nullable String name;
private final OperationProjection previousProjection; private final @Nullable OperationProjection previousProjection;
/** /**
* Creates a new {@link ProjectionOperationBuilder} for the field with the given name on top of the given * Creates a new {@link ProjectionOperationBuilder} for the field with the given name on top of the given
* {@link ProjectionOperation}. * {@link ProjectionOperation}.
* *
* @param name must not be {@literal null} or empty. * @param name must not be {@literal null} or empty.
* @param operation must not be {@literal null}. * @param operation must not be {@literal null}.
* @param previousProjection the previous operation projection, may be {@literal null}. * @param previousProjection the previous operation projection, may be {@literal null}.
*/ */
public ProjectionOperationBuilder(String name, ProjectionOperation operation, public ProjectionOperationBuilder(String name, ProjectionOperation operation,
OperationProjection previousProjection) { @Nullable OperationProjection previousProjection) {
super(name, operation); super(name, operation);
this.name = name; this.name = name;
@@ -404,13 +412,13 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Creates a new {@link ProjectionOperationBuilder} for the field with the given value on top of the given * Creates a new {@link ProjectionOperationBuilder} for the field with the given value on top of the given
* {@link ProjectionOperation}. * {@link ProjectionOperation}.
* *
* @param value * @param value
* @param operation * @param operation
* @param previousProjection * @param previousProjection
*/ */
protected ProjectionOperationBuilder(Object value, ProjectionOperation operation, protected ProjectionOperationBuilder(Object value, ProjectionOperation operation,
OperationProjection previousProjection) { @Nullable OperationProjection previousProjection) {
super(value, operation); super(value, operation);
@@ -421,28 +429,28 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Projects the result of the previous operation onto the current field. Will automatically add an exclusion for * Projects the result of the previous operation onto the current field. Will automatically add an exclusion for
* {@code _id} as what would be held in it by default will now go into the field just projected into. * {@code _id} as what would be held in it by default will now go into the field just projected into.
* *
* @return * @return
*/ */
public ProjectionOperation previousOperation() { public ProjectionOperation previousOperation() {
return this.operation.andExclude(Fields.UNDERSCORE_ID) // return this.operation.andExclude(Fields.UNDERSCORE_ID) //
.and(new PreviousOperationProjection(name)); .and(new PreviousOperationProjection(getRequiredName()));
} }
/** /**
* Defines a nested field binding for the current field. * Defines a nested field binding for the current field.
* *
* @param fields must not be {@literal null}. * @param fields must not be {@literal null}.
* @return * @return
*/ */
public ProjectionOperation nested(Fields fields) { public ProjectionOperation nested(Fields fields) {
return this.operation.and(new NestedFieldProjection(name, fields)); return this.operation.and(new NestedFieldProjection(getRequiredName(), fields));
} }
/** /**
* Allows to specify an alias for the previous projection operation. * Allows to specify an alias for the previous projection operation.
* *
* @param alias * @param alias
* @return * @return
*/ */
@@ -454,10 +462,10 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
} }
if (value instanceof AggregationExpression) { if (value instanceof AggregationExpression) {
return this.operation.and(new ExpressionProjection(Fields.field(alias), (AggregationExpression) value)); return this.operation.and(new ExpressionProjection(Fields.field(alias, alias), (AggregationExpression) value));
} }
return this.operation.and(new FieldProjection(Fields.field(alias, name), null)); return this.operation.and(new FieldProjection(Fields.field(alias, getRequiredName()), null));
} }
/* /*
@@ -468,7 +476,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
public ProjectionOperation applyCondition(Cond cond) { public ProjectionOperation applyCondition(Cond cond) {
Assert.notNull(cond, "ConditionalOperator must not be null!"); Assert.notNull(cond, "ConditionalOperator must not be null!");
return this.operation.and(new ExpressionProjection(Fields.field(name), cond)); return this.operation.and(new ExpressionProjection(Fields.field(getRequiredName()), cond));
} }
/* /*
@@ -479,12 +487,12 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
public ProjectionOperation applyCondition(IfNull ifNull) { public ProjectionOperation applyCondition(IfNull ifNull) {
Assert.notNull(ifNull, "IfNullOperator must not be null!"); Assert.notNull(ifNull, "IfNullOperator must not be null!");
return this.operation.and(new ExpressionProjection(Fields.field(name), ifNull)); return this.operation.and(new ExpressionProjection(Fields.field(getRequiredName()), ifNull));
} }
/** /**
* Generates an {@code $add} expression that adds the given number to the previously mentioned field. * Generates an {@code $add} expression that adds the given number to the previously mentioned field.
* *
* @param number * @param number
* @return * @return
*/ */
@@ -496,7 +504,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Generates an {@code $add} expression that adds the value of the given field to the previously mentioned field. * Generates an {@code $add} expression that adds the value of the given field to the previously mentioned field.
* *
* @param fieldReference * @param fieldReference
* @return * @return
*/ */
@@ -508,7 +516,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Generates an {@code $subtract} expression that subtracts the given number to the previously mentioned field. * Generates an {@code $subtract} expression that subtracts the given number to the previously mentioned field.
* *
* @param number * @param number
* @return * @return
*/ */
@@ -521,7 +529,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Generates an {@code $subtract} expression that subtracts the value of the given field to the previously mentioned * Generates an {@code $subtract} expression that subtracts the value of the given field to the previously mentioned
* field. * field.
* *
* @param fieldReference * @param fieldReference
* @return * @return
*/ */
@@ -547,7 +555,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Generates an {@code $multiply} expression that multiplies the given number with the previously mentioned field. * Generates an {@code $multiply} expression that multiplies the given number with the previously mentioned field.
* *
* @param number * @param number
* @return * @return
*/ */
@@ -560,7 +568,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Generates an {@code $multiply} expression that multiplies the value of the given field with the previously * Generates an {@code $multiply} expression that multiplies the value of the given field with the previously
* mentioned field. * mentioned field.
* *
* @param fieldReference * @param fieldReference
* @return * @return
*/ */
@@ -586,7 +594,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Generates an {@code $divide} expression that divides the previously mentioned field by the given number. * Generates an {@code $divide} expression that divides the previously mentioned field by the given number.
* *
* @param number * @param number
* @return * @return
*/ */
@@ -600,7 +608,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Generates an {@code $divide} expression that divides the value of the given field by the previously mentioned * Generates an {@code $divide} expression that divides the value of the given field by the previously mentioned
* field. * field.
* *
* @param fieldReference * @param fieldReference
* @return * @return
*/ */
@@ -627,7 +635,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Generates an {@code $mod} expression that divides the previously mentioned field by the given number and returns * Generates an {@code $mod} expression that divides the previously mentioned field by the given number and returns
* the remainder. * the remainder.
* *
* @param number * @param number
* @return * @return
*/ */
@@ -789,7 +797,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @since 1.10 * @since 1.10
*/ */
public ProjectionOperationBuilder filter(String as, AggregationExpression condition) { public ProjectionOperationBuilder filter(String as, AggregationExpression condition) {
return this.operation.and(ArrayOperators.Filter.filter(name).as(as).by(condition)); return this.operation.and(ArrayOperators.Filter.filter(getRequiredName()).as(as).by(condition));
} }
// SET OPERATORS // SET OPERATORS
@@ -894,7 +902,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @since 1.10 * @since 1.10
*/ */
public ProjectionOperationBuilder absoluteValue() { public ProjectionOperationBuilder absoluteValue() {
return this.operation.and(ArithmeticOperators.Abs.absoluteValueOf(name)); return this.operation.and(ArithmeticOperators.Abs.absoluteValueOf(getRequiredName()));
} }
/** /**
@@ -905,7 +913,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @since 1.10 * @since 1.10
*/ */
public ProjectionOperationBuilder ceil() { public ProjectionOperationBuilder ceil() {
return this.operation.and(ArithmeticOperators.Ceil.ceilValueOf(name)); return this.operation.and(ArithmeticOperators.Ceil.ceilValueOf(getRequiredName()));
} }
/** /**
@@ -916,7 +924,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @since 1.10 * @since 1.10
*/ */
public ProjectionOperationBuilder exp() { public ProjectionOperationBuilder exp() {
return this.operation.and(ArithmeticOperators.Exp.expValueOf(name)); return this.operation.and(ArithmeticOperators.Exp.expValueOf(getRequiredName()));
} }
/** /**
@@ -927,7 +935,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @since 1.10 * @since 1.10
*/ */
public ProjectionOperationBuilder floor() { public ProjectionOperationBuilder floor() {
return this.operation.and(ArithmeticOperators.Floor.floorValueOf(name)); return this.operation.and(ArithmeticOperators.Floor.floorValueOf(getRequiredName()));
} }
/** /**
@@ -938,7 +946,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @since 1.10 * @since 1.10
*/ */
public ProjectionOperationBuilder ln() { public ProjectionOperationBuilder ln() {
return this.operation.and(ArithmeticOperators.Ln.lnValueOf(name)); return this.operation.and(ArithmeticOperators.Ln.lnValueOf(getRequiredName()));
} }
/** /**
@@ -950,7 +958,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @since 1.10 * @since 1.10
*/ */
public ProjectionOperationBuilder log(String baseFieldRef) { public ProjectionOperationBuilder log(String baseFieldRef) {
return this.operation.and(ArithmeticOperators.Log.valueOf(name).log(baseFieldRef)); return this.operation.and(ArithmeticOperators.Log.valueOf(getRequiredName()).log(baseFieldRef));
} }
/** /**
@@ -962,7 +970,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @since 1.10 * @since 1.10
*/ */
public ProjectionOperationBuilder log(Number base) { public ProjectionOperationBuilder log(Number base) {
return this.operation.and(ArithmeticOperators.Log.valueOf(name).log(base)); return this.operation.and(ArithmeticOperators.Log.valueOf(getRequiredName()).log(base));
} }
/** /**
@@ -974,7 +982,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @since 1.10 * @since 1.10
*/ */
public ProjectionOperationBuilder log(AggregationExpression base) { public ProjectionOperationBuilder log(AggregationExpression base) {
return this.operation.and(ArithmeticOperators.Log.valueOf(name).log(base)); return this.operation.and(ArithmeticOperators.Log.valueOf(getRequiredName()).log(base));
} }
/** /**
@@ -985,7 +993,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @since 1.10 * @since 1.10
*/ */
public ProjectionOperationBuilder log10() { public ProjectionOperationBuilder log10() {
return this.operation.and(ArithmeticOperators.Log10.log10ValueOf(name)); return this.operation.and(ArithmeticOperators.Log10.log10ValueOf(getRequiredName()));
} }
/** /**
@@ -997,7 +1005,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @since 1.10 * @since 1.10
*/ */
public ProjectionOperationBuilder pow(String exponentFieldRef) { public ProjectionOperationBuilder pow(String exponentFieldRef) {
return this.operation.and(ArithmeticOperators.Pow.valueOf(name).pow(exponentFieldRef)); return this.operation.and(ArithmeticOperators.Pow.valueOf(getRequiredName()).pow(exponentFieldRef));
} }
/** /**
@@ -1009,7 +1017,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @since 1.10 * @since 1.10
*/ */
public ProjectionOperationBuilder pow(Number exponent) { public ProjectionOperationBuilder pow(Number exponent) {
return this.operation.and(ArithmeticOperators.Pow.valueOf(name).pow(exponent)); return this.operation.and(ArithmeticOperators.Pow.valueOf(getRequiredName()).pow(exponent));
} }
/** /**
@@ -1021,7 +1029,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @since 1.10 * @since 1.10
*/ */
public ProjectionOperationBuilder pow(AggregationExpression exponentExpression) { public ProjectionOperationBuilder pow(AggregationExpression exponentExpression) {
return this.operation.and(ArithmeticOperators.Pow.valueOf(name).pow(exponentExpression)); return this.operation.and(ArithmeticOperators.Pow.valueOf(getRequiredName()).pow(exponentExpression));
} }
/** /**
@@ -1032,7 +1040,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @since 1.10 * @since 1.10
*/ */
public ProjectionOperationBuilder sqrt() { public ProjectionOperationBuilder sqrt() {
return this.operation.and(ArithmeticOperators.Sqrt.sqrtOf(name)); return this.operation.and(ArithmeticOperators.Sqrt.sqrtOf(getRequiredName()));
} }
/** /**
@@ -1042,7 +1050,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @since 1.10 * @since 1.10
*/ */
public ProjectionOperationBuilder trunc() { public ProjectionOperationBuilder trunc() {
return this.operation.and(ArithmeticOperators.Trunc.truncValueOf(name)); return this.operation.and(ArithmeticOperators.Trunc.truncValueOf(getRequiredName()));
} }
/** /**
@@ -1089,7 +1097,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @since 1.10 * @since 1.10
*/ */
public ProjectionOperationBuilder toLower() { public ProjectionOperationBuilder toLower() {
return this.operation.and(StringOperators.ToLower.lowerValueOf(name)); return this.operation.and(StringOperators.ToLower.lowerValueOf(getRequiredName()));
} }
/** /**
@@ -1100,7 +1108,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @since 1.10 * @since 1.10
*/ */
public ProjectionOperationBuilder toUpper() { public ProjectionOperationBuilder toUpper() {
return this.operation.and(StringOperators.ToUpper.upperValueOf(name)); return this.operation.and(StringOperators.ToUpper.upperValueOf(getRequiredName()));
} }
/** /**
@@ -1171,7 +1179,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @since 1.10 * @since 1.10
*/ */
public ProjectionOperationBuilder isArray() { public ProjectionOperationBuilder isArray() {
return this.operation.and(ArrayOperators.IsArray.isArray(name)); return this.operation.and(ArrayOperators.IsArray.isArray(getRequiredName()));
} }
/** /**
@@ -1181,7 +1189,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @since 1.10 * @since 1.10
*/ */
public ProjectionOperationBuilder asLiteral() { public ProjectionOperationBuilder asLiteral() {
return this.operation.and(LiteralOperators.Literal.asLiteral(name)); return this.operation.and(LiteralOperators.Literal.asLiteral(getRequiredName()));
} }
/** /**
@@ -1193,7 +1201,19 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @since 1.10 * @since 1.10
*/ */
public ProjectionOperationBuilder dateAsFormattedString(String format) { public ProjectionOperationBuilder dateAsFormattedString(String format) {
return this.operation.and(DateOperators.DateToString.dateOf(name).toString(format)); return this.operation.and(DateOperators.DateToString.dateOf(getRequiredName()).toString(format));
}
/**
* Generates a {@code $dateToString} expression that takes the date representation of the previously mentioned field
* using the server default format. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @return
* @since 2.0.10
*/
public ProjectionOperationBuilder dateAsFormattedString() {
return this.operation.and(DateOperators.DateToString.dateOf(getRequiredName()).defaultFormat());
} }
/** /**
@@ -1225,6 +1245,13 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return this.operation.and(VariableOperators.Let.define(variables).andApply(in)); return this.operation.and(VariableOperators.Let.define(variables).andApply(in));
} }
private String getRequiredName() {
Assert.state(name != null, "Projection field name must not be null!");
return name;
}
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext) * @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
@@ -1236,7 +1263,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Adds a generic projection for the current field. * Adds a generic projection for the current field.
* *
* @param operation the operation key, e.g. {@code $add}. * @param operation the operation key, e.g. {@code $add}.
* @param values the values to be set for the projection operation. * @param values the values to be set for the projection operation.
* @return * @return
@@ -1249,7 +1276,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* A {@link Projection} to pull in the result of the previous operation. * A {@link Projection} to pull in the result of the previous operation.
* *
* @author Oliver Gierke * @author Oliver Gierke
*/ */
static class PreviousOperationProjection extends Projection { static class PreviousOperationProjection extends Projection {
@@ -1258,7 +1285,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Creates a new {@link PreviousOperationProjection} for the field with the given name. * Creates a new {@link PreviousOperationProjection} for the field with the given name.
* *
* @param name must not be {@literal null} or empty. * @param name must not be {@literal null} or empty.
*/ */
public PreviousOperationProjection(String name) { public PreviousOperationProjection(String name) {
@@ -1266,7 +1293,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
this.name = name; this.name = name;
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext) * @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/ */
@@ -1278,7 +1305,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* A {@link FieldProjection} to map a result of a previous {@link AggregationOperation} to a new field. * A {@link FieldProjection} to map a result of a previous {@link AggregationOperation} to a new field.
* *
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
* @author Mark Paluch * @author Mark Paluch
@@ -1286,11 +1313,11 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
static class FieldProjection extends Projection { static class FieldProjection extends Projection {
private final Field field; private final Field field;
private final Object value; private final @Nullable Object value;
/** /**
* Creates a new {@link FieldProjection} for the field of the given name, assigning the given value. * Creates a new {@link FieldProjection} for the field of the given name, assigning the given value.
* *
* @param name must not be {@literal null} or empty. * @param name must not be {@literal null} or empty.
* @param value * @param value
*/ */
@@ -1298,7 +1325,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
this(Fields.field(name), value); this(Fields.field(name), value);
} }
private FieldProjection(Field field, Object value) { private FieldProjection(Field field, @Nullable Object value) {
super(new ExposedField(field.getName(), true)); super(new ExposedField(field.getName(), true));
@@ -1309,7 +1336,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Factory method to easily create {@link FieldProjection}s for the given {@link Fields}. Fields are projected as * Factory method to easily create {@link FieldProjection}s for the given {@link Fields}. Fields are projected as
* references with their given name. A field {@code foo} will be projected as: {@code foo : 1 } . * references with their given name. A field {@code foo} will be projected as: {@code foo : 1 } .
* *
* @param fields the {@link Fields} to in- or exclude, must not be {@literal null}. * @param fields the {@link Fields} to in- or exclude, must not be {@literal null}.
* @return * @return
*/ */
@@ -1319,12 +1346,12 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Factory method to easily create {@link FieldProjection}s for the given {@link Fields}. * Factory method to easily create {@link FieldProjection}s for the given {@link Fields}.
* *
* @param fields the {@link Fields} to in- or exclude, must not be {@literal null}. * @param fields the {@link Fields} to in- or exclude, must not be {@literal null}.
* @param value to use for the given field. * @param value to use for the given field.
* @return * @return
*/ */
public static List<FieldProjection> from(Fields fields, Object value) { public static List<FieldProjection> from(Fields fields, @Nullable Object value) {
Assert.notNull(fields, "Fields must not be null!"); Assert.notNull(fields, "Fields must not be null!");
List<FieldProjection> projections = new ArrayList<FieldProjection>(); List<FieldProjection> projections = new ArrayList<FieldProjection>();
@@ -1336,7 +1363,14 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return projections; return projections;
} }
/* /**
* @return {@literal true} if this field is excluded.
*/
public boolean isExcluded() {
return Boolean.FALSE.equals(value);
}
/*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext) * @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/ */
@@ -1375,12 +1409,12 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Creates a new {@link OperationProjection} for the given field. * Creates a new {@link OperationProjection} for the given field.
* *
* @param field the name of the field to add the operation projection for, must not be {@literal null} or empty. * @param field the name of the field to add the operation projection for, must not be {@literal null} or empty.
* @param operation the actual operation key, must not be {@literal null} or empty. * @param operation the actual operation key, must not be {@literal null} or empty.
* @param values the values to pass into the operation, must not be {@literal null}. * @param values the values to pass into the operation, must not be {@literal null}.
*/ */
public OperationProjection(Field field, String operation, Object[] values) { OperationProjection(Field field, String operation, Object[] values) {
super(field); super(field);
@@ -1407,7 +1441,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
protected List<Object> getOperationArguments(AggregationOperationContext context) { protected List<Object> getOperationArguments(AggregationOperationContext context) {
List<Object> result = new ArrayList<Object>(values.size()); List<Object> result = new ArrayList<Object>(values.size());
result.add(context.getReference(getField().getName()).toString()); result.add(context.getReference(getField()).toString());
for (Object element : values) { for (Object element : values) {
@@ -1429,7 +1463,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Returns the field that holds the {@link OperationProjection}. * Returns the field that holds the {@link OperationProjection}.
* *
* @return * @return
*/ */
protected Field getField() { protected Field getField() {
@@ -1452,11 +1486,11 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Creates a new instance of this {@link OperationProjection} with the given alias. * Creates a new instance of this {@link OperationProjection} with the given alias.
* *
* @param alias the alias to set * @param alias the alias to set
* @return * @return
*/ */
public OperationProjection withAlias(String alias) { OperationProjection withAlias(String alias) {
final Field aliasedField = Fields.field(alias, this.field.getName()); final Field aliasedField = Fields.field(alias, this.field.getName());
return new OperationProjection(aliasedField, operation, values.toArray()) { return new OperationProjection(aliasedField, operation, values.toArray()) {
@@ -1486,14 +1520,14 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
private final String name; private final String name;
private final Fields fields; private final Fields fields;
public NestedFieldProjection(String name, Fields fields) { NestedFieldProjection(String name, Fields fields) {
super(Fields.field(name)); super(Fields.field(name));
this.name = name; this.name = name;
this.fields = fields; this.fields = fields;
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext) * @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/ */
@@ -1512,7 +1546,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Extracts the minute from a date expression. * Extracts the minute from a date expression.
* *
* @return * @return
*/ */
public ProjectionOperationBuilder extractMinute() { public ProjectionOperationBuilder extractMinute() {
@@ -1521,7 +1555,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Extracts the hour from a date expression. * Extracts the hour from a date expression.
* *
* @return * @return
*/ */
public ProjectionOperationBuilder extractHour() { public ProjectionOperationBuilder extractHour() {
@@ -1530,7 +1564,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Extracts the second from a date expression. * Extracts the second from a date expression.
* *
* @return * @return
*/ */
public ProjectionOperationBuilder extractSecond() { public ProjectionOperationBuilder extractSecond() {
@@ -1539,7 +1573,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Extracts the millisecond from a date expression. * Extracts the millisecond from a date expression.
* *
* @return * @return
*/ */
public ProjectionOperationBuilder extractMillisecond() { public ProjectionOperationBuilder extractMillisecond() {
@@ -1548,7 +1582,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Extracts the year from a date expression. * Extracts the year from a date expression.
* *
* @return * @return
*/ */
public ProjectionOperationBuilder extractYear() { public ProjectionOperationBuilder extractYear() {
@@ -1557,7 +1591,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Extracts the month from a date expression. * Extracts the month from a date expression.
* *
* @return * @return
*/ */
public ProjectionOperationBuilder extractMonth() { public ProjectionOperationBuilder extractMonth() {
@@ -1566,7 +1600,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Extracts the week from a date expression. * Extracts the week from a date expression.
* *
* @return * @return
*/ */
public ProjectionOperationBuilder extractWeek() { public ProjectionOperationBuilder extractWeek() {
@@ -1575,7 +1609,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Extracts the dayOfYear from a date expression. * Extracts the dayOfYear from a date expression.
* *
* @return * @return
*/ */
public ProjectionOperationBuilder extractDayOfYear() { public ProjectionOperationBuilder extractDayOfYear() {
@@ -1584,7 +1618,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Extracts the dayOfMonth from a date expression. * Extracts the dayOfMonth from a date expression.
* *
* @return * @return
*/ */
public ProjectionOperationBuilder extractDayOfMonth() { public ProjectionOperationBuilder extractDayOfMonth() {
@@ -1593,7 +1627,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Extracts the dayOfWeek from a date expression. * Extracts the dayOfWeek from a date expression.
* *
* @return * @return
*/ */
public ProjectionOperationBuilder extractDayOfWeek() { public ProjectionOperationBuilder extractDayOfWeek() {
@@ -1603,7 +1637,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Base class for {@link Projection} implementations. * Base class for {@link Projection} implementations.
* *
* @author Oliver Gierke * @author Oliver Gierke
*/ */
private static abstract class Projection { private static abstract class Projection {
@@ -1612,7 +1646,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Creates new {@link Projection} for the given {@link Field}. * Creates new {@link Projection} for the given {@link Field}.
* *
* @param field must not be {@literal null}. * @param field must not be {@literal null}.
*/ */
public Projection(Field field) { public Projection(Field field) {
@@ -1623,7 +1657,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Returns the field exposed by the {@link Projection}. * Returns the field exposed by the {@link Projection}.
* *
* @return will never be {@literal null}. * @return will never be {@literal null}.
*/ */
public ExposedField getExposedField() { public ExposedField getExposedField() {
@@ -1633,7 +1667,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Renders the current {@link Projection} into a {@link Document} based on the given * Renders the current {@link Projection} into a {@link Document} based on the given
* {@link AggregationOperationContext}. * {@link AggregationOperationContext}.
* *
* @param context will never be {@literal null}. * @param context will never be {@literal null}.
* @return * @return
*/ */
@@ -1650,7 +1684,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/** /**
* Creates a new {@link ExpressionProjection}. * Creates a new {@link ExpressionProjection}.
* *
* @param field * @param field
* @param expression * @param expression
*/ */

View File

@@ -43,6 +43,7 @@ import org.springframework.expression.spel.ast.PropertyOrFieldReference;
import org.springframework.expression.spel.standard.SpelExpression; import org.springframework.expression.spel.standard.SpelExpression;
import org.springframework.expression.spel.standard.SpelExpressionParser; import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.expression.spel.support.StandardEvaluationContext; import org.springframework.expression.spel.support.StandardEvaluationContext;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.NumberUtils; import org.springframework.util.NumberUtils;
import org.springframework.util.ObjectUtils; import org.springframework.util.ObjectUtils;
@@ -64,7 +65,7 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
/** /**
* Creates a new {@link SpelExpressionTransformer}. * Creates a new {@link SpelExpressionTransformer}.
*/ */
public SpelExpressionTransformer() { SpelExpressionTransformer() {
List<ExpressionNodeConversion<? extends ExpressionNode>> conversions = new ArrayList<ExpressionNodeConversion<? extends ExpressionNode>>(); List<ExpressionNodeConversion<? extends ExpressionNode>> conversions = new ArrayList<ExpressionNodeConversion<? extends ExpressionNode>>();
conversions.add(new OperatorNodeConversion(this)); conversions.add(new OperatorNodeConversion(this));
@@ -190,12 +191,12 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
* the previous context. * the previous context.
* *
* @param node must not be {@literal null}. * @param node must not be {@literal null}.
* @param parent * @param parent may be {@literal null}.
* @param operation * @param operation may be {@literal null}.
* @param context must not be {@literal null}. * @param context must not be {@literal null}.
* @return * @return
*/ */
protected Object transform(ExpressionNode node, ExpressionNode parent, Document operation, protected Object transform(ExpressionNode node, @Nullable ExpressionNode parent, @Nullable Document operation,
AggregationExpressionTransformationContext<?> context) { AggregationExpressionTransformationContext<?> context) {
Assert.notNull(node, "ExpressionNode must not be null!"); Assert.notNull(node, "ExpressionNode must not be null!");
@@ -290,7 +291,7 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
} }
private Object convertUnaryMinusOp(ExpressionTransformationContextSupport<OperatorNode> context, private Object convertUnaryMinusOp(ExpressionTransformationContextSupport<OperatorNode> context,
Object leftResult) { @Nullable Object leftResult) {
Object result = leftResult instanceof Number ? leftResult Object result = leftResult instanceof Number ? leftResult
: new Document("$multiply", Arrays.<Object> asList(Integer.valueOf(-1), leftResult)); : new Document("$multiply", Arrays.<Object> asList(Integer.valueOf(-1), leftResult));
@@ -320,7 +321,7 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
*/ */
private static class IndexerNodeConversion extends ExpressionNodeConversion<ExpressionNode> { private static class IndexerNodeConversion extends ExpressionNodeConversion<ExpressionNode> {
public IndexerNodeConversion(AggregationExpressionTransformer transformer) { IndexerNodeConversion(AggregationExpressionTransformer transformer) {
super(transformer); super(transformer);
} }
@@ -350,7 +351,7 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
*/ */
private static class InlineListNodeConversion extends ExpressionNodeConversion<ExpressionNode> { private static class InlineListNodeConversion extends ExpressionNodeConversion<ExpressionNode> {
public InlineListNodeConversion(AggregationExpressionTransformer transformer) { InlineListNodeConversion(AggregationExpressionTransformer transformer) {
super(transformer); super(transformer);
} }
@@ -358,6 +359,7 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext) * @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
*/ */
@Nullable
@Override @Override
protected Object convert(AggregationExpressionTransformationContext<ExpressionNode> context) { protected Object convert(AggregationExpressionTransformationContext<ExpressionNode> context) {
@@ -389,7 +391,7 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
*/ */
private static class PropertyOrFieldReferenceNodeConversion extends ExpressionNodeConversion<ExpressionNode> { private static class PropertyOrFieldReferenceNodeConversion extends ExpressionNodeConversion<ExpressionNode> {
public PropertyOrFieldReferenceNodeConversion(AggregationExpressionTransformer transformer) { PropertyOrFieldReferenceNodeConversion(AggregationExpressionTransformer transformer) {
super(transformer); super(transformer);
} }
@@ -422,7 +424,7 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
*/ */
private static class LiteralNodeConversion extends ExpressionNodeConversion<LiteralNode> { private static class LiteralNodeConversion extends ExpressionNodeConversion<LiteralNode> {
public LiteralNodeConversion(AggregationExpressionTransformer transformer) { LiteralNodeConversion(AggregationExpressionTransformer transformer) {
super(transformer); super(transformer);
} }
@@ -469,7 +471,7 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
*/ */
private static class MethodReferenceNodeConversion extends ExpressionNodeConversion<MethodReferenceNode> { private static class MethodReferenceNodeConversion extends ExpressionNodeConversion<MethodReferenceNode> {
public MethodReferenceNodeConversion(AggregationExpressionTransformer transformer) { MethodReferenceNodeConversion(AggregationExpressionTransformer transformer) {
super(transformer); super(transformer);
} }
@@ -483,6 +485,8 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
MethodReferenceNode node = context.getCurrentNode(); MethodReferenceNode node = context.getCurrentNode();
AggregationMethodReference methodReference = node.getMethodReference(); AggregationMethodReference methodReference = node.getMethodReference();
Assert.state(methodReference != null, "Cannot resolve current node to AggregationMethodReference!");
Object args = null; Object args = null;
if (ObjectUtils.nullSafeEquals(methodReference.getArgumentType(), ArgumentType.SINGLE)) { if (ObjectUtils.nullSafeEquals(methodReference.getArgumentType(), ArgumentType.SINGLE)) {
@@ -519,7 +523,7 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
*/ */
private static class CompoundExpressionNodeConversion extends ExpressionNodeConversion<ExpressionNode> { private static class CompoundExpressionNodeConversion extends ExpressionNodeConversion<ExpressionNode> {
public CompoundExpressionNodeConversion(AggregationExpressionTransformer transformer) { CompoundExpressionNodeConversion(AggregationExpressionTransformer transformer) {
super(transformer); super(transformer);
} }
@@ -561,7 +565,7 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
* *
* @param transformer must not be {@literal null}. * @param transformer must not be {@literal null}.
*/ */
public NotOperatorNodeConversion(AggregationExpressionTransformer transformer) { NotOperatorNodeConversion(AggregationExpressionTransformer transformer) {
super(transformer); super(transformer);
} }
@@ -603,7 +607,7 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
* *
* @param transformer must not be {@literal null}. * @param transformer must not be {@literal null}.
*/ */
public ValueRetrievingNodeConversion(AggregationExpressionTransformer transformer) { ValueRetrievingNodeConversion(AggregationExpressionTransformer transformer) {
super(transformer); super(transformer);
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2016. the original author or authors. * Copyright 2016-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -26,6 +26,7 @@ import org.springframework.util.Assert;
* Gateway to {@literal String} aggregation operations. * Gateway to {@literal String} aggregation operations.
* *
* @author Christoph Strobl * @author Christoph Strobl
* @author Mark Paluch
* @since 1.10 * @since 1.10
*/ */
public class StringOperators { public class StringOperators {
@@ -122,7 +123,7 @@ public class StringOperators {
} }
private Concat createConcat() { private Concat createConcat() {
return fieldReference != null ? Concat.valueOf(fieldReference) : Concat.valueOf(expression); return usesFieldRef() ? Concat.valueOf(fieldReference) : Concat.valueOf(expression);
} }
/** /**
@@ -149,7 +150,7 @@ public class StringOperators {
} }
private Substr createSubstr() { private Substr createSubstr() {
return fieldReference != null ? Substr.valueOf(fieldReference) : Substr.valueOf(expression); return usesFieldRef() ? Substr.valueOf(fieldReference) : Substr.valueOf(expression);
} }
/** /**
@@ -158,7 +159,7 @@ public class StringOperators {
* @return * @return
*/ */
public ToLower toLower() { public ToLower toLower() {
return fieldReference != null ? ToLower.lowerValueOf(fieldReference) : ToLower.lowerValueOf(expression); return usesFieldRef() ? ToLower.lowerValueOf(fieldReference) : ToLower.lowerValueOf(expression);
} }
/** /**
@@ -167,7 +168,7 @@ public class StringOperators {
* @return * @return
*/ */
public ToUpper toUpper() { public ToUpper toUpper() {
return fieldReference != null ? ToUpper.upperValueOf(fieldReference) : ToUpper.upperValueOf(expression); return usesFieldRef() ? ToUpper.upperValueOf(fieldReference) : ToUpper.upperValueOf(expression);
} }
/** /**
@@ -210,7 +211,7 @@ public class StringOperators {
} }
private StrCaseCmp createStrCaseCmp() { private StrCaseCmp createStrCaseCmp() {
return fieldReference != null ? StrCaseCmp.valueOf(fieldReference) : StrCaseCmp.valueOf(expression); return usesFieldRef() ? StrCaseCmp.valueOf(fieldReference) : StrCaseCmp.valueOf(expression);
} }
/** /**
@@ -256,7 +257,7 @@ public class StringOperators {
} }
private IndexOfBytes.SubstringBuilder createIndexOfBytesSubstringBuilder() { private IndexOfBytes.SubstringBuilder createIndexOfBytesSubstringBuilder() {
return fieldReference != null ? IndexOfBytes.valueOf(fieldReference) : IndexOfBytes.valueOf(expression); return usesFieldRef() ? IndexOfBytes.valueOf(fieldReference) : IndexOfBytes.valueOf(expression);
} }
/** /**
@@ -302,7 +303,7 @@ public class StringOperators {
} }
private IndexOfCP.SubstringBuilder createIndexOfCPSubstringBuilder() { private IndexOfCP.SubstringBuilder createIndexOfCPSubstringBuilder() {
return fieldReference != null ? IndexOfCP.valueOf(fieldReference) : IndexOfCP.valueOf(expression); return usesFieldRef() ? IndexOfCP.valueOf(fieldReference) : IndexOfCP.valueOf(expression);
} }
/** /**
@@ -339,7 +340,7 @@ public class StringOperators {
} }
private Split createSplit() { private Split createSplit() {
return fieldReference != null ? Split.valueOf(fieldReference) : Split.valueOf(expression); return usesFieldRef() ? Split.valueOf(fieldReference) : Split.valueOf(expression);
} }
/** /**
@@ -349,8 +350,7 @@ public class StringOperators {
* @return * @return
*/ */
public StrLenBytes length() { public StrLenBytes length() {
return fieldReference != null ? StrLenBytes.stringLengthOf(fieldReference) return usesFieldRef() ? StrLenBytes.stringLengthOf(fieldReference) : StrLenBytes.stringLengthOf(expression);
: StrLenBytes.stringLengthOf(expression);
} }
/** /**
@@ -360,7 +360,7 @@ public class StringOperators {
* @return * @return
*/ */
public StrLenCP lengthCP() { public StrLenCP lengthCP() {
return fieldReference != null ? StrLenCP.stringLengthOfCP(fieldReference) : StrLenCP.stringLengthOfCP(expression); return usesFieldRef() ? StrLenCP.stringLengthOfCP(fieldReference) : StrLenCP.stringLengthOfCP(expression);
} }
/** /**
@@ -387,7 +387,137 @@ public class StringOperators {
} }
private SubstrCP createSubstrCP() { private SubstrCP createSubstrCP() {
return fieldReference != null ? SubstrCP.valueOf(fieldReference) : SubstrCP.valueOf(expression); return usesFieldRef() ? SubstrCP.valueOf(fieldReference) : SubstrCP.valueOf(expression);
}
/**
* Creates new {@link AggregationExpression} that takes the associated string representation and trims whitespaces
* from the beginning and end. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @return new instance of {@link Trim}.
* @since 2.0.10
*/
public Trim trim() {
return createTrim();
}
/**
* Creates new {@link AggregationExpression} that takes the associated string representation and trims the given
* character sequence from the beginning and end. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @param chars must not be {@literal null}.
* @return new instance of {@link Trim}.
* @since 2.0.10
*/
public Trim trim(String chars) {
return trim().chars(chars);
}
/**
* Creates new {@link AggregationExpression} that takes the associated string representation and trims the character
* sequence resulting from the given {@link AggregationExpression} from the beginning and end. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link Trim}.
* @since 2.0.10
*/
public Trim trim(AggregationExpression expression) {
return trim().charsOf(expression);
}
private Trim createTrim() {
return usesFieldRef() ? Trim.valueOf(fieldReference) : Trim.valueOf(expression);
}
/**
* Creates new {@link AggregationExpression} that takes the associated string representation and trims whitespaces
* from the beginning. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @return new instance of {@link LTrim}.
* @since 2.0.10
*/
public LTrim ltrim() {
return createLTrim();
}
/**
* Creates new {@link AggregationExpression} that takes the associated string representation and trims the given
* character sequence from the beginning. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @param chars must not be {@literal null}.
* @return new instance of {@link LTrim}.
* @since 2.0.10
*/
public LTrim ltrim(String chars) {
return ltrim().chars(chars);
}
/**
* Creates new {@link AggregationExpression} that takes the associated string representation and trims the character
* sequence resulting from the given {@link AggregationExpression} from the beginning. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link LTrim}.
* @since 2.0.10
*/
public LTrim ltrim(AggregationExpression expression) {
return ltrim().charsOf(expression);
}
private LTrim createLTrim() {
return usesFieldRef() ? LTrim.valueOf(fieldReference) : LTrim.valueOf(expression);
}
/**
* Creates new {@link AggregationExpression} that takes the associated string representation and trims whitespaces
* from the end. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @return new instance of {@link RTrim}.
* @since 2.0.10
*/
public RTrim rtrim() {
return createRTrim();
}
/**
* Creates new {@link AggregationExpression} that takes the associated string representation and trims the given
* character sequence from the end. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @param chars must not be {@literal null}.
* @return new instance of {@link RTrim}.
* @since 2.0.10
*/
public RTrim rtrim(String chars) {
return rtrim().chars(chars);
}
/**
* Creates new {@link AggregationExpression} that takes the associated string representation and trims the character
* sequence resulting from the given {@link AggregationExpression} from the end. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link RTrim}.
* @since 2.0.10
*/
public RTrim rtrim(AggregationExpression expression) {
return rtrim().charsOf(expression);
}
private RTrim createRTrim() {
return usesFieldRef() ? RTrim.valueOf(fieldReference) : RTrim.valueOf(expression);
}
private boolean usesFieldRef() {
return fieldReference != null;
} }
} }
@@ -1067,4 +1197,257 @@ public class StringOperators {
return new SubstrCP(append(Arrays.asList(start, nrOfChars))); return new SubstrCP(append(Arrays.asList(start, nrOfChars)));
} }
} }
/**
* {@link AggregationExpression} for {@code $trim} which removes whitespace or the specified characters from the
* beginning and end of a string. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @author Christoph Strobl
* @since 2.0.10
*/
public static class Trim extends AbstractAggregationExpression {
private Trim(Object value) {
super(value);
}
/**
* Creates new {@link Trim} using the value of the provided {@link Field fieldReference} as {@literal input} value.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link LTrim}.
*/
public static Trim valueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Trim(Collections.singletonMap("input", Fields.field(fieldReference)));
}
/**
* Creates new {@link Trim} using the result of the provided {@link AggregationExpression} as {@literal input}
* value.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link Trim}.
*/
public static Trim valueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Trim(Collections.singletonMap("input", expression));
}
/**
* Optional specify the character(s) to trim from the beginning.
*
* @param chars must not be {@literal null}.
* @return new instance of {@link Trim}.
*/
public Trim chars(String chars) {
Assert.notNull(chars, "Chars must not be null!");
return new Trim(append("chars", chars));
}
/**
* Optional specify the reference to the {@link Field field} holding the character values to trim from the
* beginning.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link Trim}.
*/
public Trim charsOf(String fieldReference) {
return new Trim(append("chars", Fields.field(fieldReference)));
}
/**
* Optional specify the {@link AggregationExpression} evaluating to the character sequence to trim from the
* beginning.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link Trim}.
*/
public Trim charsOf(AggregationExpression expression) {
return new Trim(append("chars", expression));
}
/**
* Remove whitespace or the specified characters from the beginning of a string.<br />
*
* @return new instance of {@link LTrim}.
*/
public LTrim left() {
return new LTrim(argumentMap());
}
/**
* Remove whitespace or the specified characters from the end of a string.<br />
*
* @return new instance of {@link RTrim}.
*/
public RTrim right() {
return new RTrim(argumentMap());
}
@Override
protected String getMongoMethod() {
return "$trim";
}
}
/**
* {@link AggregationExpression} for {@code $ltrim} which removes whitespace or the specified characters from the
* beginning of a string. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @author Christoph Strobl
* @since 2.0.10
*/
public static class LTrim extends AbstractAggregationExpression {
private LTrim(Object value) {
super(value);
}
/**
* Creates new {@link LTrim} using the value of the provided {@link Field fieldReference} as {@literal input} value.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link LTrim}.
*/
public static LTrim valueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new LTrim(Collections.singletonMap("input", Fields.field(fieldReference)));
}
/**
* Creates new {@link LTrim} using the result of the provided {@link AggregationExpression} as {@literal input}
* value.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link LTrim}.
*/
public static LTrim valueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new LTrim(Collections.singletonMap("input", expression));
}
/**
* Optional specify the character(s) to trim from the beginning.
*
* @param chars must not be {@literal null}.
* @return new instance of {@link LTrim}.
*/
public LTrim chars(String chars) {
Assert.notNull(chars, "Chars must not be null!");
return new LTrim(append("chars", chars));
}
/**
* Optional specify the reference to the {@link Field field} holding the character values to trim from the
* beginning.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link LTrim}.
*/
public LTrim charsOf(String fieldReference) {
return new LTrim(append("chars", Fields.field(fieldReference)));
}
/**
* Optional specify the {@link AggregationExpression} evaluating to the character sequence to trim from the
* beginning.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link LTrim}.
*/
public LTrim charsOf(AggregationExpression expression) {
return new LTrim(append("chars", expression));
}
@Override
protected String getMongoMethod() {
return "$ltrim";
}
}
/**
* {@link AggregationExpression} for {@code $rtrim} which removes whitespace or the specified characters from the end
* of a string. <br />
* <strong>NOTE:</strong> Requires MongoDB 4.0 or later.
*
* @author Christoph Strobl
* @since 2.0.10
*/
public static class RTrim extends AbstractAggregationExpression {
private RTrim(Object value) {
super(value);
}
/**
* Creates new {@link RTrim} using the value of the provided {@link Field fieldReference} as {@literal input} value.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link RTrim}.
*/
public static RTrim valueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new RTrim(Collections.singletonMap("input", Fields.field(fieldReference)));
}
/**
* Creates new {@link RTrim} using the result of the provided {@link AggregationExpression} as {@literal input}
* value.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link RTrim}.
*/
public static RTrim valueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new RTrim(Collections.singletonMap("input", expression));
}
/**
* Optional specify the character(s) to trim from the end.
*
* @param chars must not be {@literal null}.
* @return new instance of {@link RTrim}.
*/
public RTrim chars(String chars) {
Assert.notNull(chars, "Chars must not be null!");
return new RTrim(append("chars", chars));
}
/**
* Optional specify the reference to the {@link Field field} holding the character values to trim from the end.
*
* @param fieldReference must not be {@literal null}.
* @return new instance of {@link RTrim}.
*/
public RTrim charsOf(String fieldReference) {
return new RTrim(append("chars", Fields.field(fieldReference)));
}
/**
* Optional specify the {@link AggregationExpression} evaluating to the character sequence to trim from the end.
*
* @param expression must not be {@literal null}.
* @return new instance of {@link RTrim}.
*/
public RTrim charsOf(AggregationExpression expression) {
return new RTrim(append("chars", expression));
}
@Override
protected String getMongoMethod() {
return "$rtrim";
}
}
} }

View File

@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.core.aggregation;
import org.bson.Document; import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField; import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
/** /**
@@ -36,7 +37,7 @@ public class UnwindOperation
implements AggregationOperation, FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation { implements AggregationOperation, FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation {
private final ExposedField field; private final ExposedField field;
private final ExposedField arrayIndex; private final @Nullable ExposedField arrayIndex;
private final boolean preserveNullAndEmptyArrays; private final boolean preserveNullAndEmptyArrays;
/** /**
@@ -185,8 +186,8 @@ public class UnwindOperation
*/ */
public static final class UnwindOperationBuilder implements PathBuilder, IndexBuilder, EmptyArraysBuilder { public static final class UnwindOperationBuilder implements PathBuilder, IndexBuilder, EmptyArraysBuilder {
private Field field; private @Nullable Field field;
private Field arrayIndex; private @Nullable Field arrayIndex;
private UnwindOperationBuilder() {} private UnwindOperationBuilder() {}

View File

@@ -22,6 +22,7 @@ import java.util.List;
import org.bson.Document; import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.VariableOperators.Let.ExpressionVariable; import org.springframework.data.mongodb.core.aggregation.VariableOperators.Let.ExpressionVariable;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
/** /**
@@ -334,8 +335,8 @@ public class VariableOperators {
*/ */
public static class ExpressionVariable { public static class ExpressionVariable {
private final String variableName; private final @Nullable String variableName;
private final Object expression; private final @Nullable Object expression;
/** /**
* Creates new {@link ExpressionVariable}. * Creates new {@link ExpressionVariable}.
@@ -343,7 +344,7 @@ public class VariableOperators {
* @param variableName can be {@literal null}. * @param variableName can be {@literal null}.
* @param expression can be {@literal null}. * @param expression can be {@literal null}.
*/ */
private ExpressionVariable(String variableName, Object expression) { private ExpressionVariable(@Nullable String variableName, @Nullable Object expression) {
this.variableName = variableName; this.variableName = variableName;
this.expression = expression; this.expression = expression;

View File

@@ -1,5 +1,8 @@
/** /**
* Support for the MongoDB aggregation framework. * Support for the MongoDB aggregation framework.
*
* @since 1.3 * @since 1.3
*/ */
package org.springframework.data.mongodb.core.aggregation; @org.springframework.lang.NonNullApi
package org.springframework.data.mongodb.core.aggregation;

View File

@@ -28,6 +28,8 @@ import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerT
import org.springframework.data.mongodb.core.convert.MongoConverters.ObjectIdToBigIntegerConverter; import org.springframework.data.mongodb.core.convert.MongoConverters.ObjectIdToBigIntegerConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.ObjectIdToStringConverter; import org.springframework.data.mongodb.core.convert.MongoConverters.ObjectIdToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToObjectIdConverter; import org.springframework.data.mongodb.core.convert.MongoConverters.StringToObjectIdConverter;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
/** /**
* Base class for {@link MongoConverter} implementations. Sets up a {@link GenericConversionService} and populates basic * Base class for {@link MongoConverter} implementations. Sets up a {@link GenericConversionService} and populates basic
@@ -36,6 +38,7 @@ import org.springframework.data.mongodb.core.convert.MongoConverters.StringToObj
* @author Jon Brisbin * @author Jon Brisbin
* @author Oliver Gierke * @author Oliver Gierke
* @author Mark Paluch * @author Mark Paluch
* @author Christoph Strobl
*/ */
public abstract class AbstractMongoConverter implements MongoConverter, InitializingBean { public abstract class AbstractMongoConverter implements MongoConverter, InitializingBean {
@@ -46,18 +49,20 @@ public abstract class AbstractMongoConverter implements MongoConverter, Initiali
/** /**
* Creates a new {@link AbstractMongoConverter} using the given {@link GenericConversionService}. * Creates a new {@link AbstractMongoConverter} using the given {@link GenericConversionService}.
* *
* @param conversionService * @param conversionService can be {@literal null} and defaults to {@link DefaultConversionService}.
*/ */
public AbstractMongoConverter(GenericConversionService conversionService) { public AbstractMongoConverter(@Nullable GenericConversionService conversionService) {
this.conversionService = conversionService == null ? new DefaultConversionService() : conversionService; this.conversionService = conversionService == null ? new DefaultConversionService() : conversionService;
} }
/** /**
* Registers the given custom conversions with the converter. * Registers the given custom conversions with the converter.
* *
* @param conversions * @param conversions must not be {@literal null}.
*/ */
public void setCustomConversions(CustomConversions conversions) { public void setCustomConversions(CustomConversions conversions) {
Assert.notNull(conversions, "Conversions must not be null!");
this.conversions = conversions; this.conversions = conversions;
} }
@@ -66,7 +71,7 @@ public abstract class AbstractMongoConverter implements MongoConverter, Initiali
* *
* @param instantiators * @param instantiators
*/ */
public void setInstantiators(EntityInstantiators instantiators) { public void setInstantiators(@Nullable EntityInstantiators instantiators) {
this.instantiators = instantiators == null ? new EntityInstantiators() : instantiators; this.instantiators = instantiators == null ? new EntityInstantiators() : instantiators;
} }
@@ -91,18 +96,11 @@ public abstract class AbstractMongoConverter implements MongoConverter, Initiali
conversions.registerConvertersIn(conversionService); conversions.registerConvertersIn(conversionService);
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.MongoWriter#convertToMongoType(java.lang.Object)
*/
public Object convertToMongoType(Object obj) {
return convertToMongoType(obj, null);
}
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.convert.MongoConverter#getConversionService() * @see org.springframework.data.mongodb.core.core.convert.MongoConverter#getConversionService()
*/ */
@Override
public ConversionService getConversionService() { public ConversionService getConversionService() {
return conversionService; return conversionService;
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2014 the original author or authors. * Copyright 2014-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -16,13 +16,15 @@
package org.springframework.data.mongodb.core.convert; package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty; import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.lang.Nullable;
import com.mongodb.DBRef; import com.mongodb.DBRef;
/** /**
* @author Oliver Gierke * @author Oliver Gierke
* @author Mark Paluch
*/ */
public interface DbRefProxyHandler { public interface DbRefProxyHandler {
Object populateId(MongoPersistentProperty property, DBRef source, Object proxy); Object populateId(MongoPersistentProperty property, @Nullable DBRef source, Object proxy);
} }

View File

@@ -21,6 +21,7 @@ import org.bson.Document;
import org.springframework.dao.InvalidDataAccessApiUsageException; import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity; import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty; import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.lang.Nullable;
import com.mongodb.DBRef; import com.mongodb.DBRef;
@@ -45,6 +46,7 @@ public interface DbRefResolver {
* @param callback will never be {@literal null}. * @param callback will never be {@literal null}.
* @return * @return
*/ */
@Nullable
Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback, Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback,
DbRefProxyHandler proxyHandler); DbRefProxyHandler proxyHandler);
@@ -67,6 +69,7 @@ public interface DbRefResolver {
* @return * @return
* @since 1.7 * @since 1.7
*/ */
@Nullable
Document fetch(DBRef dbRef); Document fetch(DBRef dbRef);
/** /**

View File

@@ -23,6 +23,7 @@ import org.springframework.data.mapping.model.SpELContext;
import org.springframework.data.mapping.model.SpELExpressionEvaluator; import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity; import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty; import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.lang.Nullable;
import com.mongodb.DBRef; import com.mongodb.DBRef;
@@ -56,7 +57,7 @@ class DefaultDbRefProxyHandler implements DbRefProxyHandler {
* @see org.springframework.data.mongodb.core.convert.DbRefProxyHandler#populateId(com.mongodb.DBRef, java.lang.Object) * @see org.springframework.data.mongodb.core.convert.DbRefProxyHandler#populateId(com.mongodb.DBRef, java.lang.Object)
*/ */
@Override @Override
public Object populateId(MongoPersistentProperty property, DBRef source, Object proxy) { public Object populateId(MongoPersistentProperty property, @Nullable DBRef source, Object proxy) {
if (source == null) { if (source == null) {
return proxy; return proxy;

View File

@@ -23,9 +23,11 @@ import java.io.ObjectOutputStream;
import java.io.Serializable; import java.io.Serializable;
import java.lang.reflect.Method; import java.lang.reflect.Method;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections; import java.util.Collections;
import java.util.Comparator;
import java.util.List; import java.util.List;
import java.util.stream.Collectors;
import java.util.stream.Stream;
import org.aopalliance.intercept.MethodInterceptor; import org.aopalliance.intercept.MethodInterceptor;
import org.aopalliance.intercept.MethodInvocation; import org.aopalliance.intercept.MethodInvocation;
@@ -42,6 +44,7 @@ import org.springframework.data.mongodb.LazyLoadingException;
import org.springframework.data.mongodb.MongoDbFactory; import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity; import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty; import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.lang.Nullable;
import org.springframework.objenesis.ObjenesisStd; import org.springframework.objenesis.ObjenesisStd;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.ReflectionUtils; import org.springframework.util.ReflectionUtils;
@@ -91,6 +94,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
Assert.notNull(property, "Property must not be null!"); Assert.notNull(property, "Property must not be null!");
Assert.notNull(callback, "Callback must not be null!"); Assert.notNull(callback, "Callback must not be null!");
Assert.notNull(handler, "Handler must not be null!");
if (isLazyDbRef(property)) { if (isLazyDbRef(property)) {
return createLazyLoadingProxy(property, dbref, callback, handler); return createLazyLoadingProxy(property, dbref, callback, handler);
@@ -142,8 +146,8 @@ public class DefaultDbRefResolver implements DbRefResolver {
} }
String collection = refs.iterator().next().getCollectionName(); String collection = refs.iterator().next().getCollectionName();
List<Object> ids = new ArrayList<>(refs.size());
List<Object> ids = new ArrayList<Object>(refs.size());
for (DBRef ref : refs) { for (DBRef ref : refs) {
if (!collection.equals(ref.getCollectionName())) { if (!collection.equals(ref.getCollectionName())) {
@@ -155,10 +159,14 @@ public class DefaultDbRefResolver implements DbRefResolver {
} }
MongoDatabase db = mongoDbFactory.getDb(); MongoDatabase db = mongoDbFactory.getDb();
List<Document> result = new ArrayList<>();
db.getCollection(collection).find(new Document("_id", new Document("$in", ids))).into(result); List<Document> result = db.getCollection(collection) //
result.sort(new DbRefByReferencePositionComparator(ids)); .find(new Document("_id", new Document("$in", ids))) //
return result; .into(new ArrayList<>());
return ids.stream() //
.flatMap(id -> documentWithId(id, result)) //
.collect(Collectors.toList());
} }
/** /**
@@ -170,8 +178,8 @@ public class DefaultDbRefResolver implements DbRefResolver {
* @param callback must not be {@literal null}. * @param callback must not be {@literal null}.
* @return * @return
*/ */
private Object createLazyLoadingProxy(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback, private Object createLazyLoadingProxy(MongoPersistentProperty property, @Nullable DBRef dbref,
DbRefProxyHandler handler) { DbRefResolverCallback callback, DbRefProxyHandler handler) {
Class<?> propertyType = property.getType(); Class<?> propertyType = property.getType();
LazyLoadingInterceptor interceptor = new LazyLoadingInterceptor(property, dbref, exceptionTranslator, callback); LazyLoadingInterceptor interceptor = new LazyLoadingInterceptor(property, dbref, exceptionTranslator, callback);
@@ -223,6 +231,20 @@ public class DefaultDbRefResolver implements DbRefResolver {
return property.getDBRef() != null && property.getDBRef().lazy(); return property.getDBRef() != null && property.getDBRef().lazy();
} }
/**
* Returns document with the given identifier from the given list of {@link Document}s.
*
* @param identifier
* @param documents
* @return
*/
private static Stream<Document> documentWithId(Object identifier, Collection<Document> documents) {
return documents.stream() //
.filter(it -> it.get("_id").equals(identifier)) //
.limit(1);
}
/** /**
* A {@link MethodInterceptor} that is used within a lazy loading proxy. The property resolving is delegated to a * A {@link MethodInterceptor} that is used within a lazy loading proxy. The property resolving is delegated to a
* {@link DbRefResolverCallback}. The resolving process is triggered by a method invocation on the proxy and is * {@link DbRefResolverCallback}. The resolving process is triggered by a method invocation on the proxy and is
@@ -242,8 +264,8 @@ public class DefaultDbRefResolver implements DbRefResolver {
private final PersistenceExceptionTranslator exceptionTranslator; private final PersistenceExceptionTranslator exceptionTranslator;
private volatile boolean resolved; private volatile boolean resolved;
private Object result; private final @Nullable DBRef dbref;
private DBRef dbref; private @Nullable Object result;
static { static {
try { try {
@@ -263,7 +285,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
* @param dbref can be {@literal null}. * @param dbref can be {@literal null}.
* @param callback must not be {@literal null}. * @param callback must not be {@literal null}.
*/ */
public LazyLoadingInterceptor(MongoPersistentProperty property, DBRef dbref, public LazyLoadingInterceptor(MongoPersistentProperty property, @Nullable DBRef dbref,
PersistenceExceptionTranslator exceptionTranslator, DbRefResolverCallback callback) { PersistenceExceptionTranslator exceptionTranslator, DbRefResolverCallback callback) {
Assert.notNull(property, "Property must not be null!"); Assert.notNull(property, "Property must not be null!");
@@ -281,7 +303,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
* @see org.aopalliance.intercept.MethodInterceptor#invoke(org.aopalliance.intercept.MethodInvocation) * @see org.aopalliance.intercept.MethodInterceptor#invoke(org.aopalliance.intercept.MethodInvocation)
*/ */
@Override @Override
public Object invoke(MethodInvocation invocation) throws Throwable { public Object invoke(@Nullable MethodInvocation invocation) throws Throwable {
return intercept(invocation.getThis(), invocation.getMethod(), invocation.getArguments(), null); return intercept(invocation.getThis(), invocation.getMethod(), invocation.getArguments(), null);
} }
@@ -289,8 +311,9 @@ public class DefaultDbRefResolver implements DbRefResolver {
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.cglib.proxy.MethodInterceptor#intercept(java.lang.Object, java.lang.reflect.Method, java.lang.Object[], org.springframework.cglib.proxy.MethodProxy) * @see org.springframework.cglib.proxy.MethodInterceptor#intercept(java.lang.Object, java.lang.reflect.Method, java.lang.Object[], org.springframework.cglib.proxy.MethodProxy)
*/ */
@Nullable
@Override @Override
public Object intercept(Object obj, Method method, Object[] args, MethodProxy proxy) throws Throwable { public Object intercept(Object obj, Method method, Object[] args, @Nullable MethodProxy proxy) throws Throwable {
if (INITIALIZE_METHOD.equals(method)) { if (INITIALIZE_METHOD.equals(method)) {
return ensureResolved(); return ensureResolved();
@@ -337,7 +360,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
* @param proxy * @param proxy
* @return * @return
*/ */
private String proxyToString(Object proxy) { private String proxyToString(@Nullable Object proxy) {
StringBuilder description = new StringBuilder(); StringBuilder description = new StringBuilder();
if (dbref != null) { if (dbref != null) {
@@ -358,7 +381,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
* @param proxy * @param proxy
* @return * @return
*/ */
private int proxyHashCode(Object proxy) { private int proxyHashCode(@Nullable Object proxy) {
return proxyToString(proxy).hashCode(); return proxyToString(proxy).hashCode();
} }
@@ -369,7 +392,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
* @param that * @param that
* @return * @return
*/ */
private boolean proxyEquals(Object proxy, Object that) { private boolean proxyEquals(@Nullable Object proxy, Object that) {
if (!(that instanceof LazyLoadingProxy)) { if (!(that instanceof LazyLoadingProxy)) {
return false; return false;
@@ -387,6 +410,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
* *
* @return * @return
*/ */
@Nullable
private Object ensureResolved() { private Object ensureResolved() {
if (!resolved) { if (!resolved) {
@@ -430,6 +454,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
* *
* @return * @return
*/ */
@Nullable
private synchronized Object resolve() { private synchronized Object resolve() {
if (!resolved) { if (!resolved) {
@@ -449,37 +474,4 @@ public class DefaultDbRefResolver implements DbRefResolver {
return result; return result;
} }
} }
/**
* {@link Comparator} for sorting {@link Document} that have been loaded in random order by a predefined list of
* reference identifiers.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.10
*/
private static class DbRefByReferencePositionComparator implements Comparator<Document> {
private final List<Object> reference;
/**
* Creates a new {@link DbRefByReferencePositionComparator} for the given list of reference identifiers.
*
* @param referenceIds must not be {@literal null}.
*/
public DbRefByReferencePositionComparator(List<Object> referenceIds) {
Assert.notNull(referenceIds, "Reference identifiers must not be null!");
this.reference = new ArrayList<Object>(referenceIds);
}
/*
* (non-Javadoc)
* @see java.util.Comparator#compare(java.lang.Object, java.lang.Object)
*/
@Override
public int compare(Document o1, Document o2) {
return Integer.compare(reference.indexOf(o1.get("_id")), reference.indexOf(o2.get("_id")));
}
}
} }

View File

@@ -18,7 +18,6 @@ package org.springframework.data.mongodb.core.convert;
import java.util.Arrays; import java.util.Arrays;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.Optional;
import java.util.Set; import java.util.Set;
import org.bson.Document; import org.bson.Document;
@@ -32,6 +31,7 @@ import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.context.MappingContext; import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.util.ClassTypeInformation; import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation; import org.springframework.data.util.TypeInformation;
import org.springframework.lang.Nullable;
import org.springframework.util.ObjectUtils; import org.springframework.util.ObjectUtils;
import com.mongodb.BasicDBList; import com.mongodb.BasicDBList;
@@ -57,26 +57,27 @@ public class DefaultMongoTypeMapper extends DefaultTypeMapper<Bson> implements M
private static final TypeInformation<Map> MAP_TYPE_INFO = ClassTypeInformation.from(Map.class); private static final TypeInformation<Map> MAP_TYPE_INFO = ClassTypeInformation.from(Map.class);
private final TypeAliasAccessor<Bson> accessor; private final TypeAliasAccessor<Bson> accessor;
private final String typeKey; private final @Nullable String typeKey;
public DefaultMongoTypeMapper() { public DefaultMongoTypeMapper() {
this(DEFAULT_TYPE_KEY); this(DEFAULT_TYPE_KEY);
} }
public DefaultMongoTypeMapper(String typeKey) { public DefaultMongoTypeMapper(@Nullable String typeKey) {
this(typeKey, Arrays.asList(new SimpleTypeInformationMapper())); this(typeKey, Arrays.asList(new SimpleTypeInformationMapper()));
} }
public DefaultMongoTypeMapper(String typeKey, MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext) { public DefaultMongoTypeMapper(@Nullable String typeKey,
MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext) {
this(typeKey, new DocumentTypeAliasAccessor(typeKey), mappingContext, this(typeKey, new DocumentTypeAliasAccessor(typeKey), mappingContext,
Arrays.asList(new SimpleTypeInformationMapper())); Arrays.asList(new SimpleTypeInformationMapper()));
} }
public DefaultMongoTypeMapper(String typeKey, List<? extends TypeInformationMapper> mappers) { public DefaultMongoTypeMapper(@Nullable String typeKey, List<? extends TypeInformationMapper> mappers) {
this(typeKey, new DocumentTypeAliasAccessor(typeKey), null, mappers); this(typeKey, new DocumentTypeAliasAccessor(typeKey), null, mappers);
} }
private DefaultMongoTypeMapper(String typeKey, TypeAliasAccessor<Bson> accessor, private DefaultMongoTypeMapper(@Nullable String typeKey, TypeAliasAccessor<Bson> accessor,
MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext, MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext,
List<? extends TypeInformationMapper> mappers) { List<? extends TypeInformationMapper> mappers) {
@@ -99,9 +100,9 @@ public class DefaultMongoTypeMapper extends DefaultTypeMapper<Bson> implements M
* @see org.springframework.data.mongodb.core.convert.MongoTypeMapper#writeTypeRestrictions(java.util.Set) * @see org.springframework.data.mongodb.core.convert.MongoTypeMapper#writeTypeRestrictions(java.util.Set)
*/ */
@Override @Override
public void writeTypeRestrictions(Document result, Set<Class<?>> restrictedTypes) { public void writeTypeRestrictions(Document result, @Nullable Set<Class<?>> restrictedTypes) {
if (restrictedTypes == null || restrictedTypes.isEmpty()) { if (ObjectUtils.isEmpty(restrictedTypes)) {
return; return;
} }
@@ -111,7 +112,7 @@ public class DefaultMongoTypeMapper extends DefaultTypeMapper<Bson> implements M
Alias typeAlias = getAliasFor(ClassTypeInformation.from(restrictedType)); Alias typeAlias = getAliasFor(ClassTypeInformation.from(restrictedType));
if (typeAlias != null && !ObjectUtils.nullSafeEquals(Alias.NONE, typeAlias) && typeAlias.isPresent()) { if (!ObjectUtils.nullSafeEquals(Alias.NONE, typeAlias) && typeAlias.isPresent()) {
restrictedMappedTypes.add(typeAlias.getValue()); restrictedMappedTypes.add(typeAlias.getValue());
} }
} }
@@ -135,9 +136,9 @@ public class DefaultMongoTypeMapper extends DefaultTypeMapper<Bson> implements M
*/ */
public static final class DocumentTypeAliasAccessor implements TypeAliasAccessor<Bson> { public static final class DocumentTypeAliasAccessor implements TypeAliasAccessor<Bson> {
private final String typeKey; private final @Nullable String typeKey;
public DocumentTypeAliasAccessor(String typeKey) { public DocumentTypeAliasAccessor(@Nullable String typeKey) {
this.typeKey = typeKey; this.typeKey = typeKey;
} }

View File

@@ -15,6 +15,8 @@
*/ */
package org.springframework.data.mongodb.core.convert; package org.springframework.data.mongodb.core.convert;
import lombok.Getter;
import java.util.Arrays; import java.util.Arrays;
import java.util.Iterator; import java.util.Iterator;
import java.util.Map; import java.util.Map;
@@ -23,6 +25,7 @@ import org.bson.Document;
import org.bson.conversions.Bson; import org.bson.conversions.Bson;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty; import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.util.BsonUtils; import org.springframework.data.mongodb.util.BsonUtils;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import com.mongodb.BasicDBObject; import com.mongodb.BasicDBObject;
@@ -39,7 +42,7 @@ import com.mongodb.DBObject;
*/ */
class DocumentAccessor { class DocumentAccessor {
private final Bson document; private final @Getter Bson document;
/** /**
* Creates a new {@link DocumentAccessor} for the given {@link Document}. * Creates a new {@link DocumentAccessor} for the given {@link Document}.
@@ -65,7 +68,7 @@ class DocumentAccessor {
* @param prop must not be {@literal null}. * @param prop must not be {@literal null}.
* @param value * @param value
*/ */
public void put(MongoPersistentProperty prop, Object value) { public void put(MongoPersistentProperty prop, @Nullable Object value) {
Assert.notNull(prop, "MongoPersistentProperty must not be null!"); Assert.notNull(prop, "MongoPersistentProperty must not be null!");
String fieldName = prop.getFieldName(); String fieldName = prop.getFieldName();
@@ -98,6 +101,7 @@ class DocumentAccessor {
* @param property must not be {@literal null}. * @param property must not be {@literal null}.
* @return * @return
*/ */
@Nullable
public Object get(MongoPersistentProperty property) { public Object get(MongoPersistentProperty property) {
String fieldName = property.getFieldName(); String fieldName = property.getFieldName();
@@ -135,15 +139,21 @@ class DocumentAccessor {
String fieldName = property.getFieldName(); String fieldName = property.getFieldName();
if (this.document instanceof Document) {
if (((Document) this.document).containsKey(fieldName)) {
return true;
}
} else if (this.document instanceof DBObject) {
if (((DBObject) this.document).containsField(fieldName)) {
return true;
}
}
if (!fieldName.contains(".")) { if (!fieldName.contains(".")) {
return false;
if (this.document instanceof Document) {
return ((Document) this.document).containsKey(fieldName);
}
if (this.document instanceof DBObject) {
return ((DBObject) this.document).containsField(fieldName);
}
} }
String[] parts = fieldName.split("\\."); String[] parts = fieldName.split("\\.");
@@ -176,6 +186,7 @@ class DocumentAccessor {
* @param source can be {@literal null}. * @param source can be {@literal null}.
* @return * @return
*/ */
@Nullable
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
private static Map<String, Object> getAsMap(Object source) { private static Map<String, Object> getAsMap(Object source) {

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2012-2016 the original author or authors. * Copyright 2012-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -22,6 +22,7 @@ import org.springframework.context.expression.MapAccessor;
import org.springframework.expression.EvaluationContext; import org.springframework.expression.EvaluationContext;
import org.springframework.expression.PropertyAccessor; import org.springframework.expression.PropertyAccessor;
import org.springframework.expression.TypedValue; import org.springframework.expression.TypedValue;
import org.springframework.lang.Nullable;
/** /**
* {@link PropertyAccessor} to allow entity based field access to {@link Document}s. * {@link PropertyAccessor} to allow entity based field access to {@link Document}s.
@@ -47,7 +48,7 @@ class DocumentPropertyAccessor extends MapAccessor {
* @see org.springframework.context.expression.MapAccessor#canRead(org.springframework.expression.EvaluationContext, java.lang.Object, java.lang.String) * @see org.springframework.context.expression.MapAccessor#canRead(org.springframework.expression.EvaluationContext, java.lang.Object, java.lang.String)
*/ */
@Override @Override
public boolean canRead(EvaluationContext context, Object target, String name) { public boolean canRead(EvaluationContext context, @Nullable Object target, String name) {
return true; return true;
} }
@@ -57,7 +58,11 @@ class DocumentPropertyAccessor extends MapAccessor {
*/ */
@Override @Override
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
public TypedValue read(EvaluationContext context, Object target, String name) { public TypedValue read(EvaluationContext context, @Nullable Object target, String name) {
if (target == null) {
return TypedValue.NULL;
}
Map<String, Object> source = (Map<String, Object>) target; Map<String, Object> source = (Map<String, Object>) target;

View File

@@ -15,10 +15,13 @@
*/ */
package org.springframework.data.mongodb.core.convert; package org.springframework.data.mongodb.core.convert;
import java.text.Collator;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Arrays; import java.util.Arrays;
import java.util.Collection; import java.util.Collection;
import java.util.List; import java.util.List;
import java.util.Map;
import java.util.TreeMap;
import org.bson.Document; import org.bson.Document;
import org.springframework.core.convert.converter.Converter; import org.springframework.core.convert.converter.Converter;
@@ -41,15 +44,17 @@ import org.springframework.data.mongodb.core.geo.GeoJsonPoint;
import org.springframework.data.mongodb.core.geo.GeoJsonPolygon; import org.springframework.data.mongodb.core.geo.GeoJsonPolygon;
import org.springframework.data.mongodb.core.geo.Sphere; import org.springframework.data.mongodb.core.geo.Sphere;
import org.springframework.data.mongodb.core.query.GeoCommand; import org.springframework.data.mongodb.core.query.GeoCommand;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.NumberUtils; import org.springframework.util.NumberUtils;
import org.springframework.util.ObjectUtils; import org.springframework.util.ObjectUtils;
import com.mongodb.BasicDBList; import com.mongodb.BasicDBList;
import com.mongodb.Function;
/** /**
* Wrapper class to contain useful geo structure converters for the usage with Mongo. * Wrapper class to contain useful geo structure converters for the usage with Mongo.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @author Oliver Gierke * @author Oliver Gierke
* @author Christoph Strobl * @author Christoph Strobl
@@ -58,6 +63,26 @@ import com.mongodb.BasicDBList;
*/ */
abstract class GeoConverters { abstract class GeoConverters {
private final static Map<String, Function<Document, GeoJson<?>>> converters;
static {
Collator caseInsensitive = Collator.getInstance();
caseInsensitive.setStrength(Collator.PRIMARY);
Map<String, Function<Document, GeoJson<?>>> geoConverters = new TreeMap<>(caseInsensitive);
geoConverters.put("point", DocumentToGeoJsonPointConverter.INSTANCE::convert);
geoConverters.put("multipoint", DocumentToGeoJsonMultiPointConverter.INSTANCE::convert);
geoConverters.put("linestring", DocumentToGeoJsonLineStringConverter.INSTANCE::convert);
geoConverters.put("multilinestring", DocumentToGeoJsonMultiLineStringConverter.INSTANCE::convert);
geoConverters.put("polygon", DocumentToGeoJsonPolygonConverter.INSTANCE::convert);
geoConverters.put("multipolygon", DocumentToGeoJsonMultiPolygonConverter.INSTANCE::convert);
geoConverters.put("geometrycollection", DocumentToGeoJsonGeometryCollectionConverter.INSTANCE::convert);
converters = geoConverters;
}
/** /**
* Private constructor to prevent instantiation. * Private constructor to prevent instantiation.
*/ */
@@ -65,7 +90,7 @@ abstract class GeoConverters {
/** /**
* Returns the geo converters to be registered. * Returns the geo converters to be registered.
* *
* @return * @return
*/ */
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
@@ -91,17 +116,18 @@ abstract class GeoConverters {
, DocumentToGeoJsonMultiLineStringConverter.INSTANCE // , DocumentToGeoJsonMultiLineStringConverter.INSTANCE //
, DocumentToGeoJsonMultiPointConverter.INSTANCE // , DocumentToGeoJsonMultiPointConverter.INSTANCE //
, DocumentToGeoJsonMultiPolygonConverter.INSTANCE // , DocumentToGeoJsonMultiPolygonConverter.INSTANCE //
, DocumentToGeoJsonGeometryCollectionConverter.INSTANCE); , DocumentToGeoJsonGeometryCollectionConverter.INSTANCE //
, DocumentToGeoJsonConverter.INSTANCE);
} }
/** /**
* Converts a {@link List} of {@link Double}s into a {@link Point}. * Converts a {@link List} of {@link Double}s into a {@link Point}.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.5 * @since 1.5
*/ */
@ReadingConverter @ReadingConverter
static enum DocumentToPointConverter implements Converter<Document, Point> { enum DocumentToPointConverter implements Converter<Document, Point> {
INSTANCE; INSTANCE;
@@ -128,11 +154,11 @@ abstract class GeoConverters {
/** /**
* Converts a {@link Point} into a {@link List} of {@link Double}s. * Converts a {@link Point} into a {@link List} of {@link Double}s.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.5 * @since 1.5
*/ */
static enum PointToDocumentConverter implements Converter<Point, Document> { enum PointToDocumentConverter implements Converter<Point, Document> {
INSTANCE; INSTANCE;
@@ -147,13 +173,13 @@ abstract class GeoConverters {
} }
/** /**
* Converts a {@link Box} into a {@link BasicDBList}. * Converts a {@link Box} into a {@link Document}.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.5 * @since 1.5
*/ */
@WritingConverter @WritingConverter
static enum BoxToDocumentConverter implements Converter<Box, Document> { enum BoxToDocumentConverter implements Converter<Box, Document> {
INSTANCE; INSTANCE;
@@ -176,13 +202,13 @@ abstract class GeoConverters {
} }
/** /**
* Converts a {@link BasicDBList} into a {@link Box}. * Converts a {@link Document} into a {@link Box}.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.5 * @since 1.5
*/ */
@ReadingConverter @ReadingConverter
static enum DocumentToBoxConverter implements Converter<Document, Box> { enum DocumentToBoxConverter implements Converter<Document, Box> {
INSTANCE; INSTANCE;
@@ -205,12 +231,12 @@ abstract class GeoConverters {
} }
/** /**
* Converts a {@link Circle} into a {@link BasicDBList}. * Converts a {@link Circle} into a {@link Document}.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.5 * @since 1.5
*/ */
static enum CircleToDocumentConverter implements Converter<Circle, Document> { enum CircleToDocumentConverter implements Converter<Circle, Document> {
INSTANCE; INSTANCE;
@@ -235,7 +261,7 @@ abstract class GeoConverters {
/** /**
* Converts a {@link Document} into a {@link Circle}. * Converts a {@link Document} into a {@link Circle}.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.5 * @since 1.5
*/ */
@@ -276,8 +302,8 @@ abstract class GeoConverters {
} }
/** /**
* Converts a {@link Sphere} into a {@link BasicDBList}. * Converts a {@link Sphere} into a {@link Document}.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.5 * @since 1.5
*/ */
@@ -305,13 +331,13 @@ abstract class GeoConverters {
} }
/** /**
* Converts a {@link BasicDBList} into a {@link Sphere}. * Converts a {@link Document} into a {@link Sphere}.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.5 * @since 1.5
*/ */
@ReadingConverter @ReadingConverter
static enum DocumentToSphereConverter implements Converter<Document, Sphere> { enum DocumentToSphereConverter implements Converter<Document, Sphere> {
INSTANCE; INSTANCE;
@@ -347,12 +373,12 @@ abstract class GeoConverters {
} }
/** /**
* Converts a {@link Polygon} into a {@link BasicDBList}. * Converts a {@link Polygon} into a {@link Document}.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.5 * @since 1.5
*/ */
static enum PolygonToDocumentConverter implements Converter<Polygon, Document> { enum PolygonToDocumentConverter implements Converter<Polygon, Document> {
INSTANCE; INSTANCE;
@@ -368,7 +394,7 @@ abstract class GeoConverters {
} }
List<Point> points = source.getPoints(); List<Point> points = source.getPoints();
List<Document> pointTuples = new ArrayList<Document>(points.size()); List<Document> pointTuples = new ArrayList<>(points.size());
for (Point point : points) { for (Point point : points) {
pointTuples.add(PointToDocumentConverter.INSTANCE.convert(point)); pointTuples.add(PointToDocumentConverter.INSTANCE.convert(point));
@@ -381,13 +407,13 @@ abstract class GeoConverters {
} }
/** /**
* Converts a {@link BasicDBList} into a {@link Polygon}. * Converts a {@link Document} into a {@link Polygon}.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.5 * @since 1.5
*/ */
@ReadingConverter @ReadingConverter
static enum DocumentToPolygonConverter implements Converter<Document, Polygon> { enum DocumentToPolygonConverter implements Converter<Document, Polygon> {
INSTANCE; INSTANCE;
@@ -404,7 +430,7 @@ abstract class GeoConverters {
} }
List<Document> points = (List<Document>) source.get("points"); List<Document> points = (List<Document>) source.get("points");
List<Point> newPoints = new ArrayList<Point>(points.size()); List<Point> newPoints = new ArrayList<>(points.size());
for (Document element : points) { for (Document element : points) {
@@ -417,12 +443,12 @@ abstract class GeoConverters {
} }
/** /**
* Converts a {@link Sphere} into a {@link BasicDBList}. * Converts a {@link Sphere} into a {@link Document}.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.5 * @since 1.5
*/ */
static enum GeoCommandToDocumentConverter implements Converter<GeoCommand, Document> { enum GeoCommandToDocumentConverter implements Converter<GeoCommand, Document> {
INSTANCE; INSTANCE;
@@ -482,7 +508,7 @@ abstract class GeoConverters {
* @since 1.7 * @since 1.7
*/ */
@SuppressWarnings("rawtypes") @SuppressWarnings("rawtypes")
static enum GeoJsonToDocumentConverter implements Converter<GeoJson, Document> { enum GeoJsonToDocumentConverter implements Converter<GeoJson, Document> {
INSTANCE; INSTANCE;
@@ -545,7 +571,7 @@ abstract class GeoConverters {
* @author Christoph Strobl * @author Christoph Strobl
* @since 1.7 * @since 1.7
*/ */
static enum GeoJsonPointToDocumentConverter implements Converter<GeoJsonPoint, Document> { enum GeoJsonPointToDocumentConverter implements Converter<GeoJsonPoint, Document> {
INSTANCE; INSTANCE;
@@ -563,7 +589,7 @@ abstract class GeoConverters {
* @author Christoph Strobl * @author Christoph Strobl
* @since 1.7 * @since 1.7
*/ */
static enum GeoJsonPolygonToDocumentConverter implements Converter<GeoJsonPolygon, Document> { enum GeoJsonPolygonToDocumentConverter implements Converter<GeoJsonPolygon, Document> {
INSTANCE; INSTANCE;
@@ -581,7 +607,7 @@ abstract class GeoConverters {
* @author Christoph Strobl * @author Christoph Strobl
* @since 1.7 * @since 1.7
*/ */
static enum DocumentToGeoJsonPointConverter implements Converter<Document, GeoJsonPoint> { enum DocumentToGeoJsonPointConverter implements Converter<Document, GeoJsonPoint> {
INSTANCE; INSTANCE;
@@ -609,7 +635,7 @@ abstract class GeoConverters {
* @author Christoph Strobl * @author Christoph Strobl
* @since 1.7 * @since 1.7
*/ */
static enum DocumentToGeoJsonPolygonConverter implements Converter<Document, GeoJsonPolygon> { enum DocumentToGeoJsonPolygonConverter implements Converter<Document, GeoJsonPolygon> {
INSTANCE; INSTANCE;
@@ -654,7 +680,7 @@ abstract class GeoConverters {
String.format("Cannot convert type '%s' to MultiPolygon.", source.get("type"))); String.format("Cannot convert type '%s' to MultiPolygon.", source.get("type")));
List dbl = (List) source.get("coordinates"); List dbl = (List) source.get("coordinates");
List<GeoJsonPolygon> polygones = new ArrayList<GeoJsonPolygon>(); List<GeoJsonPolygon> polygones = new ArrayList<>();
for (Object polygon : dbl) { for (Object polygon : dbl) {
polygones.add(toGeoJsonPolygon((List) polygon)); polygones.add(toGeoJsonPolygon((List) polygon));
@@ -668,7 +694,7 @@ abstract class GeoConverters {
* @author Christoph Strobl * @author Christoph Strobl
* @since 1.7 * @since 1.7
*/ */
static enum DocumentToGeoJsonLineStringConverter implements Converter<Document, GeoJsonLineString> { enum DocumentToGeoJsonLineStringConverter implements Converter<Document, GeoJsonLineString> {
INSTANCE; INSTANCE;
@@ -696,7 +722,7 @@ abstract class GeoConverters {
* @author Christoph Strobl * @author Christoph Strobl
* @since 1.7 * @since 1.7
*/ */
static enum DocumentToGeoJsonMultiPointConverter implements Converter<Document, GeoJsonMultiPoint> { enum DocumentToGeoJsonMultiPointConverter implements Converter<Document, GeoJsonMultiPoint> {
INSTANCE; INSTANCE;
@@ -724,7 +750,7 @@ abstract class GeoConverters {
* @author Christoph Strobl * @author Christoph Strobl
* @since 1.7 * @since 1.7
*/ */
static enum DocumentToGeoJsonMultiLineStringConverter implements Converter<Document, GeoJsonMultiLineString> { enum DocumentToGeoJsonMultiLineStringConverter implements Converter<Document, GeoJsonMultiLineString> {
INSTANCE; INSTANCE;
@@ -756,7 +782,7 @@ abstract class GeoConverters {
* @author Christoph Strobl * @author Christoph Strobl
* @since 1.7 * @since 1.7
*/ */
static enum DocumentToGeoJsonGeometryCollectionConverter implements Converter<Document, GeoJsonGeometryCollection> { enum DocumentToGeoJsonGeometryCollectionConverter implements Converter<Document, GeoJsonGeometryCollection> {
INSTANCE; INSTANCE;
@@ -775,41 +801,12 @@ abstract class GeoConverters {
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "GeometryCollection"), Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "GeometryCollection"),
String.format("Cannot convert type '%s' to GeometryCollection.", source.get("type"))); String.format("Cannot convert type '%s' to GeometryCollection.", source.get("type")));
List<GeoJson<?>> geometries = new ArrayList<GeoJson<?>>(); List<GeoJson<?>> geometries = new ArrayList<>();
for (Object o : (List) source.get("geometries")) { for (Object o : (List) source.get("geometries")) {
geometries.add(convertGeometries((Document) o)); geometries.add(toGenericGeoJson((Document) o));
} }
return new GeoJsonGeometryCollection(geometries); return new GeoJsonGeometryCollection(geometries);
}
private static GeoJson<?> convertGeometries(Document source) {
Object type = source.get("type");
if (ObjectUtils.nullSafeEquals(type, "Point")) {
return DocumentToGeoJsonPointConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "MultiPoint")) {
return DocumentToGeoJsonMultiPointConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "LineString")) {
return DocumentToGeoJsonLineStringConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "MultiLineString")) {
return DocumentToGeoJsonMultiLineStringConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "Polygon")) {
return DocumentToGeoJsonPolygonConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "MultiPolygon")) {
return DocumentToGeoJsonMultiPolygonConverter.INSTANCE.convert(source);
}
throw new IllegalArgumentException(String.format("Cannot convert unknown GeoJson type %s", type));
} }
} }
@@ -819,7 +816,7 @@ abstract class GeoConverters {
/** /**
* Converts a coordinate pairs nested in in {@link BasicDBList} into {@link GeoJsonPoint}s. * Converts a coordinate pairs nested in in {@link BasicDBList} into {@link GeoJsonPoint}s.
* *
* @param listOfCoordinatePairs * @param listOfCoordinatePairs
* @return * @return
* @since 1.7 * @since 1.7
@@ -827,7 +824,7 @@ abstract class GeoConverters {
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
static List<Point> toListOfPoint(List listOfCoordinatePairs) { static List<Point> toListOfPoint(List listOfCoordinatePairs) {
List<Point> points = new ArrayList<Point>(); List<Point> points = new ArrayList<>();
for (Object point : listOfCoordinatePairs) { for (Object point : listOfCoordinatePairs) {
@@ -843,7 +840,7 @@ abstract class GeoConverters {
/** /**
* Converts a coordinate pairs nested in in {@link BasicDBList} into {@link GeoJsonPolygon}. * Converts a coordinate pairs nested in in {@link BasicDBList} into {@link GeoJsonPolygon}.
* *
* @param dbList * @param dbList
* @return * @return
* @since 1.7 * @since 1.7
@@ -852,6 +849,46 @@ abstract class GeoConverters {
return new GeoJsonPolygon(toListOfPoint((List) dbList.get(0))); return new GeoJsonPolygon(toListOfPoint((List) dbList.get(0)));
} }
/**
* Converter implementation transforming a {@link Document} into a concrete {@link GeoJson} based on the embedded
* {@literal type} information.
*
* @since 2.1
* @author Christoph Strobl
*/
@ReadingConverter
enum DocumentToGeoJsonConverter implements Converter<Document, GeoJson> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Nullable
@Override
public GeoJson convert(Document source) {
return toGenericGeoJson(source);
}
}
private static GeoJson<?> toGenericGeoJson(Document source) {
String type = source.get("type", String.class);
if(type != null) {
Function<Document, GeoJson<?>> converter = converters.get(type);
if(converter != null){
return converter.apply(source);
}
}
throw new IllegalArgumentException(
String.format("No converter found capable of converting GeoJson type %s.", type));
}
private static double toPrimitiveDoubleValue(Object value) { private static double toPrimitiveDoubleValue(Object value) {
Assert.isInstanceOf(Number.class, value, "Argument must be a Number."); Assert.isInstanceOf(Number.class, value, "Argument must be a Number.");

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2014 the original author or authors. * Copyright 2014-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -16,21 +16,23 @@
package org.springframework.data.mongodb.core.convert; package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver.LazyLoadingInterceptor; import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver.LazyLoadingInterceptor;
import org.springframework.lang.Nullable;
import com.mongodb.DBRef; import com.mongodb.DBRef;
/** /**
* Allows direct interaction with the underlying {@link LazyLoadingInterceptor}. * Allows direct interaction with the underlying {@link LazyLoadingInterceptor}.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @author Christoph Strobl * @author Christoph Strobl
* @author Mark Paluch
* @since 1.5 * @since 1.5
*/ */
public interface LazyLoadingProxy { public interface LazyLoadingProxy {
/** /**
* Initializes the proxy and returns the wrapped value. * Initializes the proxy and returns the wrapped value.
* *
* @return * @return
* @since 1.5 * @since 1.5
*/ */
@@ -38,9 +40,10 @@ public interface LazyLoadingProxy {
/** /**
* Returns the {@link DBRef} represented by this {@link LazyLoadingProxy}, may be null. * Returns the {@link DBRef} represented by this {@link LazyLoadingProxy}, may be null.
* *
* @return * @return
* @since 1.5 * @since 1.5
*/ */
@Nullable
DBRef toDBRef(); DBRef toDBRef();
} }

View File

@@ -15,17 +15,8 @@
*/ */
package org.springframework.data.mongodb.core.convert; package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList; import java.util.*;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry; import java.util.Map.Entry;
import java.util.Optional;
import java.util.Set;
import org.bson.Document; import org.bson.Document;
import org.bson.conversions.Bson; import org.bson.conversions.Bson;
@@ -40,13 +31,13 @@ import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.data.convert.EntityInstantiator; import org.springframework.data.convert.EntityInstantiator;
import org.springframework.data.convert.TypeMapper; import org.springframework.data.convert.TypeMapper;
import org.springframework.data.mapping.Association; import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.PersistentProperty; import org.springframework.data.mapping.MappingException;
import org.springframework.data.mapping.PersistentPropertyAccessor; import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.mapping.PreferredConstructor;
import org.springframework.data.mapping.PreferredConstructor.Parameter; import org.springframework.data.mapping.PreferredConstructor.Parameter;
import org.springframework.data.mapping.context.MappingContext; import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.ConvertingPropertyAccessor; import org.springframework.data.mapping.model.ConvertingPropertyAccessor;
import org.springframework.data.mapping.model.DefaultSpELExpressionEvaluator; import org.springframework.data.mapping.model.DefaultSpELExpressionEvaluator;
import org.springframework.data.mapping.MappingException;
import org.springframework.data.mapping.model.ParameterValueProvider; import org.springframework.data.mapping.model.ParameterValueProvider;
import org.springframework.data.mapping.model.PersistentEntityParameterValueProvider; import org.springframework.data.mapping.model.PersistentEntityParameterValueProvider;
import org.springframework.data.mapping.model.PropertyValueProvider; import org.springframework.data.mapping.model.PropertyValueProvider;
@@ -61,6 +52,7 @@ import org.springframework.data.mongodb.core.mapping.event.AfterLoadEvent;
import org.springframework.data.mongodb.core.mapping.event.MongoMappingEvent; import org.springframework.data.mongodb.core.mapping.event.MongoMappingEvent;
import org.springframework.data.util.ClassTypeInformation; import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation; import org.springframework.data.util.TypeInformation;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.ClassUtils; import org.springframework.util.ClassUtils;
import org.springframework.util.CollectionUtils; import org.springframework.util.CollectionUtils;
@@ -86,6 +78,7 @@ import com.mongodb.DBRef;
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware, ValueResolver { public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware, ValueResolver {
private static final String INCOMPATIBLE_TYPES = "Cannot convert %1$s of type %2$s into an instance of %3$s! Implement a custom Converter<%2$s, %3$s> and register it with the CustomConversions. Parent object was: %4$s"; private static final String INCOMPATIBLE_TYPES = "Cannot convert %1$s of type %2$s into an instance of %3$s! Implement a custom Converter<%2$s, %3$s> and register it with the CustomConversions. Parent object was: %4$s";
private static final String INVALID_TYPE_TO_READ = "Expected to read Document %s into type %s but didn't find a PersistentEntity for the latter!";
protected static final Logger LOGGER = LoggerFactory.getLogger(MappingMongoConverter.class); protected static final Logger LOGGER = LoggerFactory.getLogger(MappingMongoConverter.class);
@@ -94,9 +87,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
protected final DbRefResolver dbRefResolver; protected final DbRefResolver dbRefResolver;
protected final DefaultDbRefProxyHandler dbRefProxyHandler; protected final DefaultDbRefProxyHandler dbRefProxyHandler;
protected ApplicationContext applicationContext; protected @Nullable ApplicationContext applicationContext;
protected MongoTypeMapper typeMapper; protected MongoTypeMapper typeMapper;
protected String mapKeyDotReplacement = null; protected @Nullable String mapKeyDotReplacement = null;
private SpELContext spELContext; private SpELContext spELContext;
@@ -142,11 +135,12 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* {@link DefaultMongoTypeMapper} by default. Setting this to {@literal null} will reset the {@link TypeMapper} to the * {@link DefaultMongoTypeMapper} by default. Setting this to {@literal null} will reset the {@link TypeMapper} to the
* default one. * default one.
* *
* @param typeMapper the typeMapper to set * @param typeMapper the typeMapper to set. Can be {@literal null}.
*/ */
public void setTypeMapper(MongoTypeMapper typeMapper) { public void setTypeMapper(@Nullable MongoTypeMapper typeMapper) {
this.typeMapper = typeMapper == null this.typeMapper = typeMapper == null
? new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, mappingContext) : typeMapper; ? new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, mappingContext)
: typeMapper;
} }
/* /*
@@ -164,9 +158,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* object to fail. If further customization of the translation is needed, have a look at * object to fail. If further customization of the translation is needed, have a look at
* {@link #potentiallyEscapeMapKey(String)} as well as {@link #potentiallyUnescapeMapKey(String)}. * {@link #potentiallyEscapeMapKey(String)} as well as {@link #potentiallyUnescapeMapKey(String)}.
* *
* @param mapKeyDotReplacement the mapKeyDotReplacement to set * @param mapKeyDotReplacement the mapKeyDotReplacement to set. Can be {@literal null}.
*/ */
public void setMapKeyDotReplacement(String mapKeyDotReplacement) { public void setMapKeyDotReplacement(@Nullable String mapKeyDotReplacement) {
this.mapKeyDotReplacement = mapKeyDotReplacement; this.mapKeyDotReplacement = mapKeyDotReplacement;
} }
@@ -200,8 +194,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return read(type, bson, ObjectPath.ROOT); return read(type, bson, ObjectPath.ROOT);
} }
@Nullable
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
private <S extends Object> S read(TypeInformation<S> type, Bson bson, ObjectPath path) { private <S extends Object> S read(TypeInformation<S> type, @Nullable Bson bson, ObjectPath path) {
if (null == bson) { if (null == bson) {
return null; return null;
@@ -241,11 +236,17 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Document target = bson instanceof BasicDBObject ? new Document((BasicDBObject) bson) : (Document) bson; Document target = bson instanceof BasicDBObject ? new Document((BasicDBObject) bson) : (Document) bson;
return read((MongoPersistentEntity<S>) mappingContext.getRequiredPersistentEntity(typeToUse), target, path); MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(typeToUse);
if (entity == null) {
throw new MappingException(String.format(INVALID_TYPE_TO_READ, target, typeToUse.getType()));
}
return read((MongoPersistentEntity<S>) entity, target, path);
} }
private ParameterValueProvider<MongoPersistentProperty> getParameterProvider(MongoPersistentEntity<?> entity, private ParameterValueProvider<MongoPersistentProperty> getParameterProvider(MongoPersistentEntity<?> entity,
Bson source, DefaultSpELExpressionEvaluator evaluator, ObjectPath path) { DocumentAccessor source, DefaultSpELExpressionEvaluator evaluator, ObjectPath path) {
MongoDbPropertyValueProvider provider = new MongoDbPropertyValueProvider(source, evaluator, path); MongoDbPropertyValueProvider provider = new MongoDbPropertyValueProvider(source, evaluator, path);
PersistentEntityParameterValueProvider<MongoPersistentProperty> parameterProvider = new PersistentEntityParameterValueProvider<>( PersistentEntityParameterValueProvider<MongoPersistentProperty> parameterProvider = new PersistentEntityParameterValueProvider<>(
@@ -258,8 +259,14 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
private <S extends Object> S read(final MongoPersistentEntity<S> entity, final Document bson, final ObjectPath path) { private <S extends Object> S read(final MongoPersistentEntity<S> entity, final Document bson, final ObjectPath path) {
DefaultSpELExpressionEvaluator evaluator = new DefaultSpELExpressionEvaluator(bson, spELContext); DefaultSpELExpressionEvaluator evaluator = new DefaultSpELExpressionEvaluator(bson, spELContext);
DocumentAccessor documentAccessor = new DocumentAccessor(bson);
PreferredConstructor<S, MongoPersistentProperty> constructor = entity.getPersistenceConstructor();
ParameterValueProvider<MongoPersistentProperty> provider = constructor != null && constructor.hasParameters() //
? getParameterProvider(entity, documentAccessor, evaluator, path) //
: NoOpParameterValueProvider.INSTANCE;
ParameterValueProvider<MongoPersistentProperty> provider = getParameterProvider(entity, bson, evaluator, path);
EntityInstantiator instantiator = instantiators.getInstantiatorFor(entity); EntityInstantiator instantiator = instantiators.getInstantiatorFor(entity);
S instance = instantiator.createInstance(entity, provider); S instance = instantiator.createInstance(entity, provider);
@@ -267,7 +274,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
conversionService); conversionService);
MongoPersistentProperty idProperty = entity.getIdProperty(); MongoPersistentProperty idProperty = entity.getIdProperty();
DocumentAccessor documentAccessor = new DocumentAccessor(bson);
// make sure id property is set before all other properties // make sure id property is set before all other properties
Object idValue = null; Object idValue = null;
@@ -283,9 +289,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
MongoDbPropertyValueProvider valueProvider = new MongoDbPropertyValueProvider(documentAccessor, evaluator, MongoDbPropertyValueProvider valueProvider = new MongoDbPropertyValueProvider(documentAccessor, evaluator,
currentPath); currentPath);
DbRefResolverCallback callback = new DefaultDbRefResolverCallback(bson, currentPath, evaluator, readProperties(entity, accessor, idProperty, documentAccessor, valueProvider, currentPath, evaluator);
MappingMongoConverter.this);
readProperties(entity, accessor, idProperty, documentAccessor, valueProvider, callback);
return instance; return instance;
} }
@@ -300,14 +304,19 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
} }
private void readProperties(MongoPersistentEntity<?> entity, PersistentPropertyAccessor accessor, private void readProperties(MongoPersistentEntity<?> entity, PersistentPropertyAccessor accessor,
MongoPersistentProperty idProperty, DocumentAccessor documentAccessor, @Nullable MongoPersistentProperty idProperty, DocumentAccessor documentAccessor,
MongoDbPropertyValueProvider valueProvider, DbRefResolverCallback callback) { MongoDbPropertyValueProvider valueProvider, ObjectPath currentPath, SpELExpressionEvaluator evaluator) {
DbRefResolverCallback callback = null;
for (MongoPersistentProperty prop : entity) { for (MongoPersistentProperty prop : entity) {
if(prop.isAssociation() && !entity.isConstructorArgument(prop)) { if (callback == null) {
readAssociation(prop.getAssociation(), accessor, documentAccessor, dbRefProxyHandler, callback ); callback = getDbRefResolverCallback(documentAccessor, currentPath, evaluator);
}
if (prop.isAssociation() && !entity.isConstructorArgument(prop)) {
readAssociation(prop.getRequiredAssociation(), accessor, documentAccessor, dbRefProxyHandler, callback);
continue; continue;
} }
// we skip the id property since it was already set // we skip the id property since it was already set
@@ -319,8 +328,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
continue; continue;
} }
if(prop.isAssociation()) { if (prop.isAssociation()) {
readAssociation(prop.getAssociation(), accessor, documentAccessor, dbRefProxyHandler, callback ); readAssociation(prop.getRequiredAssociation(), accessor, documentAccessor, dbRefProxyHandler, callback);
continue; continue;
} }
@@ -328,27 +337,34 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
} }
} }
private DbRefResolverCallback getDbRefResolverCallback(DocumentAccessor documentAccessor, ObjectPath currentPath,
SpELExpressionEvaluator evaluator) {
return new DefaultDbRefResolverCallback(documentAccessor.getDocument(), currentPath, evaluator,
MappingMongoConverter.this);
}
private void readAssociation(Association<MongoPersistentProperty> association, PersistentPropertyAccessor accessor, private void readAssociation(Association<MongoPersistentProperty> association, PersistentPropertyAccessor accessor,
DocumentAccessor documentAccessor, DbRefProxyHandler handler, DbRefResolverCallback callback) { DocumentAccessor documentAccessor, DbRefProxyHandler handler, DbRefResolverCallback callback) {
MongoPersistentProperty property = association.getInverse(); MongoPersistentProperty property = association.getInverse();
Object value = documentAccessor.get(property); Object value = documentAccessor.get(property);
if (value == null) { if (value == null) {
return; return;
} }
DBRef dbref = value instanceof DBRef ? (DBRef) value : null; DBRef dbref = value instanceof DBRef ? (DBRef) value : null;
accessor.setProperty(property, dbRefResolver.resolveDbRef(property, dbref, callback, handler)); accessor.setProperty(property, dbRefResolver.resolveDbRef(property, dbref, callback, handler));
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.MongoWriter#toDBRef(java.lang.Object, org.springframework.data.mongodb.core.mapping.MongoPersistentProperty) * @see org.springframework.data.mongodb.core.convert.MongoWriter#toDBRef(java.lang.Object, org.springframework.data.mongodb.core.mapping.MongoPersistentProperty)
*/ */
public DBRef toDBRef(Object object, MongoPersistentProperty referringProperty) { public DBRef toDBRef(Object object, @Nullable MongoPersistentProperty referringProperty) {
org.springframework.data.mongodb.core.mapping.DBRef annotation = null; org.springframework.data.mongodb.core.mapping.DBRef annotation;
if (referringProperty != null) { if (referringProperty != null) {
annotation = referringProperty.getDBRef(); annotation = referringProperty.getDBRef();
@@ -369,7 +385,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* *
* @see org.springframework.data.mongodb.core.convert.MongoWriter#write(java.lang.Object, com.mongodb.Document) * @see org.springframework.data.mongodb.core.convert.MongoWriter#write(java.lang.Object, com.mongodb.Document)
*/ */
public void write(final Object obj, final Bson bson) { public void write(Object obj, Bson bson) {
if (null == obj) { if (null == obj) {
return; return;
@@ -385,20 +401,32 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
removeFromMap(bson, "_id"); removeFromMap(bson, "_id");
} }
boolean handledByCustomConverter = conversions.hasCustomWriteTarget(entityType, Document.class); if (requiresTypeHint(entityType)) {
if (!handledByCustomConverter && !(bson instanceof Collection)) {
typeMapper.writeType(type, bson); typeMapper.writeType(type, bson);
} }
} }
/**
* Check if a given type requires a type hint (aka {@literal _class} attribute) when writing to the document.
*
* @param type must not be {@literal null}.
* @return {@literal true} if not a simple type, {@link Collection} or type with custom write target.
*/
private boolean requiresTypeHint(Class<?> type) {
return !conversions.isSimpleType(type) && !ClassUtils.isAssignable(Collection.class, type)
&& !conversions.hasCustomWriteTarget(type, Document.class);
}
/** /**
* Internal write conversion method which should be used for nested invocations. * Internal write conversion method which should be used for nested invocations.
* *
* @param obj * @param obj
* @param bson * @param bson
* @param typeHint
*/ */
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
protected void writeInternal(final Object obj, final Bson bson, final TypeInformation<?> typeHint) { protected void writeInternal(@Nullable Object obj, Bson bson, @Nullable TypeInformation<?> typeHint) {
if (null == obj) { if (null == obj) {
return; return;
@@ -419,7 +447,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
} }
if (Collection.class.isAssignableFrom(entityType)) { if (Collection.class.isAssignableFrom(entityType)) {
writeCollectionInternal((Collection<?>) obj, ClassTypeInformation.LIST, (BasicDBList) bson); writeCollectionInternal((Collection<?>) obj, ClassTypeInformation.LIST, (Collection) bson);
return; return;
} }
@@ -428,7 +456,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
addCustomTypeKeyIfNecessary(typeHint, obj, bson); addCustomTypeKeyIfNecessary(typeHint, obj, bson);
} }
protected void writeInternal(Object obj, final Bson bson, MongoPersistentEntity<?> entity) { protected void writeInternal(@Nullable Object obj, Bson bson, @Nullable MongoPersistentEntity<?> entity) {
if (obj == null) { if (obj == null) {
return; return;
@@ -454,17 +482,16 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
} }
private void writeProperties(Bson bson, MongoPersistentEntity<?> entity, PersistentPropertyAccessor accessor, private void writeProperties(Bson bson, MongoPersistentEntity<?> entity, PersistentPropertyAccessor accessor,
DocumentAccessor dbObjectAccessor, MongoPersistentProperty idProperty) { DocumentAccessor dbObjectAccessor, @Nullable MongoPersistentProperty idProperty) {
// Write the properties // Write the properties
for (MongoPersistentProperty prop : entity) { for (MongoPersistentProperty prop : entity) {
if (prop.equals(idProperty) || !prop.isWritable()) { if (prop.equals(idProperty) || !prop.isWritable()) {
continue; continue;
} }
if(prop.isAssociation()) { if (prop.isAssociation()) {
writeAssociation(prop.getAssociation(), accessor, dbObjectAccessor); writeAssociation(prop.getRequiredAssociation(), accessor, dbObjectAccessor);
continue; continue;
} }
@@ -485,13 +512,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
private void writeAssociation(Association<MongoPersistentProperty> association, PersistentPropertyAccessor accessor, private void writeAssociation(Association<MongoPersistentProperty> association, PersistentPropertyAccessor accessor,
DocumentAccessor dbObjectAccessor) { DocumentAccessor dbObjectAccessor) {
MongoPersistentProperty inverseProp = association.getInverse(); MongoPersistentProperty inverseProp = association.getInverse();
writePropertyInternal(accessor.getProperty(inverseProp), dbObjectAccessor, inverseProp); writePropertyInternal(accessor.getProperty(inverseProp), dbObjectAccessor, inverseProp);
} }
@SuppressWarnings({ "unchecked" }) @SuppressWarnings({ "unchecked" })
protected void writePropertyInternal(Object obj, DocumentAccessor accessor, MongoPersistentProperty prop) { protected void writePropertyInternal(@Nullable Object obj, DocumentAccessor accessor, MongoPersistentProperty prop) {
if (obj == null) { if (obj == null) {
return; return;
@@ -549,7 +576,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
} }
MongoPersistentEntity<?> entity = isSubtype(prop.getType(), obj.getClass()) MongoPersistentEntity<?> entity = isSubtype(prop.getType(), obj.getClass())
? mappingContext.getRequiredPersistentEntity(obj.getClass()) : mappingContext.getRequiredPersistentEntity(type); ? mappingContext.getRequiredPersistentEntity(obj.getClass())
: mappingContext.getRequiredPersistentEntity(type);
Object existingValue = accessor.get(prop); Object existingValue = accessor.get(prop);
Document document = existingValue instanceof Document ? (Document) existingValue : new Document(); Document document = existingValue instanceof Document ? (Document) existingValue : new Document();
@@ -645,17 +673,21 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
} }
/** /**
* Populates the given {@link BasicDBList} with values from the given {@link Collection}. * Populates the given {@link Collection sink} with converted values from the given {@link Collection source}.
* *
* @param source the collection to create a {@link BasicDBList} for, must not be {@literal null}. * @param source the collection to create a {@link Collection} for, must not be {@literal null}.
* @param type the {@link TypeInformation} to consider or {@literal null} if unknown. * @param type the {@link TypeInformation} to consider or {@literal null} if unknown.
* @param sink the {@link BasicDBList} to write to. * @param sink the {@link Collection} to write to.
* @return * @return
*/ */
private BasicDBList writeCollectionInternal(Collection<?> source, TypeInformation<?> type, BasicDBList sink) { @SuppressWarnings("unchecked")
private List<Object> writeCollectionInternal(Collection<?> source, @Nullable TypeInformation<?> type,
Collection<?> sink) {
TypeInformation<?> componentType = null; TypeInformation<?> componentType = null;
List<Object> collection = sink instanceof List ? (List<Object>) sink : new ArrayList<>(sink);
if (type != null) { if (type != null) {
componentType = type.getComponentType(); componentType = type.getComponentType();
} }
@@ -665,17 +697,17 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Class<?> elementType = element == null ? null : element.getClass(); Class<?> elementType = element == null ? null : element.getClass();
if (elementType == null || conversions.isSimpleType(elementType)) { if (elementType == null || conversions.isSimpleType(elementType)) {
sink.add(getPotentiallyConvertedSimpleWrite(element)); collection.add(getPotentiallyConvertedSimpleWrite(element));
} else if (element instanceof Collection || elementType.isArray()) { } else if (element instanceof Collection || elementType.isArray()) {
sink.add(writeCollectionInternal(asCollection(element), componentType, new BasicDBList())); collection.add(writeCollectionInternal(asCollection(element), componentType, new BasicDBList()));
} else { } else {
Document document = new Document(); Document document = new Document();
writeInternal(element, document, componentType); writeInternal(element, document, componentType);
sink.add(document); collection.add(document);
} }
} }
return sink; return collection;
} }
/** /**
@@ -768,7 +800,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
} }
return conversions.hasCustomWriteTarget(key.getClass(), String.class) return conversions.hasCustomWriteTarget(key.getClass(), String.class)
? (String) getPotentiallyConvertedSimpleWrite(key) : key.toString(); ? (String) getPotentiallyConvertedSimpleWrite(key)
: key.toString();
} }
/** /**
@@ -790,7 +823,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param value must not be {@literal null}. * @param value must not be {@literal null}.
* @param bson must not be {@literal null}. * @param bson must not be {@literal null}.
*/ */
protected void addCustomTypeKeyIfNecessary(TypeInformation<?> type, Object value, Bson bson) { protected void addCustomTypeKeyIfNecessary(@Nullable TypeInformation<?> type, Object value, Bson bson) {
Class<?> reference = type != null ? type.getActualType().getType() : Object.class; Class<?> reference = type != null ? type.getActualType().getType() : Object.class;
Class<?> valueType = ClassUtils.getUserClass(value.getClass()); Class<?> valueType = ClassUtils.getUserClass(value.getClass());
@@ -824,7 +857,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param value * @param value
* @return * @return
*/ */
private Object getPotentiallyConvertedSimpleWrite(Object value) { @Nullable
private Object getPotentiallyConvertedSimpleWrite(@Nullable Object value) {
if (value == null) { if (value == null) {
return null; return null;
@@ -855,10 +889,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param target must not be {@literal null}. * @param target must not be {@literal null}.
* @return * @return
*/ */
@Nullable
@SuppressWarnings({ "rawtypes", "unchecked" }) @SuppressWarnings({ "rawtypes", "unchecked" })
private Object getPotentiallyConvertedSimpleRead(Object value, Class<?> target) { private Object getPotentiallyConvertedSimpleRead(@Nullable Object value, @Nullable Class<?> target) {
if (value == null || target == null || target.isAssignableFrom(value.getClass())) { if (value == null || target == null || ClassUtils.isAssignableValue(target, value)) {
return value; return value;
} }
@@ -921,58 +956,61 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* Reads the given {@link BasicDBList} into a collection of the given {@link TypeInformation}. * Reads the given {@link BasicDBList} into a collection of the given {@link TypeInformation}.
* *
* @param targetType must not be {@literal null}. * @param targetType must not be {@literal null}.
* @param sourceValue must not be {@literal null}. * @param source must not be {@literal null}.
* @param path must not be {@literal null}. * @param path must not be {@literal null}.
* @return the converted {@link Collection} or array, will never be {@literal null}. * @return the converted {@link Collection} or array, will never be {@literal null}.
*/ */
@SuppressWarnings({ "rawtypes", "unchecked" }) @SuppressWarnings("unchecked")
private Object readCollectionOrArray(TypeInformation<?> targetType, List sourceValue, ObjectPath path) { private Object readCollectionOrArray(TypeInformation<?> targetType, Collection<?> source, ObjectPath path) {
Assert.notNull(targetType, "Target type must not be null!"); Assert.notNull(targetType, "Target type must not be null!");
Assert.notNull(path, "Object path must not be null!"); Assert.notNull(path, "Object path must not be null!");
Class<?> collectionType = targetType.getType(); Class<?> collectionType = targetType.getType();
collectionType = Collection.class.isAssignableFrom(collectionType) //
? collectionType //
: List.class;
TypeInformation<?> componentType = targetType.getComponentType() != null ? targetType.getComponentType() TypeInformation<?> componentType = targetType.getComponentType() != null //
? targetType.getComponentType() //
: ClassTypeInformation.OBJECT; : ClassTypeInformation.OBJECT;
Class<?> rawComponentType = componentType.getType(); Class<?> rawComponentType = componentType.getType();
collectionType = Collection.class.isAssignableFrom(collectionType) ? collectionType : List.class; Collection<Object> items = targetType.getType().isArray() //
Collection<Object> items = targetType.getType().isArray() ? new ArrayList<>(sourceValue.size()) ? new ArrayList<>(source.size()) //
: CollectionFactory.createCollection(collectionType, rawComponentType, sourceValue.size()); : CollectionFactory.createCollection(collectionType, rawComponentType, source.size());
if (sourceValue.isEmpty()) { if (source.isEmpty()) {
return getPotentiallyConvertedSimpleRead(items, collectionType); return getPotentiallyConvertedSimpleRead(items, targetType.getType());
} }
if (!DBRef.class.equals(rawComponentType) && isCollectionOfDbRefWhereBulkFetchIsPossible(sourceValue)) { if (!DBRef.class.equals(rawComponentType) && isCollectionOfDbRefWhereBulkFetchIsPossible(source)) {
List<Object> objects = bulkReadAndConvertDBRefs((List<DBRef>) sourceValue, componentType, path, rawComponentType); List<Object> objects = bulkReadAndConvertDBRefs((List<DBRef>) source, componentType, path, rawComponentType);
return getPotentiallyConvertedSimpleRead(objects, targetType.getType()); return getPotentiallyConvertedSimpleRead(objects, targetType.getType());
} }
for (Object dbObjItem : sourceValue) { for (Object element : source) {
if (dbObjItem instanceof DBRef) { if (element instanceof DBRef) {
items.add(DBRef.class.equals(rawComponentType) ? dbObjItem items.add(DBRef.class.equals(rawComponentType) ? element
: readAndConvertDBRef((DBRef) dbObjItem, componentType, path, rawComponentType)); : readAndConvertDBRef((DBRef) element, componentType, path, rawComponentType));
} else if (dbObjItem instanceof Document) { } else if (element instanceof Document) {
items.add(read(componentType, (Document) dbObjItem, path)); items.add(read(componentType, (Document) element, path));
} else if (dbObjItem instanceof BasicDBObject) { } else if (element instanceof BasicDBObject) {
items.add(read(componentType, (BasicDBObject) dbObjItem, path)); items.add(read(componentType, (BasicDBObject) element, path));
} else { } else {
if (dbObjItem instanceof Collection) { if (!Object.class.equals(rawComponentType) && element instanceof Collection) {
if (!rawComponentType.isArray() && !ClassUtils.isAssignable(Iterable.class, rawComponentType)) { if (!rawComponentType.isArray() && !ClassUtils.isAssignable(Iterable.class, rawComponentType)) {
throw new MappingException( throw new MappingException(
String.format(INCOMPATIBLE_TYPES, dbObjItem, dbObjItem.getClass(), rawComponentType, path)); String.format(INCOMPATIBLE_TYPES, element, element.getClass(), rawComponentType, path));
} }
} }
if (element instanceof List) {
if (dbObjItem instanceof List) { items.add(readCollectionOrArray(componentType, (Collection<Object>) element, path));
items.add(readCollectionOrArray(ClassTypeInformation.OBJECT, (List) dbObjItem, path));
} else { } else {
items.add(getPotentiallyConvertedSimpleRead(dbObjItem, rawComponentType)); items.add(getPotentiallyConvertedSimpleRead(element, rawComponentType));
} }
} }
} }
@@ -1033,8 +1071,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
map.put(key, DBRef.class.equals(rawValueType) ? value map.put(key, DBRef.class.equals(rawValueType) ? value
: readAndConvertDBRef((DBRef) value, defaultedValueType, ObjectPath.ROOT, rawValueType)); : readAndConvertDBRef((DBRef) value, defaultedValueType, ObjectPath.ROOT, rawValueType));
} else if (value instanceof List) { } else if (value instanceof List) {
map.put(key, map.put(key, readCollectionOrArray(valueType != null ? valueType : ClassTypeInformation.LIST,
readCollectionOrArray(valueType != null ? valueType : ClassTypeInformation.LIST, (List) value, path)); (List<Object>) value, path));
} else { } else {
map.put(key, getPotentiallyConvertedSimpleRead(value, rawValueType)); map.put(key, getPotentiallyConvertedSimpleRead(value, rawValueType));
} }
@@ -1058,7 +1096,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
String.format("Cannot read %s. as map. Given Bson must be a Document or DBObject!", bson.getClass())); String.format("Cannot read %s. as map. Given Bson must be a Document or DBObject!", bson.getClass()));
} }
private static void addToMap(Bson bson, String key, Object value) { private static void addToMap(Bson bson, String key, @Nullable Object value) {
if (bson instanceof Document) { if (bson instanceof Document) {
((Document) bson).put(key, value); ((Document) bson).put(key, value);
@@ -1072,8 +1110,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
"Cannot add key/value pair to %s. as map. Given Bson must be a Document or DBObject!", bson.getClass())); "Cannot add key/value pair to %s. as map. Given Bson must be a Document or DBObject!", bson.getClass()));
} }
@SuppressWarnings("unchecked") private static void addAllToMap(Bson bson, Map<String, ?> value) {
private static void addAllToMap(Bson bson, Map value) {
if (bson instanceof Document) { if (bson instanceof Document) {
((Document) bson).putAll(value); ((Document) bson).putAll(value);
@@ -1109,8 +1146,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.MongoWriter#convertToMongoType(java.lang.Object, org.springframework.data.util.TypeInformation) * @see org.springframework.data.mongodb.core.convert.MongoWriter#convertToMongoType(java.lang.Object, org.springframework.data.util.TypeInformation)
*/ */
@Nullable
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
public Object convertToMongoType(Object obj, TypeInformation<?> typeInformation) { @Override
public Object convertToMongoType(@Nullable Object obj, TypeInformation<?> typeInformation) {
if (obj == null) { if (obj == null) {
return null; return null;
@@ -1126,10 +1165,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return getPotentiallyConvertedSimpleWrite(obj); return getPotentiallyConvertedSimpleWrite(obj);
} }
TypeInformation<?> typeHint = typeInformation;
if (obj instanceof List) { if (obj instanceof List) {
return maybeConvertList((List<Object>) obj, typeHint); return maybeConvertList((List<Object>) obj, typeInformation);
} }
if (obj instanceof Document) { if (obj instanceof Document) {
@@ -1137,7 +1174,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Document newValueDocument = new Document(); Document newValueDocument = new Document();
for (String vk : ((Document) obj).keySet()) { for (String vk : ((Document) obj).keySet()) {
Object o = ((Document) obj).get(vk); Object o = ((Document) obj).get(vk);
newValueDocument.put(vk, convertToMongoType(o, typeHint)); newValueDocument.put(vk, convertToMongoType(o, typeInformation));
} }
return newValueDocument; return newValueDocument;
} }
@@ -1148,7 +1185,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
for (String vk : ((DBObject) obj).keySet()) { for (String vk : ((DBObject) obj).keySet()) {
Object o = ((DBObject) obj).get(vk); Object o = ((DBObject) obj).get(vk);
newValueDbo.put(vk, convertToMongoType(o, typeHint)); newValueDbo.put(vk, convertToMongoType(o, typeInformation));
} }
return newValueDbo; return newValueDbo;
@@ -1159,18 +1196,18 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Document result = new Document(); Document result = new Document();
for (Map.Entry<Object, Object> entry : ((Map<Object, Object>) obj).entrySet()) { for (Map.Entry<Object, Object> entry : ((Map<Object, Object>) obj).entrySet()) {
result.put(entry.getKey().toString(), convertToMongoType(entry.getValue(), typeHint)); result.put(entry.getKey().toString(), convertToMongoType(entry.getValue(), typeInformation));
} }
return result; return result;
} }
if (obj.getClass().isArray()) { if (obj.getClass().isArray()) {
return maybeConvertList(Arrays.asList((Object[]) obj), typeHint); return maybeConvertList(Arrays.asList((Object[]) obj), typeInformation);
} }
if (obj instanceof Collection) { if (obj instanceof Collection) {
return maybeConvertList((Collection<?>) obj, typeHint); return maybeConvertList((Collection<?>) obj, typeInformation);
} }
Document newDocument = new Document(); Document newDocument = new Document();
@@ -1205,6 +1242,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param recursively whether to apply the removal recursively * @param recursively whether to apply the removal recursively
* @return * @return
*/ */
@SuppressWarnings("unchecked")
private Object removeTypeInfo(Object object, boolean recursively) { private Object removeTypeInfo(Object object, boolean recursively) {
if (!(object instanceof Document)) { if (!(object instanceof Document)) {
@@ -1225,7 +1263,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
removeTypeInfo(element, recursively); removeTypeInfo(element, recursively);
} }
} else if (value instanceof List) { } else if (value instanceof List) {
for (Object element : (List) value) { for (Object element : (List<Object>) value) {
removeTypeInfo(element, recursively); removeTypeInfo(element, recursively);
} }
} else { } else {
@@ -1268,7 +1306,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* *
* @param source must not be {@literal null}. * @param source must not be {@literal null}.
* @param evaluator must not be {@literal null}. * @param evaluator must not be {@literal null}.
* @param path can be {@literal null}. * @param path must not be {@literal null}.
*/ */
public MongoDbPropertyValueProvider(Bson source, SpELExpressionEvaluator evaluator, ObjectPath path) { public MongoDbPropertyValueProvider(Bson source, SpELExpressionEvaluator evaluator, ObjectPath path) {
@@ -1287,7 +1325,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* *
* @param accessor must not be {@literal null}. * @param accessor must not be {@literal null}.
* @param evaluator must not be {@literal null}. * @param evaluator must not be {@literal null}.
* @param path can be {@literal null}. * @param path must not be {@literal null}.
*/ */
public MongoDbPropertyValueProvider(DocumentAccessor accessor, SpELExpressionEvaluator evaluator, ObjectPath path) { public MongoDbPropertyValueProvider(DocumentAccessor accessor, SpELExpressionEvaluator evaluator, ObjectPath path) {
@@ -1304,6 +1342,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.convert.PropertyValueProvider#getPropertyValue(org.springframework.data.mapping.PersistentProperty) * @see org.springframework.data.convert.PropertyValueProvider#getPropertyValue(org.springframework.data.mapping.PersistentProperty)
*/ */
@Nullable
public <T> T getPropertyValue(MongoPersistentProperty property) { public <T> T getPropertyValue(MongoPersistentProperty property) {
String expression = property.getSpelExpression(); String expression = property.getSpelExpression();
@@ -1353,6 +1392,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
} }
} }
@Nullable
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
<T> T readValue(Object value, TypeInformation<?> type, ObjectPath path) { <T> T readValue(Object value, TypeInformation<?> type, ObjectPath path) {
@@ -1363,7 +1403,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
} else if (value instanceof DBRef) { } else if (value instanceof DBRef) {
return potentiallyReadOrResolveDbRef((DBRef) value, type, path, rawType); return potentiallyReadOrResolveDbRef((DBRef) value, type, path, rawType);
} else if (value instanceof List) { } else if (value instanceof List) {
return (T) readCollectionOrArray(type, (List) value, path); return (T) readCollectionOrArray(type, (List<Object>) value, path);
} else if (value instanceof Document) { } else if (value instanceof Document) {
return (T) read(type, (Document) value, path); return (T) read(type, (Document) value, path);
} else if (value instanceof DBObject) { } else if (value instanceof DBObject) {
@@ -1373,18 +1413,22 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
} }
} }
@Nullable
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
private <T> T potentiallyReadOrResolveDbRef(DBRef dbref, TypeInformation<?> type, ObjectPath path, Class<?> rawType) { private <T> T potentiallyReadOrResolveDbRef(@Nullable DBRef dbref, TypeInformation<?> type, ObjectPath path,
Class<?> rawType) {
if (rawType.equals(DBRef.class)) { if (rawType.equals(DBRef.class)) {
return (T) dbref; return (T) dbref;
} }
T object = dbref == null ? null : path.getPathItem(dbref.getId(), dbref.getCollectionName(), (Class<T>) rawType); T object = dbref == null ? null : path.getPathItem(dbref.getId(), dbref.getCollectionName(), (Class<T>) rawType);
return object != null ? object : readAndConvertDBRef(dbref, type, path, rawType); return object != null ? object : readAndConvertDBRef(dbref, type, path, rawType);
} }
private <T> T readAndConvertDBRef(DBRef dbref, TypeInformation<?> type, ObjectPath path, final Class<?> rawType) { @Nullable
private <T> T readAndConvertDBRef(@Nullable DBRef dbref, TypeInformation<?> type, ObjectPath path,
final Class<?> rawType) {
List<T> result = bulkReadAndConvertDBRefs(Collections.singletonList(dbref), type, path, rawType); List<T> result = bulkReadAndConvertDBRefs(Collections.singletonList(dbref), type, path, rawType);
return CollectionUtils.isEmpty(result) ? null : result.iterator().next(); return CollectionUtils.isEmpty(result) ? null : result.iterator().next();
@@ -1414,7 +1458,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
} }
List<Document> referencedRawDocuments = dbrefs.size() == 1 List<Document> referencedRawDocuments = dbrefs.size() == 1
? Collections.singletonList(readRef(dbrefs.iterator().next())) : bulkReadRefs(dbrefs); ? Collections.singletonList(readRef(dbrefs.iterator().next()))
: bulkReadRefs(dbrefs);
String collectionName = dbrefs.iterator().next().getCollectionName(); String collectionName = dbrefs.iterator().next().getCollectionName();
List<T> targeList = new ArrayList<>(dbrefs.size()); List<T> targeList = new ArrayList<>(dbrefs.size());
@@ -1474,7 +1519,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param source must not be {@literal null}. * @param source must not be {@literal null}.
* @return * @return
*/ */
private static boolean isCollectionOfDbRefWhereBulkFetchIsPossible(Iterable<Object> source) { private static boolean isCollectionOfDbRefWhereBulkFetchIsPossible(Iterable<?> source) {
Assert.notNull(source, "Iterable of DBRefs must not be null!"); Assert.notNull(source, "Iterable of DBRefs must not be null!");
@@ -1506,4 +1551,14 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
static class NestedDocument { static class NestedDocument {
} }
enum NoOpParameterValueProvider implements ParameterValueProvider<MongoPersistentProperty> {
INSTANCE;
@Override
public <T> T getParameterValue(Parameter<T, MongoPersistentProperty> parameter) {
return null;
}
}
} }

View File

@@ -1,44 +1,44 @@
/* /*
* Copyright 2010-2016 the original author or authors. * Copyright 2010-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* http://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.springframework.data.mongodb.core.convert; package org.springframework.data.mongodb.core.convert;
import org.bson.Document; import org.bson.Document;
import org.bson.conversions.Bson; import org.bson.conversions.Bson;
import org.springframework.data.convert.EntityConverter; import org.springframework.data.convert.EntityConverter;
import org.springframework.data.convert.EntityReader; import org.springframework.data.convert.EntityReader;
import org.springframework.data.convert.TypeMapper; import org.springframework.data.convert.TypeMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity; import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty; import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
/** /**
* Central Mongo specific converter interface which combines {@link MongoWriter} and {@link MongoReader}. * Central Mongo specific converter interface which combines {@link MongoWriter} and {@link MongoReader}.
* *
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
* @author Christoph Strobl * @author Christoph Strobl
*/ */
public interface MongoConverter public interface MongoConverter
extends EntityConverter<MongoPersistentEntity<?>, MongoPersistentProperty, Object, Bson>, MongoWriter<Object>, extends EntityConverter<MongoPersistentEntity<?>, MongoPersistentProperty, Object, Bson>, MongoWriter<Object>,
EntityReader<Object, Bson> { EntityReader<Object, Bson> {
/** /**
* Returns thw {@link TypeMapper} being used to write type information into {@link Document}s created with that * Returns thw {@link TypeMapper} being used to write type information into {@link Document}s created with that
* converter. * converter.
* *
* @return will never be {@literal null}. * @return will never be {@literal null}.
*/ */
MongoTypeMapper getTypeMapper(); MongoTypeMapper getTypeMapper();
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011-2016 the original author or authors. * Copyright 2011-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -15,8 +15,6 @@
*/ */
package org.springframework.data.mongodb.core.convert; package org.springframework.data.mongodb.core.convert;
import reactor.core.publisher.Flux;
import java.math.BigDecimal; import java.math.BigDecimal;
import java.math.BigInteger; import java.math.BigInteger;
import java.net.MalformedURLException; import java.net.MalformedURLException;
@@ -29,9 +27,9 @@ import java.util.concurrent.atomic.AtomicInteger;
import java.util.concurrent.atomic.AtomicLong; import java.util.concurrent.atomic.AtomicLong;
import org.bson.Document; import org.bson.Document;
import org.bson.types.Binary;
import org.bson.types.Code; import org.bson.types.Code;
import org.bson.types.ObjectId; import org.bson.types.ObjectId;
import org.reactivestreams.Publisher;
import org.springframework.core.convert.ConversionFailedException; import org.springframework.core.convert.ConversionFailedException;
import org.springframework.core.convert.TypeDescriptor; import org.springframework.core.convert.TypeDescriptor;
import org.springframework.core.convert.converter.ConditionalConverter; import org.springframework.core.convert.converter.ConditionalConverter;
@@ -41,6 +39,7 @@ import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.WritingConverter; import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mongodb.core.query.Term; import org.springframework.data.mongodb.core.query.Term;
import org.springframework.data.mongodb.core.script.NamedMongoScript; import org.springframework.data.mongodb.core.script.NamedMongoScript;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.NumberUtils; import org.springframework.util.NumberUtils;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
@@ -66,9 +65,9 @@ abstract class MongoConverters {
* @return * @return
* @since 1.9 * @since 1.9
*/ */
public static Collection<Object> getConvertersToRegister() { static Collection<Object> getConvertersToRegister() {
List<Object> converters = new ArrayList<Object>(); List<Object> converters = new ArrayList<>();
converters.add(BigDecimalToStringConverter.INSTANCE); converters.add(BigDecimalToStringConverter.INSTANCE);
converters.add(StringToBigDecimalConverter.INSTANCE); converters.add(StringToBigDecimalConverter.INSTANCE);
@@ -86,6 +85,7 @@ abstract class MongoConverters {
converters.add(AtomicLongToLongConverter.INSTANCE); converters.add(AtomicLongToLongConverter.INSTANCE);
converters.add(LongToAtomicLongConverter.INSTANCE); converters.add(LongToAtomicLongConverter.INSTANCE);
converters.add(IntegerToAtomicIntegerConverter.INSTANCE); converters.add(IntegerToAtomicIntegerConverter.INSTANCE);
converters.add(BinaryToByteArrayConverter.INSTANCE);
return converters; return converters;
} }
@@ -95,7 +95,7 @@ abstract class MongoConverters {
* *
* @author Oliver Gierke * @author Oliver Gierke
*/ */
public static enum ObjectIdToStringConverter implements Converter<ObjectId, String> { enum ObjectIdToStringConverter implements Converter<ObjectId, String> {
INSTANCE; INSTANCE;
public String convert(ObjectId id) { public String convert(ObjectId id) {
@@ -108,7 +108,7 @@ abstract class MongoConverters {
* *
* @author Oliver Gierke * @author Oliver Gierke
*/ */
public static enum StringToObjectIdConverter implements Converter<String, ObjectId> { enum StringToObjectIdConverter implements Converter<String, ObjectId> {
INSTANCE; INSTANCE;
public ObjectId convert(String source) { public ObjectId convert(String source) {
@@ -121,7 +121,7 @@ abstract class MongoConverters {
* *
* @author Oliver Gierke * @author Oliver Gierke
*/ */
public static enum ObjectIdToBigIntegerConverter implements Converter<ObjectId, BigInteger> { enum ObjectIdToBigIntegerConverter implements Converter<ObjectId, BigInteger> {
INSTANCE; INSTANCE;
public BigInteger convert(ObjectId source) { public BigInteger convert(ObjectId source) {
@@ -134,7 +134,7 @@ abstract class MongoConverters {
* *
* @author Oliver Gierke * @author Oliver Gierke
*/ */
public static enum BigIntegerToObjectIdConverter implements Converter<BigInteger, ObjectId> { enum BigIntegerToObjectIdConverter implements Converter<BigInteger, ObjectId> {
INSTANCE; INSTANCE;
public ObjectId convert(BigInteger source) { public ObjectId convert(BigInteger source) {
@@ -142,7 +142,7 @@ abstract class MongoConverters {
} }
} }
public static enum BigDecimalToStringConverter implements Converter<BigDecimal, String> { enum BigDecimalToStringConverter implements Converter<BigDecimal, String> {
INSTANCE; INSTANCE;
public String convert(BigDecimal source) { public String convert(BigDecimal source) {
@@ -150,7 +150,7 @@ abstract class MongoConverters {
} }
} }
public static enum StringToBigDecimalConverter implements Converter<String, BigDecimal> { enum StringToBigDecimalConverter implements Converter<String, BigDecimal> {
INSTANCE; INSTANCE;
public BigDecimal convert(String source) { public BigDecimal convert(String source) {
@@ -158,7 +158,7 @@ abstract class MongoConverters {
} }
} }
public static enum BigIntegerToStringConverter implements Converter<BigInteger, String> { enum BigIntegerToStringConverter implements Converter<BigInteger, String> {
INSTANCE; INSTANCE;
public String convert(BigInteger source) { public String convert(BigInteger source) {
@@ -166,7 +166,7 @@ abstract class MongoConverters {
} }
} }
public static enum StringToBigIntegerConverter implements Converter<String, BigInteger> { enum StringToBigIntegerConverter implements Converter<String, BigInteger> {
INSTANCE; INSTANCE;
public BigInteger convert(String source) { public BigInteger convert(String source) {
@@ -174,7 +174,7 @@ abstract class MongoConverters {
} }
} }
public static enum URLToStringConverter implements Converter<URL, String> { enum URLToStringConverter implements Converter<URL, String> {
INSTANCE; INSTANCE;
public String convert(URL source) { public String convert(URL source) {
@@ -182,7 +182,7 @@ abstract class MongoConverters {
} }
} }
public static enum StringToURLConverter implements Converter<String, URL> { enum StringToURLConverter implements Converter<String, URL> {
INSTANCE; INSTANCE;
private static final TypeDescriptor SOURCE = TypeDescriptor.valueOf(String.class); private static final TypeDescriptor SOURCE = TypeDescriptor.valueOf(String.class);
@@ -199,7 +199,7 @@ abstract class MongoConverters {
} }
@ReadingConverter @ReadingConverter
public static enum DocumentToStringConverter implements Converter<Document, String> { enum DocumentToStringConverter implements Converter<Document, String> {
INSTANCE; INSTANCE;
@@ -219,7 +219,7 @@ abstract class MongoConverters {
* @since 1.6 * @since 1.6
*/ */
@WritingConverter @WritingConverter
public static enum TermToStringConverter implements Converter<Term, String> { enum TermToStringConverter implements Converter<Term, String> {
INSTANCE; INSTANCE;
@@ -233,7 +233,7 @@ abstract class MongoConverters {
* @author Christoph Strobl * @author Christoph Strobl
* @since 1.7 * @since 1.7
*/ */
public static enum DocumentToNamedMongoScriptConverter implements Converter<Document, NamedMongoScript> { enum DocumentToNamedMongoScriptConverter implements Converter<Document, NamedMongoScript> {
INSTANCE; INSTANCE;
@@ -255,7 +255,7 @@ abstract class MongoConverters {
* @author Christoph Strobl * @author Christoph Strobl
* @since 1.7 * @since 1.7
*/ */
public static enum NamedMongoScriptToDocumentConverter implements Converter<NamedMongoScript, Document> { enum NamedMongoScriptToDocumentConverter implements Converter<NamedMongoScript, Document> {
INSTANCE; INSTANCE;
@@ -282,7 +282,7 @@ abstract class MongoConverters {
* @since 1.9 * @since 1.9
*/ */
@WritingConverter @WritingConverter
public static enum CurrencyToStringConverter implements Converter<Currency, String> { enum CurrencyToStringConverter implements Converter<Currency, String> {
INSTANCE; INSTANCE;
@@ -303,7 +303,7 @@ abstract class MongoConverters {
* @since 1.9 * @since 1.9
*/ */
@ReadingConverter @ReadingConverter
public static enum StringToCurrencyConverter implements Converter<String, Currency> { enum StringToCurrencyConverter implements Converter<String, Currency> {
INSTANCE; INSTANCE;
@@ -326,7 +326,7 @@ abstract class MongoConverters {
* @since 1.9 * @since 1.9
*/ */
@WritingConverter @WritingConverter
public static enum NumberToNumberConverterFactory implements ConverterFactory<Number, Number>, ConditionalConverter { enum NumberToNumberConverterFactory implements ConverterFactory<Number, Number>, ConditionalConverter {
INSTANCE; INSTANCE;
@@ -391,7 +391,7 @@ abstract class MongoConverters {
* @since 1.10 * @since 1.10
*/ */
@WritingConverter @WritingConverter
public static enum AtomicLongToLongConverter implements Converter<AtomicLong, Long> { enum AtomicLongToLongConverter implements Converter<AtomicLong, Long> {
INSTANCE; INSTANCE;
@Override @Override
@@ -407,7 +407,7 @@ abstract class MongoConverters {
* @since 1.10 * @since 1.10
*/ */
@WritingConverter @WritingConverter
public static enum AtomicIntegerToIntegerConverter implements Converter<AtomicInteger, Integer> { enum AtomicIntegerToIntegerConverter implements Converter<AtomicInteger, Integer> {
INSTANCE; INSTANCE;
@Override @Override
@@ -423,7 +423,7 @@ abstract class MongoConverters {
* @since 1.10 * @since 1.10
*/ */
@ReadingConverter @ReadingConverter
public static enum LongToAtomicLongConverter implements Converter<Long, AtomicLong> { enum LongToAtomicLongConverter implements Converter<Long, AtomicLong> {
INSTANCE; INSTANCE;
@Override @Override
@@ -439,7 +439,7 @@ abstract class MongoConverters {
* @since 1.10 * @since 1.10
*/ */
@ReadingConverter @ReadingConverter
public static enum IntegerToAtomicIntegerConverter implements Converter<Integer, AtomicInteger> { enum IntegerToAtomicIntegerConverter implements Converter<Integer, AtomicInteger> {
INSTANCE; INSTANCE;
@Override @Override
@@ -447,4 +447,22 @@ abstract class MongoConverters {
return source != null ? new AtomicInteger(source) : null; return source != null ? new AtomicInteger(source) : null;
} }
} }
/**
* {@link Converter} implementation converting {@link Binary} into {@code byte[]}.
*
* @author Christoph Strobl
* @since 2.0.1
*/
@ReadingConverter
enum BinaryToByteArrayConverter implements Converter<Binary, byte[]> {
INSTANCE;
@Nullable
@Override
public byte[] convert(Binary source) {
return source.getData();
}
}
} }

View File

@@ -28,6 +28,7 @@ import org.springframework.core.convert.converter.GenericConverter;
import org.springframework.data.convert.JodaTimeConverters; import org.springframework.data.convert.JodaTimeConverters;
import org.springframework.data.convert.WritingConverter; import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes; import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.lang.Nullable;
/** /**
* Value object to capture custom conversion. {@link MongoCustomConversions} also act as factory for * Value object to capture custom conversion. {@link MongoCustomConversions} also act as factory for
@@ -94,8 +95,8 @@ public class MongoCustomConversions extends org.springframework.data.convert.Cus
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.core.convert.converter.GenericConverter#convert(java.lang.Object, org.springframework.core.convert.TypeDescriptor, org.springframework.core.convert.TypeDescriptor) * @see org.springframework.core.convert.converter.GenericConverter#convert(java.lang.Object, org.springframework.core.convert.TypeDescriptor, org.springframework.core.convert.TypeDescriptor)
*/ */
public Object convert(Object source, TypeDescriptor sourceType, TypeDescriptor targetType) { public Object convert(@Nullable Object source, TypeDescriptor sourceType, TypeDescriptor targetType) {
return source.toString(); return source != null ? source.toString() : null;
} }
} }
} }

View File

@@ -39,23 +39,35 @@ import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.query.MongoRegexCreator; import org.springframework.data.mongodb.core.query.MongoRegexCreator;
import org.springframework.data.mongodb.core.query.MongoRegexCreator.MatchMode; import org.springframework.data.mongodb.core.query.MongoRegexCreator.MatchMode;
import org.springframework.data.mongodb.core.query.SerializationUtils; import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.data.mongodb.core.query.UntypedExampleMatcher;
import org.springframework.data.support.ExampleMatcherAccessor; import org.springframework.data.support.ExampleMatcherAccessor;
import org.springframework.data.util.TypeInformation; import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.ObjectUtils; import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
/** /**
* Mapper from {@link Example} to a query {@link Document}.
*
* @author Christoph Strobl * @author Christoph Strobl
* @author Mark Paluch * @author Mark Paluch
* @author Jens Schauder * @author Jens Schauder
* @since 1.8 * @since 1.8
* @see Example
* @see org.springframework.data.domain.ExampleMatcher
* @see UntypedExampleMatcher
*/ */
public class MongoExampleMapper { public class MongoExampleMapper {
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext; private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final MongoConverter converter; private final MongoConverter converter;
/**
* Create a new {@link MongoTypeMapper} given {@link MongoConverter}.
*
* @param converter must not be {@literal null}.
*/
public MongoExampleMapper(MongoConverter converter) { public MongoExampleMapper(MongoConverter converter) {
this.converter = converter; this.converter = converter;
@@ -91,7 +103,7 @@ public class MongoExampleMapper {
Document reference = (Document) converter.convertToMongoType(example.getProbe()); Document reference = (Document) converter.convertToMongoType(example.getProbe());
if (entity.getIdProperty() != null) { if (entity.getIdProperty() != null && ClassUtils.isAssignable(entity.getType(), example.getProbeType())) {
Object identifier = entity.getIdentifierAccessor(example.getProbe()).getIdentifier(); Object identifier = entity.getIdentifierAccessor(example.getProbe()).getIdentifier();
if (identifier == null) { if (identifier == null) {
@@ -107,85 +119,17 @@ public class MongoExampleMapper {
: new Document(SerializationUtils.flattenMap(reference)); : new Document(SerializationUtils.flattenMap(reference));
Document result = example.getMatcher().isAllMatching() ? flattened : orConcatenate(flattened); Document result = example.getMatcher().isAllMatching() ? flattened : orConcatenate(flattened);
this.converter.getTypeMapper().writeTypeRestrictions(result, getTypesToMatch(example)); return updateTypeRestrictions(result, example);
return result;
}
private static Document orConcatenate(Document source) {
List<Document> foo = new ArrayList<Document>(source.keySet().size());
for (String key : source.keySet()) {
foo.add(new Document(key, source.get(key)));
}
return new Document("$or", foo);
}
private Set<Class<?>> getTypesToMatch(Example<?> example) {
Set<Class<?>> types = new HashSet<Class<?>>();
for (TypeInformation<?> reference : mappingContext.getManagedTypes()) {
if (example.getProbeType().isAssignableFrom(reference.getType())) {
types.add(reference.getType());
}
}
return types;
}
private String getMappedPropertyPath(String path, Class<?> probeType) {
MongoPersistentEntity<?> entity = mappingContext.getRequiredPersistentEntity(probeType);
Iterator<String> parts = Arrays.asList(path.split("\\.")).iterator();
final Stack<MongoPersistentProperty> stack = new Stack<MongoPersistentProperty>();
List<String> resultParts = new ArrayList<String>();
while (parts.hasNext()) {
String part = parts.next();
MongoPersistentProperty prop = entity.getPersistentProperty(part);
if (prop == null) {
entity.doWithProperties((PropertyHandler<MongoPersistentProperty>) property -> {
if (property.getFieldName().equals(part)) {
stack.push(property);
}
});
if (stack.isEmpty()) {
return "";
}
prop = stack.pop();
}
resultParts.add(prop.getName());
if (prop.isEntity() && mappingContext.hasPersistentEntityFor(prop.getActualType())) {
entity = mappingContext.getRequiredPersistentEntity(prop.getActualType());
} else {
break;
}
}
return StringUtils.collectionToDelimitedString(resultParts, ".");
} }
private void applyPropertySpecs(String path, Document source, Class<?> probeType, private void applyPropertySpecs(String path, Document source, Class<?> probeType,
ExampleMatcherAccessor exampleSpecAccessor) { ExampleMatcherAccessor exampleSpecAccessor) {
if (!(source instanceof Document)) { if (source == null) {
return; return;
} }
Iterator<Map.Entry<String, Object>> iter = ((Document) source).entrySet().iterator(); Iterator<Map.Entry<String, Object>> iter = source.entrySet().iterator();
while (iter.hasNext()) { while (iter.hasNext()) {
@@ -237,11 +181,106 @@ public class MongoExampleMapper {
} }
} }
private boolean isEmptyIdProperty(Entry<String, Object> entry) { private String getMappedPropertyPath(String path, Class<?> probeType) {
MongoPersistentEntity<?> entity = mappingContext.getRequiredPersistentEntity(probeType);
Iterator<String> parts = Arrays.asList(path.split("\\.")).iterator();
final Stack<MongoPersistentProperty> stack = new Stack<>();
List<String> resultParts = new ArrayList<>();
while (parts.hasNext()) {
String part = parts.next();
MongoPersistentProperty prop = entity.getPersistentProperty(part);
if (prop == null) {
entity.doWithProperties((PropertyHandler<MongoPersistentProperty>) property -> {
if (property.getFieldName().equals(part)) {
stack.push(property);
}
});
if (stack.isEmpty()) {
return "";
}
prop = stack.pop();
}
resultParts.add(prop.getName());
if (prop.isEntity() && mappingContext.hasPersistentEntityFor(prop.getActualType())) {
entity = mappingContext.getRequiredPersistentEntity(prop.getActualType());
} else {
break;
}
}
return StringUtils.collectionToDelimitedString(resultParts, ".");
}
private Document updateTypeRestrictions(Document query, Example example) {
Document result = new Document();
if (isTypeRestricting(example)) {
result.putAll(query);
this.converter.getTypeMapper().writeTypeRestrictions(result, getTypesToMatch(example));
return result;
}
for (Map.Entry<String, Object> entry : query.entrySet()) {
if (!this.converter.getTypeMapper().isTypeKey(entry.getKey())) {
result.put(entry.getKey(), entry.getValue());
}
}
return result;
}
private boolean isTypeRestricting(Example example) {
if (example.getMatcher() instanceof UntypedExampleMatcher) {
return false;
}
if (example.getMatcher().getIgnoredPaths().isEmpty()) {
return true;
}
for (String path : example.getMatcher().getIgnoredPaths()) {
if (this.converter.getTypeMapper().isTypeKey(path)) {
return false;
}
}
return true;
}
private Set<Class<?>> getTypesToMatch(Example<?> example) {
Set<Class<?>> types = new HashSet<>();
for (TypeInformation<?> reference : mappingContext.getManagedTypes()) {
if (example.getProbeType().isAssignableFrom(reference.getType())) {
types.add(reference.getType());
}
}
return types;
}
private static boolean isEmptyIdProperty(Entry<String, Object> entry) {
return entry.getKey().equals("_id") && entry.getValue() == null || entry.getValue().equals(Optional.empty()); return entry.getKey().equals("_id") && entry.getValue() == null || entry.getValue().equals(Optional.empty());
} }
private void applyStringMatcher(Map.Entry<String, Object> entry, StringMatcher stringMatcher, boolean ignoreCase) { private static void applyStringMatcher(Map.Entry<String, Object> entry, StringMatcher stringMatcher,
boolean ignoreCase) {
Document document = new Document(); Document document = new Document();
@@ -264,9 +303,20 @@ public class MongoExampleMapper {
} }
} }
private static Document orConcatenate(Document source) {
List<Document> or = new ArrayList<>(source.keySet().size());
for (String key : source.keySet()) {
or.add(new Document(key, source.get(key)));
}
return new Document("$or", or);
}
/** /**
* Return the {@link MatchMode} for the given {@link StringMatcher}. * Return the {@link MatchMode} for the given {@link StringMatcher}.
* *
* @param matcher must not be {@literal null}. * @param matcher must not be {@literal null}.
* @return * @return
*/ */

View File

@@ -1,64 +1,69 @@
/* /*
* Copyright 2010-2016 the original author or authors. * Copyright 2010-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* http://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.springframework.data.mongodb.core.convert; package org.springframework.data.mongodb.core.convert;
import org.bson.conversions.Bson; import org.bson.conversions.Bson;
import org.springframework.data.convert.EntityWriter; import org.springframework.data.convert.EntityWriter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty; import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.util.TypeInformation; import org.springframework.data.util.TypeInformation;
import org.springframework.lang.Nullable;
import com.mongodb.DBRef;
import com.mongodb.DBRef;
/**
* A MongoWriter is responsible for converting an object of type T to the native MongoDB representation Document. /**
* * A MongoWriter is responsible for converting an object of type T to the native MongoDB representation Document.
* @param <T> the type of the object to convert to a Document *
* @author Mark Pollack * @param <T> the type of the object to convert to a Document
* @author Thomas Risberg * @author Mark Pollack
* @author Oliver Gierke * @author Thomas Risberg
* @author Christoph Strobl * @author Oliver Gierke
*/ * @author Christoph Strobl
public interface MongoWriter<T> extends EntityWriter<T, Bson> { */
public interface MongoWriter<T> extends EntityWriter<T, Bson> {
/**
* Converts the given object into one Mongo will be able to store natively. If the given object can already be stored /**
* as is, no conversion will happen. * Converts the given object into one Mongo will be able to store natively. If the given object can already be stored
* * as is, no conversion will happen.
* @param obj can be {@literal null}. *
* @return * @param obj can be {@literal null}.
*/ * @return
Object convertToMongoType(Object obj); */
@Nullable
/** default Object convertToMongoType(@Nullable Object obj) {
* Converts the given object into one Mongo will be able to store natively but retains the type information in case return convertToMongoType(obj, null);
* the given {@link TypeInformation} differs from the given object type. }
*
* @param obj can be {@literal null}. /**
* @param typeInformation can be {@literal null}. * Converts the given object into one Mongo will be able to store natively but retains the type information in case
* @return * the given {@link TypeInformation} differs from the given object type.
*/ *
Object convertToMongoType(Object obj, TypeInformation<?> typeInformation); * @param obj can be {@literal null}.
* @param typeInformation can be {@literal null}.
/** * @return
* Creates a {@link DBRef} to refer to the given object. */
* @Nullable
* @param object the object to create a {@link DBRef} to link to. The object's type has to carry an id attribute. Object convertToMongoType(@Nullable Object obj, @Nullable TypeInformation<?> typeInformation);
* @param referingProperty the client-side property referring to the object which might carry additional metadata for
* the {@link DBRef} object to create. Can be {@literal null}. /**
* @return will never be {@literal null}. * Creates a {@link DBRef} to refer to the given object.
*/ *
DBRef toDBRef(Object object, MongoPersistentProperty referingProperty); * @param object the object to create a {@link DBRef} to link to. The object's type has to carry an id attribute.
} * @param referingProperty the client-side property referring to the object which might carry additional metadata for
* the {@link DBRef} object to create. Can be {@literal null}.
* @return will never be {@literal null}.
*/
DBRef toDBRef(Object object, @Nullable MongoPersistentProperty referingProperty);
}

Some files were not shown because too many files have changed in this diff Show More