Compare commits

...

232 Commits

Author SHA1 Message Date
Oliver Drotbohm
6b4432a771 DATAMONGO-2186 - Release version 2.0.14 (Kay SR14). 2019-04-01 19:05:43 +02:00
Oliver Drotbohm
ddb56ab974 DATAMONGO-2186 - Prepare 2.0.14 (Kay SR14). 2019-04-01 19:05:14 +02:00
Oliver Drotbohm
79c15341f3 DATAMONGO-2186 - Updated changelog. 2019-04-01 19:05:09 +02:00
Oliver Drotbohm
a1200bb096 DATAMONGO-2243 - Updated changelog. 2019-04-01 18:52:22 +02:00
Oliver Drotbohm
4f4951d94c DATAMONGO-2185 - Updated changelog. 2019-04-01 13:54:20 +02:00
Spring Operator
3cffabaa8a DATAMONGO-2231 - URL Cleanup.
This commit updates URLs to prefer the https protocol. Redirects are not followed to avoid accidentally expanding intentionally shortened URLs (i.e. if using a URL shortener).

# Fixed URLs

## Fixed Success
These URLs were switched to an https URL with a 2xx status. While the status was successful, your review is still recommended.

* [ ] http://www.apache.org/licenses/ with 1 occurrences migrated to:
  https://www.apache.org/licenses/ ([https](https://www.apache.org/licenses/) result 200).
* [ ] http://www.apache.org/licenses/LICENSE-2.0 with 701 occurrences migrated to:
  https://www.apache.org/licenses/LICENSE-2.0 ([https](https://www.apache.org/licenses/LICENSE-2.0) result 200).

Original Pull Request: #699
2019-03-22 09:55:23 +01:00
Spring Operator
fa70a76980 DATAMONGO-2231 - URL Cleanup.
This commit updates URLs to prefer the https protocol. Redirects are not followed to avoid accidentally expanding intentionally shortened URLs (i.e. if using a URL shortener).

# Fixed URLs

## Fixed Success
These URLs were switched to an https URL with a 2xx status. While the status was successful, your review is still recommended.

* http://maven.apache.org/xsd/maven-4.0.0.xsd with 3 occurrences migrated to:
  https://maven.apache.org/xsd/maven-4.0.0.xsd ([https](https://maven.apache.org/xsd/maven-4.0.0.xsd) result 200).
* http://www.gopivotal.com (302) with 6 occurrences migrated to:
  https://pivotal.io ([https](https://www.gopivotal.com) result 200).
* http://maven.apache.org/maven-v4_0_0.xsd with 2 occurrences migrated to:
  https://maven.apache.org/maven-v4_0_0.xsd ([https](https://maven.apache.org/maven-v4_0_0.xsd) result 301).
* http://projects.spring.io/spring-data-mongodb with 1 occurrences migrated to:
  https://projects.spring.io/spring-data-mongodb ([https](https://projects.spring.io/spring-data-mongodb) result 301).
* http://www.pivotal.io with 1 occurrences migrated to:
  https://www.pivotal.io ([https](https://www.pivotal.io) result 301).

# Ignored
These URLs were intentionally ignored.

* http://maven.apache.org/POM/4.0.0 with 10 occurrences
* http://www.w3.org/2001/XMLSchema-instance with 5 occurrences

Original Pull Request: #665
2019-03-19 12:55:08 +01:00
Christoph Strobl
06c77426e7 DATAMONGO-2225 - Fix potential NPE in MongoExampleMapper. 2019-03-18 13:51:45 +01:00
Christoph Strobl
8be58a1f49 DATAMONGO-2164 - Updated changelog. 2019-03-07 10:30:16 +01:00
Mark Paluch
4a141a251f DATAMONGO-2219 - Fix ReactiveMongoTemplate.findAllAndRemove(…) if the query yields no results.
ReactiveMongoTemplate.findAllAndRemove(…) now completes successfully without emitting a result if the find query yields no hits. We no longer call the subsequent remove query if without previous results.

Original Pull Request: #657
2019-03-05 13:16:31 +01:00
Mark Paluch
7d642a3b0b DATAMONGO-2187 - Updated changelog. 2019-02-13 11:47:57 +01:00
Mark Paluch
1efa5ffc18 DATAMONGO-2145 - Updated changelog. 2019-01-10 14:15:41 +01:00
Mark Paluch
f17b699d25 DATAMONGO-2144 - After release cleanups. 2019-01-10 12:03:26 +01:00
Mark Paluch
4320baf636 DATAMONGO-2144 - Prepare next development iteration. 2019-01-10 12:03:24 +01:00
Mark Paluch
ea0502bf0a DATAMONGO-2144 - Release version 2.0.13 (Kay SR13). 2019-01-10 11:10:22 +01:00
Mark Paluch
0eadb05c0d DATAMONGO-2144 - Prepare 2.0.13 (Kay SR13). 2019-01-10 11:09:23 +01:00
Mark Paluch
e160240f46 DATAMONGO-2144 - Updated changelog. 2019-01-10 11:09:17 +01:00
Mark Paluch
6250c95af6 DATAMONGO-2143 - Updated changelog. 2019-01-10 11:01:23 +01:00
Christoph Strobl
0324ae6706 DATAMONGO-2174 - Fix InvalidPersistentPropertyPath exception when updating documents.
MetadataBackedField.getPath() now returns null instead throwing an error for fields that are not part of the domain model. This allows adding any field when updating an entity.

Original pull request: #633.
2019-01-09 10:34:56 +01:00
Mark Paluch
4b6a058335 DATAMONGO-2170 - Polishing.
Use ObjectUtils to compote hash code as hash code implementation contained artifacts that do not belong there. Extract test method.

Original pull request: #629.
2019-01-07 13:11:39 +01:00
Christoph Strobl
acdee76ff3 DATAMONGO-2170 - Return null instead of empty string for IndexInfo#getPartialFilterExpression when not set.
We now return null instead of an empty string when calling IndexInfo#getPartialFilterExpression. The method has been marked to return null vales before and we’re complying to that contract and return value expectation.

Original pull request: #629.
2019-01-07 13:11:09 +01:00
Mark Paluch
7240823a19 DATAMONGO-2175 - Update copyright years to 2019. 2019-01-02 14:12:38 +01:00
Christoph Strobl
ed7705173c DATAMONGO-2160 - Updated changelog. 2018-12-11 11:43:20 +01:00
Mark Paluch
607730b2fe DATAMONGO-2149 - Polishing.
Add ticket reference to follow-up ticket regarding array matching on partial DBRef expressions.

Related ticket: DATAMONGO-2154
Original pull request: #623.
2018-11-30 14:49:35 +01:00
Christoph Strobl
752544aa64 DATAMONGO-2149 - Fix $slice in fields projection when pointing to array of DBRefs.
We now no longer try to convert the actual slice parameters into a DBRef.

Original pull request: #623.
2018-11-30 14:49:28 +01:00
Mark Paluch
1e4cc2e0e4 DATAMONGO-2121 - Updated changelog. 2018-11-27 14:54:07 +01:00
Mark Paluch
5df59ba852 DATAMONGO-2109 - After release cleanups. 2018-11-27 12:16:34 +01:00
Mark Paluch
3d683f6f02 DATAMONGO-2109 - Prepare next development iteration. 2018-11-27 12:16:32 +01:00
Mark Paluch
2b47f44531 DATAMONGO-2109 - Release version 2.0.12 (Kay SR12). 2018-11-27 11:45:49 +01:00
Mark Paluch
79534fa426 DATAMONGO-2109 - Prepare 2.0.12 (Kay SR12). 2018-11-27 11:44:58 +01:00
Mark Paluch
b8e76ecae4 DATAMONGO-2109 - Updated changelog. 2018-11-27 11:44:52 +01:00
Mark Paluch
3e2b060611 DATAMONGO-2110 - Updated changelog. 2018-11-27 11:27:23 +01:00
Mark Paluch
4699219728 DATAMONGO-2119 - Polishing.
Convert anonymous JSON callback class into a private static one. Use an expressive Pattern constant.

Original pull request: #621.
2018-11-23 09:56:24 +01:00
Christoph Strobl
a483d95cde DATAMONGO-2119 - Allow SpEL usage for annotated $regex query.
Original pull request: #621.
2018-11-23 09:56:24 +01:00
Oliver Drotbohm
84f35f5655 DATAMONGO-2135 - Polishing. 2018-11-15 15:30:41 +01:00
Oliver Drotbohm
2ec0f93325 DATAMONGO-2135 - Default to intermediate List for properties typed to Collection.
We now defensively create a List rather than a LinkedHashSet (which Spring's CollectionFactory.createCollection(…) defaults to) to make sure we're not accidentally dropping values that are considered equal according to their Java class definition.
2018-11-15 15:30:30 +01:00
Mark Paluch
48f9422a66 DATAMONGO-2107 - Updated changelog. 2018-10-29 14:30:33 +01:00
Mark Paluch
a2349405af DATAMONGO-2118 - Polishing.
Fix typo in reactive repositories reference documentation.

Original pull request: #611.
2018-10-26 10:08:07 +02:00
Mona Mohamadinia
e12ab354f7 DATAMONGO-2118 - Fix typo in repositories reference documentation.
Original pull request: #611.
2018-10-26 10:08:07 +02:00
Mark Paluch
04b20fa9c0 DATAMONGO-2098 - Polishing.
Annotate methods and parameters with Nullable. Use diamond syntax where appropriate.

Original pull request: #612.
2018-10-25 15:35:31 +02:00
Zied Yaich
81c46f04d6 DATAMONGO-2098 - Fix typo in MappingMongoConverterParser method.
Original pull request: #612.
2018-10-25 15:35:31 +02:00
Mark Paluch
ce905c80fe DATAMONGO-2083 - Updated changelog. 2018-10-15 14:19:05 +02:00
Mark Paluch
831e4f9ef1 DATAMONGO-2084 - After release cleanups. 2018-10-15 12:28:13 +02:00
Mark Paluch
5ce293a871 DATAMONGO-2084 - Prepare next development iteration. 2018-10-15 12:28:12 +02:00
Mark Paluch
bcd61f0dae DATAMONGO-2084 - Release version 2.0.11 (Kay SR11). 2018-10-15 12:00:38 +02:00
Mark Paluch
478594c3ca DATAMONGO-2084 - Prepare 2.0.11 (Kay SR11). 2018-10-15 11:59:39 +02:00
Mark Paluch
bb101d5e18 DATAMONGO-2084 - Updated changelog. 2018-10-15 11:59:31 +02:00
Mark Paluch
a5bc7a2a08 DATAMONGO-2094 - Updated changelog. 2018-10-15 11:37:27 +02:00
Christoph Strobl
6720967e19 DATAMONGO-2101 - Fix DBObject to GeoJson conversion.
Querydsl still wraps MongoDB data in DBObject which causes trouble with the registered converters that deal with Document to entity conversion. Therefore we now try to extract the argument map from the DBObject transferring it to Document in order to have the converters kick in where applicable.

Original pull request: #614.
2018-10-08 09:40:23 +02:00
Mark Paluch
99a4661e81 DATAMONGO-2096 - Polishing.
Migrate assertions to AssertJ.

Original pull request: #613.
2018-10-05 15:02:45 +02:00
Christoph Strobl
338bc30b96 DATAMONGO-2096 - Fix target field name for GraphLookup aggregation operation.
We now make sure to use the target field name instead of the alias when processing GraphLookupOperation.

Original pull request: #613.
2018-10-05 15:02:45 +02:00
Mark Paluch
7fa3f0068b DATAMONGO-2061 - Updated changelog. 2018-09-21 08:13:15 -04:00
Khaled Baklouti
abc74fdcc6 DATAMONGO-2087 - Fix typo in MongoRepository.
Original Pull Request: #610
2018-09-20 09:18:44 +02:00
Mark Paluch
3a895588c8 DATAMONGO-2086 - Fix Fluent API Kotlin extension generics to allow projections.
We now fixed Kotlin extension generics to properly use projections by ignoring the source type of the Fluent API object. Previously, the source and target type were linked which prevented the use of a different result type.
2018-09-17 13:59:45 +02:00
Mark Paluch
f79d98ce23 DATAMONGO-2034 - After release cleanups. 2018-09-10 13:52:27 +02:00
Mark Paluch
2bcc0d8185 DATAMONGO-2034 - Prepare next development iteration. 2018-09-10 13:52:26 +02:00
Mark Paluch
c8846d3d1c DATAMONGO-2034 - Release version 2.0.10 (Kay SR10). 2018-09-10 12:52:18 +02:00
Mark Paluch
a4835c8fcf DATAMONGO-2034 - Prepare 2.0.10 (Kay SR10). 2018-09-10 12:51:27 +02:00
Mark Paluch
7875c8399f DATAMONGO-2034 - Updated changelog. 2018-09-10 12:51:22 +02:00
Mark Paluch
9046857721 DATAMONGO-2035 - Updated changelog. 2018-09-10 10:20:56 +02:00
Oliver Gierke
e8bb63c9f7 DATAMONGO-2076 - Fixed attribute substitution in reactive MongoDB section.
We now redeclare the Asciidoctor Maven plugin to register the store specific attributes. Apparently they must not contain dots, so we replaced them with dashes.
2018-08-30 11:45:08 +02:00
Oliver Gierke
b431a56a95 DATAMONGO-2076 - Fixed attribute substitution in getting started section. 2018-08-30 09:32:05 +02:00
Oliver Gierke
dc820017e0 DATAMONGO-2033 - Updated changelog. 2018-08-20 11:07:56 +02:00
Oliver Gierke
34ce87b80c DATAMONGO-2046 - Performance improvements in mapping and conversion subsystem.
In MappingMongoConverter, we now avoid the creation of a ParameterValueProvider for parameter-less constructors. We also skip property population if entity can be constructed entirely through constructor creation. Replaced the lambda in MappingMongoConverter.readAndPopulateIdentifier(…) with direct call to ….readIdValue(…). Objectpath now uses decomposed ObjectPathItems to avoid array copying and creation. It now stores a reference to its parent and ObjectPathItem fields are now merged into ObjectPath, which reduces the number of created objects during reads.

Extended CachingMongoPersistentProperty with DBRef caching. Turned key access in DocumentAccessor into an optimistic lookup. DbRefResolverCallbacks are now created lazily.

Related tickets: DATACMNS-1366.
Original pull request: #602.
2018-08-15 16:12:22 +02:00
Mark Paluch
9098d509a5 DATAMONGO-2055 - Polishing.
Move test to UpdateMapperUnitTests.

Original pull request: #600.
2018-08-15 15:59:55 +02:00
Christoph Strobl
861c8279a3 DATAMONGO-2055 - Allow position modifier to be negative using push at position on Update.
Original pull request: #600.
2018-08-15 15:53:59 +02:00
Mark Paluch
e545787e7e DATAMONGO-2050 - Polishing.
Tweak Javadoc.

Original pull request: #596.
2018-08-15 15:18:19 +02:00
Christoph Strobl
38ccdc5dfc DATAMONGO-2050 - Polishing.
Move to AssertJ.

Original pull request: #596.
2018-08-15 15:05:06 +02:00
Christoph Strobl
7a34cc73d8 DATAMONGO-2050 - Allow to specify the index to use for $geoNear aggregation operation.
Original pull request: #596.
2018-08-15 15:05:04 +02:00
Mark Paluch
ba6fa834e5 DATAMONGO-2051 - Polishing.
Use method argument types to avoid false positives with different method signatures.

Original pull request: #597.
Related pull request: #598.
2018-08-14 16:37:32 +02:00
Christoph Strobl
7100cd17be DATAMONGO-2051 - Add support for SCRAM-SHA-256 authentication mechanism to MongoCredentialPropertyEditor.
Original pull request: #597.
Related pull request: #598.
2018-08-14 16:33:39 +02:00
Mark Paluch
7c65472e2d DATAMONGO-2049 - Polishing.
Add static import for assertThat(…).

Original pull request: #594.
2018-08-14 10:51:41 +02:00
Christoph Strobl
f98f586a23 DATAMONGO-2049 - Add support for $ltrim, $rtrim, and $trim.
Original pull request: #594.
2018-08-14 10:51:41 +02:00
Mark Paluch
19b5b6b6f0 DATAMONGO-2048 - Polishing.
Javadoc tweaks.

Original pull request: #595.
2018-08-13 16:00:40 +02:00
Christoph Strobl
b9ffa9b89d DATAMONGO-2048 - Add support for MongoDB 4.0 $convert aggregation operator.
We now support the following type conversion aggregation operators:

* $convert
* $toBool
* $toDate
* $toDecimal
* $toDouble
* $toInt
* $toLong
* $toObjectId
* $toString

Original pull request: #595.
2018-08-13 16:00:40 +02:00
Mark Paluch
3ba589072f DATAMONGO-2047 - Polishing.
Retain previous options when calling withTimezone(…)/onNull…(…). Add tests. Javadoc.

Original pull request: #593.
2018-08-13 13:27:27 +02:00
Christoph Strobl
e237c5dfc4 DATAMONGO-2047 - Update $dateToString and $dateFromString aggregation operators to match MongoDB 4.0 changes.
We added the format and onNull options to DateFromString and changed format to an optional parameter.

Original pull request: #593.
2018-08-13 13:27:26 +02:00
Mark Paluch
ecb560cdbc DATAMONGO-2045 - Polishing.
Use diamond syntax where possible. Add initial size to HashMap instances with known number of elements. Fix typos in private constant names. Fix duplicate error code ids.

Original pull request: #592.
2018-08-13 10:31:28 +02:00
Mark Paluch
fc4a21775a DATAMONGO-2043 - Polishing.
Slightly tweak Javadoc.

Original pull request: #589.
2018-08-08 11:01:14 +02:00
Christoph Strobl
ae62e70c52 DATAMONGO-2043 - Omit type hint when mapping simple types.
Original pull request: #589.
2018-08-08 11:01:09 +02:00
Christoph Strobl
f83622709d DATAMONGO-2027 - Polishing.
Remove duplicate tests and fix assertions on existing ones. Move tests over to AssertJ and fix output database not applied correctly.

Original Pull Request: #588
2018-08-07 13:37:22 +02:00
Mark Paluch
83d218081c DATAMONGO-2027 - Consider MapReduce output type.
We now consider the output type (collection output) when rendering the MapReduce command. Previously, all output was returned inline without storing the results in the configured collection.

Original Pull Request: #588

# Conflicts:
#	spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/ReactiveMongoTemplate.java
#	spring-data-mongodb/src/test/java/org/springframework/data/mongodb/core/mapreduce/ReactiveMapReduceTests.java
2018-08-07 13:20:46 +02:00
Mark Paluch
70fe406602 DATAMONGO-2006 - Updated changelog. 2018-07-27 11:45:23 +02:00
Mark Paluch
18046e9040 DATAMONGO-2007 - After release cleanups. 2018-07-26 15:23:24 +02:00
Mark Paluch
69310552e3 DATAMONGO-2007 - Prepare next development iteration. 2018-07-26 15:23:22 +02:00
Mark Paluch
b8f093269d DATAMONGO-2007 - Release version 2.0.9 (Kay SR9). 2018-07-26 14:44:00 +02:00
Mark Paluch
172db96fea DATAMONGO-2007 - Prepare 2.0.9 (Kay SR9). 2018-07-26 14:43:06 +02:00
Mark Paluch
c8381c734b DATAMONGO-2007 - Updated changelog. 2018-07-26 14:42:54 +02:00
Mark Paluch
bf82964474 DATAMONGO-1982 - Updated changelog. 2018-07-26 14:03:20 +02:00
Mark Paluch
2d0495874f DATAMONGO-2029 - Encode collections of UUID and byte array query method arguments to their binary form.
We now convert collections that only contain UUID or byte array items to a BSON list that contains the encoded form of these items. Previously, we only converted single UUID and byte arrays into $binary so lists rendered to e.g. $uuid which does not work for queries.

Encoding is now encapsulated in strategy objects that implement the encoding only for their type. This allows to break up the conditional flow and improve organization of responsibilities.
2018-07-25 15:16:15 +02:00
Mark Paluch
82c91cbb71 DATAMONGO-2030 - Reinstantiate existsBy queries for reactive repositories.
We now support existsBy queries for reactive repositories to align with blocking repository support. ExistsBy support got lost during merging and is now back in place.

Extract boolean flag counting into BooleanUtil.
2018-07-23 16:34:12 +02:00
Christoph Strobl
4d309bd7f0 DATAMONGO-2011 - Relax type check when mapping collections.
Original pull request: #587.
2018-07-13 12:55:07 +02:00
Mark Paluch
6f011b0fa1 DATAMONGO-2021 - Polishing.
Adapt getResources(…) to use the file id and no longer the file name when opening a download stream. Add author tag.

Original pull request: #581.
2018-07-06 13:12:36 +02:00
Niklas Helge Hanft
1a3b9e3c42 DATAMONGO-2021 - Use getObjectId() instead of getFilename() for opening the GridFS download stream.
Using the file name leads to duplicate resource streams as file names are not unique therefore we're using the file's ObjectId to lookup the file content.

Original pull request: #581.
2018-07-06 13:12:36 +02:00
Mark Paluch
5a37468103 DATAMONGO-2016 - Polishing.
Fail gracefully if query string parameter has no value. Reformat test. Convert assertions to AssertJ.

Original pull request: #578.
2018-07-04 11:25:38 +02:00
Stephen Tyler Conrad
d4b0963550 DATAMONGO-2016 - Fix username/password extraction in MongoCredentialPropertyEditor.
MongoCredentialPropertyEditor inspects now the connection URI for the appropriate delimiter tokens. Previously, inspection used the char questionmark for username/password delimiter inspection.

Original pull request: #578.
2018-07-04 11:25:35 +02:00
Mark Paluch
468c497525 DATAMONGO-1969 - After release cleanups. 2018-06-13 21:24:35 +02:00
Mark Paluch
4562f39d7a DATAMONGO-1969 - Prepare next development iteration. 2018-06-13 21:24:33 +02:00
Mark Paluch
49957e8c6e DATAMONGO-1969 - Release version 2.0.8 (Kay SR8). 2018-06-13 15:13:01 +02:00
Mark Paluch
b462b35284 DATAMONGO-1969 - Prepare 2.0.8 (Kay SR8). 2018-06-13 15:12:06 +02:00
Mark Paluch
445388bb5f DATAMONGO-1969 - Updated changelog. 2018-06-13 15:12:00 +02:00
Mark Paluch
61e9eac49b DATAMONGO-1967 - Updated changelog. 2018-06-13 15:01:58 +02:00
Mark Paluch
c219f6e7f2 DATAMONGO-2003 - Polishing.
Add nullability annotation to MongoParameterAccessor.getPoint(). Remove superfluous casts.

Convert MongoQueryCreatorUnitTests to user AssertJ assertions.

Original pull request: #570.
2018-06-11 14:19:02 +02:00
Christoph Strobl
1ab130ffca DATAMONGO-2003 - Fix derived query using regex pattern with options.
We now consider regex pattern options when using the pattern as a derived finder argument.

Original pull request: #570.
2018-06-11 14:19:01 +02:00
Oliver Gierke
a4d6a0cf8a DATAMONGO-2002 - Fixed Criteria.equals(…) for usage with Pattern instances.
For Criteria instances that use regular expressions we now properly compare the two Pattern instances produced by also including the pattern flags in the comparison.
2018-06-07 19:12:44 +02:00
Mark Paluch
c28f725f48 DATAMONGO-1979 - Polishing.
Convert ReactiveMongoRepositoryTests to AssertJ. Add missing verifyComplete() steps to StepVerifier.

Original pull request: #566.
2018-06-07 10:06:37 +02:00
Mark Paluch
a71f50f15c DATAMONGO-1998 - Polishing.
Switch id field name check to equals or to match the last property path segment.

Original pull request: #567.
2018-06-06 11:35:22 +02:00
Christoph Strobl
0ad715f806 DATAMONGO-1998 - Fix Querydsl id handling for nested property references using ObjectId hex String representation.
We now follow the conversion rules for id properties with a valid ObjectId representation when parsing Querydsl queries.

Original pull request: #567.
2018-06-06 11:35:22 +02:00
Mark Paluch
ba559c223a DATAMONGO-1986 - Polishing.
Refactor duplicated code into AggregationUtil.

Original pull request: #564.
2018-06-06 10:37:20 +02:00
Christoph Strobl
5f3ad68114 DATAMONGO-1986 - Always provide a typed AggregationOperationContext for TypedAggregation.
We now initialize a TypeBasedAggregationOperationContext for TypedAggregations if no context is provided. This makes sure that potential Criteria objects are run trough the QueryMapper.
In case the default context is used we now also make sure to at least run the aggregation pipeline through the QueryMapper to avoid passing on non MongoDB simple types to the driver.

Original pull request: #564.
2018-06-06 10:37:20 +02:00
Mark Paluch
28b18d25cb DATAMONGO-1988 - Polishing.
Match exactly for either top-level properties of leaf-properties instead of accepting the property/field name suffix.

Original pull request: #565.
2018-06-05 11:14:11 +02:00
Christoph Strobl
22c0e5029c DATAMONGO-1988 - Fix query creation for id property references using ObjectId hex String representation.
We now follow the conversion rules for id properties with a valid ObjectId representation when creating queries. Prior to this change e.g. String values would have been turned into ObejctIds when saving a document, but not when querying the latter.

Original pull request: #565.
2018-06-05 11:13:01 +02:00
Christoph Strobl
4582d3152c DATAMONGO-1927 - Updated changelog. 2018-05-17 10:32:57 +02:00
Sébastien Deleuze
d219e8ed7c DATAMONGO-1980 - Fix a typo in CriteriaExtensions.kt.
Original pull request: #563.
2018-05-16 09:43:34 +02:00
Victor
ab7740faf5 DATAMONGO-1978 - Fix minor typo in Field.positionKey field name.
Original pull request: #558.
2018-05-15 12:30:06 +02:00
Mark Paluch
0fba00311d DATAMONGO-1466 - Polishing.
Switch conditionals to Map-based Function registry to pick the appropriate converter. Fix typos in method names.

Original pull request: #561.
2018-05-15 11:30:21 +02:00
Christoph Strobl
33863999e6 DATAMONGO-1466 - Polishing.
Just some minor code style improvements.

Original pull request: #561.
2018-05-15 11:30:21 +02:00
Christoph Strobl
ae18958955 DATAMONGO-1466 - Add embedded typeinformation-based reading GeoJSON converter.
Original pull request: #561.
2018-05-15 11:30:21 +02:00
Mark Paluch
489d637a00 DATAMONGO-1974 - Polishing.
Fix typos, links, and code fences.

Original pull request: #559.
2018-05-11 15:29:57 +02:00
Jay Bryant
1b5ce651be DATAMONGO-1974 - Full editing pass for Spring Data MongoDB.
Full editing pass of the Spring Data MongoDB reference guide. I also adjusted index.adoc to work with the changes I made to the build project, so that we get Epub and PDF as well as HTML.

Original pull request: #559.
2018-05-11 15:29:18 +02:00
Mark Paluch
e035210917 DATAMONGO-1971 - Polishing.
Remove outdated profiles.

Original pull request: #554.
2018-05-09 16:35:07 +02:00
Mark Paluch
57fc260c43 DATAMONGO-1971 - Install MongoDB 3.7.9 on TravisCI.
We now download and unpack MongoDB directly instead of using TravisCI's outdated MongoDB version.

Original pull request: #554.
2018-05-09 16:35:04 +02:00
Mark Paluch
f9ec63425e DATAMONGO-1918 - After release cleanups. 2018-05-08 15:04:28 +02:00
Mark Paluch
aedb50d728 DATAMONGO-1918 - Prepare next development iteration. 2018-05-08 15:04:26 +02:00
Mark Paluch
dbf4990f60 DATAMONGO-1918 - Release version 2.0.7 (Kay SR7). 2018-05-08 14:15:27 +02:00
Mark Paluch
c5c43158c2 DATAMONGO-1918 - Prepare 2.0.7 (Kay SR7). 2018-05-08 14:14:32 +02:00
Mark Paluch
56ffe7913d DATAMONGO-1918 - Updated changelog. 2018-05-08 14:14:23 +02:00
Mark Paluch
eae263eebc DATAMONGO-1917 - Updated changelog. 2018-05-08 12:22:53 +02:00
Mark Paluch
0dd2fa3dce DATAMONGO-1943 - Polishing.
Reduce visibility. Use List interface instead of concrete type.

Original pull request: #556.
2018-05-07 16:20:52 +02:00
Christoph Strobl
e648ea5903 DATAMONGO-1943 - Fix ClassCastException caused by SpringDataMongodbSerializer.
We now convert List-typed predicates to List to BasicDBList to meet MongodbSerializer's expectations for top-level lists used for the $and operator.

Original pull request: #556.
2018-05-07 16:20:52 +02:00
Mark Paluch
f389812b7c DATAMONGO-1869 - Updated changelog. 2018-04-13 15:11:29 +02:00
Mark Paluch
2127ddcbb8 DATAMONGO-1893 - Polishing.
Inherit fields from previous operation if at least one field is excluded. Extend FieldsExposingAggregationOperation to conditionally inherit fields.

Original pull request: #538.
2018-04-06 10:45:52 +02:00
Christoph Strobl
7f9ab3bb44 DATAMONGO-1893 - Allow exclusion of other fields than _id in aggregation $project.
As of MongoDB 3.4 exclusion of fields other than _id is allowed so we removed the limitation in our code.

Original pull request: #538.
2018-04-06 10:45:52 +02:00
Mark Paluch
aea40ca490 DATAMONGO-1888 - After release cleanups. 2018-04-04 16:42:33 +02:00
Mark Paluch
fb8d03db31 DATAMONGO-1888 - Prepare next development iteration. 2018-04-04 16:42:30 +02:00
Mark Paluch
890f08f19a DATAMONGO-1888 - Release version 2.0.6 (Kay SR6). 2018-04-04 15:53:22 +02:00
Mark Paluch
be58472777 DATAMONGO-1888 - Prepare 2.0.6 (Kay SR6). 2018-04-04 15:52:31 +02:00
Mark Paluch
b082d4ad98 DATAMONGO-1888 - Updated changelog. 2018-04-04 15:52:22 +02:00
Mark Paluch
e80b031f54 DATAMONGO-1857 - Updated changelog. 2018-04-04 15:16:20 +02:00
Mark Paluch
50b017c08b DATAMONGO-1903 - Polishing.
Remove client side operating system check as operating system-dependant constraints depend on the server. Add check on whitespaces. Add author tags. Extend tests.

Adapt check in SimpleReactiveMongoDatabaseFactory accordingly. Remove superfluous UnknownHostException declaration in reactive database factory. Replace references to legacy types in Javadoc with references to current ones.

Original pull request: #546.
2018-04-03 13:44:19 +02:00
George Moraitis
78429eb33d DATAMONGO-1903 - Align database name check in SimpleMongoDbFactory with MongoDB limitations.
We now test database names against the current (3.6) MongoDB specifications for database names.

Original pull request: #546.
2018-04-03 13:44:16 +02:00
Mark Paluch
3ed0bd7a18 DATAMONGO-1916 - Polishing.
Remove unused final keywords from method parameters and unused variables. Add nullable annotations to parameters that can be null. Fix generics.

Original pull request: #547.
2018-04-03 11:33:13 +02:00
Christoph Strobl
cbc923c727 DATAMONGO-1916 - Fix potential ClassCastException in MappingMongoConverter#writeInternal when writing collections.
Original pull request: #547.
2018-04-03 11:32:53 +02:00
Mark Paluch
f6ca0049b6 DATAMONGO-1834 - Polishing.
Increase visibility of Timezone factory methods. Add missing nullable annotation. Tweaked Javadoc. Add tests for Timezone using expressions/field references.

Original Pull Request: #539
2018-03-28 11:40:14 +02:00
Christoph Strobl
82c9b0c662 DATAMONGO-1834 - Polishing.
Remove DateFactory and split up tests.
Introduce dedicated Timezone abstraction and update existing factories to apply the timezone if appropriate. Update builders and align code style.

Original Pull Request: #539
2018-03-28 11:39:58 +02:00
Matt Morrissette
3ca2349ce3 DATAMONGO-1834 - Add support for MongoDB 3.6 DateOperators $dateFromString, $dateFromParts and $dateToParts including timezones.
Original Pull Request: #539
2018-03-28 11:34:57 +02:00
Oliver Gierke
a76f157457 DATAMONGO-1915 - Removed explicit declaration of Jackson library versions. 2018-03-27 19:35:24 +02:00
Christoph Strobl
560a6a5bc2 DATAMONGO-1911 - Polishing.
Use native MongoDB Codec facilities to render binary and uuid.

Original Pull Request: #544
2018-03-27 14:17:46 +02:00
Mark Paluch
51d5c52193 DATAMONGO-1911 - Fix UUID serialization in String-based queries.
We now render to the correct UUID representation in String-based queries. Unquoted values render to $binary representation, quoted UUIDs are rendered with their toString() value.

Previously we used JSON.serialize() to encode values to JSON. The com.mongodb.util.JSON serializer does not produce JSON that is compatible with Document.parse. It uses an older JSON format that preceded the MongoDB Extended JSON specification.

Original Pull Request: #544
2018-03-27 14:04:34 +02:00
Mark Paluch
56b6748068 DATAMONGO-1913 - Add missing nullable annotations to GridFsTemplate. 2018-03-26 14:10:53 +02:00
Felipe Zanardo Affonso
1e19f405cc DATAMONGO-1909 - Fix typo on return statement.
Original pull request: #523.
2018-03-21 16:05:25 +01:00
Mark Paluch
54d2c122eb DATAMONGO-1907 - Polishing.
Rename test method to reflect test subject.

Switch from flatMap(…) to map(…) to avoid overhead of Mono creation.

Original pull request: #541.
2018-03-21 09:54:07 +01:00
Ruben J Garcia
b47c5704e7 DATAMONGO-1907 - Adjust SimpleReactiveMongoRepository.findOne(…) to complete without exception on empty result
We now no longer emit an exception via SimpleReactiveMongoRepository.findOne(Example) if the query completes without yielding a result. Previously findOne(Example) emitted a NoSuchElementException if the query returned no result.

Original pull request: #541.
2018-03-21 09:51:32 +01:00
Oliver Gierke
6b0b1cd97d DATAMONGO-1904 - Optimizations in MappingMongoConverter.readCollectionOrArray(…).
Switched to ClassUtils.isAssignableValue(…) in getPotentiallyConvertedSimpleRead(…) as it transparently handles primitives and their wrapper types so that we can avoid the superfluous invocation of the converter infrastructure.
2018-03-15 15:03:53 +01:00
Oliver Gierke
35bbc604aa DATAMONGO-1904 - Fixed handling of nested arrays on reads in MappingMongoConverter.
We now properly forward the component type information into recursive calls to MappingMongoConverter.readCollectionOrArray(…).
2018-03-15 15:03:51 +01:00
Oliver Gierke
9ade830a10 DATAMONGO-1901 - Added project.root configuration to make JavaDoc generation work again.
Related ticket: https://github.com/spring-projects/spring-data-build/issues/527.
2018-03-14 09:37:46 +01:00
Oliver Gierke
8fbff50f4f DATAMONGO-1898 - Added unit tests for the conversion handling of enums implementing interfaces.
Related tickets: DATACMNS-1278.
2018-03-12 11:07:40 +01:00
Oliver Gierke
14b49638a0 DATAMONGO-1896 - SimpleMongoRepository.saveAll(…) now consistently uses aggregate collection for inserts.
We previously used MongoTemplate.insertAll(…) which determines the collection to insert the individual elements based on the type, which - in cases of entity inheritance - will use dedicated collections for sub-types of the aggregate root. Subsequent lookups of the entities will then fail, as those are executed against the collection the aggregate root is mapped to.

We now rather use ….insert(Collection, String) handing the collection of the aggregate root explicitly.
2018-03-09 00:03:44 +01:00
Mark Paluch
dc31f4f32f DATAMONGO-1882 - After release cleanups. 2018-02-28 10:43:35 +01:00
Mark Paluch
708f9ac7b3 DATAMONGO-1882 - Prepare next development iteration. 2018-02-28 10:43:34 +01:00
Mark Paluch
17d6100426 DATAMONGO-1882 - Release version 2.0.5 (Kay SR5). 2018-02-28 10:14:58 +01:00
Mark Paluch
27a4e25880 DATAMONGO-1882 - Prepare 2.0.5 (Kay SR5). 2018-02-28 10:14:05 +01:00
Mark Paluch
d378bcb442 DATAMONGO-1882 - Updated changelog. 2018-02-28 10:13:57 +01:00
Mark Paluch
f6505c7758 DATAMONGO-1859 - After release cleanups. 2018-02-19 20:29:08 +01:00
Mark Paluch
d25f88c70e DATAMONGO-1859 - Prepare next development iteration. 2018-02-19 20:29:06 +01:00
Mark Paluch
cec6edfa26 DATAMONGO-1859 - Release version 2.0.4 (Kay SR4). 2018-02-19 19:46:53 +01:00
Mark Paluch
3261936e8a DATAMONGO-1859 - Prepare 2.0.4 (Kay SR4). 2018-02-19 19:46:05 +01:00
Mark Paluch
d2d471d135 DATAMONGO-1859 - Updated changelog. 2018-02-19 19:45:58 +01:00
Mark Paluch
bcd2de000c DATAMONGO-1870 - Polishing.
Extend copyright license years. Slightly reword documentation. Use IntStream and insertAll to create test fixture.

Original pull request: #532.
Related pull request: #531.
2018-02-15 10:56:14 +01:00
Christoph Strobl
c873e49d71 DATAMONGO-1870 - Consider skip/limit on MongoOperations.remove(Query, Class).
We now use _id lookup for remove operations that query with limit or skip parameters. This allows more fine grained control over documents removed.

Original pull request: #532.
Related pull request: #531.
2018-02-15 10:56:05 +01:00
Christoph Strobl
4ebcac19bc DATAMONGO-1860 - Polishing.
Fix references to QuerydslPredicateExecutor.

Original Pull Request: #529
2018-02-14 13:36:17 +01:00
Mark Paluch
78212948bc DATAMONGO-1860 - Polishing.
Fix type references in Javadoc. Change lambdas to method references where applicable.

Original Pull Request: #529
2018-02-14 13:36:17 +01:00
Mark Paluch
38575baec1 DATAMONGO-1860 - Retrieve result count via QuerydslMongoPredicateExecutor only for paging.
We now use AbstractMongodbQuery.fetch() instead of AbstractMongodbQuery.fetchResults() to execute MongoDB queries. fetchResults() executes a find(…) and a count(…) query. Retrieving the record count is an expensive operation in MongoDB and the count is not always required. For regular find(…) method, the count is ignored, for paging the count(…) is only required in certain result/request scenarios.

Original Pull Request: #529
2018-02-14 13:36:17 +01:00
Mark Paluch
f1a3c37a79 DATAMONGO-1865 - Polishing.
Adapt to collection name retrieval during query execution. Slightly reword documentation and JavaDoc.

Original pull request: #530.
2018-02-14 12:01:44 +01:00
Christoph Strobl
c668a47243 DATAMONGO-1865 - Avoid IncorrectResultSizeDataAccessException for derived findFirst/findTop queries.
We now return the first result when executing findFirst/findTop queries. This fixes a glitch introduced in the Kay release throwing IncorrectResultSizeDataAccessException for single entity executions returning more than one result, which is explicitly not the desired behavior in this case.

Original pull request: #530.
2018-02-14 12:01:44 +01:00
Mark Paluch
6a20ddf5a2 DATAMONGO-1871 - Polishing.
Migrate test to AssertJ.

Original pull request: #533.
2018-02-14 11:05:20 +01:00
Christoph Strobl
cec6526543 DATAMONGO-1871 - Fix AggregationExpression aliasing.
We now make sure to allow a nested property alias by setting the target.

Original pull request: #533.
2018-02-14 11:05:17 +01:00
Oliver Gierke
46ea58f3b9 DATAMONGO-1872 - Polishing.
Fixed @since tag for newly introduced method in MongoEntityMetadata.
2018-02-13 12:24:09 +01:00
Oliver Gierke
ebaea8d22f DATAMONGO-1872 - Repository query execution doesn't prematurely fix collection to be queried.
We now avoid calling ….inCollection(…) with a fixed, one-time calculated collection name to make sure we dynamically resolve the collections. That's necessary to make sure SpEL expressions in @Document are evaluated for every query execution.
2018-02-13 12:18:19 +01:00
Christoph Strobl
ed6aaeed25 DATAMONGO-1794 - Updated changelog. 2018-02-06 11:14:01 +01:00
Mark Paluch
89b1b6fbb2 DATAMONGO-1830 - After release cleanups. 2018-01-24 13:46:10 +01:00
Mark Paluch
23769301b5 DATAMONGO-1830 - Prepare next development iteration. 2018-01-24 13:46:08 +01:00
Mark Paluch
3399160acf DATAMONGO-1830 - Release version 2.0.3 (Kay SR3). 2018-01-24 13:21:24 +01:00
Mark Paluch
32a8ee9b31 DATAMONGO-1830 - Prepare 2.0.3 (Kay SR3). 2018-01-24 13:20:39 +01:00
Mark Paluch
17cea70abc DATAMONGO-1830 - Updated changelog. 2018-01-24 13:20:34 +01:00
Mark Paluch
07731c39ba DATAMONGO-1858 - Fix line endings to LF. 2018-01-24 12:57:24 +01:00
Mark Paluch
c5b580b82b DATAMONGO-1829 - Updated changelog. 2018-01-24 12:22:10 +01:00
Mark Paluch
9a1385186e DATAMONGO-1843 - Polishing.
Convert anonymous classes to lambdas. Typo fixes. Migrate test to AssertJ.

Original pull request: #526.
2018-01-23 10:34:46 +01:00
Christoph Strobl
704524d7f4 DATAMONGO-1843 - Fix parameter shadowing in ArrayOperators reduce.
Original pull request: #526.
2018-01-23 10:34:35 +01:00
Christoph Strobl
cc9a3ac8da DATAMONGO-1850 - Polishing.
Remove blank line, add tests and migrate to AssertJ.

Original Pull Request: #527
2018-01-22 16:16:52 +01:00
Mark Paluch
acb68f3ca4 DATAMONGO-1850 - Guard GridFsResource.getContentType() against absent file metadata.
We now consider fall back to GridFS.getContentType() if GridFS metadata is absent to prevent null dereference.

Original Pull Request: #527
2018-01-22 15:14:35 +01:00
Mark Paluch
3088f0469e DATAMONGO-1824 - Polishing.
Move method from AggregationCommandPreparer and AggregationResultPostProcessor to BatchAggregationLoader. Extract field names to constants. Tiny renames to variables. Add unit test for aggregation response without cursor use. Migrate test to AssertJ.

Original pull request: #521.
2017-12-15 14:30:24 +01:00
Christoph Strobl
a1ae04881d DATAMONGO-1824 - Skip tests no longer applicable for MongoDB 3.6.
Original pull request: #521.
2017-12-15 14:25:31 +01:00
Christoph Strobl
6f55c66060 DATAMONGO-1824 - Fix aggregation execution for MongoDB 3.6.
We now send aggregation commands along a cursor batch size for compatibility with MongoDB 3.6 that no longer supports aggregations without cursor. We consume the whole cursor before returning and converting results and omit the 16MB aggregation result limit. For MongoDB versions not supporting aggregation cursors we return results directly.

Original pull request: #521.
2017-12-15 14:25:20 +01:00
Christoph Strobl
f86447bd04 DATAMONGO-1831 - Fix array type conversion for empty source.
We now make sure that we convert empty sources to the corresponding target type. This prevents entity instantiation from failing due to incorrect argument types when invoking the constructor.

Original pull request: #520.
2017-12-02 12:18:55 -08:00
Mark Paluch
1bb4324b2e DATAMONGO-1816 - After release cleanups. 2017-11-27 16:42:54 +01:00
Mark Paluch
856506f121 DATAMONGO-1816 - Prepare next development iteration. 2017-11-27 16:42:52 +01:00
Mark Paluch
2a81dc75a8 DATAMONGO-1816 - Release version 2.0.2 (Kay SR2). 2017-11-27 16:12:34 +01:00
Mark Paluch
58cd4c08ca DATAMONGO-1816 - Prepare 2.0.2 (Kay SR2). 2017-11-27 16:11:21 +01:00
Mark Paluch
344e019143 DATAMONGO-1816 - Updated changelog. 2017-11-27 16:11:15 +01:00
Mark Paluch
918b7e96bb DATAMONGO-1799 - Updated changelog. 2017-11-27 15:58:45 +01:00
Christoph Strobl
fce7a5c1cb DATAMONGO-1818 - Polishing.
Move overlapping/duplicate documentation into one place.

Original Pull Request: #512
2017-11-27 07:53:22 +01:00
Mark Paluch
dbd2de8e0f DATAMONGO-1818 - Reword tailable cursors documentation.
Fix reference to @Tailable annotation. Slightly reword documentation.

Original Pull Request: #512
2017-11-27 07:53:08 +01:00
Mark Paluch
0dbe331ab0 DATAMONGO-1823 - Polishing.
Replace constructor with lombok's RequiredArgsConstructor. Add Nullable annotation. Tiny reformatting. Align license header. Migrate test to AssertJ.

Original pull request: #517.
2017-11-22 14:33:20 +01:00
Christoph Strobl
846ebcd91d DATAMONGO-1823 - Emit ApplicationEvents using projecting find methods.
We now again emit application events when using finder methods that apply projection.

Original pull request: #517.
2017-11-22 14:33:20 +01:00
Oliver Gierke
9e0b5caeac DATAMONGO-1737 - BasicMongoPersistentEntity now correctly initializes comparator.
In BasicMongoPersistentEntity.verify() we now properly call the super method to make sure the comparators that honor the @Field's order value are initialized properly.
2017-11-17 14:55:00 +01:00
Mark Paluch
cf70f5e5eb DATAMONGO-1819 - Polishing.
Use native field names for NamedMongoScript query instead of relying on metadata-based mapping as NamedMongoScript is considered a simple top-level type.

Related pull request: #513.
2017-11-17 13:49:06 +01:00
Mark Paluch
331dc6df6f DATAMONGO-1821 - Fix method ambiguity in tests when compiling against MongoDB 3.6 2017-11-07 12:47:51 +01:00
Mark Paluch
a51dce2c90 DATAMONGO-1820 - Set Mongo's Feature Compatibility flag for TravisCI build to 3.4.
Apply setFeatureCompatibilityVersion to upgrade MongoDB to 3.4 features.
2017-11-06 10:28:10 +01:00
Mark Paluch
c0cf1aa95b DATAMONGO-1817 - Polishing.
Remove blank line.

Original pull request: #510.
2017-11-06 10:02:35 +01:00
Sola
7104ffa543 DATAMONGO-1817 - Align nullability in Kotlin MongoOperationsExtensions with Java API.
Return types in MongoOperationsExtensions are now aligned to the nullability of MongoOperations.

Original pull request: #510.
2017-11-06 10:02:35 +01:00
Oliver Gierke
28d2fb6680 DATAMONGO-1793 - After release cleanups. 2017-10-27 15:50:48 +02:00
Oliver Gierke
140e26946f DATAMONGO-1793 - Prepare next development iteration. 2017-10-27 15:50:45 +02:00
Oliver Gierke
f4e730ce87 DATAMONGO-1793 - Release version 2.0.1 (Kay SR1). 2017-10-27 15:25:11 +02:00
Oliver Gierke
e3a83ebc42 DATAMONGO-1793 - Prepare 2.0.1 (Kay SR1). 2017-10-27 15:24:24 +02:00
Oliver Gierke
f65c1e324e DATAMONGO-1793 - Updated changelog. 2017-10-27 15:24:14 +02:00
Oliver Gierke
1dd0061f03 DATAMONGO-1815 - Adapt API changes in Property in test cases. 2017-10-27 11:13:31 +02:00
Mark Paluch
5ea860700c DATAMONGO-1814 - Update reference documentation for faceted classification.
Original pull request: #426.
Original ticket: DATAMONGO-1552.
2017-10-26 09:44:50 +02:00
Christoph Strobl
3dd653a702 DATAMONGO-1811 - Update documentation of MongoOperations.executeCommand.
Update Javadoc and reference documentation.
2017-10-24 14:59:47 +02:00
Christoph Strobl
f87847407b DATAMONGO-1805 - Update GridFsOperations documentation.
Fix return type in reference documentation and update Javadoc.
2017-10-24 14:59:40 +02:00
Christoph Strobl
433a125c9e DATAMONGO-1806 - Polishing.
Remove unused import, trailing whitespaces and update Javadoc.

Original Pull Request: #506
2017-10-24 14:59:33 +02:00
hartmut
5827cb0971 DATAMONGO-1806 - Fix Javadoc for GridFsResource.
Original Pull Request: #506
2017-10-24 14:59:24 +02:00
Mark Paluch
0109bf6858 DATAMONGO-1809 - Introduce AssertJ assertions for Document.
Original pull request: #508.
2017-10-24 14:45:03 +02:00
Christoph Strobl
49d1555576 DATAMONGO-1809 - Polishing.
Move tests to AssertJ.

Original pull request: #508.
2017-10-24 14:45:03 +02:00
Christoph Strobl
fdbb305b8e DATAMONGO-1809 - Fix positional parameter detection for PropertyPaths.
We now make sure to capture all digits for positional parameters.

Original pull request: #508.
2017-10-24 14:45:03 +02:00
Mark Paluch
49dd03311a DATAMONGO-1696 - Mention appropriate EnableMongoAuditing annotation in reference documentation. 2017-10-20 08:45:33 +02:00
Mark Paluch
a86a3210e1 DATAMONGO-1802 - Polishing.
Reduce converter visibility to MongoConverters's package-scope visibility. Tiny alignment in Javadoc wording. Copyright year, create empty byte array with element count instead initializer.

Original pull request: #505.
2017-10-17 14:52:11 +02:00
Christoph Strobl
4b655abfb6 DATAMONGO-1802 - Add Binary to byte array converter.
We now provide and register a Binary to byte[] converter to provide conversion of binary data to a byte array. MongoDB deserializes binary data using the document API to its Binary type. With this converter, we reinstantiated the previous capability to use byte arrays for binary data within domain types.

Original pull request: #505.
2017-10-17 14:52:11 +02:00
Oliver Gierke
0963e6cf77 DATAMONGO-1775 - Updated changelog. 2017-10-11 19:03:29 +02:00
Oliver Gierke
3e1b2c4bdb DATAMONGO-1795 - Removed obsolete Kotlin build setup. 2017-10-04 11:05:27 +02:00
Mark Paluch
03e0e0c431 DATAMONGO-1776 - After release cleanups. 2017-10-02 11:38:04 +02:00
Mark Paluch
51900021a1 DATAMONGO-1776 - Prepare next development iteration. 2017-10-02 11:38:03 +02:00
725 changed files with 12978 additions and 5912 deletions

View File

@@ -3,34 +3,33 @@ language: java
jdk:
- oraclejdk8
before_script:
- mongod --version
services:
- mongodb
before_install:
- mkdir -p downloads
- mkdir -p var/db var/log
- if [[ ! -d downloads/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION} ]] ; then cd downloads && wget https://fastdl.mongodb.org/linux/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}.tgz && tar xzf mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}.tgz && cd ..; fi
- downloads/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}/bin/mongod --version
- downloads/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}/bin/mongod --dbpath var/db --replSet rs0 --fork --logpath var/log/mongod.log
- sleep 10
- |-
downloads/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}/bin/mongo --eval "rs.initiate({_id: 'rs0', members:[{_id: 0, host: '127.0.0.1:27017'}]});"
sleep 15
env:
matrix:
- PROFILE=ci
- PROFILE=mongo36-next
global:
- MONGO_VERSION=3.7.9
# Current MongoDB version is 2.4.2 as of 2016-04, see https://github.com/travis-ci/travis-ci/issues/3694
# apt-get starts a MongoDB instance so it's not started using before_script
addons:
apt:
sources:
- mongodb-3.4-precise
packages:
- mongodb-org-server
- mongodb-org-shell
- oracle-java8-installer
- oracle-java8-installer
sudo: false
cache:
directories:
- $HOME/.m2
install: true
- downloads
script: "mvn clean dependency:list test -P${PROFILE} -Dsort"

74
pom.xml
View File

@@ -1,21 +1,21 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.0.RELEASE</version>
<version>2.0.14.RELEASE</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
<description>MongoDB support for Spring Data</description>
<url>http://projects.spring.io/spring-data-mongodb</url>
<url>https://projects.spring.io/spring-data-mongodb</url>
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>2.0.0.RELEASE</version>
<version>2.0.14.RELEASE</version>
</parent>
<modules>
@@ -27,7 +27,7 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>2.0.0.RELEASE</springdata.commons>
<springdata.commons>2.0.14.RELEASE</springdata.commons>
<mongo>3.5.0</mongo>
<mongo.reactivestreams>1.6.0</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
@@ -39,7 +39,7 @@
<name>Oliver Gierke</name>
<email>ogierke at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Project Lead</role>
</roles>
@@ -50,7 +50,7 @@
<name>Thomas Risberg</name>
<email>trisberg at vmware.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -61,7 +61,7 @@
<name>Mark Pollack</name>
<email>mpollack at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -72,7 +72,7 @@
<name>Jon Brisbin</name>
<email>jbrisbin at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -83,7 +83,7 @@
<name>Thomas Darimont</name>
<email>tdarimont at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -94,7 +94,7 @@
<name>Christoph Strobl</name>
<email>cstrobl at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -105,7 +105,7 @@
<name>Mark Paluch</name>
<email>mpaluch at pivotal.io</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.pivotal.io</organizationUrl>
<organizationUrl>https://www.pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -115,38 +115,6 @@
<profiles>
<!-- not-yet available profile>
<id>mongo35-next</id>
<properties>
<mongo>3.5.1-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile -->
<profile>
<id>mongo36-next</id>
<properties>
<mongo>3.6.0-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>release</id>
<build>
@@ -170,6 +138,24 @@
</modules>
</profile>
<profile>
<id>distribute</id>
<build>
<plugins>
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
<configuration>
<attributes>
<mongo-reactivestreams>${mongo.reactivestreams}</mongo-reactivestreams>
<reactor>${reactor}</reactor>
</attributes>
</configuration>
</plugin>
</plugins>
</build>
</profile>
</profiles>
<dependencies>

View File

@@ -1,13 +1,13 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.0.RELEASE</version>
<version>2.0.14.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,12 +1,12 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.0.RELEASE</version>
<version>2.0.14.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -17,6 +17,7 @@
<jpa>2.1.1</jpa>
<hibernate>5.2.1.Final</hibernate>
<java-module-name>spring.data.mongodb.cross.store</java-module-name>
<project.root>${basedir}/..</project.root>
</properties>
<dependencies>
@@ -49,7 +50,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>2.0.0.RELEASE</version>
<version>2.0.14.RELEASE</version>
</dependency>
<!-- reactive -->

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,214 +1,214 @@
/*
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManagerFactory;
import org.bson.Document;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.crossstore.ChangeSet;
import org.springframework.data.crossstore.ChangeSetBacked;
import org.springframework.data.crossstore.ChangeSetPersister;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.util.ClassUtils;
import com.mongodb.MongoException;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.model.Filters;
import com.mongodb.client.result.DeleteResult;
/**
* @author Thomas Risberg
* @author Oliver Gierke
* @author Alex Vengrovsk
* @author Mark Paluch
* @deprecated will be removed without replacement.
*/
@Deprecated
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_CLASS = "_entity_class";
private static final String ENTITY_ID = "_entity_id";
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
private final Logger log = LoggerFactory.getLogger(getClass());
private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory;
public void setMongoTemplate(MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
public void setEntityManagerFactory(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentState(java.lang.Class, java.lang.Object, org.springframework.data.crossstore.ChangeSet)
*/
public void getPersistentState(Class<? extends ChangeSetBacked> entityClass, Object id, final ChangeSet changeSet)
throws DataAccessException, NotFoundException {
if (id == null) {
log.debug("Unable to load MongoDB data for null id");
return;
}
String collName = getCollectionNameForEntity(entityClass);
final Document dbk = new Document();
dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for {}", dbk);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException {
for (Document dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) {
log.debug("Processing key: {}", key);
}
if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) {
throw new DataIntegrityViolationException(
"Unble to convert property " + key + ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
}
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: {}", key);
}
changeSet.set(key, value);
}
}
return null;
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (log.isDebugEnabled()) {
log.debug("getPersistentId called on {}", entity);
}
if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
}
return entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity);
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#persistState(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object persistState(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (cs == null) {
log.debug("Flush: changeset was null, nothing to flush.");
return 0L;
}
if (log.isDebugEnabled()) {
log.debug("Flush: changeset: {}", cs.getValues());
}
String collName = getCollectionNameForEntity(entity.getClass());
if (mongoTemplate.getCollection(collName) == null) {
mongoTemplate.createCollection(collName);
}
for (String key : cs.getValues().keySet()) {
if (key != null && !key.startsWith("_") && !key.equals(ChangeSetPersister.ID_KEY)) {
Object value = cs.getValues().get(key);
final Document dbQuery = new Document();
dbQuery.put(ENTITY_ID, getPersistentId(entity, cs));
dbQuery.put(ENTITY_CLASS, entity.getClass().getName());
dbQuery.put(ENTITY_FIELD_NAME, key);
final Document dbId = mongoTemplate.execute(collName, new CollectionCallback<Document>() {
public Document doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
Document id = collection.find(dbQuery).first();
return id;
}
});
if (value == null) {
if (log.isDebugEnabled()) {
log.debug("Flush: removing: {}", dbQuery);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
DeleteResult dr = collection.deleteMany(dbQuery);
return null;
}
});
} else {
final Document dbDoc = new Document();
dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) {
log.debug("Flush: saving: {}", dbQuery);
}
mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());
if (dbId != null) {
dbDoc.put("_id", dbId.get("_id"));
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
if (dbId != null) {
collection.replaceOne(Filters.eq("_id", dbId.get("_id")), dbDoc);
} else {
if (dbDoc.containsKey("_id") && dbDoc.get("_id") == null) {
dbDoc.remove("_id");
}
collection.insertOne(dbDoc);
}
return null;
}
});
}
}
}
return 0L;
}
/**
* Returns the collection the given entity type shall be persisted to.
*
* @param entityClass must not be {@literal null}.
* @return
*/
private String getCollectionNameForEntity(Class<? extends ChangeSetBacked> entityClass) {
return mongoTemplate.getCollectionName(entityClass);
}
}
/*
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManagerFactory;
import org.bson.Document;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.crossstore.ChangeSet;
import org.springframework.data.crossstore.ChangeSetBacked;
import org.springframework.data.crossstore.ChangeSetPersister;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.util.ClassUtils;
import com.mongodb.MongoException;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.model.Filters;
import com.mongodb.client.result.DeleteResult;
/**
* @author Thomas Risberg
* @author Oliver Gierke
* @author Alex Vengrovsk
* @author Mark Paluch
* @deprecated will be removed without replacement.
*/
@Deprecated
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_CLASS = "_entity_class";
private static final String ENTITY_ID = "_entity_id";
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
private final Logger log = LoggerFactory.getLogger(getClass());
private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory;
public void setMongoTemplate(MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
public void setEntityManagerFactory(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentState(java.lang.Class, java.lang.Object, org.springframework.data.crossstore.ChangeSet)
*/
public void getPersistentState(Class<? extends ChangeSetBacked> entityClass, Object id, final ChangeSet changeSet)
throws DataAccessException, NotFoundException {
if (id == null) {
log.debug("Unable to load MongoDB data for null id");
return;
}
String collName = getCollectionNameForEntity(entityClass);
final Document dbk = new Document();
dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for {}", dbk);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException {
for (Document dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) {
log.debug("Processing key: {}", key);
}
if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) {
throw new DataIntegrityViolationException(
"Unble to convert property " + key + ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
}
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: {}", key);
}
changeSet.set(key, value);
}
}
return null;
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (log.isDebugEnabled()) {
log.debug("getPersistentId called on {}", entity);
}
if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
}
return entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity);
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#persistState(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object persistState(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (cs == null) {
log.debug("Flush: changeset was null, nothing to flush.");
return 0L;
}
if (log.isDebugEnabled()) {
log.debug("Flush: changeset: {}", cs.getValues());
}
String collName = getCollectionNameForEntity(entity.getClass());
if (mongoTemplate.getCollection(collName) == null) {
mongoTemplate.createCollection(collName);
}
for (String key : cs.getValues().keySet()) {
if (key != null && !key.startsWith("_") && !key.equals(ChangeSetPersister.ID_KEY)) {
Object value = cs.getValues().get(key);
final Document dbQuery = new Document();
dbQuery.put(ENTITY_ID, getPersistentId(entity, cs));
dbQuery.put(ENTITY_CLASS, entity.getClass().getName());
dbQuery.put(ENTITY_FIELD_NAME, key);
final Document dbId = mongoTemplate.execute(collName, new CollectionCallback<Document>() {
public Document doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
Document id = collection.find(dbQuery).first();
return id;
}
});
if (value == null) {
if (log.isDebugEnabled()) {
log.debug("Flush: removing: {}", dbQuery);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
DeleteResult dr = collection.deleteMany(dbQuery);
return null;
}
});
} else {
final Document dbDoc = new Document();
dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) {
log.debug("Flush: saving: {}", dbQuery);
}
mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());
if (dbId != null) {
dbDoc.put("_id", dbId.get("_id"));
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
if (dbId != null) {
collection.replaceOne(Filters.eq("_id", dbId.get("_id")), dbDoc);
} else {
if (dbDoc.containsKey("_id") && dbDoc.get("_id") == null) {
dbDoc.remove("_id");
}
collection.insertOne(dbDoc);
}
return null;
}
});
}
}
}
return 0L;
}
/**
* Returns the collection the given entity type shall be persisted to.
*
* @param entityClass must not be {@literal null}.
* @return
*/
private String getCollectionNameForEntity(Class<? extends ChangeSetBacked> entityClass) {
return mongoTemplate.getCollectionName(entityClass);
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.0.RELEASE</version>
<version>2.0.14.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.0.RELEASE</version>
<version>2.0.14.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -19,6 +19,7 @@
<objenesis>1.3</objenesis>
<equalsverifier>1.7.8</equalsverifier>
<java-module-name>spring.data.mongodb</java-module-name>
<project.root>${basedir}/..</project.root>
</properties>
<dependencies>
@@ -145,7 +146,7 @@
<version>1.0.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.interceptor</groupId>
<artifactId>javax.interceptor-api</artifactId>
@@ -214,7 +215,6 @@
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson}</version>
<optional>true</optional>
</dependency>
@@ -288,74 +288,9 @@
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>kotlin-maven-plugin</artifactId>
<groupId>org.jetbrains.kotlin</groupId>
<version>${kotlin}</version>
<configuration>
<jvmTarget>${source.level}</jvmTarget>
</configuration>
<executions>
<execution>
<id>compile</id>
<phase>compile</phase>
<goals>
<goal>compile</goal>
</goals>
<configuration>
<sourceDirs>
<sourceDir>${project.basedir}/src/main/kotlin</sourceDir>
<sourceDir>${project.basedir}/src/main/java</sourceDir>
</sourceDirs>
</configuration>
</execution>
<execution>
<id>test-compile</id>
<phase>test-compile</phase>
<goals>
<goal>test-compile</goal>
</goals>
<configuration>
<sourceDirs>
<sourceDir>${project.basedir}/src/test/kotlin</sourceDir>
<sourceDir>${project.basedir}/src/test/java</sourceDir>
</sourceDirs>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<executions>
<execution>
<id>default-compile</id>
<phase>none</phase>
</execution>
<execution>
<id>default-testCompile</id>
<phase>none</phase>
</execution>
<execution>
<id>java-compile</id>
<phase>compile</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>java-test-compile</id>
<phase>test-compile</phase>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>com.mysema.maven</groupId>
<artifactId>apt-maven-plugin</artifactId>
@@ -384,7 +319,6 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12</version>
<configuration>
<useFile>false</useFile>
<includes>
@@ -406,6 +340,8 @@
</properties>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2017 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -23,16 +23,16 @@ import com.mongodb.DB;
import com.mongodb.client.MongoDatabase;
/**
* Interface for factories creating {@link DB} instances.
*
* Interface for factories creating {@link MongoDatabase} instances.
*
* @author Mark Pollack
* @author Thomas Darimont
*/
public interface MongoDbFactory {
/**
* Creates a default {@link DB} instance.
*
* Creates a default {@link MongoDatabase} instance.
*
* @return
* @throws DataAccessException
*/
@@ -40,7 +40,7 @@ public interface MongoDbFactory {
/**
* Creates a {@link DB} instance to access the database with the given name.
*
*
* @param dbName must not be {@literal null} or empty.
* @return
* @throws DataAccessException
@@ -49,7 +49,7 @@ public interface MongoDbFactory {
/**
* Exposes a shared {@link MongoExceptionTranslator}.
*
*
* @return will never be {@literal null}.
*/
PersistenceExceptionTranslator getExceptionTranslator();

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016 the original author or authors.
* Copyright 2016-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2017 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -60,6 +60,7 @@ import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCre
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.event.ValidatingMongoEventListener;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
@@ -75,6 +76,7 @@ import org.w3c.dom.Element;
* @author Thomas Darimont
* @author Christoph Strobl
* @author Mark Paluch
* @author Zied Yaich
*/
public class MappingMongoConverterParser implements BeanDefinitionParser {
@@ -159,6 +161,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
return null;
}
@Nullable
private BeanDefinition potentiallyCreateValidatingMongoEventListener(Element element, ParserContext parserContext) {
String disableValidation = element.getAttribute("disable-validation");
@@ -180,6 +183,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
return null;
}
@Nullable
private RuntimeBeanReference getValidator(Object source, ParserContext parserContext) {
if (!JSR_303_PRESENT) {
@@ -197,7 +201,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
}
public static String potentiallyCreateMappingContext(Element element, ParserContext parserContext,
BeanDefinition conversionsDefinition, String converterId) {
@Nullable BeanDefinition conversionsDefinition, @Nullable String converterId) {
String ctxRef = element.getAttribute("mapping-context-ref");
@@ -211,7 +215,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
BeanDefinitionBuilder mappingContextBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoMappingContext.class);
Set<String> classesToAdd = getInititalEntityClasses(element);
Set<String> classesToAdd = getInitialEntityClasses(element);
if (classesToAdd != null) {
mappingContextBuilder.addPropertyValue("initialEntitySet", classesToAdd);
@@ -262,6 +266,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
}
}
@Nullable
private BeanDefinition getCustomConversions(Element element, ParserContext parserContext) {
List<Element> customConvertersElements = DomUtils.getChildElementsByTagName(element, "custom-converters");
@@ -269,7 +274,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
if (customConvertersElements.size() == 1) {
Element customerConvertersElement = customConvertersElements.get(0);
ManagedList<BeanMetadataElement> converterBeans = new ManagedList<BeanMetadataElement>();
ManagedList<BeanMetadataElement> converterBeans = new ManagedList<>();
List<Element> converterElements = DomUtils.getChildElementsByTagName(customerConvertersElement, "converter");
if (converterElements != null) {
@@ -285,9 +290,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
provider.addExcludeFilter(new NegatingFilter(new AssignableTypeFilter(Converter.class),
new AssignableTypeFilter(GenericConverter.class)));
for (BeanDefinition candidate : provider.findCandidateComponents(packageToScan)) {
converterBeans.add(candidate);
}
converterBeans.addAll(provider.findCandidateComponents(packageToScan));
}
BeanDefinitionBuilder conversionsBuilder = BeanDefinitionBuilder.rootBeanDefinition(MongoCustomConversions.class);
@@ -304,7 +307,8 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
return null;
}
private static Set<String> getInititalEntityClasses(Element element) {
@Nullable
private static Set<String> getInitialEntityClasses(Element element) {
String basePackage = element.getAttribute(BASE_PACKAGE);
@@ -317,7 +321,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Document.class));
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Persistent.class));
Set<String> classes = new ManagedSet<String>();
Set<String> classes = new ManagedSet<>();
for (BeanDefinition candidate : componentProvider.findCandidateComponents(basePackage)) {
classes.add(candidate.getBeanClassName());
}
@@ -325,6 +329,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
return classes;
}
@Nullable
public BeanMetadataElement parseConverter(Element element, ParserContext parserContext) {
String converterRef = element.getAttribute("ref");
@@ -375,7 +380,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
Assert.notNull(filters, "TypeFilters must not be null");
this.delegates = new HashSet<TypeFilter>(Arrays.asList(filters));
this.delegates = new HashSet<>(Arrays.asList(filters));
}
/*

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2012-2014 the original author or authors.
* Copyright 2012-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2016 the original author or authors.
* Copyright 2013-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2017 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import java.io.UnsupportedEncodingException;
import java.lang.reflect.Method;
import java.net.URLDecoder;
import java.util.ArrayList;
import java.util.Arrays;
@@ -26,6 +27,7 @@ import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.springframework.lang.Nullable;
import org.springframework.util.ReflectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.MongoCredential;
@@ -35,6 +37,8 @@ import com.mongodb.MongoCredential;
*
* @author Christoph Strobl
* @author Oliver Gierke
* @author Stephen Tyler Conrad
* @author Mark Paluch
* @since 1.7
*/
public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
@@ -98,6 +102,20 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
verifyDatabasePresent(database);
credentials.add(MongoCredential.createScramSha1Credential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if ("SCRAM-SHA-256".equals(authMechanism)) {
Method createScramSha256Credential = ReflectionUtils.findMethod(MongoCredential.class,
"createScramSha256Credential", String.class, String.class, char[].class);
if (createScramSha256Credential == null) {
throw new IllegalArgumentException(
"SCRAM-SHA-256 auth mechanism is available as of MongoDB 4 and MongoDB Java Driver 3.8! Please make sure to use at least those versions.");
}
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.class.cast(ReflectionUtils.invokeMethod(createScramSha256Credential, null,
userNameAndPassword[0], database, userNameAndPassword[1].toCharArray())));
} else {
throw new IllegalArgumentException(
String.format("Cannot create MongoCredentials for unknown auth mechanism '%s'!", authMechanism));
@@ -164,7 +182,7 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
private static Properties extractOptions(String text) {
int optionsSeparationIndex = text.lastIndexOf(OPTIONS_DELIMITER);
int dbSeparationIndex = text.lastIndexOf(OPTIONS_DELIMITER);
int dbSeparationIndex = text.lastIndexOf(DATABASE_DELIMITER);
if (optionsSeparationIndex == -1 || dbSeparationIndex > optionsSeparationIndex) {
return new Properties();
@@ -173,7 +191,13 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
Properties properties = new Properties();
for (String option : text.substring(optionsSeparationIndex + 1).split(OPTION_VALUE_DELIMITER)) {
String[] optionArgs = option.split("=");
if (optionArgs.length == 1) {
throw new IllegalArgumentException(String.format("Query parameter '%s' has no value!", optionArgs[0]));
}
properties.put(optionArgs[0], optionArgs[1]);
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2017 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -0,0 +1,118 @@
/*
* Copyright 2018-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import lombok.AllArgsConstructor;
import java.util.List;
import java.util.Optional;
import java.util.stream.Collectors;
import org.bson.Document;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.lang.Nullable;
import org.springframework.util.ObjectUtils;
/**
* Utility methods to map {@link org.springframework.data.mongodb.core.aggregation.Aggregation} pipeline definitions and
* create type-bound {@link AggregationOperationContext}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.0.8
*/
@AllArgsConstructor
class AggregationUtil {
QueryMapper queryMapper;
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
/**
* Prepare the {@link AggregationOperationContext} for a given aggregation by either returning the context itself it
* is not {@literal null}, create a {@link TypeBasedAggregationOperationContext} if the aggregation contains type
* information (is a {@link TypedAggregation}) or use the {@link Aggregation#DEFAULT_CONTEXT}.
*
* @param aggregation must not be {@literal null}.
* @param context can be {@literal null}.
* @return the root {@link AggregationOperationContext} to use.
*/
AggregationOperationContext prepareAggregationContext(Aggregation aggregation,
@Nullable AggregationOperationContext context) {
if (context != null) {
return context;
}
if (aggregation instanceof TypedAggregation) {
return new TypeBasedAggregationOperationContext(((TypedAggregation) aggregation).getInputType(), mappingContext,
queryMapper);
}
return Aggregation.DEFAULT_CONTEXT;
}
/**
* Extract and map the aggregation pipeline into a {@link List} of {@link Document}.
*
* @param aggregation
* @param context
* @return
*/
Document createPipeline(String collectionName, Aggregation aggregation, AggregationOperationContext context) {
if (!ObjectUtils.nullSafeEquals(context, Aggregation.DEFAULT_CONTEXT)) {
return aggregation.toDocument(collectionName, context);
}
Document command = aggregation.toDocument(collectionName, context);
command.put("pipeline", mapAggregationPipeline(command.get("pipeline", List.class)));
return command;
}
/**
* Extract the command and map the aggregation pipeline.
*
* @param aggregation
* @param context
* @return
*/
Document createCommand(String collection, Aggregation aggregation, AggregationOperationContext context) {
Document command = aggregation.toDocument(collection, context);
if (!ObjectUtils.nullSafeEquals(context, Aggregation.DEFAULT_CONTEXT)) {
return command;
}
command.put("pipeline", mapAggregationPipeline(command.get("pipeline", List.class)));
return command;
}
private List<Document> mapAggregationPipeline(List<Document> pipeline) {
return pipeline.stream().map(val -> queryMapper.getMappedObject(val, Optional.empty()))
.collect(Collectors.toList());
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2017 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2017 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2017 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2002-2016 the original author or authors.
* Copyright 2002-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2017 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2017 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2014-2017 the original author or authors.
* Copyright 2014-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -43,9 +43,10 @@ import com.mongodb.client.MongoDatabase;
/**
* Default implementation of {@link ScriptOperations} capable of saving and executing {@link ServerSideJavaScript}.
*
*
* @author Christoph Strobl
* @author Oliver Gierke
* @author Mark Paluch
* @since 1.7
*/
class DefaultScriptOperations implements ScriptOperations {
@@ -141,7 +142,7 @@ class DefaultScriptOperations implements ScriptOperations {
Assert.hasText(scriptName, "ScriptName must not be null or empty!");
return mongoOperations.exists(query(where("name").is(scriptName)), NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
return mongoOperations.exists(query(where("_id").is(scriptName)), NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
}
/*
@@ -190,7 +191,7 @@ class DefaultScriptOperations implements ScriptOperations {
* Generate a valid name for the {@literal JavaScript}. MongoDB requires an id of type String for scripts. Calling
* scripts having {@link ObjectId} as id fails. Therefore we create a random UUID without {@code -} (as this won't
* work) an prefix the result with {@link #SCRIPT_NAME_PREFIX}.
*
*
* @return
*/
private static String generateScriptName() {

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2016 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2017 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2017 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2017 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2017 the original author or authors.
* Copyright 2013-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2017 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
@@ -51,17 +52,17 @@ import com.mongodb.bulk.BulkWriteError;
*/
public class MongoExceptionTranslator implements PersistenceExceptionTranslator {
private static final Set<String> DULICATE_KEY_EXCEPTIONS = new HashSet<String>(
private static final Set<String> DUPLICATE_KEY_EXCEPTIONS = new HashSet<>(
Arrays.asList("MongoException.DuplicateKey", "DuplicateKeyException"));
private static final Set<String> RESOURCE_FAILURE_EXCEPTIONS = new HashSet<String>(
private static final Set<String> RESOURCE_FAILURE_EXCEPTIONS = new HashSet<>(
Arrays.asList("MongoException.Network", "MongoSocketException", "MongoException.CursorNotFound",
"MongoCursorNotFoundException", "MongoServerSelectionException", "MongoTimeoutException"));
private static final Set<String> RESOURCE_USAGE_EXCEPTIONS = new HashSet<String>(
Arrays.asList("MongoInternalException"));
private static final Set<String> RESOURCE_USAGE_EXCEPTIONS = new HashSet<>(
Collections.singletonList("MongoInternalException"));
private static final Set<String> DATA_INTEGRETY_EXCEPTIONS = new HashSet<String>(
private static final Set<String> DATA_INTEGRITY_EXCEPTIONS = new HashSet<>(
Arrays.asList("WriteConcernException", "MongoWriteException", "MongoBulkWriteException"));
/*
@@ -79,7 +80,7 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
String exception = ClassUtils.getShortName(ClassUtils.getUserClass(ex.getClass()));
if (DULICATE_KEY_EXCEPTIONS.contains(exception)) {
if (DUPLICATE_KEY_EXCEPTIONS.contains(exception)) {
return new DuplicateKeyException(ex.getMessage(), ex);
}
@@ -91,7 +92,7 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex);
}
if (DATA_INTEGRETY_EXCEPTIONS.contains(exception)) {
if (DATA_INTEGRITY_EXCEPTIONS.contains(exception)) {
if (ex instanceof MongoServerException) {
if (((MongoServerException) ex).getCode() == 11000) {

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -72,11 +72,11 @@ public interface MongoOperations extends FluentMongoOperations {
String getCollectionName(Class<?> entityClass);
/**
* Execute the a MongoDB command expressed as a JSON string. This will call the method JSON.parse that is part of the
* MongoDB driver to convert the JSON string to a Document. Any errors that result from executing this command will be
* Execute the a MongoDB command expressed as a JSON string. Parsing is delegated to {@link Document#parse(String)} to
* obtain the {@link Document} holding the actual command. Any errors that result from executing this command will be
* converted into Spring's DAO exception hierarchy.
*
* @param jsonCommand a MongoDB command expressed as a JSON string.
* @param jsonCommand a MongoDB command expressed as a JSON string. Must not be {@literal null}.
* @return a result object returned by the action.
*/
Document executeCommand(String jsonCommand);
@@ -710,8 +710,8 @@ public interface MongoOperations extends FluentMongoOperations {
<T> T findById(Object id, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification. Must not be {@literal null}.
@@ -723,8 +723,8 @@ public interface MongoOperations extends FluentMongoOperations {
<T> T findAndModify(Query query, Update update, Class<T> entityClass);
/**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification. Must not be {@literal null}.
@@ -737,8 +737,8 @@ public interface MongoOperations extends FluentMongoOperations {
<T> T findAndModify(Query query, Update update, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
@@ -754,8 +754,8 @@ public interface MongoOperations extends FluentMongoOperations {
<T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass);
/**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
@@ -851,8 +851,8 @@ public interface MongoOperations extends FluentMongoOperations {
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" >
* Spring's Type Conversion"</a> for more details.
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's Type
* Conversion"</a> for more details.
* <p/>
* <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method.
@@ -894,7 +894,7 @@ public interface MongoOperations extends FluentMongoOperations {
* Insert a mixed Collection of objects into a database collection determining the collection name to use based on the
* class.
*
* @param collectionToSave the list of objects to save. Must not be {@literal null}.
* @param objectsToSave the list of objects to save. Must not be {@literal null}.
*/
void insertAll(Collection<? extends Object> objectsToSave);
@@ -908,8 +908,8 @@ public interface MongoOperations extends FluentMongoOperations {
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" >
* Spring's Type Conversion"</a> for more details.
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's Type
* Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
*/
@@ -925,8 +925,8 @@ public interface MongoOperations extends FluentMongoOperations {
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's
* Type Conversion"</a> for more details.
* http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type
* Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
* @param collectionName name of the collection to store the object in. Must not be {@literal null}.
@@ -1083,6 +1083,7 @@ public interface MongoOperations extends FluentMongoOperations {
* @param query the query document that specifies the criteria used to remove a record.
* @param entityClass class that determines the collection to use.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
* @throws IllegalArgumentException when {@literal query} or {@literal entityClass} is {@literal null}.
*/
DeleteResult remove(Query query, Class<?> entityClass);
@@ -1094,6 +1095,8 @@ public interface MongoOperations extends FluentMongoOperations {
* @param entityClass class of the pojo to be operated on. Can be {@literal null}.
* @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
* @throws IllegalArgumentException when {@literal query}, {@literal entityClass} or {@literal collectionName} is
* {@literal null}.
*/
DeleteResult remove(Query query, Class<?> entityClass, String collectionName);
@@ -1106,6 +1109,7 @@ public interface MongoOperations extends FluentMongoOperations {
* @param query the query document that specifies the criteria used to remove a record.
* @param collectionName name of the collection where the objects will removed, must not be {@literal null} or empty.
* @return the {@link DeleteResult} which lets you access the results of the previous delete.
* @throws IllegalArgumentException when {@literal query} or {@literal collectionName} is {@literal null}.
*/
DeleteResult remove(Query query, String collectionName);

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2017 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -18,25 +18,15 @@ package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.SerializationUtils.*;
import com.mongodb.client.model.MapReduceAction;
import lombok.AccessLevel;
import lombok.AllArgsConstructor;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Iterator;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Map;
import java.util.*;
import java.util.Map.Entry;
import java.util.Optional;
import java.util.Scanner;
import java.util.Set;
import java.util.concurrent.TimeUnit;
import org.bson.Document;
@@ -215,7 +205,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @param databaseName must not be {@literal null} or empty.
*/
public MongoTemplate(MongoClient mongoClient, String databaseName) {
this(new SimpleMongoDbFactory(mongoClient, databaseName), null);
this(new SimpleMongoDbFactory(mongoClient, databaseName), null);
}
/**
@@ -1594,13 +1584,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
protected <T> DeleteResult doRemove(final String collectionName, final Query query,
@Nullable final Class<T> entityClass) {
Assert.notNull(query, "Query must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
if (query == null) {
throw new InvalidDataAccessApiUsageException("Query passed in to remove can't be null!");
}
final Document queryObject = query.getQueryObject();
final MongoPersistentEntity<?> entity = getPersistentEntity(entityClass);
final Document queryObject = queryMapper.getMappedObject(query.getQueryObject(), entity);
return execute(collectionName, new CollectionCallback<DeleteResult>() {
@@ -1609,7 +1597,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
maybeEmitEvent(new BeforeDeleteEvent<T>(queryObject, entityClass, collectionName));
Document mappedQuery = queryMapper.getMappedObject(queryObject, entity);
Document removeQuery = queryObject;
DeleteOptions options = new DeleteOptions();
query.getCollation().map(Collation::toMongoCollation).ifPresent(options::collation);
@@ -1622,14 +1610,27 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
DeleteResult dr = null;
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Remove using query: {} in collection: {}.",
new Object[] { serializeToJsonSafely(mappedQuery), collectionName });
new Object[] { serializeToJsonSafely(removeQuery), collectionName });
}
if (query.getLimit() > 0 || query.getSkip() > 0) {
MongoCursor<Document> cursor = new QueryCursorPreparer(query, entityClass)
.prepare(collection.find(removeQuery).projection(new Document(ID_FIELD, 1))).iterator();
Set<Object> ids = new LinkedHashSet<>();
while (cursor.hasNext()) {
ids.add(cursor.next().get(ID_FIELD));
}
removeQuery = new Document(ID_FIELD, new Document("$in", ids));
}
if (writeConcernToUse == null) {
dr = collection.deleteMany(mappedQuery, options);
dr = collection.deleteMany(removeQuery, options);
} else {
dr = collection.withWriteConcern(writeConcernToUse).deleteMany(mappedQuery, options);
dr = collection.withWriteConcern(writeConcernToUse).deleteMany(removeQuery, options);
}
maybeEmitEvent(new AfterDeleteEvent<T>(queryObject, entityClass, collectionName));
@@ -1715,18 +1716,32 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
if (!CollectionUtils.isEmpty(mapReduceOptions.getScopeVariables())) {
result = result.scope(new Document(mapReduceOptions.getScopeVariables()));
}
if (mapReduceOptions.getLimit() != null && mapReduceOptions.getLimit().intValue() > 0) {
result = result.limit(mapReduceOptions.getLimit());
}
if (mapReduceOptions.getFinalizeFunction().filter(StringUtils::hasText).isPresent()) {
result = result.finalizeFunction(mapReduceOptions.getFinalizeFunction().get());
}
if (mapReduceOptions.getJavaScriptMode() != null) {
result = result.jsMode(mapReduceOptions.getJavaScriptMode());
}
if (mapReduceOptions.getOutputSharded().isPresent()) {
result = result.sharded(mapReduceOptions.getOutputSharded().get());
}
if (StringUtils.hasText(mapReduceOptions.getOutputCollection()) && !mapReduceOptions.usesInlineOutput()) {
result = result.collectionName(mapReduceOptions.getOutputCollection())
.action(mapReduceOptions.getMapReduceAction());
if (mapReduceOptions.getOutputDatabase().isPresent()) {
result = result.databaseName(mapReduceOptions.getOutputDatabase().get());
}
}
}
result = collation.map(Collation::toMongoCollation).map(result::collation).orElse(result);
@@ -1933,16 +1948,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Assert.notNull(aggregation, "Aggregation pipeline must not be null!");
Assert.notNull(outputType, "Output type must not be null!");
AggregationOperationContext rootContext = context == null ? Aggregation.DEFAULT_CONTEXT : context;
Document command = aggregation.toDocument(collectionName, rootContext);
Document commandResult = new BatchAggregationLoader(this, queryMapper, mappingContext, readPreference,
Integer.MAX_VALUE)
.aggregate(collectionName, aggregation, context);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing aggregation: {}", serializeToJsonSafely(command));
}
Document commandResult = executeCommand(command, this.readPreference);
return new AggregationResults<O>(returnPotentiallyMappedResults(outputType, commandResult, collectionName),
return new AggregationResults<>(returnPotentiallyMappedResults(outputType, commandResult, collectionName),
commandResult);
}
@@ -1962,9 +1972,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
return Collections.emptyList();
}
DocumentCallback<O> callback = new UnwrapAndReadDocumentCallback<O>(mongoConverter, outputType, collectionName);
DocumentCallback<O> callback = new UnwrapAndReadDocumentCallback<>(mongoConverter, outputType, collectionName);
List<O> mappedResults = new ArrayList<O>();
List<O> mappedResults = new ArrayList<>();
for (Document document : resultSet) {
mappedResults.add(callback.doWith(document));
}
@@ -1979,9 +1989,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Assert.notNull(aggregation, "Aggregation pipeline must not be null!");
Assert.notNull(outputType, "Output type must not be null!");
AggregationOperationContext rootContext = context == null ? Aggregation.DEFAULT_CONTEXT : context;
AggregationUtil aggregationUtil = new AggregationUtil(queryMapper, mappingContext);
AggregationOperationContext rootContext = aggregationUtil.prepareAggregationContext(aggregation, context);
Document command = aggregation.toDocument(collectionName, rootContext);
Document command = aggregationUtil.createCommand(collectionName, aggregation, rootContext);
assertNotExplain(command);
@@ -1989,7 +2000,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
LOGGER.debug("Streaming aggregation: {}", serializeToJsonSafely(command));
}
ReadDocumentCallback<O> readCallback = new ReadDocumentCallback<O>(mongoConverter, outputType, collectionName);
ReadDocumentCallback<O> readCallback = new ReadDocumentCallback<>(mongoConverter, outputType, collectionName);
return execute(collectionName, new CollectionCallback<CloseableIterator<O>>() {
@@ -2013,7 +2024,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
cursor = cursor.collation(options.getCollation().map(Collation::toMongoCollation).get());
}
return new CloseableIterableCursorAdapter<O>(cursor.iterator(), exceptionTranslator, readCallback);
return new CloseableIterableCursorAdapter<>(cursor.iterator(), exceptionTranslator, readCallback);
}
});
}
@@ -2582,6 +2593,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
return fields;
}
/**
* Tries to convert the given {@link RuntimeException} into a {@link DataAccessException} but returns the original
* exception if the conversation failed. Thus allows safe re-throwing of the return value.
@@ -2771,31 +2783,26 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @author Oliver Gierke
* @author Christoph Strobl
*/
@RequiredArgsConstructor
private class ReadDocumentCallback<T> implements DocumentCallback<T> {
private final EntityReader<? super T, Bson> reader;
private final Class<T> type;
private final @NonNull EntityReader<? super T, Bson> reader;
private final @NonNull Class<T> type;
private final String collectionName;
public ReadDocumentCallback(EntityReader<? super T, Bson> reader, Class<T> type, String collectionName) {
Assert.notNull(reader, "EntityReader must not be null!");
Assert.notNull(type, "Entity type must not be null!");
this.reader = reader;
this.type = type;
this.collectionName = collectionName;
}
@Nullable
public T doWith(Document object) {
public T doWith(@Nullable Document object) {
if (null != object) {
maybeEmitEvent(new AfterLoadEvent<T>(object, type, collectionName));
}
T source = reader.read(type, object);
if (null != source) {
maybeEmitEvent(new AfterConvertEvent<T>(object, source, collectionName));
}
return source;
}
}
@@ -2830,10 +2837,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Class<?> typeToRead = targetType.isInterface() || targetType.isAssignableFrom(entityType) ? entityType
: targetType;
if (null != object) {
maybeEmitEvent(new AfterLoadEvent<T>(object, targetType, collectionName));
}
Object source = reader.read(typeToRead, object);
Object result = targetType.isInterface() ? projectionFactory.createProjection(targetType, source) : source;
if (result == null) {
if (null != result) {
maybeEmitEvent(new AfterConvertEvent<>(object, result, collectionName));
}
@@ -2986,7 +2998,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
T doWith = delegate.doWith(content);
return new GeoResult<T>(doWith, new Distance(distance, metric));
return new GeoResult<>(doWith, new Distance(distance, metric));
}
}
@@ -3074,4 +3086,155 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
public MongoDbFactory getMongoDbFactory() {
return mongoDbFactory;
}
/**
* {@link BatchAggregationLoader} is a little helper that can process cursor results returned by an aggregation
* command execution. On presence of a {@literal nextBatch} indicated by presence of an {@code id} field in the
* {@code cursor} another {@code getMore} command gets executed reading the next batch of documents until all results
* are loaded.
*
* @author Christoph Strobl
* @since 1.10
*/
static class BatchAggregationLoader {
private static final String CURSOR_FIELD = "cursor";
private static final String RESULT_FIELD = "result";
private static final String BATCH_SIZE_FIELD = "batchSize";
private static final String FIRST_BATCH = "firstBatch";
private static final String NEXT_BATCH = "nextBatch";
private static final String SERVER_USED = "serverUsed";
private static final String OK = "ok";
private final MongoTemplate template;
private final QueryMapper queryMapper;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final ReadPreference readPreference;
private final int batchSize;
BatchAggregationLoader(MongoTemplate template, QueryMapper queryMapper,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext,
ReadPreference readPreference, int batchSize) {
this.template = template;
this.queryMapper = queryMapper;
this.mappingContext = mappingContext;
this.readPreference = readPreference;
this.batchSize = batchSize;
}
/**
* Run aggregation command and fetch all results.
*/
Document aggregate(String collectionName, Aggregation aggregation, AggregationOperationContext context) {
Document command = prepareAggregationCommand(collectionName, aggregation, context, batchSize);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing aggregation: {}", serializeToJsonSafely(command));
}
return mergeAggregationResults(aggregateBatched(command, collectionName, batchSize));
}
/**
* Pre process the aggregation command sent to the server by adding {@code cursor} options to match execution on
* different server versions.
*/
private Document prepareAggregationCommand(String collectionName, Aggregation aggregation,
@Nullable AggregationOperationContext context, int batchSize) {
AggregationUtil aggregationUtil = new AggregationUtil(queryMapper, mappingContext);
AggregationOperationContext rootContext = aggregationUtil.prepareAggregationContext(aggregation, context);
Document command = aggregationUtil.createCommand(collectionName, aggregation, rootContext);
if (!aggregation.getOptions().isExplain()) {
command.put(CURSOR_FIELD, new Document(BATCH_SIZE_FIELD, batchSize));
}
return command;
}
private List<Document> aggregateBatched(Document command, String collectionName, int batchSize) {
List<Document> results = new ArrayList<>();
Document commandResult = template.executeCommand(command, readPreference);
results.add(postProcessResult(commandResult));
while (hasNext(commandResult)) {
Document getMore = new Document("getMore", getNextBatchId(commandResult)) //
.append("collection", collectionName) //
.append(BATCH_SIZE_FIELD, batchSize);
commandResult = template.executeCommand(getMore, this.readPreference);
results.add(postProcessResult(commandResult));
}
return results;
}
private static Document postProcessResult(Document commandResult) {
if (!commandResult.containsKey(CURSOR_FIELD)) {
return commandResult;
}
Document resultObject = new Document(SERVER_USED, commandResult.get(SERVER_USED));
resultObject.put(OK, commandResult.get(OK));
Document cursor = (Document) commandResult.get(CURSOR_FIELD);
if (cursor.containsKey(FIRST_BATCH)) {
resultObject.put(RESULT_FIELD, cursor.get(FIRST_BATCH));
} else {
resultObject.put(RESULT_FIELD, cursor.get(NEXT_BATCH));
}
return resultObject;
}
private static Document mergeAggregationResults(List<Document> batchResults) {
if (batchResults.size() == 1) {
return batchResults.iterator().next();
}
Document commandResult = new Document();
List<Object> allResults = new ArrayList<>();
for (Document batchResult : batchResults) {
Collection documents = (Collection<?>) batchResult.get(RESULT_FIELD);
if (!CollectionUtils.isEmpty(documents)) {
allResults.addAll(documents);
}
}
// take general info from first batch
commandResult.put(SERVER_USED, batchResults.iterator().next().get(SERVER_USED));
commandResult.put(OK, batchResults.iterator().next().get(OK));
// and append the merged batchResults
commandResult.put(RESULT_FIELD, allResults);
return commandResult;
}
private static boolean hasNext(Document commandResult) {
if (!commandResult.containsKey(CURSOR_FIELD)) {
return false;
}
Object next = getNextBatchId(commandResult);
return next != null && ((Number) next).longValue() != 0L;
}
@Nullable
private static Object getNextBatchId(Document commandResult) {
return ((Document) commandResult.get(CURSOR_FIELD)).get("id");
}
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016 the original author or authors.
* Copyright 2016-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016 the original author or authors.
* Copyright 2016-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

Some files were not shown because too many files have changed in this diff Show More