Compare commits

...

189 Commits

Author SHA1 Message Date
Mark Paluch
43224ccec7 DATAMONGO-2272 - Release version 1.10.23 (Ingalls SR23). 2019-08-05 10:37:40 +02:00
Mark Paluch
54ef3305ae DATAMONGO-2272 - Prepare 1.10.23 (Ingalls SR23). 2019-08-05 10:37:16 +02:00
Mark Paluch
fa2dbf13a9 DATAMONGO-2272 - Updated changelog. 2019-08-05 10:37:13 +02:00
Mark Paluch
7c215a9ebf DATAMONGO-2318 - Fix typo. 2019-07-10 09:46:26 +02:00
Mark Paluch
c9287a2465 DATAMONGO-2280 - Cleanup release profile.
Reuse inherited configuration from parent pom.
2019-07-09 11:02:19 +02:00
Mark Paluch
5f070ee9de DATAMONGO-2318 - Revise readme for a consistent structure. 2019-07-09 11:02:19 +02:00
Greg Turnquist
0b963fd9c6 DATAMONGO-2280 - Polishing. 2019-07-03 17:13:10 -05:00
Greg Turnquist
29849dcd13 DATAMONGO-2280 - Use parent 'artifactory' profile for snapshot releases. 2019-07-03 17:12:44 -05:00
Greg Turnquist
27dc8bcade DATAMONGO-2280 - Only test main branch for upstream triggers. 2019-06-28 19:32:12 -05:00
Greg Turnquist
fe4597f0e6 DATAMONGO-2280 - Set user.name and user.home for CI jobs. 2019-06-25 13:36:22 -05:00
Christoph Strobl
902ea24ded DATAMONGO-2256 - Updated changelog. 2019-06-14 15:17:55 +02:00
Christoph Strobl
2e12f3a22e DATAMONGO-2271 - Updated changelog. 2019-06-14 13:27:17 +02:00
Greg Turnquist
ebc22c8d32 DATAMONGO-2280 - Add maven wrapper. 2019-06-11 15:16:10 -05:00
Greg Turnquist
9e01c4aad5 DATAMONGO-2280 - Introduce Jenkins. 2019-06-11 15:12:02 -05:00
Mark Paluch
bd2b2653c3 DATAMONGO-2270 - After release cleanups. 2019-05-13 18:14:55 +02:00
Mark Paluch
fb3655329b DATAMONGO-2270 - Prepare next development iteration. 2019-05-13 18:14:51 +02:00
Mark Paluch
a5cefddbcc DATAMONGO-2270 - Release version 1.10.22 (Ingalls SR22). 2019-05-13 17:25:16 +02:00
Mark Paluch
fe3f358480 DATAMONGO-2270 - Prepare 1.10.22 (Ingalls SR22). 2019-05-13 17:24:29 +02:00
Mark Paluch
d7a63ac882 DATAMONGO-2270 - Updated changelog. 2019-05-13 17:24:23 +02:00
Mark Paluch
526037e71b DATAMONGO-2269 - Updated changelog. 2019-05-13 14:59:29 +02:00
Mark Paluch
2ba6f48335 DATAMONGO-2260 - Updated changelog. 2019-05-13 12:37:43 +02:00
Oliver Drotbohm
ed22672b5c DATAMONGO-2244 - After release cleanups. 2019-05-10 14:15:20 +02:00
Oliver Drotbohm
7c94718c9b DATAMONGO-2244 - Prepare next development iteration. 2019-05-10 14:15:18 +02:00
Oliver Drotbohm
d56d84924a DATAMONGO-2244 - Release version 1.10.21 (Ingalls SR21). 2019-05-10 13:30:51 +02:00
Oliver Drotbohm
034d3f4d43 DATAMONGO-2244 - Prepare 1.10.21 (Ingalls SR21). 2019-05-10 13:30:15 +02:00
Oliver Drotbohm
c9a6a04fc3 DATAMONGO-2244 - Updated changelog. 2019-05-10 13:30:09 +02:00
Oliver Drotbohm
fb8fd05cf0 DATAMONGO-2246 - Updated changelog. 2019-05-10 12:57:23 +02:00
Christoph Strobl
8c62077d08 DATAMONGO-2222 - Updated changelog. 2019-04-11 12:28:55 +02:00
Oliver Drotbohm
23967107f2 DATAMONGO-2204 - Updated changelog. 2019-04-01 20:56:31 +02:00
Oliver Drotbohm
704a9658a7 DATAMONGO-2186 - Updated changelog. 2019-04-01 19:37:04 +02:00
Oliver Drotbohm
a394d1ea8e DATAMONGO-2243 - After release cleanups. 2019-04-01 18:16:19 +02:00
Oliver Drotbohm
b35e2e8eb8 DATAMONGO-2243 - Prepare next development iteration. 2019-04-01 18:16:18 +02:00
Oliver Drotbohm
26cda42892 DATAMONGO-2243 - Release version 1.10.20 (Ingalls SR20). 2019-04-01 17:40:06 +02:00
Oliver Drotbohm
145fc27bd9 DATAMONGO-2243 - Prepare 1.10.20 (Ingalls SR20). 2019-04-01 17:39:38 +02:00
Oliver Drotbohm
57b541ac86 DATAMONGO-2243 - Updated changelog. 2019-04-01 17:39:33 +02:00
Oliver Drotbohm
a21a539c83 DATAMONGO-2185 - After release cleanups. 2019-04-01 13:31:15 +02:00
Oliver Drotbohm
89fef9306f DATAMONGO-2185 - Prepare next development iteration. 2019-04-01 13:31:14 +02:00
Oliver Drotbohm
b2b67270d1 DATAMONGO-2185 - Release version 1.10.19 (Ingalls SR19). 2019-04-01 13:08:18 +02:00
Oliver Drotbohm
4d72b0b78f DATAMONGO-2185 - Prepare 1.10.19 (Ingalls SR19). 2019-04-01 13:07:51 +02:00
Oliver Drotbohm
cd510d97bd DATAMONGO-2185 - Updated changelog. 2019-04-01 13:07:47 +02:00
Spring Operator
bbcaf361c6 DATAMONGO-2231 - URL Cleanup.
This commit updates URLs to prefer the https protocol. Redirects are not followed to avoid accidentally expanding intentionally shortened URLs (i.e. if using a URL shortener).

# Fixed URLs

## Fixed Success
These URLs were switched to an https URL with a 2xx status. While the status was successful, your review is still recommended.

* [ ] http://www.apache.org/licenses/ with 1 occurrences migrated to:
  https://www.apache.org/licenses/ ([https](https://www.apache.org/licenses/) result 200).
* [ ] http://www.apache.org/licenses/LICENSE-2.0 with 587 occurrences migrated to:
  https://www.apache.org/licenses/LICENSE-2.0 ([https](https://www.apache.org/licenses/LICENSE-2.0) result 200).

Original Pull Request: #696
2019-03-22 09:39:13 +01:00
Spring Operator
4cf922a05b DATAMONGO-2231 - URL Cleanup.
This commit updates URLs to prefer the https protocol. Redirects are not followed to avoid accidentally expanding intentionally shortened URLs (i.e. if using a URL shortener).

# Fixed URLs

## Fixed Success
These URLs were switched to an https URL with a 2xx status. While the status was successful, your review is still recommended.

* http://maven.apache.org/xsd/maven-4.0.0.xsd with 3 occurrences migrated to:
  https://maven.apache.org/xsd/maven-4.0.0.xsd ([https](https://maven.apache.org/xsd/maven-4.0.0.xsd) result 200).
* http://www.gopivotal.com (302) with 6 occurrences migrated to:
  https://pivotal.io ([https](https://www.gopivotal.com) result 200).
* http://maven.apache.org/maven-v4_0_0.xsd with 3 occurrences migrated to:
  https://maven.apache.org/maven-v4_0_0.xsd ([https](https://maven.apache.org/maven-v4_0_0.xsd) result 301).
* http://projects.spring.io/spring-data-mongodb with 1 occurrences migrated to:
  https://projects.spring.io/spring-data-mongodb ([https](https://projects.spring.io/spring-data-mongodb) result 301).

# Ignored
These URLs were intentionally ignored.

* http://maven.apache.org/POM/4.0.0 with 12 occurrences
* http://www.w3.org/2001/XMLSchema-instance with 6 occurrences

Original Pull Request: #662
2019-03-19 12:52:52 +01:00
Christoph Strobl
da07d9e3f8 DATAMONGO-2164 - Updated changelog. 2019-03-07 10:30:12 +01:00
Mark Paluch
e0b0763349 DATAMONGO-2187 - Updated changelog. 2019-02-13 11:47:55 +01:00
Mark Paluch
00ca50cf17 DATAMONGO-2145 - Updated changelog. 2019-01-10 14:15:40 +01:00
Mark Paluch
3a461d2bc6 DATAMONGO-2144 - Updated changelog. 2019-01-10 12:26:38 +01:00
Mark Paluch
df3f91fff4 DATAMONGO-2143 - After release cleanups. 2019-01-10 10:46:20 +01:00
Mark Paluch
5d75c2a921 DATAMONGO-2143 - Prepare next development iteration. 2019-01-10 10:46:19 +01:00
Mark Paluch
d723784925 DATAMONGO-2143 - Release version 1.10.18 (Ingalls SR18). 2019-01-10 09:52:56 +01:00
Mark Paluch
a2426ec98d DATAMONGO-2143 - Prepare 1.10.18 (Ingalls SR18). 2019-01-10 09:52:13 +01:00
Mark Paluch
568f6bb1e0 DATAMONGO-2143 - Updated changelog. 2019-01-10 09:52:09 +01:00
Mark Paluch
c2f94cb840 DATAMONGO-2175 - Update copyright years to 2019. 2019-01-02 14:13:33 +01:00
Christoph Strobl
14981872e6 DATAMONGO-2160 - Updated changelog. 2018-12-11 11:43:16 +01:00
Christoph Strobl
624ec3ce55 DATAMONGO-2149 - Fix $slice in fields projection when pointing to array of DBRefs.
We now no longer try to convert the actual slice parameters into a DBRef.

Original pull request: #623.
2018-11-30 15:19:07 +01:00
Mark Paluch
db9d282313 DATAMONGO-2121 - Updated changelog. 2018-11-27 14:54:04 +01:00
Mark Paluch
10972bfda1 DATAMONGO-2109 - Updated changelog. 2018-11-27 12:36:49 +01:00
Mark Paluch
28454cb554 DATAMONGO-2110 - After release cleanups. 2018-11-27 11:10:58 +01:00
Mark Paluch
f17c4923e6 DATAMONGO-2110 - Prepare next development iteration. 2018-11-27 11:10:56 +01:00
Mark Paluch
4414771aaa DATAMONGO-2110 - Release version 1.10.17 (Ingalls SR17). 2018-11-27 10:15:45 +01:00
Mark Paluch
9acc558ae9 DATAMONGO-2110 - Prepare 1.10.17 (Ingalls SR17). 2018-11-27 10:14:58 +01:00
Mark Paluch
2d946a2c9a DATAMONGO-2110 - Updated changelog. 2018-11-27 10:14:54 +01:00
Oliver Drotbohm
715a2d3da4 DATAMONGO-2135 - Polishing. 2018-11-15 15:54:22 +01:00
Oliver Drotbohm
01c1ec2e5b DATAMONGO-2135 - Default to intermediate List for properties typed to Collection.
We now defensively create a List rather than a LinkedHashSet (which Spring's CollectionFactory.createCollection(…) defaults to) to make sure we're not accidentally dropping values that are considered equal according to their Java class definition.
2018-11-15 15:54:18 +01:00
Mark Paluch
7596f37235 DATAMONGO-2107 - Updated changelog. 2018-10-29 14:30:31 +01:00
Mona Mohamadinia
da4ccaa975 DATAMONGO-2118 - Fix typo in repositories reference documentation.
Original pull request: #611.
2018-10-26 10:08:55 +02:00
Mark Paluch
aed9b4d0da DATAMONGO-2083 - After release cleanups. 2018-10-15 13:55:50 +02:00
Mark Paluch
54a0e34049 DATAMONGO-2083 - Prepare next development iteration. 2018-10-15 13:55:49 +02:00
Mark Paluch
6cb8055601 DATAMONGO-2083 - Release version 1.10.16 (Ingalls SR16). 2018-10-15 12:56:30 +02:00
Mark Paluch
34829fca84 DATAMONGO-2083 - Prepare 1.10.16 (Ingalls SR16). 2018-10-15 12:55:35 +02:00
Mark Paluch
94b2495145 DATAMONGO-2083 - Updated changelog. 2018-10-15 12:55:29 +02:00
Mark Paluch
f2301eaf5b DATAMONGO-2084 - Updated changelog. 2018-10-15 12:46:26 +02:00
Mark Paluch
e3e201edb9 DATAMONGO-2094 - Updated changelog. 2018-10-15 11:37:25 +02:00
Christoph Strobl
c738148b40 DATAMONGO-2096 - Fix target field name for GraphLookup aggregation operation.
We now make sure to use the target field name instead of the alias when processing GraphLookupOperation.

Original pull request: #613.
2018-10-05 15:17:42 +02:00
Mark Paluch
ec5268a862 DATAMONGO-2061 - Updated changelog. 2018-09-21 08:13:12 -04:00
Mark Paluch
9bc073501f DATAMONGO-2034 - Updated changelog. 2018-09-10 14:15:51 +02:00
Mark Paluch
3d00ef1076 DATAMONGO-2035 - After release cleanups. 2018-09-10 10:01:44 +02:00
Mark Paluch
a07ca615be DATAMONGO-2035 - Prepare next development iteration. 2018-09-10 10:01:43 +02:00
Mark Paluch
15501f0424 DATAMONGO-2035 - Release version 1.10.15 (Ingalls SR15). 2018-09-10 09:30:36 +02:00
Mark Paluch
8d0edcd389 DATAMONGO-2035 - Prepare 1.10.15 (Ingalls SR15). 2018-09-10 09:29:51 +02:00
Mark Paluch
02241870d7 DATAMONGO-2035 - Updated changelog. 2018-09-10 09:29:48 +02:00
Oliver Gierke
d76dd90325 DATAMONGO-2076 - Fixed attribute substitution in getting started section. 2018-08-30 09:32:19 +02:00
Oliver Gierke
cc8c3448a8 DATAMONGO-2033 - Updated changelog. 2018-08-20 11:07:54 +02:00
Mark Paluch
3981a8ac92 DATAMONGO-2055 - Polishing.
Move test to UpdateMapperUnitTests.

Original pull request: #600.
2018-08-15 15:59:27 +02:00
Christoph Strobl
b9d72060af DATAMONGO-2055 - Allow position modifier to be negative using push at position on Update.
Original pull request: #600.
2018-08-15 15:54:27 +02:00
Mark Paluch
c1647ed269 DATAMONGO-2050 - Polishing.
Tweak Javadoc.

Original pull request: #596.
2018-08-15 15:28:56 +02:00
Christoph Strobl
9faeb1afe0 DATAMONGO-2050 - Allow to specify the index to use for $geoNear aggregation operation.
Original pull request: #596.
2018-08-15 15:28:56 +02:00
Mark Paluch
bbc7a64956 DATAMONGO-2051 - Polishing.
Use method argument types to avoid false positives with different method signatures.

Original pull request: #597.
Related pull request: #598.
2018-08-14 16:37:07 +02:00
Christoph Strobl
5f32339175 DATAMONGO-2051 - Add support for SCRAM-SHA-256 authentication mechanism to MongoCredentialPropertyEditor.
Original pull request: #597.
Related pull request: #598.
2018-08-14 16:37:04 +02:00
Christoph Strobl
013f56d141 DATAMONGO-2049 - Add support for $ltrim, $rtrim, and $trim.
Original pull request: #594.
2018-08-14 10:55:43 +02:00
Mark Paluch
ff9217b33e DATAMONGO-2048 - Polishing.
Javadoc tweaks.

Original pull request: #595.
2018-08-13 16:11:18 +02:00
Christoph Strobl
a87476f474 DATAMONGO-2048 - Add support for MongoDB 4.0 $convert aggregation operator.
We now support the following type conversion aggregation operators:

* $convert
* $toBool
* $toDate
* $toDecimal
* $toDouble
* $toInt
* $toLong
* $toObjectId
* $toString

Original pull request: #595.
2018-08-13 16:11:18 +02:00
Mark Paluch
df40a4820e DATAMONGO-2057 - Skip MongoDbUtils integration tests when running against MongoDB 4.0.
MongoDB 4.0 digests passwords by default which does not work with the SCRAM-SHA-256 authentication method so we skip those tests when running against MongoDB 4.0.
2018-08-13 14:09:20 +02:00
Mark Paluch
ac0aed8449 DATAMONGO-2047 - Polishing.
Retain previous options when calling withTimezone(…)/onNull…(…). Add tests. Javadoc.

Original pull request: #593.
2018-08-13 13:38:40 +02:00
Christoph Strobl
488462d5b3 DATAMONGO-2047 - Update $dateToString and $dateFromString aggregation operators to match MongoDB 4.0 changes.
We added the format and onNull options to DateFromString and changed format to an optional parameter.

Original pull request: #593.
2018-08-13 13:38:37 +02:00
Mark Paluch
639fecc9ca DATAMONGO-2043 - Polishing.
Slightly tweak Javadoc.

Original pull request: #589.
2018-08-08 11:11:41 +02:00
Christoph Strobl
47f9e3c739 DATAMONGO-2043 - Omit type hint when mapping simple types.
Original pull request: #589.
2018-08-08 11:11:35 +02:00
Mark Paluch
25507b995f DATAMONGO-2006 - After release cleanups. 2018-07-27 11:09:41 +02:00
Mark Paluch
1e29687135 DATAMONGO-2006 - Prepare next development iteration. 2018-07-27 11:09:38 +02:00
Mark Paluch
d594ae1bd0 DATAMONGO-2006 - Release version 1.10.14 (Ingalls SR14). 2018-07-27 09:21:39 +02:00
Mark Paluch
441e7fbb39 DATAMONGO-2006 - Prepare 1.10.14 (Ingalls SR14). 2018-07-27 09:20:28 +02:00
Mark Paluch
b9389860ce DATAMONGO-2006 - Updated changelog. 2018-07-27 09:20:21 +02:00
Mark Paluch
58426e0314 DATAMONGO-2007 - Updated changelog. 2018-07-26 16:24:01 +02:00
Mark Paluch
209466a85e DATAMONGO-1982 - Updated changelog. 2018-07-26 14:03:19 +02:00
Oliver Gierke
7c18b7dc0f DATAMONGO-2011 - Port unit test to verify Ingalls is not affected.
Original pull request: #587.
2018-07-13 12:59:38 +02:00
Christoph Strobl
3ff0975ab9 DATAMONGO-2023 - Polishing.
Add tests verifying the behavior when using both typed and untyped aggregation.

Original Pull Request: #585
2018-07-09 20:04:16 +02:00
Mark Paluch
ee9d8768a2 DATAMONGO-2023 - Polishing.
Remove trailing whitespaces.

Original Pull Request: #585
2018-07-09 19:58:07 +02:00
Mark Paluch
b77658e188 DATAMONGO-2023 - Allow usage of $sample in aggregation pipelines.
We now allow usage of $sample as aggregation framework stage and are no longer in the way with Query by Example. Previously, we identified Example objects using the $sample keyword which prevented query mapping of aggregation pipelines that contained a sample stage.

We already fixed this issue via DATAMONGO-1325 for the 2.x line.

Original Pull Request: #585
2018-07-09 19:57:35 +02:00
Mark Paluch
0cb0cb700b DATAMONGO-2016 - Polishing.
Fail gracefully if query string parameter has no value. Reformat test.

Original pull request: #578.
2018-07-04 11:25:21 +02:00
Stephen Tyler Conrad
364920e644 DATAMONGO-2016 - Fix username/password extraction in MongoCredentialPropertyEditor.
MongoCredentialPropertyEditor inspects now the connection URI for the appropriate delimiter tokens. Previously, inspection used the char questionmark for username/password delimiter inspection.

Original pull request: #578.
2018-07-04 11:24:50 +02:00
Mark Paluch
0495dd89ce DATAMONGO-1969 - Updated changelog. 2018-06-13 21:39:51 +02:00
Mark Paluch
19ccdd473a DATAMONGO-1967 - After release cleanups. 2018-06-13 14:42:35 +02:00
Mark Paluch
8fecc92dd4 DATAMONGO-1967 - Prepare next development iteration. 2018-06-13 14:42:33 +02:00
Mark Paluch
7fbb6583a5 DATAMONGO-1967 - Release version 1.10.13 (Ingalls SR13). 2018-06-13 10:48:53 +02:00
Mark Paluch
46550f6830 DATAMONGO-1967 - Prepare 1.10.13 (Ingalls SR13). 2018-06-13 10:48:03 +02:00
Mark Paluch
f076f75866 DATAMONGO-1967 - Updated changelog. 2018-06-13 10:47:57 +02:00
Mark Paluch
d9bf13cc1e DATAMONGO-2003 - Polishing.
Remove superfluous throws declarations and trailing whitespaces.

Original pull request: #570.
2018-06-11 14:29:11 +02:00
Christoph Strobl
626dfa4f9a DATAMONGO-2003 - Fix derived query using regex pattern with options.
We now consider regex pattern options when using the pattern as a derived finder argument.

Original pull request: #570.
2018-06-11 14:29:11 +02:00
Oliver Gierke
26c12214d3 DATAMONGO-2002 - Fixed Criteria.equals(…) for usage with Pattern instances.
For Criteria instances that use regular expressions we now properly compare the two Pattern instances produced by also including the pattern flags in the comparison.
2018-06-07 19:12:56 +02:00
Mark Paluch
f82b791f72 DATAMONGO-1998 - Polishing.
Switch id field name check to equals or to match the last property path segment.

Original pull request: #567.
2018-06-06 11:41:46 +02:00
Christoph Strobl
3785a52676 DATAMONGO-1998 - Fix Querydsl id handling for nested property references using ObjectId hex String representation.
We now follow the conversion rules for id properties with a valid ObjectId representation when parsing Querydsl queries.

Original pull request: #567.
2018-06-06 11:37:36 +02:00
Mark Paluch
41897c7d46 DATAMONGO-1986 - Polishing.
Refactor duplicated code into AggregationUtil.

Original pull request: #564.
2018-06-06 10:36:50 +02:00
Christoph Strobl
f3397e95bc DATAMONGO-1986 - Always provide a typed AggregationOperationContext for TypedAggregation.
We now initialize a TypeBasedAggregationOperationContext for TypedAggregations if no context is provided. This makes sure that potential Criteria objects are run trough the QueryMapper.
In case the default context is used we now also make sure to at least run the aggregation pipeline through the QueryMapper to avoid passing on non MongoDB simple types to the driver.

Original pull request: #564.
2018-06-06 10:36:50 +02:00
Mark Paluch
5807c5d8bc DATAMONGO-1988 - Polishing.
Match exactly for either top-level properties of leaf-properties instead of accepting the property/field name suffix. Downgrade tests to use DBObject API.

Original pull request: #565.
2018-06-05 11:23:00 +02:00
Christoph Strobl
70783d5806 DATAMONGO-1988 - Fix query creation for id property references using ObjectId hex String representation.
We now follow the conversion rules for id properties with a valid ObjectId representation when creating queries. Prior to this change e.g. String values would have been turned into ObejctIds when saving a document, but not when querying the latter.

Original pull request: #565.
2018-06-05 11:15:27 +02:00
Christoph Strobl
02e3fa1486 DATAMONGO-1927 - Updated changelog. 2018-05-17 10:32:55 +02:00
Victor
f09c956ab2 DATAMONGO-1978 - Fix minor typo in Field.positionKey field name.
Original pull request: #558.
2018-05-15 12:29:44 +02:00
Mark Paluch
6d6201d327 DATAMONGO-1466 - Polishing.
Remove superfluous static keyword in inner enum declarations. Fix typos in method names.

Original pull request: #561.
2018-05-15 11:19:43 +02:00
Christoph Strobl
5217e7a7e0 DATAMONGO-1466 - Add embedded typeinformation-based reading GeoJSON converter.
Original pull request: #561.
2018-05-15 11:19:25 +02:00
Mark Paluch
f56bb16a12 DATAMONGO-1974 - Polishing.
Fix typos, links, and code fences.

Original pull request: #559.
2018-05-11 15:39:18 +02:00
Jay Bryant
0869fdb5b8 DATAMONGO-1974 - Full editing pass for Spring Data MongoDB.
Full editing pass of the Spring Data MongoDB reference guide. I also adjusted index.adoc to work with the changes I made to the build project, so that we get Epub and PDF as well as HTML.

Original pull request: #559.
2018-05-11 15:36:37 +02:00
Mark Paluch
59aa9f96cc DATAMONGO-1918 - Updated changelog. 2018-05-08 15:27:19 +02:00
Mark Paluch
944fabb87d DATAMONGO-1917 - After release cleanups. 2018-05-08 11:56:07 +02:00
Mark Paluch
3cf2cd2d59 DATAMONGO-1917 - Prepare next development iteration. 2018-05-08 11:56:06 +02:00
Mark Paluch
44a4e65a72 DATAMONGO-1917 - Release version 1.10.12 (Ingalls SR12). 2018-05-08 10:50:35 +02:00
Mark Paluch
e378b6f4a6 DATAMONGO-1917 - Prepare 1.10.12 (Ingalls SR12). 2018-05-08 10:49:48 +02:00
Mark Paluch
61667bde17 DATAMONGO-1917 - Updated changelog. 2018-05-08 10:49:41 +02:00
Mark Paluch
4148e38ca9 DATAMONGO-1869 - Updated changelog. 2018-04-13 15:11:27 +02:00
Mark Paluch
e8d3c9e932 DATAMONGO-1893 - Polishing.
Inherit fields from previous operation if at least one field is excluded. Extend FieldsExposingAggregationOperation to conditionally inherit fields.

Backport to Java 6 code.

Original pull request: #538.
2018-04-06 10:45:48 +02:00
Christoph Strobl
fdc8ad8580 DATAMONGO-1893 - Allow exclusion of other fields than _id in aggregation $project.
As of MongoDB 3.4 exclusion of fields other than _id is allowed so we removed the limitation in our code.

Original pull request: #538.
2018-04-06 10:45:48 +02:00
Mark Paluch
e7cc9231ed DATAMONGO-1888 - Updated changelog. 2018-04-04 17:12:53 +02:00
Mark Paluch
2d2953cbc5 DATAMONGO-1857 - After release cleanups. 2018-04-04 14:58:03 +02:00
Mark Paluch
4dab0369ed DATAMONGO-1857 - Prepare next development iteration. 2018-04-04 14:58:02 +02:00
Mark Paluch
8ad0dc87e3 DATAMONGO-1857 - Release version 1.10.11 (Ingalls SR11). 2018-04-04 14:21:29 +02:00
Mark Paluch
16ad3c614f DATAMONGO-1857 - Prepare 1.10.11 (Ingalls SR11). 2018-04-04 14:20:47 +02:00
Mark Paluch
2663428825 DATAMONGO-1857 - Updated changelog. 2018-04-04 14:20:42 +02:00
Mark Paluch
f5d5a813f3 DATAMONGO-1903 - Polishing.
Remove client side operating system check as operating system-dependant constraints depend on the server. Add check on whitespaces. Add author tags. Extend tests. Extract database name assertion in an own method.

Original pull request: #546.
2018-04-03 13:44:35 +02:00
George Moraitis
b69ddd43ac DATAMONGO-1903 - Align database name check in SimpleMongoDbFactory with MongoDB limitations.
We now test database names against the current (3.6) MongoDB specifications for database names.

Original pull request: #546.
2018-04-03 13:44:33 +02:00
Mark Paluch
9f3dca3be9 DATAMONGO-1834 - Polishing.
Increase visibility of Timezone factory methods. Add missing nullable annotation. Tweaked Javadoc. Add tests for Timezone using expressions/field references.

Original Pull Request: #539
2018-03-28 14:29:56 +02:00
Christoph Strobl
49e97b7f47 DATAMONGO-1834 - Polishing.
Remove DateFactory and split up tests.
Introduce dedicated Timezone abstraction and update existing factories to apply the timezone if appropriate. Update builders and align code style.

Original Pull Request: #539
2018-03-28 13:55:53 +02:00
Matt Morrissette
7775746d26 DATAMONGO-1834 - Add support for MongoDB 3.6 DateOperators $dateFromString, $dateFromParts and $dateToParts including timezones.
Original Pull Request: #539
2018-03-28 13:28:43 +02:00
Oliver Gierke
67bae7c813 DATAMONGO-1915 - Removed explicit declaration of Jackson library versions. 2018-03-27 19:35:33 +02:00
Felipe Zanardo Affonso
038ab7d8bc DATAMONGO-1909 - Fix typo on return statement.
Original pull request: #523.
2018-03-21 16:05:53 +01:00
Oliver Gierke
0ac6956807 DATAMONGO-1898 - Fixed broken test cases.
Switched from AssertJ back to Hamcrest for matching.
2018-03-14 10:30:15 +01:00
Oliver Gierke
b1ec25d9be DATAMONGO-1901 - Added project.root configuration to make JavaDoc generation work again.
Related ticket: https://github.com/spring-projects/spring-data-build/issues/527.
2018-03-14 09:44:39 +01:00
Oliver Gierke
0a609d1ac9 DATAMONGO-1898 - Added unit tests for the conversion handling of enums implementing interfaces.
Related tickets: DATACMNS-1278.
2018-03-12 11:08:00 +01:00
Oliver Gierke
e4103eacda DATAMONGO-1896 - SimpleMongoRepository.saveAll(…) now consistently uses aggregate collection for inserts.
We previously used MongoTemplate.insertAll(…) which determines the collection to insert the individual elements based on the type, which - in cases of entity inheritance - will use dedicated collections for sub-types of the aggregate root. Subsequent lookups of the entities will then fail, as those are executed against the collection the aggregate root is mapped to.

We now rather use ….insert(Collection, String) handing the collection of the aggregate root explicitly.
2018-03-09 00:09:38 +01:00
Mark Paluch
bf58e9536b DATAMONGO-1882 - Updated changelog. 2018-02-28 11:17:41 +01:00
Mark Paluch
5fe67339e1 DATAMONGO-1859 - Updated changelog. 2018-02-19 20:30:08 +01:00
Mark Paluch
32a2970e76 DATAMONGO-1870 - Polishing.
Extend copyright license years. Slightly reword documentation.

Original pull request: #531.
Related pull request: #532.
2018-02-15 10:54:11 +01:00
Christoph Strobl
e8c9ac7dd0 DATAMONGO-1870 - Consider skip/limit on MongoOperations.remove(Query, Class).
We now use _id lookup for remove operations that query with limit or skip parameters. This allows more fine grained control over documents removed.

Original pull request: #531.
Related pull request: #532.
2018-02-15 10:53:43 +01:00
Christoph Strobl
29b4a9dcc3 DATAMONGO-1860 - Retrieve result count via QuerydslMongoPredicateExecutor only for paging.
We now use AbstractMongodbQuery.fetch() instead of AbstractMongodbQuery.fetchResults() to execute MongoDB queries. fetchResults() executes a find(…) and a count(…) query. Retrieving the record count is an expensive operation in MongoDB and the count is not always required. For regular find(…) method, the count is ignored, for paging the count(…) is only required in certain result/request scenarios.

Original Pull Request: #529
2018-02-14 13:58:10 +01:00
Christoph Strobl
7275a10c1e DATAMONGO-1871 - Fix AggregationExpression aliasing.
We now make sure to allow a nested property alias by setting the target.

Original pull request: #533.
2018-02-14 11:04:36 +01:00
Christoph Strobl
61b5977ada DATAMONGO-1794 - Updated changelog. 2018-02-06 11:13:59 +01:00
Mark Paluch
9ff3888a21 DATAMONGO-1830 - Updated changelog. 2018-01-24 13:41:25 +01:00
Mark Paluch
72f461ae30 DATAMONGO-1858 - Fix line endings to LF. 2018-01-24 12:57:54 +01:00
Mark Paluch
2af322a823 DATAMONGO-1829 - After release cleanups. 2018-01-24 12:16:41 +01:00
Mark Paluch
3694761f92 DATAMONGO-1829 - Prepare next development iteration. 2018-01-24 12:16:41 +01:00
Mark Paluch
c3b0e81db2 DATAMONGO-1829 - Release version 1.10.10 (Ingalls SR10). 2018-01-24 12:16:41 +01:00
Mark Paluch
27e68de5aa DATAMONGO-1829 - Prepare 1.10.10 (Ingalls SR10). 2018-01-24 12:16:41 +01:00
Mark Paluch
04ace43046 DATAMONGO-1829 - Updated changelog. 2018-01-24 12:16:32 +01:00
Mark Paluch
65996f3ad8 DATAMONGO-1843 - Polishing.
Typo fixes.

Original pull request: #526.
2018-01-23 10:46:37 +01:00
Christoph Strobl
237eace8e4 DATAMONGO-1843 - Fix parameter shadowing in ArrayOperators reduce.
Original pull request: #526.
2018-01-23 10:42:32 +01:00
Mark Paluch
0b31621d35 DATAMONGO-1824 - Switch to TravisCI-provided MongoDB service.
Original pull request: #521.
2017-12-15 14:30:53 +01:00
Mark Paluch
3f009053fe DATAMONGO-1824 - Polishing.
Move method from AggregationCommandPreparer and AggregationResultPostProcessor to BatchAggregationLoader. Extract field names to constants. Tiny renames to variables. Add unit test for aggregation response without cursor use.

Original pull request: #521.
2017-12-15 14:30:50 +01:00
Christoph Strobl
a2d470cd23 DATAMONGO-1824 - Skip tests no longer applicable for MongoDB 3.6.
Original pull request: #521.
2017-12-15 14:26:34 +01:00
Christoph Strobl
9525cdedfc DATAMONGO-1824 - Fix aggregation execution for MongoDB 3.6.
We now send aggregation commands along a cursor batch size for compatibility with MongoDB 3.6 that no longer supports aggregations without cursor. We consume the whole cursor before returning and converting results and omit the 16MB aggregation result limit. For MongoDB versions not supporting aggregation cursors we return results directly.

Original pull request: #521.
2017-12-15 14:26:12 +01:00
Christoph Strobl
867fcf6df2 DATAMONGO-1831 - Fix array type conversion for empty source.
We now make sure that we convert empty sources to the corresponding target type. This prevents entity instantiation from failing due to incorrect argument types when invoking the constructor.

Original pull request: #520.
2017-12-02 12:10:43 -08:00
Mark Paluch
3ddac744ce DATAMONGO-1816 - Updated changelog. 2017-11-27 16:43:42 +01:00
Mark Paluch
e565d25084 DATAMONGO-1799 - After release cleanups. 2017-11-27 15:56:39 +01:00
Mark Paluch
27623fce01 DATAMONGO-1799 - Prepare next development iteration. 2017-11-27 15:56:37 +01:00
Mark Paluch
c9d471e5d5 DATAMONGO-1799 - Release version 1.10.9 (Ingalls SR9). 2017-11-27 15:17:06 +01:00
Mark Paluch
6641277aaa DATAMONGO-1799 - Prepare 1.10.9 (Ingalls SR9). 2017-11-27 15:16:09 +01:00
Mark Paluch
29eba6e427 DATAMONGO-1799 - Updated changelog. 2017-11-27 15:16:04 +01:00
Oliver Gierke
d10e4afefd DATAMONGO-1737 - BasicMongoPersistentEntity now correctly initializes comparator.
In BasicMongoPersistentEntity.verify() we now properly call the super method to make sure the comparators that honor the @Field's order value are initialized properly.
2017-11-17 14:59:03 +01:00
Oliver Gierke
14bb4b586f DATAMONGO-1793 - Updated changelog. 2017-10-27 16:36:48 +02:00
Christoph Strobl
d795836994 DATAMONGO-1809 - Fix positional parameter detection for PropertyPaths.
We now make sure to capture all digits for positional parameters.

Original pull request: #508.
2017-10-24 14:53:33 +02:00
Mark Paluch
d216fed8db DATAMONGO-1696 - Mention appropriate EnableMongoAuditing annotation in reference documentation. 2017-10-20 08:45:47 +02:00
Oliver Gierke
26b7267737 DATAMONGO-1775 - After release cleanups. 2017-10-11 19:00:15 +02:00
Oliver Gierke
69d749d028 DATAMONGO-1775 - Prepare next development iteration. 2017-10-11 19:00:10 +02:00
621 changed files with 12331 additions and 6138 deletions

BIN
.mvn/wrapper/maven-wrapper.jar vendored Executable file

Binary file not shown.

1
.mvn/wrapper/maven-wrapper.properties vendored Executable file
View File

@@ -0,0 +1 @@
distributionUrl=https://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.5.4/apache-maven-3.5.4-bin.zip

View File

@@ -6,6 +6,9 @@ jdk:
before_script:
- mongod --version
services:
- mongodb
env:
matrix:
- PROFILE=ci
@@ -23,9 +26,7 @@ env:
addons:
apt:
sources:
- mongodb-upstart
- sourceline: 'deb [arch=amd64] http://repo.mongodb.org/apt/ubuntu precise/mongodb-org/3.4 multiverse'
key_url: 'https://www.mongodb.org/static/pgp/server-3.4.asc'
- mongodb-3.4-precise
packages:
- mongodb-org-server
- mongodb-org-shell
@@ -37,6 +38,8 @@ cache:
directories:
- $HOME/.m2
install: true
install:
- |-
mongo admin --eval "db.adminCommand({setFeatureCompatibilityVersion: '3.4'});"
script: "mvn clean dependency:list test -P${PROFILE} -Dsort"

43
CI.adoc Normal file
View File

@@ -0,0 +1,43 @@
= Continuous Integration
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmaster&subject=Moore%20(master)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2F2.1.x&subject=Lovelace%20(2.1.x)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2F1.10.x&subject=Ingalls%20(1.10.x)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]
== Running CI tasks locally
Since this pipeline is purely Docker-based, it's easy to:
* Debug what went wrong on your local machine.
* Test out a a tweak to your test routine before sending it out.
* Experiment against a new image before submitting your pull request.
All of these use cases are great reasons to essentially run what the CI server does on your local machine.
IMPORTANT: To do this you must have Docker installed on your machine.
1. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github springci/spring-data-openjdk8-with-mongodb-4.0:latest /bin/bash`
+
This will launch the Docker image and mount your source code at `spring-data-mongodb-github`.
+
2. `cd spring-data-mongodb-github`
+
Next, run the tests from inside the container:
+
3. `./mvnw clean dependency:list test -Dsort -Dbundlor.enabled=false -B` (or with whatever profile you need to test out)
Since the container is binding to your source, you can make edits from your IDE and continue to run build jobs.
If you need to package things up, do this:
1. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github springci/spring-data-openjdk8-with-mongodb-4.0:latest /bin/bash`
+
This will launch the Docker image and mount your source code at `spring-data-mongodb-github`.
+
2. `cd spring-data-mongodb-github`
+
Next, package things from inside the container doing this:
+
3. `./mvnw clean dependency:list package -Dsort -Dbundlor.enabled=false -B`
NOTE: Docker containers can eat up disk space fast! From time to time, run `docker system prune` to clean out old images.

123
Jenkinsfile vendored Normal file
View File

@@ -0,0 +1,123 @@
pipeline {
agent none
triggers {
pollSCM 'H/10 * * * *'
upstream(upstreamProjects: "spring-data-commons/1.13.x", threshold: hudson.model.Result.SUCCESS)
}
options {
disableConcurrentBuilds()
buildDiscarder(logRotator(numToKeepStr: '14'))
}
stages {
stage("Test") {
when {
anyOf {
branch '1.10.x'
not { triggeredBy 'UpstreamCause' }
}
}
parallel {
stage("test: baseline") {
agent {
docker {
image 'springci/spring-data-openjdk8-with-mongodb-4.0:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
}
options { timeout(time: 30, unit: 'MINUTES') }
steps {
sh 'rm -rf ?'
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw clean dependency:list test -Dsort -B'
}
}
}
}
stage('Release to artifactory') {
when {
branch 'issue/*'
not { triggeredBy 'UpstreamCause' }
}
agent {
docker {
image 'adoptopenjdk/openjdk8:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
}
options { timeout(time: 20, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
sh 'rm -rf ?'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pci,artifactory ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
"-Dartifactory.staging-repository=libs-snapshot-local " +
"-Dartifactory.build-name=spring-data-mongodb-1.10 " +
"-Dartifactory.build-number=${BUILD_NUMBER} " +
'-Dmaven.test.skip=true clean deploy -B'
}
}
stage('Release to artifactory with docs') {
when {
branch '1.10.x'
}
agent {
docker {
image 'adoptopenjdk/openjdk8:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
}
options { timeout(time: 20, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
sh 'rm -rf ?'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pci,artifactory ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
"-Dartifactory.staging-repository=libs-snapshot-local " +
"-Dartifactory.build-name=spring-data-mongodb-1.10 " +
"-Dartifactory.build-number=${BUILD_NUMBER} " +
'-Dmaven.test.skip=true clean deploy -B'
}
}
}
post {
changed {
script {
slackSend(
color: (currentBuild.currentResult == 'SUCCESS') ? 'good' : 'danger',
channel: '#spring-data-dev',
message: "${currentBuild.fullDisplayName} - `${currentBuild.currentResult}`\n${env.BUILD_URL}")
emailext(
subject: "[${currentBuild.fullDisplayName}] ${currentBuild.currentResult}",
mimeType: 'text/html',
recipientProviders: [[$class: 'CulpritsRecipientProvider'], [$class: 'RequesterRecipientProvider']],
body: "<a href=\"${env.BUILD_URL}\">${currentBuild.fullDisplayName} is reported as ${currentBuild.currentResult}</a>")
}
}
}
}

159
README.adoc Normal file
View File

@@ -0,0 +1,159 @@
image:https://spring.io/badges/spring-data-mongodb/ga.svg[Spring Data MongoDB,link=https://projects.spring.io/spring-data-mongodb#quick-start] image:https://spring.io/badges/spring-data-mongodb/snapshot.svg[Spring Data MongoDB,link=https://projects.spring.io/spring-data-mongodb#quick-start]
= Spring Data MongoDB image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmaster&subject=Build[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/] https://gitter.im/spring-projects/spring-data[image:https://badges.gitter.im/spring-projects/spring-data.svg[Gitter]]
The primary goal of the https://projects.spring.io/spring-data[Spring Data] project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
The Spring Data MongoDB project aims to provide a familiar and consistent Spring-based programming model for new datastores while retaining store-specific features and capabilities.
The Spring Data MongoDB project provides integration with the MongoDB document database.
Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB `+Document+` and easily writing a repository style data access layer.
== Code of Conduct
This project is governed by the link:CODE_OF_CONDUCT.adoc[Spring Code of Conduct]. By participating, you are expected to uphold this code of conduct. Please report unacceptable behavior to spring-code-of-conduct@pivotal.io.
== Getting Started
Here is a quick teaser of an application using Spring Data Repositories in Java:
[source,java]
----
public interface PersonRepository extends CrudRepository<Person, Long> {
List<Person> findByLastname(String lastname);
List<Person> findByFirstnameLike(String firstname);
}
@Service
public class MyService {
private final PersonRepository repository;
public MyService(PersonRepository repository) {
this.repository = repository;
}
public void doWork() {
repository.deleteAll();
Person person = new Person();
person.setFirstname("Oliver");
person.setLastname("Gierke");
repository.save(person);
List<Person> lastNameResults = repository.findByLastname("Gierke");
List<Person> firstNameResults = repository.findByFirstnameLike("Oli*");
}
}
@Configuration
@EnableMongoRepositories
class ApplicationConfig extends AbstractMongoConfiguration {
@Override
public MongoClient mongoClient() {
return new MongoClient();
}
@Override
protected String getDatabaseName() {
return "springdata";
}
}
----
=== Maven configuration
Add the Maven dependency:
[source,xml]
----
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}.RELEASE</version>
</dependency>
----
If you'd rather like the latest snapshots of the upcoming major version, use our Maven snapshot repository and declare the appropriate dependency version.
[source,xml]
----
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}.BUILD-SNAPSHOT</version>
</dependency>
<repository>
<id>spring-libs-snapshot</id>
<name>Spring Snapshot Repository</name>
<url>https://repo.spring.io/libs-snapshot</url>
</repository>
----
== Getting Help
Having trouble with Spring Data? Wed love to help!
* Check the
https://docs.spring.io/spring-data/mongodb/docs/current/reference/html/[reference documentation], and https://docs.spring.io/spring-data/mongodb/docs/current/api/[Javadocs].
* Learn the Spring basics Spring Data builds on Spring Framework, check the https://spring.io[spring.io] web-site for a wealth of reference documentation.
If you are just starting out with Spring, try one of the https://spring.io/guides[guides].
* If you are upgrading, check out the https://docs.spring.io/spring-data/mongodb/docs/current/changelog.txt[changelog] for "`new and noteworthy`" features.
* Ask a question - we monitor https://stackoverflow.com[stackoverflow.com] for questions tagged with https://stackoverflow.com/tags/spring-data[`spring-data-mongodb`].
You can also chat with the community on https://gitter.im/spring-projects/spring-data[Gitter].
* Report bugs with Spring Data MongoDB at https://jira.spring.io/browse/DATAMONGO[jira.spring.io/browse/DATAMONGO].
== Reporting Issues
Spring Data uses JIRA as issue tracking system to record bugs and feature requests. If you want to raise an issue, please follow the recommendations below:
* Before you log a bug, please search the
https://jira.spring.io/browse/DATAMONGO[issue tracker] to see if someone has already reported the problem.
* If the issue doesnt already exist, https://jira.spring.io/browse/DATAMONGO[create a new issue].
* Please provide as much information as possible with the issue report, we like to know the version of Spring Data that you are using and JVM version.
* If you need to paste code, or include a stack trace use JIRA `{code}…{code}` escapes before and after your text.
* If possible try to create a test-case or project that replicates the issue. Attach a link to your code or a compressed file containing your code.
== Building from Source
You dont need to build from source to use Spring Data (binaries in https://repo.spring.io[repo.spring.io]), but if you want to try out the latest and greatest, Spring Data can be easily built with the https://github.com/takari/maven-wrapper[maven wrapper].
You also need JDK 1.8.
[source,bash]
----
$ ./mvnw clean install
----
If you want to build with the regular `mvn` command, you will need https://maven.apache.org/run-maven/index.html[Maven v3.5.0 or above].
_Also see link:CONTRIBUTING.adoc[CONTRIBUTING.adoc] if you wish to submit pull requests, and in particular please sign the https://cla.pivotal.io/sign/spring[Contributors Agreement] before your first non-trivial change._
=== Building reference documentation
Building the documentation builds also the project without running tests.
[source,bash]
----
$ ./mvnw clean install -Pdistribute
----
The generated documentation is available from `target/site/reference/html/index.html`.
== Guides
The https://spring.io/[spring.io] site contains several guides that show how to use Spring Data step-by-step:
* https://spring.io/guides/gs/accessing-data-mongodb/[Accessing Data with MongoDB] is a very basic guide that shows you how to create a simple application and how to access data using repositories.
* https://spring.io/guides/gs/accessing-mongodb-data-rest/[Accessing MongoDB Data with REST] is a guide to creating a REST web service exposing data stored in MongoDB through repositories.
== Examples
* https://github.com/spring-projects/spring-data-examples/[Spring Data Examples] contains example projects that explain specific features in more detail.
== License
Spring Data MongoDB is Open Source software released under the https://www.apache.org/licenses/LICENSE-2.0.html[Apache 2.0 license].

150
README.md
View File

@@ -1,150 +0,0 @@
[![Spring Data MongoDB](https://spring.io/badges/spring-data-mongodb/ga.svg)](http://projects.spring.io/spring-data-mongodb#quick-start)
[![Spring Data MongoDB](https://spring.io/badges/spring-data-mongodb/snapshot.svg)](http://projects.spring.io/spring-data-mongodb#quick-start)
# Spring Data MongoDB
The primary goal of the [Spring Data](http://projects.spring.io/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
The Spring Data MongoDB project aims to provide a familiar and consistent Spring-based programming model for new datastores while retaining store-specific features and capabilities. The Spring Data MongoDB project provides integration with the MongoDB document database. Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB DBCollection and easily writing a repository style data access layer.
## Getting Help
For a comprehensive treatment of all the Spring Data MongoDB features, please refer to:
* the [User Guide](http://docs.spring.io/spring-data/mongodb/docs/current/reference/html/)
* the [JavaDocs](http://docs.spring.io/spring-data/mongodb/docs/current/api/) have extensive comments in them as well.
* the home page of [Spring Data MongoDB](http://projects.spring.io/spring-data-mongodb) contains links to articles and other resources.
* for more detailed questions, use [Spring Data Mongodb on Stackoverflow](http://stackoverflow.com/questions/tagged/spring-data-mongodb).
If you are new to Spring as well as to Spring Data, look for information about [Spring projects](http://projects.spring.io/).
## Quick Start
### Maven configuration
Add the Maven dependency:
```xml
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}.RELEASE</version>
</dependency>
```
If you'd rather like the latest snapshots of the upcoming major version, use our Maven snapshot repository and declare the appropriate dependency version.
```xml
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}.BUILD-SNAPSHOT</version>
</dependency>
<repository>
<id>spring-libs-snapshot</id>
<name>Spring Snapshot Repository</name>
<url>http://repo.spring.io/libs-snapshot</url>
</repository>
```
### MongoTemplate
MongoTemplate is the central support class for Mongo database operations. It provides:
* Basic POJO mapping support to and from BSON
* Convenience methods to interact with the store (insert object, update objects) and MongoDB specific ones (geo-spatial operations, upserts, map-reduce etc.)
* Connection affinity callback
* Exception translation into Spring's [technology agnostic DAO exception hierarchy](http://docs.spring.io/spring/docs/current/spring-framework-reference/html/dao.html#dao-exceptions).
### Spring Data repositories
To simplify the creation of data repositories Spring Data MongoDB provides a generic repository programming model. It will automatically create a repository proxy for you that adds implementations of finder methods you specify on an interface.
For example, given a `Person` class with first and last name properties, a `PersonRepository` interface that can query for `Person` by last name and when the first name matches a like expression is shown below:
```java
public interface PersonRepository extends CrudRepository<Person, Long> {
List<Person> findByLastname(String lastname);
List<Person> findByFirstnameLike(String firstname);
}
```
The queries issued on execution will be derived from the method name. Extending `CrudRepository` causes CRUD methods being pulled into the interface so that you can easily save and find single entities and collections of them.
You can have Spring automatically create a proxy for the interface by using the following JavaConfig:
```java
@Configuration
@EnableMongoRepositories
class ApplicationConfig extends AbstractMongoConfiguration {
@Override
public Mongo mongo() throws Exception {
return new MongoClient();
}
@Override
protected String getDatabaseName() {
return "springdata";
}
}
```
This sets up a connection to a local MongoDB instance and enables the detection of Spring Data repositories (through `@EnableMongoRepositories`). The same configuration would look like this in XML:
```xml
<bean id="template" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg>
<bean class="com.mongodb.MongoClient">
<constructor-arg value="localhost" />
<constructor-arg value="27017" />
</bean>
</constructor-arg>
<constructor-arg value="database" />
</bean>
<mongo:repositories base-package="com.acme.repository" />
```
This will find the repository interface and register a proxy object in the container. You can use it as shown below:
```java
@Service
public class MyService {
private final PersonRepository repository;
@Autowired
public MyService(PersonRepository repository) {
this.repository = repository;
}
public void doWork() {
repository.deleteAll();
Person person = new Person();
person.setFirstname("Oliver");
person.setLastname("Gierke");
person = repository.save(person);
List<Person> lastNameResults = repository.findByLastname("Gierke");
List<Person> firstNameResults = repository.findByFirstnameLike("Oli*");
}
}
```
## Contributing to Spring Data
Here are some ways for you to get involved in the community:
* Get involved with the Spring community on Stackoverflow and help out on the [spring-data-mongodb](http://stackoverflow.com/questions/tagged/spring-data-mongodb) tag by responding to questions and joining the debate.
* Create [JIRA](https://jira.spring.io/browse/DATAMONGO) tickets for bugs and new features and comment and vote on the ones that you are interested in.
* Github is for social coding: if you want to write code, we encourage contributions through pull requests from [forks of this repository](http://help.github.com/forking/). If you want to contribute code this way, please reference a JIRA ticket as well covering the specific issue you are addressing.
* Watch for upcoming articles on Spring by [subscribing](http://spring.io/blog) to spring.io.
Before we accept a non-trivial patch or pull request we will need you to [sign the Contributor License Agreement](https://cla.pivotal.io/sign/spring). Signing the contributors agreement does not grant anyone commit rights to the main repository, but it does mean that we can accept your contributions, and you will get an author credit if we do. If you forget to do so, you'll be reminded when you submit a pull request. Active contributors might be asked to join the core team, and given the ability to merge pull requests.

View File

@@ -0,0 +1,14 @@
FROM adoptopenjdk/openjdk8:latest
RUN apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2
RUN apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 9DA31620334BD75D9DCB49F368818C72E52529D4
RUN echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.0.list
RUN apt-get update
RUN apt-get install -y mongodb-org=4.0.9 mongodb-org-server=4.0.9 mongodb-org-shell=4.0.9 mongodb-org-mongos=4.0.9 mongodb-org-tools=4.0.9
RUN apt-get clean \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -0,0 +1,14 @@
FROM adoptopenjdk/openjdk8:latest
RUN apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2
RUN apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 4B7C549A058F8B6B
RUN echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.1 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.1.list
RUN apt-get update
RUN apt-get install -y mongodb-org-unstable=4.1.13 mongodb-org-unstable-server=4.1.13 mongodb-org-unstable-shell=4.1.13 mongodb-org-unstable-mongos=4.1.13 mongodb-org-unstable-tools=4.1.13
RUN apt-get clean \
&& rm -rf /var/lib/apt/lists/*

286
mvnw vendored Executable file
View File

@@ -0,0 +1,286 @@
#!/bin/sh
# ----------------------------------------------------------------------------
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
# ----------------------------------------------------------------------------
# ----------------------------------------------------------------------------
# Maven2 Start Up Batch script
#
# Required ENV vars:
# ------------------
# JAVA_HOME - location of a JDK home dir
#
# Optional ENV vars
# -----------------
# M2_HOME - location of maven2's installed home dir
# MAVEN_OPTS - parameters passed to the Java VM when running Maven
# e.g. to debug Maven itself, use
# set MAVEN_OPTS=-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000
# MAVEN_SKIP_RC - flag to disable loading of mavenrc files
# ----------------------------------------------------------------------------
if [ -z "$MAVEN_SKIP_RC" ] ; then
if [ -f /etc/mavenrc ] ; then
. /etc/mavenrc
fi
if [ -f "$HOME/.mavenrc" ] ; then
. "$HOME/.mavenrc"
fi
fi
# OS specific support. $var _must_ be set to either true or false.
cygwin=false;
darwin=false;
mingw=false
case "`uname`" in
CYGWIN*) cygwin=true ;;
MINGW*) mingw=true;;
Darwin*) darwin=true
# Use /usr/libexec/java_home if available, otherwise fall back to /Library/Java/Home
# See https://developer.apple.com/library/mac/qa/qa1170/_index.html
if [ -z "$JAVA_HOME" ]; then
if [ -x "/usr/libexec/java_home" ]; then
export JAVA_HOME="`/usr/libexec/java_home`"
else
export JAVA_HOME="/Library/Java/Home"
fi
fi
;;
esac
if [ -z "$JAVA_HOME" ] ; then
if [ -r /etc/gentoo-release ] ; then
JAVA_HOME=`java-config --jre-home`
fi
fi
if [ -z "$M2_HOME" ] ; then
## resolve links - $0 may be a link to maven's home
PRG="$0"
# need this for relative symlinks
while [ -h "$PRG" ] ; do
ls=`ls -ld "$PRG"`
link=`expr "$ls" : '.*-> \(.*\)$'`
if expr "$link" : '/.*' > /dev/null; then
PRG="$link"
else
PRG="`dirname "$PRG"`/$link"
fi
done
saveddir=`pwd`
M2_HOME=`dirname "$PRG"`/..
# make it fully qualified
M2_HOME=`cd "$M2_HOME" && pwd`
cd "$saveddir"
# echo Using m2 at $M2_HOME
fi
# For Cygwin, ensure paths are in UNIX format before anything is touched
if $cygwin ; then
[ -n "$M2_HOME" ] &&
M2_HOME=`cygpath --unix "$M2_HOME"`
[ -n "$JAVA_HOME" ] &&
JAVA_HOME=`cygpath --unix "$JAVA_HOME"`
[ -n "$CLASSPATH" ] &&
CLASSPATH=`cygpath --path --unix "$CLASSPATH"`
fi
# For Mingw, ensure paths are in UNIX format before anything is touched
if $mingw ; then
[ -n "$M2_HOME" ] &&
M2_HOME="`(cd "$M2_HOME"; pwd)`"
[ -n "$JAVA_HOME" ] &&
JAVA_HOME="`(cd "$JAVA_HOME"; pwd)`"
# TODO classpath?
fi
if [ -z "$JAVA_HOME" ]; then
javaExecutable="`which javac`"
if [ -n "$javaExecutable" ] && ! [ "`expr \"$javaExecutable\" : '\([^ ]*\)'`" = "no" ]; then
# readlink(1) is not available as standard on Solaris 10.
readLink=`which readlink`
if [ ! `expr "$readLink" : '\([^ ]*\)'` = "no" ]; then
if $darwin ; then
javaHome="`dirname \"$javaExecutable\"`"
javaExecutable="`cd \"$javaHome\" && pwd -P`/javac"
else
javaExecutable="`readlink -f \"$javaExecutable\"`"
fi
javaHome="`dirname \"$javaExecutable\"`"
javaHome=`expr "$javaHome" : '\(.*\)/bin'`
JAVA_HOME="$javaHome"
export JAVA_HOME
fi
fi
fi
if [ -z "$JAVACMD" ] ; then
if [ -n "$JAVA_HOME" ] ; then
if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
# IBM's JDK on AIX uses strange locations for the executables
JAVACMD="$JAVA_HOME/jre/sh/java"
else
JAVACMD="$JAVA_HOME/bin/java"
fi
else
JAVACMD="`which java`"
fi
fi
if [ ! -x "$JAVACMD" ] ; then
echo "Error: JAVA_HOME is not defined correctly." >&2
echo " We cannot execute $JAVACMD" >&2
exit 1
fi
if [ -z "$JAVA_HOME" ] ; then
echo "Warning: JAVA_HOME environment variable is not set."
fi
CLASSWORLDS_LAUNCHER=org.codehaus.plexus.classworlds.launcher.Launcher
# traverses directory structure from process work directory to filesystem root
# first directory with .mvn subdirectory is considered project base directory
find_maven_basedir() {
if [ -z "$1" ]
then
echo "Path not specified to find_maven_basedir"
return 1
fi
basedir="$1"
wdir="$1"
while [ "$wdir" != '/' ] ; do
if [ -d "$wdir"/.mvn ] ; then
basedir=$wdir
break
fi
# workaround for JBEAP-8937 (on Solaris 10/Sparc)
if [ -d "${wdir}" ]; then
wdir=`cd "$wdir/.."; pwd`
fi
# end of workaround
done
echo "${basedir}"
}
# concatenates all lines of a file
concat_lines() {
if [ -f "$1" ]; then
echo "$(tr -s '\n' ' ' < "$1")"
fi
}
BASE_DIR=`find_maven_basedir "$(pwd)"`
if [ -z "$BASE_DIR" ]; then
exit 1;
fi
##########################################################################################
# Extension to allow automatically downloading the maven-wrapper.jar from Maven-central
# This allows using the maven wrapper in projects that prohibit checking in binary data.
##########################################################################################
if [ -r "$BASE_DIR/.mvn/wrapper/maven-wrapper.jar" ]; then
if [ "$MVNW_VERBOSE" = true ]; then
echo "Found .mvn/wrapper/maven-wrapper.jar"
fi
else
if [ "$MVNW_VERBOSE" = true ]; then
echo "Couldn't find .mvn/wrapper/maven-wrapper.jar, downloading it ..."
fi
jarUrl="https://repo.maven.apache.org/maven2/io/takari/maven-wrapper/0.4.2/maven-wrapper-0.4.2.jar"
while IFS="=" read key value; do
case "$key" in (wrapperUrl) jarUrl="$value"; break ;;
esac
done < "$BASE_DIR/.mvn/wrapper/maven-wrapper.properties"
if [ "$MVNW_VERBOSE" = true ]; then
echo "Downloading from: $jarUrl"
fi
wrapperJarPath="$BASE_DIR/.mvn/wrapper/maven-wrapper.jar"
if command -v wget > /dev/null; then
if [ "$MVNW_VERBOSE" = true ]; then
echo "Found wget ... using wget"
fi
wget "$jarUrl" -O "$wrapperJarPath"
elif command -v curl > /dev/null; then
if [ "$MVNW_VERBOSE" = true ]; then
echo "Found curl ... using curl"
fi
curl -o "$wrapperJarPath" "$jarUrl"
else
if [ "$MVNW_VERBOSE" = true ]; then
echo "Falling back to using Java to download"
fi
javaClass="$BASE_DIR/.mvn/wrapper/MavenWrapperDownloader.java"
if [ -e "$javaClass" ]; then
if [ ! -e "$BASE_DIR/.mvn/wrapper/MavenWrapperDownloader.class" ]; then
if [ "$MVNW_VERBOSE" = true ]; then
echo " - Compiling MavenWrapperDownloader.java ..."
fi
# Compiling the Java class
("$JAVA_HOME/bin/javac" "$javaClass")
fi
if [ -e "$BASE_DIR/.mvn/wrapper/MavenWrapperDownloader.class" ]; then
# Running the downloader
if [ "$MVNW_VERBOSE" = true ]; then
echo " - Running MavenWrapperDownloader.java ..."
fi
("$JAVA_HOME/bin/java" -cp .mvn/wrapper MavenWrapperDownloader "$MAVEN_PROJECTBASEDIR")
fi
fi
fi
fi
##########################################################################################
# End of extension
##########################################################################################
export MAVEN_PROJECTBASEDIR=${MAVEN_BASEDIR:-"$BASE_DIR"}
if [ "$MVNW_VERBOSE" = true ]; then
echo $MAVEN_PROJECTBASEDIR
fi
MAVEN_OPTS="$(concat_lines "$MAVEN_PROJECTBASEDIR/.mvn/jvm.config") $MAVEN_OPTS"
# For Cygwin, switch paths to Windows format before running java
if $cygwin; then
[ -n "$M2_HOME" ] &&
M2_HOME=`cygpath --path --windows "$M2_HOME"`
[ -n "$JAVA_HOME" ] &&
JAVA_HOME=`cygpath --path --windows "$JAVA_HOME"`
[ -n "$CLASSPATH" ] &&
CLASSPATH=`cygpath --path --windows "$CLASSPATH"`
[ -n "$MAVEN_PROJECTBASEDIR" ] &&
MAVEN_PROJECTBASEDIR=`cygpath --path --windows "$MAVEN_PROJECTBASEDIR"`
fi
WRAPPER_LAUNCHER=org.apache.maven.wrapper.MavenWrapperMain
exec "$JAVACMD" \
$MAVEN_OPTS \
-classpath "$MAVEN_PROJECTBASEDIR/.mvn/wrapper/maven-wrapper.jar" \
"-Dmaven.home=${M2_HOME}" "-Dmaven.multiModuleProjectDirectory=${MAVEN_PROJECTBASEDIR}" \
${WRAPPER_LAUNCHER} $MAVEN_CONFIG "$@"

161
mvnw.cmd vendored Executable file
View File

@@ -0,0 +1,161 @@
@REM ----------------------------------------------------------------------------
@REM Licensed to the Apache Software Foundation (ASF) under one
@REM or more contributor license agreements. See the NOTICE file
@REM distributed with this work for additional information
@REM regarding copyright ownership. The ASF licenses this file
@REM to you under the Apache License, Version 2.0 (the
@REM "License"); you may not use this file except in compliance
@REM with the License. You may obtain a copy of the License at
@REM
@REM https://www.apache.org/licenses/LICENSE-2.0
@REM
@REM Unless required by applicable law or agreed to in writing,
@REM software distributed under the License is distributed on an
@REM "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
@REM KIND, either express or implied. See the License for the
@REM specific language governing permissions and limitations
@REM under the License.
@REM ----------------------------------------------------------------------------
@REM ----------------------------------------------------------------------------
@REM Maven2 Start Up Batch script
@REM
@REM Required ENV vars:
@REM JAVA_HOME - location of a JDK home dir
@REM
@REM Optional ENV vars
@REM M2_HOME - location of maven2's installed home dir
@REM MAVEN_BATCH_ECHO - set to 'on' to enable the echoing of the batch commands
@REM MAVEN_BATCH_PAUSE - set to 'on' to wait for a key stroke before ending
@REM MAVEN_OPTS - parameters passed to the Java VM when running Maven
@REM e.g. to debug Maven itself, use
@REM set MAVEN_OPTS=-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000
@REM MAVEN_SKIP_RC - flag to disable loading of mavenrc files
@REM ----------------------------------------------------------------------------
@REM Begin all REM lines with '@' in case MAVEN_BATCH_ECHO is 'on'
@echo off
@REM set title of command window
title %0
@REM enable echoing my setting MAVEN_BATCH_ECHO to 'on'
@if "%MAVEN_BATCH_ECHO%" == "on" echo %MAVEN_BATCH_ECHO%
@REM set %HOME% to equivalent of $HOME
if "%HOME%" == "" (set "HOME=%HOMEDRIVE%%HOMEPATH%")
@REM Execute a user defined script before this one
if not "%MAVEN_SKIP_RC%" == "" goto skipRcPre
@REM check for pre script, once with legacy .bat ending and once with .cmd ending
if exist "%HOME%\mavenrc_pre.bat" call "%HOME%\mavenrc_pre.bat"
if exist "%HOME%\mavenrc_pre.cmd" call "%HOME%\mavenrc_pre.cmd"
:skipRcPre
@setlocal
set ERROR_CODE=0
@REM To isolate internal variables from possible post scripts, we use another setlocal
@setlocal
@REM ==== START VALIDATION ====
if not "%JAVA_HOME%" == "" goto OkJHome
echo.
echo Error: JAVA_HOME not found in your environment. >&2
echo Please set the JAVA_HOME variable in your environment to match the >&2
echo location of your Java installation. >&2
echo.
goto error
:OkJHome
if exist "%JAVA_HOME%\bin\java.exe" goto init
echo.
echo Error: JAVA_HOME is set to an invalid directory. >&2
echo JAVA_HOME = "%JAVA_HOME%" >&2
echo Please set the JAVA_HOME variable in your environment to match the >&2
echo location of your Java installation. >&2
echo.
goto error
@REM ==== END VALIDATION ====
:init
@REM Find the project base dir, i.e. the directory that contains the folder ".mvn".
@REM Fallback to current working directory if not found.
set MAVEN_PROJECTBASEDIR=%MAVEN_BASEDIR%
IF NOT "%MAVEN_PROJECTBASEDIR%"=="" goto endDetectBaseDir
set EXEC_DIR=%CD%
set WDIR=%EXEC_DIR%
:findBaseDir
IF EXIST "%WDIR%"\.mvn goto baseDirFound
cd ..
IF "%WDIR%"=="%CD%" goto baseDirNotFound
set WDIR=%CD%
goto findBaseDir
:baseDirFound
set MAVEN_PROJECTBASEDIR=%WDIR%
cd "%EXEC_DIR%"
goto endDetectBaseDir
:baseDirNotFound
set MAVEN_PROJECTBASEDIR=%EXEC_DIR%
cd "%EXEC_DIR%"
:endDetectBaseDir
IF NOT EXIST "%MAVEN_PROJECTBASEDIR%\.mvn\jvm.config" goto endReadAdditionalConfig
@setlocal EnableExtensions EnableDelayedExpansion
for /F "usebackq delims=" %%a in ("%MAVEN_PROJECTBASEDIR%\.mvn\jvm.config") do set JVM_CONFIG_MAVEN_PROPS=!JVM_CONFIG_MAVEN_PROPS! %%a
@endlocal & set JVM_CONFIG_MAVEN_PROPS=%JVM_CONFIG_MAVEN_PROPS%
:endReadAdditionalConfig
SET MAVEN_JAVA_EXE="%JAVA_HOME%\bin\java.exe"
set WRAPPER_JAR="%MAVEN_PROJECTBASEDIR%\.mvn\wrapper\maven-wrapper.jar"
set WRAPPER_LAUNCHER=org.apache.maven.wrapper.MavenWrapperMain
set DOWNLOAD_URL="https://repo.maven.apache.org/maven2/io/takari/maven-wrapper/0.4.2/maven-wrapper-0.4.2.jar"
FOR /F "tokens=1,2 delims==" %%A IN (%MAVEN_PROJECTBASEDIR%\.mvn\wrapper\maven-wrapper.properties) DO (
IF "%%A"=="wrapperUrl" SET DOWNLOAD_URL=%%B
)
@REM Extension to allow automatically downloading the maven-wrapper.jar from Maven-central
@REM This allows using the maven wrapper in projects that prohibit checking in binary data.
if exist %WRAPPER_JAR% (
echo Found %WRAPPER_JAR%
) else (
echo Couldn't find %WRAPPER_JAR%, downloading it ...
echo Downloading from: %DOWNLOAD_URL%
powershell -Command "(New-Object Net.WebClient).DownloadFile('%DOWNLOAD_URL%', '%WRAPPER_JAR%')"
echo Finished downloading %WRAPPER_JAR%
)
@REM End of extension
%MAVEN_JAVA_EXE% %JVM_CONFIG_MAVEN_PROPS% %MAVEN_OPTS% %MAVEN_DEBUG_OPTS% -classpath %WRAPPER_JAR% "-Dmaven.multiModuleProjectDirectory=%MAVEN_PROJECTBASEDIR%" %WRAPPER_LAUNCHER% %MAVEN_CONFIG% %*
if ERRORLEVEL 1 goto error
goto end
:error
set ERROR_CODE=1
:end
@endlocal & set ERROR_CODE=%ERROR_CODE%
if not "%MAVEN_SKIP_RC%" == "" goto skipRcPost
@REM check for post script, once with legacy .bat ending and once with .cmd ending
if exist "%HOME%\mavenrc_post.bat" call "%HOME%\mavenrc_post.bat"
if exist "%HOME%\mavenrc_post.cmd" call "%HOME%\mavenrc_post.cmd"
:skipRcPost
@REM pause the script if MAVEN_BATCH_PAUSE is set to 'on'
if "%MAVEN_BATCH_PAUSE%" == "on" pause
if "%MAVEN_TERMINATE_CMD%" == "on" exit %ERROR_CODE%
exit /B %ERROR_CODE%

35
pom.xml
View File

@@ -1,21 +1,21 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.8.RELEASE</version>
<version>1.10.23.RELEASE</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
<description>MongoDB support for Spring Data</description>
<url>http://projects.spring.io/spring-data-mongodb</url>
<url>https://projects.spring.io/spring-data-mongodb</url>
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.9.8.RELEASE</version>
<version>1.9.23.RELEASE</version>
</parent>
<modules>
@@ -28,7 +28,7 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.13.8.RELEASE</springdata.commons>
<springdata.commons>1.13.23.RELEASE</springdata.commons>
<mongo>2.14.3</mongo>
<mongo.osgi>2.13.0</mongo.osgi>
<jmh.version>1.19</jmh.version>
@@ -40,7 +40,7 @@
<name>Oliver Gierke</name>
<email>ogierke at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Project Lead</role>
</roles>
@@ -51,7 +51,7 @@
<name>Thomas Risberg</name>
<email>trisberg at vmware.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -62,7 +62,7 @@
<name>Mark Pollack</name>
<email>mpollack at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -73,7 +73,7 @@
<name>Jon Brisbin</name>
<email>jbrisbin at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -84,7 +84,7 @@
<name>Thomas Darimont</name>
<email>tdarimont at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -95,7 +95,7 @@
<name>Christoph Strobl</name>
<email>cstrobl at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -204,19 +204,6 @@
</profile>
<profile>
<id>release</id>
<build>
<plugins>
<plugin>
<groupId>org.jfrog.buildinfo</groupId>
<artifactId>artifactory-maven-plugin</artifactId>
<inherited>false</inherited>
</plugin>
</plugins>
</build>
</profile>
<profile>
<id>benchmarks</id>
<modules>

View File

@@ -1,13 +1,13 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.8.RELEASE</version>
<version>1.10.23.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,12 +1,12 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.8.RELEASE</version>
<version>1.10.23.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -16,6 +16,7 @@
<properties>
<jpa>2.0.0</jpa>
<hibernate>3.6.10.Final</hibernate>
<project.root>${basedir}/..</project.root>
</properties>
<dependencies>
@@ -48,7 +49,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.10.8.RELEASE</version>
<version>1.10.23.RELEASE</version>
</dependency>
<dependency>

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,197 +1,197 @@
/*
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManagerFactory;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.crossstore.ChangeSet;
import org.springframework.data.crossstore.ChangeSetBacked;
import org.springframework.data.crossstore.ChangeSetPersister;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.util.ClassUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.MongoException;
/**
* @author Thomas Risberg
* @author Oliver Gierke
* @author Alex Vengrovsk
* @author Mark Paluch
*/
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_CLASS = "_entity_class";
private static final String ENTITY_ID = "_entity_id";
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
private final Logger log = LoggerFactory.getLogger(getClass());
private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory;
public void setMongoTemplate(MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
public void setEntityManagerFactory(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentState(java.lang.Class, java.lang.Object, org.springframework.data.crossstore.ChangeSet)
*/
public void getPersistentState(Class<? extends ChangeSetBacked> entityClass, Object id, final ChangeSet changeSet)
throws DataAccessException, NotFoundException {
if (id == null) {
log.debug("Unable to load MongoDB data for null id");
return;
}
String collName = getCollectionNameForEntity(entityClass);
final DBObject dbk = new BasicDBObject();
dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for {}", dbk);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
for (DBObject dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) {
log.debug("Processing key: {}", key);
}
if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) {
throw new DataIntegrityViolationException(
"Unble to convert property " + key + ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
}
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: {}", key);
}
changeSet.set(key, value);
}
}
return null;
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (log.isDebugEnabled()) {
log.debug("getPersistentId called on {}", entity);
}
if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
}
return entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity);
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#persistState(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object persistState(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (cs == null) {
log.debug("Flush: changeset was null, nothing to flush.");
return 0L;
}
if (log.isDebugEnabled()) {
log.debug("Flush: changeset: {}", cs.getValues());
}
String collName = getCollectionNameForEntity(entity.getClass());
if (mongoTemplate.getCollection(collName) == null) {
mongoTemplate.createCollection(collName);
}
for (String key : cs.getValues().keySet()) {
if (key != null && !key.startsWith("_") && !key.equals(ChangeSetPersister.ID_KEY)) {
Object value = cs.getValues().get(key);
final DBObject dbQuery = new BasicDBObject();
dbQuery.put(ENTITY_ID, getPersistentId(entity, cs));
dbQuery.put(ENTITY_CLASS, entity.getClass().getName());
dbQuery.put(ENTITY_FIELD_NAME, key);
DBObject dbId = mongoTemplate.execute(collName, new CollectionCallback<DBObject>() {
public DBObject doInCollection(DBCollection collection) throws MongoException, DataAccessException {
return collection.findOne(dbQuery);
}
});
if (value == null) {
if (log.isDebugEnabled()) {
log.debug("Flush: removing: {}", dbQuery);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
collection.remove(dbQuery);
return null;
}
});
} else {
final DBObject dbDoc = new BasicDBObject();
dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) {
log.debug("Flush: saving: {}", dbQuery);
}
mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());
if (dbId != null) {
dbDoc.put("_id", dbId.get("_id"));
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
collection.save(dbDoc);
return null;
}
});
}
}
}
return 0L;
}
/**
* Returns the collection the given entity type shall be persisted to.
*
* @param entityClass must not be {@literal null}.
* @return
*/
private String getCollectionNameForEntity(Class<? extends ChangeSetBacked> entityClass) {
return mongoTemplate.getCollectionName(entityClass);
}
}
/*
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManagerFactory;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.crossstore.ChangeSet;
import org.springframework.data.crossstore.ChangeSetBacked;
import org.springframework.data.crossstore.ChangeSetPersister;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.util.ClassUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.MongoException;
/**
* @author Thomas Risberg
* @author Oliver Gierke
* @author Alex Vengrovsk
* @author Mark Paluch
*/
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_CLASS = "_entity_class";
private static final String ENTITY_ID = "_entity_id";
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
private final Logger log = LoggerFactory.getLogger(getClass());
private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory;
public void setMongoTemplate(MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
public void setEntityManagerFactory(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentState(java.lang.Class, java.lang.Object, org.springframework.data.crossstore.ChangeSet)
*/
public void getPersistentState(Class<? extends ChangeSetBacked> entityClass, Object id, final ChangeSet changeSet)
throws DataAccessException, NotFoundException {
if (id == null) {
log.debug("Unable to load MongoDB data for null id");
return;
}
String collName = getCollectionNameForEntity(entityClass);
final DBObject dbk = new BasicDBObject();
dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for {}", dbk);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
for (DBObject dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) {
log.debug("Processing key: {}", key);
}
if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) {
throw new DataIntegrityViolationException(
"Unble to convert property " + key + ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
}
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: {}", key);
}
changeSet.set(key, value);
}
}
return null;
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (log.isDebugEnabled()) {
log.debug("getPersistentId called on {}", entity);
}
if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
}
return entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity);
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#persistState(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object persistState(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (cs == null) {
log.debug("Flush: changeset was null, nothing to flush.");
return 0L;
}
if (log.isDebugEnabled()) {
log.debug("Flush: changeset: {}", cs.getValues());
}
String collName = getCollectionNameForEntity(entity.getClass());
if (mongoTemplate.getCollection(collName) == null) {
mongoTemplate.createCollection(collName);
}
for (String key : cs.getValues().keySet()) {
if (key != null && !key.startsWith("_") && !key.equals(ChangeSetPersister.ID_KEY)) {
Object value = cs.getValues().get(key);
final DBObject dbQuery = new BasicDBObject();
dbQuery.put(ENTITY_ID, getPersistentId(entity, cs));
dbQuery.put(ENTITY_CLASS, entity.getClass().getName());
dbQuery.put(ENTITY_FIELD_NAME, key);
DBObject dbId = mongoTemplate.execute(collName, new CollectionCallback<DBObject>() {
public DBObject doInCollection(DBCollection collection) throws MongoException, DataAccessException {
return collection.findOne(dbQuery);
}
});
if (value == null) {
if (log.isDebugEnabled()) {
log.debug("Flush: removing: {}", dbQuery);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
collection.remove(dbQuery);
return null;
}
});
} else {
final DBObject dbDoc = new BasicDBObject();
dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) {
log.debug("Flush: saving: {}", dbQuery);
}
mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());
if (dbId != null) {
dbDoc.put("_id", dbId.get("_id"));
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
collection.save(dbDoc);
return null;
}
});
}
}
}
return 0L;
}
/**
* Returns the collection the given entity type shall be persisted to.
*
* @param entityClass must not be {@literal null}.
* @return
*/
private String getCollectionNameForEntity(Class<? extends ChangeSetBacked> entityClass) {
return mongoTemplate.getCollectionName(entityClass);
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.8.RELEASE</version>
<version>1.10.23.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -28,10 +28,6 @@
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>wagon-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>

View File

@@ -1,11 +1,11 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.8.RELEASE</version>
<version>1.10.23.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -14,6 +14,7 @@
<properties>
<log4j>1.2.16</log4j>
<project.root>${basedir}/..</project.root>
</properties>
<dependencies>

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2016 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016 the original author or authors.
* Copyright 2016-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2016 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2017 the original author or authors.
* Copyright 2013-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
@@ -11,13 +11,14 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.8.RELEASE</version>
<version>1.10.23.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<objenesis>1.3</objenesis>
<equalsverifier>1.5</equalsverifier>
<project.root>${basedir}/..</project.root>
</properties>
<dependencies>
@@ -147,7 +148,6 @@
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson}</version>
<optional>true</optional>
</dependency>

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2012 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -23,7 +23,7 @@ import com.mongodb.DB;
/**
* Interface for factories creating {@link DB} instances.
*
*
* @author Mark Pollack
* @author Thomas Darimont
*/
@@ -31,7 +31,7 @@ public interface MongoDbFactory {
/**
* Creates a default {@link DB} instance.
*
*
* @return
* @throws DataAccessException
*/
@@ -39,7 +39,7 @@ public interface MongoDbFactory {
/**
* Creates a {@link DB} instance to access the database with the given name.
*
*
* @param dbName must not be {@literal null} or empty.
* @return
* @throws DataAccessException
@@ -48,7 +48,7 @@ public interface MongoDbFactory {
/**
* Exposes a shared {@link MongoExceptionTranslator}.
*
*
* @return will never be {@literal null}.
*/
PersistenceExceptionTranslator getExceptionTranslator();

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,297 +1,297 @@
/*
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.support.CachingIsNewStrategyFactory;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Base class for Spring Data MongoDB configuration using JavaConfig.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Ryan Tenney
* @author Christoph Strobl
*/
@Configuration
public abstract class AbstractMongoConfiguration {
/**
* Return the name of the database to connect to.
*
* @return must not be {@literal null}.
*/
protected abstract String getDatabaseName();
/**
* Return the name of the authentication database to use. Defaults to {@literal null} and will turn into the value
* returned by {@link #getDatabaseName()} later on effectively.
*
* @return
* @deprecated since 1.7. {@link MongoClient} should hold authentication data within
* {@link MongoClient#getCredentialsList()}
*/
@Deprecated
protected String getAuthenticationDatabaseName() {
return null;
}
/**
* Return the {@link Mongo} instance to connect to. Annotate with {@link Bean} in case you want to expose a
* {@link Mongo} instance to the {@link org.springframework.context.ApplicationContext}.
*
* @return
* @throws Exception
*/
public abstract Mongo mongo() throws Exception;
/**
* Creates a {@link MongoTemplate}.
*
* @return
* @throws Exception
*/
@Bean
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongoDbFactory(), mappingMongoConverter());
}
/**
* Creates a {@link SimpleMongoDbFactory} to be used by the {@link MongoTemplate}. Will use the {@link Mongo} instance
* configured in {@link #mongo()}.
*
* @see #mongo()
* @see #mongoTemplate()
* @return
* @throws Exception
*/
@Bean
public MongoDbFactory mongoDbFactory() throws Exception {
return new SimpleMongoDbFactory(mongo(), getDatabaseName(), getUserCredentials(), getAuthenticationDatabaseName());
}
/**
* Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration
* class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending
* {@link AbstractMongoConfiguration} the base package will be considered {@code com.acme} unless the method is
* overridden to implement alternate behavior.
*
* @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for
* entities.
* @deprecated use {@link #getMappingBasePackages()} instead.
*/
@Deprecated
protected String getMappingBasePackage() {
Package mappingBasePackage = getClass().getPackage();
return mappingBasePackage == null ? null : mappingBasePackage.getName();
}
/**
* Returns the base packages to scan for MongoDB mapped entities at startup. Will return the package name of the
* configuration class' (the concrete class, not this one here) by default. So if you have a
* {@code com.acme.AppConfig} extending {@link AbstractMongoConfiguration} the base package will be considered
* {@code com.acme} unless the method is overridden to implement alternate behavior.
*
* @return the base packages to scan for mapped {@link Document} classes or an empty collection to not enable scanning
* for entities.
* @since 1.10
*/
protected Collection<String> getMappingBasePackages() {
return Collections.singleton(getMappingBasePackage());
}
/**
* Return {@link UserCredentials} to be used when connecting to the MongoDB instance or {@literal null} if none shall
* be used.
*
* @return
* @deprecated since 1.7. {@link MongoClient} should hold authentication data within
* {@link MongoClient#getCredentialsList()}
*/
@Deprecated
protected UserCredentials getUserCredentials() {
return null;
}
/**
* Creates a {@link MongoMappingContext} equipped with entity classes scanned from the mapping base package.
*
* @see #getMappingBasePackage()
* @return
* @throws ClassNotFoundException
*/
@Bean
public MongoMappingContext mongoMappingContext() throws ClassNotFoundException {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
mappingContext.setFieldNamingStrategy(fieldNamingStrategy());
return mappingContext;
}
/**
* Returns a {@link MappingContextIsNewStrategyFactory} wrapped into a {@link CachingIsNewStrategyFactory}.
*
* @return
* @throws ClassNotFoundException
*/
@Bean
public IsNewStrategyFactory isNewStrategyFactory() throws ClassNotFoundException {
return new CachingIsNewStrategyFactory(new MappingContextIsNewStrategyFactory(mongoMappingContext()));
}
/**
* Register custom {@link Converter}s in a {@link CustomConversions} object if required. These
* {@link CustomConversions} will be registered with the {@link #mappingMongoConverter()} and
* {@link #mongoMappingContext()}. Returns an empty {@link CustomConversions} instance by default.
*
* @return must not be {@literal null}.
*/
@Bean
public CustomConversions customConversions() {
return new CustomConversions(Collections.emptyList());
}
/**
* Creates a {@link MappingMongoConverter} using the configured {@link #mongoDbFactory()} and
* {@link #mongoMappingContext()}. Will get {@link #customConversions()} applied.
*
* @see #customConversions()
* @see #mongoMappingContext()
* @see #mongoDbFactory()
* @return
* @throws Exception
*/
@Bean
public MappingMongoConverter mappingMongoConverter() throws Exception {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext());
converter.setCustomConversions(customConversions());
return converter;
}
/**
* Scans the mapping base package for classes annotated with {@link Document}. By default, it scans for entities in
* all packages returned by {@link #getMappingBasePackages()}.
*
* @see #getMappingBasePackages()
* @return
* @throws ClassNotFoundException
*/
protected Set<Class<?>> getInitialEntitySet() throws ClassNotFoundException {
Set<Class<?>> initialEntitySet = new HashSet<Class<?>>();
for (String basePackage : getMappingBasePackages()) {
initialEntitySet.addAll(scanForEntities(basePackage));
}
return initialEntitySet;
}
/**
* Scans the given base package for entities, i.e. MongoDB specific types annotated with {@link Document} and
* {@link Persistent}.
*
* @param basePackage must not be {@literal null}.
* @return
* @throws ClassNotFoundException
* @since 1.10
*/
protected Set<Class<?>> scanForEntities(String basePackage) throws ClassNotFoundException {
if (!StringUtils.hasText(basePackage)) {
return Collections.emptySet();
}
Set<Class<?>> initialEntitySet = new HashSet<Class<?>>();
if (StringUtils.hasText(basePackage)) {
ClassPathScanningCandidateComponentProvider componentProvider = new ClassPathScanningCandidateComponentProvider(
false);
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Document.class));
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Persistent.class));
for (BeanDefinition candidate : componentProvider.findCandidateComponents(basePackage)) {
initialEntitySet
.add(ClassUtils.forName(candidate.getBeanClassName(), AbstractMongoConfiguration.class.getClassLoader()));
}
}
return initialEntitySet;
}
/**
* Configures whether to abbreviate field names for domain objects by configuring a
* {@link CamelCaseAbbreviatingFieldNamingStrategy} on the {@link MongoMappingContext} instance created. For advanced
* customization needs, consider overriding {@link #mappingMongoConverter()}.
*
* @return
*/
protected boolean abbreviateFieldNames() {
return false;
}
/**
* Configures a {@link FieldNamingStrategy} on the {@link MongoMappingContext} instance created.
*
* @return
* @since 1.5
*/
protected FieldNamingStrategy fieldNamingStrategy() {
return abbreviateFieldNames() ? new CamelCaseAbbreviatingFieldNamingStrategy()
: PropertyNameFieldNamingStrategy.INSTANCE;
}
}
/*
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.support.CachingIsNewStrategyFactory;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Base class for Spring Data MongoDB configuration using JavaConfig.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Ryan Tenney
* @author Christoph Strobl
*/
@Configuration
public abstract class AbstractMongoConfiguration {
/**
* Return the name of the database to connect to.
*
* @return must not be {@literal null}.
*/
protected abstract String getDatabaseName();
/**
* Return the name of the authentication database to use. Defaults to {@literal null} and will turn into the value
* returned by {@link #getDatabaseName()} later on effectively.
*
* @return
* @deprecated since 1.7. {@link MongoClient} should hold authentication data within
* {@link MongoClient#getCredentialsList()}
*/
@Deprecated
protected String getAuthenticationDatabaseName() {
return null;
}
/**
* Return the {@link Mongo} instance to connect to. Annotate with {@link Bean} in case you want to expose a
* {@link Mongo} instance to the {@link org.springframework.context.ApplicationContext}.
*
* @return
* @throws Exception
*/
public abstract Mongo mongo() throws Exception;
/**
* Creates a {@link MongoTemplate}.
*
* @return
* @throws Exception
*/
@Bean
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongoDbFactory(), mappingMongoConverter());
}
/**
* Creates a {@link SimpleMongoDbFactory} to be used by the {@link MongoTemplate}. Will use the {@link Mongo} instance
* configured in {@link #mongo()}.
*
* @see #mongo()
* @see #mongoTemplate()
* @return
* @throws Exception
*/
@Bean
public MongoDbFactory mongoDbFactory() throws Exception {
return new SimpleMongoDbFactory(mongo(), getDatabaseName(), getUserCredentials(), getAuthenticationDatabaseName());
}
/**
* Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration
* class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending
* {@link AbstractMongoConfiguration} the base package will be considered {@code com.acme} unless the method is
* overridden to implement alternate behavior.
*
* @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for
* entities.
* @deprecated use {@link #getMappingBasePackages()} instead.
*/
@Deprecated
protected String getMappingBasePackage() {
Package mappingBasePackage = getClass().getPackage();
return mappingBasePackage == null ? null : mappingBasePackage.getName();
}
/**
* Returns the base packages to scan for MongoDB mapped entities at startup. Will return the package name of the
* configuration class' (the concrete class, not this one here) by default. So if you have a
* {@code com.acme.AppConfig} extending {@link AbstractMongoConfiguration} the base package will be considered
* {@code com.acme} unless the method is overridden to implement alternate behavior.
*
* @return the base packages to scan for mapped {@link Document} classes or an empty collection to not enable scanning
* for entities.
* @since 1.10
*/
protected Collection<String> getMappingBasePackages() {
return Collections.singleton(getMappingBasePackage());
}
/**
* Return {@link UserCredentials} to be used when connecting to the MongoDB instance or {@literal null} if none shall
* be used.
*
* @return
* @deprecated since 1.7. {@link MongoClient} should hold authentication data within
* {@link MongoClient#getCredentialsList()}
*/
@Deprecated
protected UserCredentials getUserCredentials() {
return null;
}
/**
* Creates a {@link MongoMappingContext} equipped with entity classes scanned from the mapping base package.
*
* @see #getMappingBasePackage()
* @return
* @throws ClassNotFoundException
*/
@Bean
public MongoMappingContext mongoMappingContext() throws ClassNotFoundException {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
mappingContext.setFieldNamingStrategy(fieldNamingStrategy());
return mappingContext;
}
/**
* Returns a {@link MappingContextIsNewStrategyFactory} wrapped into a {@link CachingIsNewStrategyFactory}.
*
* @return
* @throws ClassNotFoundException
*/
@Bean
public IsNewStrategyFactory isNewStrategyFactory() throws ClassNotFoundException {
return new CachingIsNewStrategyFactory(new MappingContextIsNewStrategyFactory(mongoMappingContext()));
}
/**
* Register custom {@link Converter}s in a {@link CustomConversions} object if required. These
* {@link CustomConversions} will be registered with the {@link #mappingMongoConverter()} and
* {@link #mongoMappingContext()}. Returns an empty {@link CustomConversions} instance by default.
*
* @return must not be {@literal null}.
*/
@Bean
public CustomConversions customConversions() {
return new CustomConversions(Collections.emptyList());
}
/**
* Creates a {@link MappingMongoConverter} using the configured {@link #mongoDbFactory()} and
* {@link #mongoMappingContext()}. Will get {@link #customConversions()} applied.
*
* @see #customConversions()
* @see #mongoMappingContext()
* @see #mongoDbFactory()
* @return
* @throws Exception
*/
@Bean
public MappingMongoConverter mappingMongoConverter() throws Exception {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext());
converter.setCustomConversions(customConversions());
return converter;
}
/**
* Scans the mapping base package for classes annotated with {@link Document}. By default, it scans for entities in
* all packages returned by {@link #getMappingBasePackages()}.
*
* @see #getMappingBasePackages()
* @return
* @throws ClassNotFoundException
*/
protected Set<Class<?>> getInitialEntitySet() throws ClassNotFoundException {
Set<Class<?>> initialEntitySet = new HashSet<Class<?>>();
for (String basePackage : getMappingBasePackages()) {
initialEntitySet.addAll(scanForEntities(basePackage));
}
return initialEntitySet;
}
/**
* Scans the given base package for entities, i.e. MongoDB specific types annotated with {@link Document} and
* {@link Persistent}.
*
* @param basePackage must not be {@literal null}.
* @return
* @throws ClassNotFoundException
* @since 1.10
*/
protected Set<Class<?>> scanForEntities(String basePackage) throws ClassNotFoundException {
if (!StringUtils.hasText(basePackage)) {
return Collections.emptySet();
}
Set<Class<?>> initialEntitySet = new HashSet<Class<?>>();
if (StringUtils.hasText(basePackage)) {
ClassPathScanningCandidateComponentProvider componentProvider = new ClassPathScanningCandidateComponentProvider(
false);
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Document.class));
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Persistent.class));
for (BeanDefinition candidate : componentProvider.findCandidateComponents(basePackage)) {
initialEntitySet
.add(ClassUtils.forName(candidate.getBeanClassName(), AbstractMongoConfiguration.class.getClassLoader()));
}
}
return initialEntitySet;
}
/**
* Configures whether to abbreviate field names for domain objects by configuring a
* {@link CamelCaseAbbreviatingFieldNamingStrategy} on the {@link MongoMappingContext} instance created. For advanced
* customization needs, consider overriding {@link #mappingMongoConverter()}.
*
* @return
*/
protected boolean abbreviateFieldNames() {
return false;
}
/**
* Configures a {@link FieldNamingStrategy} on the {@link MongoMappingContext} instance created.
*
* @return
* @since 1.5
*/
protected FieldNamingStrategy fieldNamingStrategy() {
return abbreviateFieldNames() ? new CamelCaseAbbreviatingFieldNamingStrategy()
: PropertyNameFieldNamingStrategy.INSTANCE;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2012-2014 the original author or authors.
* Copyright 2012-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2016 the original author or authors.
* Copyright 2013-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import java.lang.reflect.Method;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
@@ -23,15 +24,18 @@ import java.util.Properties;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.springframework.util.ReflectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.MongoCredential;
/**
* Parse a {@link String} to a Collection of {@link MongoCredential}.
*
*
* @author Christoph Strobl
* @author Oliver Gierke
* @author Stephen Tyler Conrad
* @author Mark Paluch
* @since 1.7
*/
public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
@@ -95,6 +99,20 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
verifyDatabasePresent(database);
credentials.add(MongoCredential.createScramSha1Credential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if ("SCRAM-SHA-256".equals(authMechanism)) {
Method createScramSha256Credential = ReflectionUtils.findMethod(MongoCredential.class,
"createScramSha256Credential", String.class, String.class, char[].class);
if (createScramSha256Credential == null) {
throw new IllegalArgumentException(
"SCRAM-SHA-256 auth mechanism is available as of MongoDB 4 and MongoDB Java Driver 3.8! Please make sure to use at least those versions.");
}
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.class.cast(ReflectionUtils.invokeMethod(createScramSha256Credential, null,
userNameAndPassword[0], database, userNameAndPassword[1].toCharArray())));
} else {
throw new IllegalArgumentException(
String.format("Cannot create MongoCredentials for unknown auth mechanism '%s'!", authMechanism));
@@ -156,7 +174,7 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
private static Properties extractOptions(String text) {
int optionsSeperationIndex = text.lastIndexOf(OPTIONS_DELIMINATOR);
int dbSeperationIndex = text.lastIndexOf(OPTIONS_DELIMINATOR);
int dbSeperationIndex = text.lastIndexOf(DATABASE_DELIMINATOR);
if (optionsSeperationIndex == -1 || dbSeperationIndex > optionsSeperationIndex) {
return new Properties();
@@ -166,6 +184,11 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
for (String option : text.substring(optionsSeperationIndex + 1).split(OPTION_VALUE_DELIMINATOR)) {
String[] optionArgs = option.split("=");
if (optionArgs.length == 1) {
throw new IllegalArgumentException(String.format("Query parameter '%s' has no value!", optionArgs[0]));
}
properties.put(optionArgs[0], optionArgs[1]);
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,69 +1,69 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.mongodb.core.MongoAdmin;
import org.springframework.data.mongodb.monitor.*;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
public class MongoJmxParser implements BeanDefinitionParser {
public BeanDefinition parse(Element element, ParserContext parserContext) {
String name = element.getAttribute("mongo-ref");
if (!StringUtils.hasText(name)) {
name = "mongo";
}
registerJmxComponents(name, element, parserContext);
return null;
}
protected void registerJmxComponents(String mongoRefName, Element element, ParserContext parserContext) {
Object eleSource = parserContext.extractSource(element);
CompositeComponentDefinition compositeDef = new CompositeComponentDefinition(element.getTagName(), eleSource);
createBeanDefEntry(AssertMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(BackgroundFlushingMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(BtreeIndexCounters.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(ConnectionMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(GlobalLockMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(MemoryMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(OperationCounters.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(ServerInfo.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(MongoAdmin.class, compositeDef, mongoRefName, eleSource, parserContext);
parserContext.registerComponent(compositeDef);
}
protected void createBeanDefEntry(Class<?> clazz, CompositeComponentDefinition compositeDef, String mongoRefName,
Object eleSource, ParserContext parserContext) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(clazz);
builder.getRawBeanDefinition().setSource(eleSource);
builder.addConstructorArgReference(mongoRefName);
BeanDefinition assertDef = builder.getBeanDefinition();
String assertName = parserContext.getReaderContext().registerWithGeneratedName(assertDef);
compositeDef.addNestedComponent(new BeanComponentDefinition(assertDef, assertName));
}
}
/*
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.mongodb.core.MongoAdmin;
import org.springframework.data.mongodb.monitor.*;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
public class MongoJmxParser implements BeanDefinitionParser {
public BeanDefinition parse(Element element, ParserContext parserContext) {
String name = element.getAttribute("mongo-ref");
if (!StringUtils.hasText(name)) {
name = "mongo";
}
registerJmxComponents(name, element, parserContext);
return null;
}
protected void registerJmxComponents(String mongoRefName, Element element, ParserContext parserContext) {
Object eleSource = parserContext.extractSource(element);
CompositeComponentDefinition compositeDef = new CompositeComponentDefinition(element.getTagName(), eleSource);
createBeanDefEntry(AssertMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(BackgroundFlushingMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(BtreeIndexCounters.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(ConnectionMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(GlobalLockMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(MemoryMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(OperationCounters.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(ServerInfo.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(MongoAdmin.class, compositeDef, mongoRefName, eleSource, parserContext);
parserContext.registerComponent(compositeDef);
}
protected void createBeanDefEntry(Class<?> clazz, CompositeComponentDefinition compositeDef, String mongoRefName,
Object eleSource, ParserContext parserContext) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(clazz);
builder.getRawBeanDefinition().setSource(eleSource);
builder.addConstructorArgReference(mongoRefName);
BeanDefinition assertDef = builder.getBeanDefinition();
String assertName = parserContext.getReaderContext().registerWithGeneratedName(assertDef);
compositeDef.addNestedComponent(new BeanComponentDefinition(assertDef, assertName));
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,207 +1,207 @@
/*
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.springframework.data.config.ParsingUtils.*;
import java.util.Map;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.data.mongodb.core.MongoClientOptionsFactoryBean;
import org.springframework.data.mongodb.core.MongoOptionsFactoryBean;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
/**
* Utility methods for {@link BeanDefinitionParser} implementations for MongoDB.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
@SuppressWarnings("deprecation")
abstract class MongoParsingUtils {
private MongoParsingUtils() {}
/**
* Parses the mongo replica-set element.
*
* @param parserContext the parser context
* @param element the mongo element
* @param mongoBuilder the bean definition builder to populate
* @return
*/
static void parseReplicaSet(Element element, BeanDefinitionBuilder mongoBuilder) {
setPropertyValue(mongoBuilder, element, "replica-set", "replicaSetSeeds");
}
/**
* Parses the {@code mongo:options} sub-element. Populates the given attribute factory with the proper attributes.
*
* @return true if parsing actually occured, {@literal false} otherwise
*/
static boolean parseMongoOptions(Element element, BeanDefinitionBuilder mongoBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "options");
if (optionsElement == null) {
return false;
}
BeanDefinitionBuilder optionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoOptionsFactoryBean.class);
setPropertyValue(optionsDefBuilder, optionsElement, "connections-per-host", "connectionsPerHost");
setPropertyValue(optionsDefBuilder, optionsElement, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(optionsDefBuilder, optionsElement, "max-wait-time", "maxWaitTime");
setPropertyValue(optionsDefBuilder, optionsElement, "connect-timeout", "connectTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "socket-timeout", "socketTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(optionsDefBuilder, optionsElement, "auto-connect-retry", "autoConnectRetry");
setPropertyValue(optionsDefBuilder, optionsElement, "max-auto-connect-retry-time", "maxAutoConnectRetryTime");
setPropertyValue(optionsDefBuilder, optionsElement, "write-number", "writeNumber");
setPropertyValue(optionsDefBuilder, optionsElement, "write-timeout", "writeTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "write-fsync", "writeFsync");
setPropertyValue(optionsDefBuilder, optionsElement, "slave-ok", "slaveOk");
setPropertyValue(optionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(optionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
mongoBuilder.addPropertyValue("mongoOptions", optionsDefBuilder.getBeanDefinition());
return true;
}
/**
* Parses the {@code mongo:client-options} sub-element. Populates the given attribute factory with the proper
* attributes.
*
* @param element must not be {@literal null}.
* @param mongoClientBuilder must not be {@literal null}.
* @return
* @since 1.7
*/
public static boolean parseMongoClientOptions(Element element, BeanDefinitionBuilder mongoClientBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "client-options");
if (optionsElement == null) {
return false;
}
BeanDefinitionBuilder clientOptionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoClientOptionsFactoryBean.class);
setPropertyValue(clientOptionsDefBuilder, optionsElement, "description", "description");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-connections-per-host", "minConnectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connections-per-host", "connectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-wait-time", "maxWaitTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-idle-time", "maxConnectionIdleTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-life-time", "maxConnectionLifeTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connect-timeout", "connectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-timeout", "socketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "read-preference", "readPreference");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "write-concern", "writeConcern");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-frequency", "heartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-heartbeat-frequency", "minHeartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-connect-timeout", "heartbeatConnectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-socket-timeout", "heartbeatSocketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(clientOptionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "server-selection-timeout", "serverSelectionTimeout");
mongoClientBuilder.addPropertyValue("mongoClientOptions", clientOptionsDefBuilder.getBeanDefinition());
return true;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link WriteConcernPropertyEditor}.
*
* @return
*/
static BeanDefinitionBuilder getWriteConcernPropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.WriteConcern", WriteConcernPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
* container.
*/
static BeanDefinitionBuilder getServerAddressPropertyEditorBuilder() {
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link ReadPreferencePropertyEditor}.
*
* @return
* @since 1.7
*/
static BeanDefinitionBuilder getReadPreferencePropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.ReadPreference", ReadPreferencePropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link MongoCredentialPropertyEditor}.
*
* @return
* @since 1.7
*/
static BeanDefinitionBuilder getMongoCredentialPropertyEditor() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.MongoCredential[]", MongoCredentialPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
}
/*
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.springframework.data.config.ParsingUtils.*;
import java.util.Map;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.data.mongodb.core.MongoClientOptionsFactoryBean;
import org.springframework.data.mongodb.core.MongoOptionsFactoryBean;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
/**
* Utility methods for {@link BeanDefinitionParser} implementations for MongoDB.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
@SuppressWarnings("deprecation")
abstract class MongoParsingUtils {
private MongoParsingUtils() {}
/**
* Parses the mongo replica-set element.
*
* @param parserContext the parser context
* @param element the mongo element
* @param mongoBuilder the bean definition builder to populate
* @return
*/
static void parseReplicaSet(Element element, BeanDefinitionBuilder mongoBuilder) {
setPropertyValue(mongoBuilder, element, "replica-set", "replicaSetSeeds");
}
/**
* Parses the {@code mongo:options} sub-element. Populates the given attribute factory with the proper attributes.
*
* @return true if parsing actually occured, {@literal false} otherwise
*/
static boolean parseMongoOptions(Element element, BeanDefinitionBuilder mongoBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "options");
if (optionsElement == null) {
return false;
}
BeanDefinitionBuilder optionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoOptionsFactoryBean.class);
setPropertyValue(optionsDefBuilder, optionsElement, "connections-per-host", "connectionsPerHost");
setPropertyValue(optionsDefBuilder, optionsElement, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(optionsDefBuilder, optionsElement, "max-wait-time", "maxWaitTime");
setPropertyValue(optionsDefBuilder, optionsElement, "connect-timeout", "connectTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "socket-timeout", "socketTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(optionsDefBuilder, optionsElement, "auto-connect-retry", "autoConnectRetry");
setPropertyValue(optionsDefBuilder, optionsElement, "max-auto-connect-retry-time", "maxAutoConnectRetryTime");
setPropertyValue(optionsDefBuilder, optionsElement, "write-number", "writeNumber");
setPropertyValue(optionsDefBuilder, optionsElement, "write-timeout", "writeTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "write-fsync", "writeFsync");
setPropertyValue(optionsDefBuilder, optionsElement, "slave-ok", "slaveOk");
setPropertyValue(optionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(optionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
mongoBuilder.addPropertyValue("mongoOptions", optionsDefBuilder.getBeanDefinition());
return true;
}
/**
* Parses the {@code mongo:client-options} sub-element. Populates the given attribute factory with the proper
* attributes.
*
* @param element must not be {@literal null}.
* @param mongoClientBuilder must not be {@literal null}.
* @return
* @since 1.7
*/
public static boolean parseMongoClientOptions(Element element, BeanDefinitionBuilder mongoClientBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "client-options");
if (optionsElement == null) {
return false;
}
BeanDefinitionBuilder clientOptionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoClientOptionsFactoryBean.class);
setPropertyValue(clientOptionsDefBuilder, optionsElement, "description", "description");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-connections-per-host", "minConnectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connections-per-host", "connectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-wait-time", "maxWaitTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-idle-time", "maxConnectionIdleTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-life-time", "maxConnectionLifeTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connect-timeout", "connectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-timeout", "socketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "read-preference", "readPreference");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "write-concern", "writeConcern");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-frequency", "heartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-heartbeat-frequency", "minHeartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-connect-timeout", "heartbeatConnectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-socket-timeout", "heartbeatSocketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(clientOptionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "server-selection-timeout", "serverSelectionTimeout");
mongoClientBuilder.addPropertyValue("mongoClientOptions", clientOptionsDefBuilder.getBeanDefinition());
return true;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link WriteConcernPropertyEditor}.
*
* @return
*/
static BeanDefinitionBuilder getWriteConcernPropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.WriteConcern", WriteConcernPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
* container.
*/
static BeanDefinitionBuilder getServerAddressPropertyEditorBuilder() {
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link ReadPreferencePropertyEditor}.
*
* @return
* @since 1.7
*/
static BeanDefinitionBuilder getReadPreferencePropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.ReadPreference", ReadPreferencePropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link MongoCredentialPropertyEditor}.
*
* @return
* @since 1.7
*/
static BeanDefinitionBuilder getMongoCredentialPropertyEditor() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.MongoCredential[]", MongoCredentialPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -0,0 +1,121 @@
/*
* Copyright 2018-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import lombok.AllArgsConstructor;
import java.util.List;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.ObjectUtils;
import com.mongodb.BasicDBList;
import com.mongodb.DBObject;
/**
* Utility methods to map {@link org.springframework.data.mongodb.core.aggregation.Aggregation} pipeline definitions and
* create type-bound {@link AggregationOperationContext}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 1.10.13
*/
@AllArgsConstructor
class AggregationUtil {
QueryMapper queryMapper;
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
/**
* Prepare the {@link AggregationOperationContext} for a given aggregation by either returning the context itself it
* is not {@literal null}, create a {@link TypeBasedAggregationOperationContext} if the aggregation contains type
* information (is a {@link TypedAggregation}) or use the {@link Aggregation#DEFAULT_CONTEXT}.
*
* @param aggregation must not be {@literal null}.
* @param context can be {@literal null}.
* @return the root {@link AggregationOperationContext} to use.
*/
AggregationOperationContext prepareAggregationContext(Aggregation aggregation, AggregationOperationContext context) {
if (context != null) {
return context;
}
if (aggregation instanceof TypedAggregation) {
return new TypeBasedAggregationOperationContext(((TypedAggregation) aggregation).getInputType(), mappingContext,
queryMapper);
}
return Aggregation.DEFAULT_CONTEXT;
}
/**
* Extract and map the aggregation pipeline into a {@link List} of {@link Document}.
*
* @param aggregation
* @param context
* @return
*/
DBObject createPipeline(String collectionName, Aggregation aggregation, AggregationOperationContext context) {
if (!ObjectUtils.nullSafeEquals(context, Aggregation.DEFAULT_CONTEXT)) {
return aggregation.toDbObject(collectionName, context);
}
DBObject command = aggregation.toDbObject(collectionName, context);
command.put("pipeline", mapAggregationPipeline((List) command.get("pipeline")));
return command;
}
/**
* Extract the command and map the aggregation pipeline.
*
* @param aggregation
* @param context
* @return
*/
DBObject createCommand(String collection, Aggregation aggregation, AggregationOperationContext context) {
DBObject command = aggregation.toDbObject(collection, context);
if (!ObjectUtils.nullSafeEquals(context, Aggregation.DEFAULT_CONTEXT)) {
return command;
}
command.put("pipeline", mapAggregationPipeline((List) command.get("pipeline")));
return command;
}
private BasicDBList mapAggregationPipeline(List<DBObject> pipeline) {
BasicDBList mapped = new BasicDBList();
for (DBObject stage : pipeline) {
mapped.add(queryMapper.getMappedObject(stage, null));
}
return mapped;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2017 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,26 +1,26 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.DBCollection;
import com.mongodb.MongoException;
import org.springframework.dao.DataAccessException;
public interface CollectionCallback<T> {
T doInCollection(DBCollection collection) throws MongoException, DataAccessException;
}
/*
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.DBCollection;
import com.mongodb.MongoException;
import org.springframework.dao.DataAccessException;
public interface CollectionCallback<T> {
T doInCollection(DBCollection collection) throws MongoException, DataAccessException;
}

View File

@@ -1,70 +1,70 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
/**
* Provides a simple wrapper to encapsulate the variety of settings you can use when creating a collection.
*
* @author Thomas Risberg
*/
public class CollectionOptions {
private Integer maxDocuments;
private Integer size;
private Boolean capped;
/**
* Constructs a new <code>CollectionOptions</code> instance.
*
* @param size the collection size in bytes, this data space is preallocated
* @param maxDocuments the maximum number of documents in the collection.
* @param capped true to created a "capped" collection (fixed size with auto-FIFO behavior based on insertion order),
* false otherwise.
*/
public CollectionOptions(Integer size, Integer maxDocuments, Boolean capped) {
super();
this.maxDocuments = maxDocuments;
this.size = size;
this.capped = capped;
}
public Integer getMaxDocuments() {
return maxDocuments;
}
public void setMaxDocuments(Integer maxDocuments) {
this.maxDocuments = maxDocuments;
}
public Integer getSize() {
return size;
}
public void setSize(Integer size) {
this.size = size;
}
public Boolean getCapped() {
return capped;
}
public void setCapped(Boolean capped) {
this.capped = capped;
}
}
/*
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
/**
* Provides a simple wrapper to encapsulate the variety of settings you can use when creating a collection.
*
* @author Thomas Risberg
*/
public class CollectionOptions {
private Integer maxDocuments;
private Integer size;
private Boolean capped;
/**
* Constructs a new <code>CollectionOptions</code> instance.
*
* @param size the collection size in bytes, this data space is preallocated
* @param maxDocuments the maximum number of documents in the collection.
* @param capped true to created a "capped" collection (fixed size with auto-FIFO behavior based on insertion order),
* false otherwise.
*/
public CollectionOptions(Integer size, Integer maxDocuments, Boolean capped) {
super();
this.maxDocuments = maxDocuments;
this.size = size;
this.capped = capped;
}
public Integer getMaxDocuments() {
return maxDocuments;
}
public void setMaxDocuments(Integer maxDocuments) {
this.maxDocuments = maxDocuments;
}
public Integer getSize() {
return size;
}
public void setSize(Integer size) {
this.size = size;
}
public Boolean getCapped() {
return capped;
}
public void setCapped(Boolean capped) {
this.capped = capped;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2002-2010 the original author or authors.
* Copyright 2002-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,25 +1,25 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.DB;
import com.mongodb.MongoException;
import org.springframework.dao.DataAccessException;
public interface DbCallback<T> {
T doInDB(DB db) throws MongoException, DataAccessException;
}
/*
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.DB;
import com.mongodb.MongoException;
import org.springframework.dao.DataAccessException;
public interface DbCallback<T> {
T doInDB(DB db) throws MongoException, DataAccessException;
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2016 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2014-2016 the original author or authors.
* Copyright 2014-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2016 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,101 +1,101 @@
/*
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.jmx.export.annotation.ManagedOperation;
import org.springframework.jmx.export.annotation.ManagedResource;
import org.springframework.util.Assert;
import com.mongodb.DB;
import com.mongodb.Mongo;
/**
* Mongo server administration exposed via JMX annotations
*
* @author Mark Pollack
* @author Thomas Darimont
* @author Mark Paluch
*/
@ManagedResource(description = "Mongo Admin Operations")
public class MongoAdmin implements MongoAdminOperations {
private final Mongo mongo;
private String username;
private String password;
private String authenticationDatabaseName;
public MongoAdmin(Mongo mongo) {
Assert.notNull(mongo, "Mongo must not be null!");
this.mongo = mongo;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#dropDatabase(java.lang.String)
*/
@ManagedOperation
public void dropDatabase(String databaseName) {
getDB(databaseName).dropDatabase();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#createDatabase(java.lang.String)
*/
@ManagedOperation
public void createDatabase(String databaseName) {
getDB(databaseName);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#getDatabaseStats(java.lang.String)
*/
@ManagedOperation
public String getDatabaseStats(String databaseName) {
return getDB(databaseName).getStats().toString();
}
/**
* Sets the username to use to connect to the Mongo database
*
* @param username The username to use
*/
public void setUsername(String username) {
this.username = username;
}
/**
* Sets the password to use to authenticate with the Mongo database.
*
* @param password The password to use
*/
public void setPassword(String password) {
this.password = password;
}
/**
* Sets the authenticationDatabaseName to use to authenticate with the Mongo database.
*
* @param authenticationDatabaseName The authenticationDatabaseName to use.
*/
public void setAuthenticationDatabaseName(String authenticationDatabaseName) {
this.authenticationDatabaseName = authenticationDatabaseName;
}
DB getDB(String databaseName) {
return MongoDbUtils.getDB(mongo, databaseName, new UserCredentials(username, password), authenticationDatabaseName);
}
}
/*
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.jmx.export.annotation.ManagedOperation;
import org.springframework.jmx.export.annotation.ManagedResource;
import org.springframework.util.Assert;
import com.mongodb.DB;
import com.mongodb.Mongo;
/**
* Mongo server administration exposed via JMX annotations
*
* @author Mark Pollack
* @author Thomas Darimont
* @author Mark Paluch
*/
@ManagedResource(description = "Mongo Admin Operations")
public class MongoAdmin implements MongoAdminOperations {
private final Mongo mongo;
private String username;
private String password;
private String authenticationDatabaseName;
public MongoAdmin(Mongo mongo) {
Assert.notNull(mongo, "Mongo must not be null!");
this.mongo = mongo;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#dropDatabase(java.lang.String)
*/
@ManagedOperation
public void dropDatabase(String databaseName) {
getDB(databaseName).dropDatabase();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#createDatabase(java.lang.String)
*/
@ManagedOperation
public void createDatabase(String databaseName) {
getDB(databaseName);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#getDatabaseStats(java.lang.String)
*/
@ManagedOperation
public String getDatabaseStats(String databaseName) {
return getDB(databaseName).getStats().toString();
}
/**
* Sets the username to use to connect to the Mongo database
*
* @param username The username to use
*/
public void setUsername(String username) {
this.username = username;
}
/**
* Sets the password to use to authenticate with the Mongo database.
*
* @param password The password to use
*/
public void setPassword(String password) {
this.password = password;
}
/**
* Sets the authenticationDatabaseName to use to authenticate with the Mongo database.
*
* @param authenticationDatabaseName The authenticationDatabaseName to use.
*/
public void setAuthenticationDatabaseName(String authenticationDatabaseName) {
this.authenticationDatabaseName = authenticationDatabaseName;
}
DB getDB(String databaseName) {
return MongoDbUtils.getDB(mongo, databaseName, new UserCredentials(username, password), authenticationDatabaseName);
}
}

View File

@@ -1,34 +1,34 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.jmx.export.annotation.ManagedOperation;
/**
* @author Mark Pollack
* @author Oliver Gierke
*/
public interface MongoAdminOperations {
@ManagedOperation
void dropDatabase(String databaseName);
@ManagedOperation
void createDatabase(String databaseName);
@ManagedOperation
String getDatabaseStats(String databaseName);
}
/*
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.jmx.export.annotation.ManagedOperation;
/**
* @author Mark Pollack
* @author Oliver Gierke
*/
public interface MongoAdminOperations {
@ManagedOperation
void dropDatabase(String databaseName);
@ManagedOperation
void createDatabase(String databaseName);
@ManagedOperation
String getDatabaseStats(String databaseName);
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2017 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2015 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2015 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2015 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -649,8 +649,8 @@ public interface MongoOperations {
<T> T findById(Object id, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify <a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
@@ -661,8 +661,8 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, Class<T> entityClass);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify <a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
@@ -674,8 +674,8 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify <a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
@@ -688,8 +688,8 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify <a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
@@ -821,7 +821,7 @@ public interface MongoOperations {
* Insert a mixed Collection of objects into a database collection determining the collection name to use based on the
* class.
*
* @param collectionToSave the list of objects to save.
* @param objectsToSave the list of objects to save.
*/
void insertAll(Collection<? extends Object> objectsToSave);
@@ -994,8 +994,9 @@ public interface MongoOperations {
* Remove all documents that match the provided query document criteria from the the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
* @param query
* @param entityClass
* @param query must not be {@literal null}.
* @param entityClass must not be {@literal null}.
* @throws IllegalArgumentException when {@literal query} or {@literal entityClass} is {@literal null}.
*/
WriteResult remove(Query query, Class<?> entityClass);
@@ -1006,6 +1007,8 @@ public interface MongoOperations {
* @param query
* @param entityClass
* @param collectionName
* @throws IllegalArgumentException when {@literal query}, {@literal entityClass} or {@literal collectionName} is
* {@literal null}.
*/
WriteResult remove(Query query, Class<?> entityClass, String collectionName);
@@ -1017,6 +1020,7 @@ public interface MongoOperations {
*
* @param query the query document that specifies the criteria used to remove a record
* @param collectionName name of the collection where the objects will removed
* @throws IllegalArgumentException when {@literal query} or {@literal collectionName} is {@literal null}.
*/
WriteResult remove(Query query, String collectionName);

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2015 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2012-2014 the original author or authors.
* Copyright 2012-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2017 the original author or authors.
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -19,17 +19,8 @@ import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.SerializationUtils.*;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import java.util.*;
import java.util.Map.Entry;
import java.util.Scanner;
import java.util.Set;
import org.bson.types.ObjectId;
import org.slf4j.Logger;
@@ -108,21 +99,7 @@ import org.springframework.util.ObjectUtils;
import org.springframework.util.ResourceUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.Bytes;
import com.mongodb.CommandResult;
import com.mongodb.Cursor;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBCursor;
import com.mongodb.DBObject;
import com.mongodb.MapReduceCommand;
import com.mongodb.MapReduceOutput;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
import com.mongodb.WriteResult;
import com.mongodb.*;
import com.mongodb.util.JSON;
import com.mongodb.util.JSONParseException;
@@ -1324,35 +1301,46 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
protected <T> WriteResult doRemove(final String collectionName, final Query query, final Class<T> entityClass) {
if (query == null) {
throw new InvalidDataAccessApiUsageException("Query passed in to remove can't be null!");
}
Assert.notNull(query, "Query must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
final DBObject queryObject = query.getQueryObject();
final MongoPersistentEntity<?> entity = getPersistentEntity(entityClass);
final DBObject queryObject = queryMapper.getMappedObject(query.getQueryObject(), entity);
return execute(collectionName, new CollectionCallback<WriteResult>() {
public WriteResult doInCollection(DBCollection collection) throws MongoException, DataAccessException {
maybeEmitEvent(new BeforeDeleteEvent<T>(queryObject, entityClass, collectionName));
DBObject dboq = queryMapper.getMappedObject(queryObject, entity);
DBObject removeQuery = queryObject;
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.REMOVE, collectionName,
entityClass, null, queryObject);
entityClass, null, removeQuery);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Remove using query: {} in collection: {}.",
new Object[] { serializeToJsonSafely(dboq), collectionName });
new Object[] { serializeToJsonSafely(removeQuery), collectionName });
}
WriteResult wr = writeConcernToUse == null ? collection.remove(dboq)
: collection.remove(dboq, writeConcernToUse);
if (query.getLimit() > 0 || query.getSkip() > 0) {
handleAnyWriteResultErrors(wr, dboq, MongoActionOperation.REMOVE);
DBCursor cursor = new QueryCursorPreparer(query, entityClass)
.prepare(collection.find(removeQuery, new BasicDBObject(ID_FIELD, 1)));
Set<Object> ids = new LinkedHashSet<Object>();
Iterator<DBObject> it = cursor.iterator();
while (it.hasNext()) {
ids.add(it.next().get(ID_FIELD));
}
removeQuery = new BasicDBObject(ID_FIELD, new BasicDBObject("$in", ids));
}
WriteResult wr = writeConcernToUse == null ? collection.remove(removeQuery)
: collection.remove(removeQuery, writeConcernToUse);
handleAnyWriteResultErrors(wr, removeQuery, MongoActionOperation.REMOVE);
maybeEmitEvent(new AfterDeleteEvent<T>(queryObject, entityClass, collectionName));
@@ -1566,15 +1554,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
Assert.notNull(aggregation, "Aggregation pipeline must not be null!");
Assert.notNull(outputType, "Output type must not be null!");
AggregationOperationContext rootContext = context == null ? Aggregation.DEFAULT_CONTEXT : context;
DBObject command = aggregation.toDbObject(collectionName, rootContext);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing aggregation: {}", serializeToJsonSafely(command));
}
CommandResult commandResult = executeCommand(command, this.readPreference);
handleCommandError(commandResult, command);
DBObject commandResult = new BatchAggregationLoader(this, queryMapper, mappingContext, readPreference,
Integer.MAX_VALUE)
.aggregate(collectionName, aggregation, context);
return new AggregationResults<O>(returnPotentiallyMappedResults(outputType, commandResult, collectionName),
commandResult);
@@ -1587,7 +1569,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* @param commandResult
* @return
*/
private <O> List<O> returnPotentiallyMappedResults(Class<O> outputType, CommandResult commandResult,
private <O> List<O> returnPotentiallyMappedResults(Class<O> outputType, DBObject commandResult,
String collectionName) {
@SuppressWarnings("unchecked")
@@ -2094,7 +2076,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* @param result must not be {@literal null}.
* @param source must not be {@literal null}.
*/
private void handleCommandError(CommandResult result, DBObject source) {
private static void handleCommandError(CommandResult result, DBObject source) {
try {
result.throwOnError();
@@ -2553,4 +2535,156 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
}
/**
* {@link BatchAggregationLoader} is a little helper that can process cursor results returned by an aggregation
* command execution. On presence of a {@literal nextBatch} indicated by presence of an {@code id} field in the
* {@code cursor} another {@code getMore} command gets executed reading the next batch of documents until all results
* are loaded.
*
* @author Christoph Strobl
* @since 1.10
*/
static class BatchAggregationLoader {
private static final String CURSOR_FIELD = "cursor";
private static final String RESULT_FIELD = "result";
private static final String BATCH_SIZE_FIELD = "batchSize";
private static final String FIRST_BATCH = "firstBatch";
private static final String NEXT_BATCH = "nextBatch";
private static final String SERVER_USED = "serverUsed";
private static final String OK = "ok";
private final MongoTemplate template;
private final QueryMapper queryMapper;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final ReadPreference readPreference;
private final int batchSize;
BatchAggregationLoader(MongoTemplate template, QueryMapper queryMapper,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext,
ReadPreference readPreference, int batchSize) {
this.template = template;
this.queryMapper = queryMapper;
this.mappingContext = mappingContext;
this.readPreference = readPreference;
this.batchSize = batchSize;
}
/**
* Run aggregation command and fetch all results.
*/
DBObject aggregate(String collectionName, Aggregation aggregation, AggregationOperationContext context) {
DBObject command = prepareAggregationCommand(collectionName, aggregation, context, batchSize);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing aggregation: {}", serializeToJsonSafely(command));
}
return mergeAggregationResults(aggregateBatched(command, collectionName, batchSize));
}
/**
* Pre process the aggregation command sent to the server by adding {@code cursor} options to match execution on
* different server versions.
*/
private DBObject prepareAggregationCommand(String collectionName, Aggregation aggregation,
AggregationOperationContext context, int batchSize) {
AggregationUtil aggregationUtil = new AggregationUtil(queryMapper, mappingContext);
AggregationOperationContext rootContext = aggregationUtil.prepareAggregationContext(aggregation, context);
DBObject command = aggregationUtil.createCommand(collectionName, aggregation, rootContext);
if (!aggregation.getOptions().isExplain()) {
command.put(CURSOR_FIELD, new BasicDBObject(BATCH_SIZE_FIELD, batchSize));
}
return command;
}
private List<DBObject> aggregateBatched(DBObject command, String collectionName, int batchSize) {
List<DBObject> results = new ArrayList<DBObject>();
CommandResult commandResult = template.executeCommand(command, readPreference);
results.add(postProcessResult(command, commandResult));
while (hasNext(commandResult)) {
DBObject getMore = new BasicDBObject("getMore", getNextBatchId(commandResult)) //
.append("collection", collectionName) //
.append(BATCH_SIZE_FIELD, batchSize);
commandResult = template.executeCommand(getMore, this.readPreference);
results.add(postProcessResult(command, commandResult));
}
return results;
}
private static DBObject postProcessResult(DBObject command, CommandResult commandResult) {
handleCommandError(commandResult, command);
if (!commandResult.containsField(CURSOR_FIELD)) {
return commandResult;
}
DBObject resultObject = new BasicDBObject(SERVER_USED, commandResult.get(SERVER_USED));
resultObject.put(OK, commandResult.get(OK));
DBObject cursor = (DBObject) commandResult.get(CURSOR_FIELD);
if (cursor.containsField(FIRST_BATCH)) {
resultObject.put(RESULT_FIELD, cursor.get(FIRST_BATCH));
} else {
resultObject.put(RESULT_FIELD, cursor.get(NEXT_BATCH));
}
return resultObject;
}
private static DBObject mergeAggregationResults(List<DBObject> batchResults) {
if (batchResults.size() == 1) {
return batchResults.iterator().next();
}
DBObject commandResult = new BasicDBObject(3);
List<Object> allResults = new ArrayList<Object>();
for (DBObject batchResult : batchResults) {
Collection documents = (Collection<?>) batchResult.get(RESULT_FIELD);
if (!CollectionUtils.isEmpty(documents)) {
allResults.addAll(documents);
}
}
// take general info from first batch
commandResult.put(SERVER_USED, batchResults.iterator().next().get(SERVER_USED));
commandResult.put(OK, batchResults.iterator().next().get(OK));
// and append the merged batchResults
commandResult.put(RESULT_FIELD, allResults);
return commandResult;
}
private static boolean hasNext(DBObject commandResult) {
if (!commandResult.containsField(CURSOR_FIELD)) {
return false;
}
Object next = getNextBatchId(commandResult);
return next != null && ((Number) next).longValue() != 0L;
}
private static Object getNextBatchId(DBObject commandResult) {
return ((DBObject) commandResult.get(CURSOR_FIELD)).get("id");
}
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2014-2015 the original author or authors.
* Copyright 2014-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,235 +1,243 @@
/*
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.net.UnknownHostException;
import org.springframework.beans.factory.DisposableBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientURI;
import com.mongodb.MongoException;
import com.mongodb.MongoURI;
import com.mongodb.WriteConcern;
/**
* Factory to create {@link DB} instances from a {@link Mongo} instance.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
private final Mongo mongo;
private final String databaseName;
private final boolean mongoInstanceCreated;
private final UserCredentials credentials;
private final PersistenceExceptionTranslator exceptionTranslator;
private final String authenticationDatabaseName;
private WriteConcern writeConcern;
/**
* Create an instance of {@link SimpleMongoDbFactory} given the {@link Mongo} instance and database name.
*
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName database name, not be {@literal null} or empty.
* @deprecated since 1.7. Please use {@link #SimpleMongoDbFactory(MongoClient, String)}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName) {
this(mongo, databaseName, null);
}
/**
* Create an instance of SimpleMongoDbFactory given the Mongo instance, database name, and username/password
*
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName Database name, must not be {@literal null} or empty.
* @param credentials username and password.
* @deprecated since 1.7. The credentials used should be provided by {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials) {
this(mongo, databaseName, credentials, false, null);
}
/**
* Create an instance of SimpleMongoDbFactory given the Mongo instance, database name, and username/password
*
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName Database name, must not be {@literal null} or empty.
* @param credentials username and password.
* @param authenticationDatabaseName the database name to use for authentication
* @deprecated since 1.7. The credentials used should be provided by {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
String authenticationDatabaseName) {
this(mongo, databaseName, credentials, false, authenticationDatabaseName);
}
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoURI}.
*
* @param uri must not be {@literal null}.
* @throws MongoException
* @throws UnknownHostException
* @see MongoURI
* @deprecated since 1.7. Please use {@link #SimpleMongoDbFactory(MongoClientURI)} instead.
*/
@Deprecated
public SimpleMongoDbFactory(MongoURI uri) throws MongoException, UnknownHostException {
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())), true,
uri.getDatabase());
}
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClientURI}.
*
* @param uri must not be {@literal null}.
* @throws UnknownHostException
* @since 1.7
*/
public SimpleMongoDbFactory(MongoClientURI uri) throws UnknownHostException {
this(new MongoClient(uri), uri.getDatabase(), true);
}
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClient}.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null}.
* @since 1.7
*/
public SimpleMongoDbFactory(MongoClient mongoClient, String databaseName) {
this(mongoClient, databaseName, false);
}
private SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
boolean mongoInstanceCreated, String authenticationDatabaseName) {
if (mongo instanceof MongoClient && (credentials != null && !UserCredentials.NO_CREDENTIALS.equals(credentials))) {
throw new InvalidDataAccessApiUsageException(
"Usage of 'UserCredentials' with 'MongoClient' is no longer supported. Please use 'MongoCredential' for 'MongoClient' or just 'Mongo'.");
}
Assert.notNull(mongo, "Mongo must not be null");
Assert.hasText(databaseName, "Database name must not be empty");
Assert.isTrue(databaseName.matches("[\\w-]+"),
"Database name must only contain letters, numbers, underscores and dashes!");
this.mongo = mongo;
this.databaseName = databaseName;
this.mongoInstanceCreated = mongoInstanceCreated;
this.credentials = credentials == null ? UserCredentials.NO_CREDENTIALS : credentials;
this.exceptionTranslator = new MongoExceptionTranslator();
this.authenticationDatabaseName = StringUtils.hasText(authenticationDatabaseName) ? authenticationDatabaseName
: databaseName;
Assert.isTrue(this.authenticationDatabaseName.matches("[\\w-]+"),
"Authentication database name must only contain letters, numbers, underscores and dashes!");
}
/**
* @param client
* @param databaseName
* @param mongoInstanceCreated
* @since 1.7
*/
private SimpleMongoDbFactory(MongoClient client, String databaseName, boolean mongoInstanceCreated) {
Assert.notNull(client, "MongoClient must not be null!");
Assert.hasText(databaseName, "Database name must not be empty!");
this.mongo = client;
this.databaseName = databaseName;
this.mongoInstanceCreated = mongoInstanceCreated;
this.exceptionTranslator = new MongoExceptionTranslator();
this.credentials = UserCredentials.NO_CREDENTIALS;
this.authenticationDatabaseName = databaseName;
}
/**
* Configures the {@link WriteConcern} to be used on the {@link DB} instance being created.
*
* @param writeConcern the writeConcern to set
*/
public void setWriteConcern(WriteConcern writeConcern) {
this.writeConcern = writeConcern;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb()
*/
public DB getDb() throws DataAccessException {
return getDb(databaseName);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb(java.lang.String)
*/
@SuppressWarnings("deprecation")
public DB getDb(String dbName) throws DataAccessException {
Assert.hasText(dbName, "Database name must not be empty.");
DB db = MongoDbUtils.getDB(mongo, dbName, credentials, authenticationDatabaseName);
if (writeConcern != null) {
db.setWriteConcern(writeConcern);
}
return db;
}
/**
* Clean up the Mongo instance if it was created by the factory itself.
*
* @see DisposableBean#destroy()
*/
public void destroy() throws Exception {
if (mongoInstanceCreated) {
mongo.close();
}
}
private static String parseChars(char[] chars) {
return chars == null ? null : String.valueOf(chars);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getExceptionTranslator()
*/
@Override
public PersistenceExceptionTranslator getExceptionTranslator() {
return this.exceptionTranslator;
}
}
/*
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.net.UnknownHostException;
import org.springframework.beans.factory.DisposableBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientURI;
import com.mongodb.MongoException;
import com.mongodb.MongoURI;
import com.mongodb.WriteConcern;
/**
* Factory to create {@link DB} instances from a {@link Mongo} instance.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
* @author George Moraitis
* @author Mark Paluch
*/
public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
private final Mongo mongo;
private final String databaseName;
private final boolean mongoInstanceCreated;
private final UserCredentials credentials;
private final PersistenceExceptionTranslator exceptionTranslator;
private final String authenticationDatabaseName;
private WriteConcern writeConcern;
/**
* Create an instance of {@link SimpleMongoDbFactory} given the {@link Mongo} instance and database name.
*
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName database name, not be {@literal null} or empty.
* @deprecated since 1.7. Please use {@link #SimpleMongoDbFactory(MongoClient, String)}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName) {
this(mongo, databaseName, null);
}
/**
* Create an instance of SimpleMongoDbFactory given the Mongo instance, database name, and username/password
*
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName Database name, must not be {@literal null} or empty.
* @param credentials username and password.
* @deprecated since 1.7. The credentials used should be provided by {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials) {
this(mongo, databaseName, credentials, false, null);
}
/**
* Create an instance of SimpleMongoDbFactory given the Mongo instance, database name, and username/password
*
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName Database name, must not be {@literal null} or empty.
* @param credentials username and password.
* @param authenticationDatabaseName the database name to use for authentication
* @deprecated since 1.7. The credentials used should be provided by {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
String authenticationDatabaseName) {
this(mongo, databaseName, credentials, false, authenticationDatabaseName);
}
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoURI}.
*
* @param uri must not be {@literal null}.
* @throws MongoException
* @throws UnknownHostException
* @see MongoURI
* @deprecated since 1.7. Please use {@link #SimpleMongoDbFactory(MongoClientURI)} instead.
*/
@Deprecated
public SimpleMongoDbFactory(MongoURI uri) throws MongoException, UnknownHostException {
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())), true,
uri.getDatabase());
}
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClientURI}.
*
* @param uri must not be {@literal null}.
* @throws UnknownHostException
* @since 1.7
*/
public SimpleMongoDbFactory(MongoClientURI uri) throws UnknownHostException {
this(new MongoClient(uri), uri.getDatabase(), true);
}
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClient}.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null}.
* @since 1.7
*/
public SimpleMongoDbFactory(MongoClient mongoClient, String databaseName) {
this(mongoClient, databaseName, false);
}
private SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
boolean mongoInstanceCreated, String authenticationDatabaseName) {
if (mongo instanceof MongoClient && (credentials != null && !UserCredentials.NO_CREDENTIALS.equals(credentials))) {
throw new InvalidDataAccessApiUsageException(
"Usage of 'UserCredentials' with 'MongoClient' is no longer supported. Please use 'MongoCredential' for 'MongoClient' or just 'Mongo'.");
}
Assert.notNull(mongo, "Mongo must not be null");
Assert.hasText(databaseName, "Database name must not be empty");
assertDatabaseName(databaseName);
this.mongo = mongo;
this.databaseName = databaseName;
this.mongoInstanceCreated = mongoInstanceCreated;
this.credentials = credentials == null ? UserCredentials.NO_CREDENTIALS : credentials;
this.exceptionTranslator = new MongoExceptionTranslator();
this.authenticationDatabaseName = StringUtils.hasText(authenticationDatabaseName) ? authenticationDatabaseName
: databaseName;
Assert.isTrue(this.authenticationDatabaseName.matches("[\\w-]+"),
"Authentication database name must only contain letters, numbers, underscores and dashes!");
}
/**
* @param client
* @param databaseName
* @param mongoInstanceCreated
* @since 1.7
*/
private SimpleMongoDbFactory(MongoClient client, String databaseName, boolean mongoInstanceCreated) {
Assert.notNull(client, "MongoClient must not be null!");
Assert.hasText(databaseName, "Database name must not be empty!");
assertDatabaseName(databaseName);
this.mongo = client;
this.databaseName = databaseName;
this.mongoInstanceCreated = mongoInstanceCreated;
this.exceptionTranslator = new MongoExceptionTranslator();
this.credentials = UserCredentials.NO_CREDENTIALS;
this.authenticationDatabaseName = databaseName;
}
/**
* Configures the {@link WriteConcern} to be used on the {@link DB} instance being created.
*
* @param writeConcern the writeConcern to set
*/
public void setWriteConcern(WriteConcern writeConcern) {
this.writeConcern = writeConcern;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb()
*/
public DB getDb() throws DataAccessException {
return getDb(databaseName);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb(java.lang.String)
*/
@SuppressWarnings("deprecation")
public DB getDb(String dbName) throws DataAccessException {
Assert.hasText(dbName, "Database name must not be empty.");
DB db = MongoDbUtils.getDB(mongo, dbName, credentials, authenticationDatabaseName);
if (writeConcern != null) {
db.setWriteConcern(writeConcern);
}
return db;
}
/**
* Clean up the Mongo instance if it was created by the factory itself.
*
* @see DisposableBean#destroy()
*/
public void destroy() throws Exception {
if (mongoInstanceCreated) {
mongo.close();
}
}
private static String parseChars(char[] chars) {
return chars == null ? null : String.valueOf(chars);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getExceptionTranslator()
*/
@Override
public PersistenceExceptionTranslator getExceptionTranslator() {
return this.exceptionTranslator;
}
private static void assertDatabaseName(String databaseName) {
Assert.isTrue(databaseName.matches("[^/\\\\.$\"\\s]+"),
"Database name must not contain slashes, dots, spaces, quotes, or dollar signs!");
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

Some files were not shown because too many files have changed in this diff Show More