Compare commits

...

363 Commits

Author SHA1 Message Date
Oliver Gierke
d7f70e219b DATAMONGO-1409 - Release version 1.10 M1 (Ingalls). 2016-07-27 13:52:12 +02:00
Oliver Gierke
728cc390f6 DATAMONGO-1409 - Prepare 1.10 M1 (Ingalls). 2016-07-27 13:51:38 +02:00
Oliver Gierke
ddcc3914ff DATAMONGO-1409 - Updated changelog. 2016-07-27 13:51:32 +02:00
Oliver Gierke
9a385599af DATAMONGO-1394 - Polishing.
Some internal refactorings to avoid deeply nested if-clauses.

Original pull request: #373.
2016-07-27 10:54:21 +02:00
Christoph Strobl
c14c42fb0c DATAMONGO-1394 - Support identifier references on Querydsl expressions for DBRefs.
We now allow direct usage path.eq(…) on id properties of db referenced objects. This allows to write the query as person.coworker.id.eq(coworker.getId()) instead of person.coworker.eq(coworker). This helps building the query using just the plain id not having to actually create new object wrapping it.

Original pull request: #373.
2016-07-27 10:54:21 +02:00
Oliver Gierke
5a8e4f3dae DATAMONGO-1194 - Polishing.
Some missing JavaDoc and slight code polish.

Original pull request: #377.
2016-07-26 15:30:31 +02:00
Christoph Strobl
5d50155d81 DATAMONGO-1194 - Improve DBRef resolution for maps.
We bulk load maps of referenced objects as long as they are stored in the same collection. This reduces database roundtrips and network traffic.

Original pull request: #377.
2016-07-26 15:30:31 +02:00
Christoph Strobl
babab54ffd DATAMONGO-1194 - Improve DBRef resolution for collections.
We now bulk load collections of referenced objects as long as they are stored in the same collection. This reduces database roundtrips and network traffic.

Original pull request: #377.
2016-07-26 15:30:30 +02:00
Mark Paluch
1ba137b98a DATAMONGO-1464 - Polishing.
Added JavaDoc. Simplified if-check in MongoQueryExecution.isListOfGeoResult(…).

Original pull request: #379.
2016-07-26 14:51:55 +02:00
Mark Paluch
353b836a77 DATAMONGO-1464 - Optimize query execution for pagination queries.
We execute paged queries now in an optimized way. The data is obtained for each paged execution but the count query is deferred. We determine the total from the pageable and the results in which we don't hit the page size bounds (i.e. results are less than a full page without offset or results are greater 0 and less than a full page with offset). In all other cases we issue an additional count query.

Original pull request: #379.
2016-07-26 14:51:29 +02:00
Oliver Gierke
325bcd11b9 DATAMONGO-1431 - Added MongoOperations.stream(…) with explicit collection. 2016-07-20 15:59:52 +02:00
Mark Paluch
9db2dde19b DATAMONGO-1463 - Upgrade to mongo-java-driver 2.14.3.
Upgrade mongo-java-driver 2.14.3 and upgrade the mongo33 profile to use 3.3.0 (release).
2016-07-20 10:33:23 +02:00
Mark Paluch
318ba53e2f DATAMONGO-1462 - Integrate version badge from spring.io.
Add version badge from spring.io and replace fixed version numbers with a placeholder.
2016-07-20 08:43:37 +02:00
Mark Paluch
3db30bd4a6 DATAMONGO-1460 - User placeholder property for JSR-303 API. 2016-07-15 12:48:37 +02:00
Oliver Gierke
dd1fbfeb66 DATAMONGO-1459 - Polishing.
Added missing @Overrides to MongoRepository interface and polished non-JavaDoc references.
2016-07-14 15:52:32 +02:00
Oliver Gierke
3fa17272bb DATAMONGO-1459 - Added support for any-match mode in Query-by-example.
MongoExampleMapper now $or-concatenates the predicates derived from the example in case the ExampleMatcher expresses any-match binding to be desired.

Moved integration tests for Query-by-example to the appropriate package and polished the code a little.

Related ticket: DATACMNS-879.
2016-07-14 15:52:32 +02:00
Christoph Strobl
9361fc3c71 DATAMONGO-1455, DATAMONGO-1456 - Polishing.
Use static imports for org.junit.Assert and org.hamcrest.core. Fix spelling.

Original pull request: #375.
2016-07-12 15:25:49 +02:00
Christoph Strobl
ac55f5e77f DATAMONGO-1455, DATAMONGO-1456 - Add support for $caseSensitive and $diacriticSensitive to $text.
We added methods to set values for $caseSensitive and $diacriticSensitive when using TextCriteria. Both operators are optional and will not be used until explicitly set.

    // { "$text" : { "$search" : "coffee" , "$caseSensitive" : true } }
    TextCriteria.forDefaultLanguage().matching("coffee").caseSensitive(true);

    // { "$text" : { "$search" : "coffee" , "$diacriticSensitive" : true } }
    TextCriteria.forDefaultLanguage().matching("coffee").diacriticSensitive(true);

Original pull request: #375.
2016-07-12 15:25:49 +02:00
Christoph Strobl
116baf9a92 DATAMONGO-832 - Polishing
Moved newly introduced types into order. Added missing @since tag and additional test.
Updated reference documentation for update operators and added $slice operator to "what’s new" section.

Original Pull Request: #374
2016-07-11 10:02:22 +02:00
Mark Paluch
026dce2612 DATAMONGO-832 - Add support for $slice in Update.push.
We now support $slice in Update operations via the PushOperatorBuilder.

    new Update().push("key").slice(5).each(Arrays.asList("one", "two", "three"));

Original Pull Request: #374
2016-07-11 09:49:45 +02:00
Mark Paluch
eae32be568 DATAMONGO-1457 - Polishing.
Add missing Javadoc to size operator. Mention Array Aggregation Operators in reference docs. Fix typos in reference docs.

Original pull request: #372.
2016-07-08 10:34:08 +02:00
Christoph Strobl
9d51ea4c01 DATAMONGO-1457 - Add support for $slice in aggregation.
We now support $slice in aggregation projections via the ProjectionOperationBuilder.

    Aggregation.project().and("field").slice(10, 20)

Original pull request: #372.
2016-07-08 10:08:29 +02:00
Mark Paluch
f4a5482005 DATAMONGO-1418 - Polishing.
Added ticket references. Simplified code.

 Original pull request: #361.
2016-06-24 15:32:58 +02:00
Nikolai Bogdanov
0db36aff8f DATAMONGO-1418 - Add support for $out operator in aggregations.
We now support the $out operator via Aggregation.out(…) to store aggregation results in a collection. Using the $out operator returns an empty list in AggregationResults.

Original pull request: #361.
CLA: 172720160413124705 (Nikolai Bogdanov)
2016-06-24 15:28:45 +02:00
Christoph Strobl
ba8ece334a DATAMONGO-1453 - Fix GeoJson conversion when coordinates are Integers.
We now use Number instead of Double for reading "coordinates" from GeoJSON representations.

Original pull request: #369.
2016-06-24 14:04:58 +02:00
Oliver Gierke
a5394074c5 DATAMONGO-1410 - Updated changelog. 2016-06-15 14:31:49 +02:00
Kevin Dosey
fe5bb515b7 DATAMONGO-1449 - Switched to foreach loop in collection handling of MappingMongoConverter.
This should result in minor to moderate performance improvement for iteration on Collections/Arrays during DBObject to object mapping.

Original pull request: #368.
2016-06-11 18:29:32 +02:00
Mark Paluch
c84bfbccf4 DATAMONGO-1437 - Polishing.
Renamed test. Added JavaDoc. Simplify throws declaration.

Original pull request: #367.
2016-06-02 08:55:09 +02:00
Christoph Strobl
d147f80a39 DATAMONGO-1437 - Preserve non translatable Exception cause when lazily resolving DBRef.
We now preserve the cause of Exceptions that cannot be translated into DataAccessExceptions when an error occurs during lazily loading DBRefs.

Original pull request: #367.
2016-06-02 08:52:53 +02:00
Christoph Strobl
7b8dadeb74 DATAMONGO-1271 - Polishing.
Removed non Java 6 language features, reworked and added a few tests.

Original Pull Request: #322
2016-05-27 07:53:53 +02:00
Jordi Llach Fernandez
d1251c42ca DATAMONGO-1271 - Provide lifecycle events for DBRefs.
We now publish livecycle events when loading DBRefs.

Original Pull Request: #322
CLA: 121620150519031801 (Jordi Llach Fernandez)
2016-05-27 07:53:45 +02:00
Oliver Gierke
4140dd573f DATAMONGO-1423 - Polishing.
Orignal pull request: #365.
2016-05-25 17:26:31 +02:00
Christoph Strobl
0e60630393 DATAMONGO-1423 - Map keys now get registered conversions applied for Updates.
We now pipe map keys through the potentially registered conversions when mapping Updates.

Orignal pull request: #365.
2016-05-25 17:26:30 +02:00
Oliver Gierke
9bc35512fd DATAMONGO-1416 - Polishing.
Just use instanceOf(…) from Hamcrest's Matchers class instead of dedicated class.

Original pull request: #362.
2016-05-24 15:57:21 +02:00
Christoph Strobl
b626c2f82b DATAMONGO-1416 - Get rid of the warnings for Atomic… type conversions.
We now use explicit converters instead of a ConverterFactory. This reduces noise in log when registering converters.

Original pull request: #362.
2016-05-24 15:57:21 +02:00
Mark Paluch
c3e894ee8d DATAMONGO-1424 - Polishing.
Remove EndingWith from NotLike. Remove superfluous white-spaces. Split combined highlighted keywords to individual highlighting.

Original pull request: #364.
2016-05-09 11:06:44 +02:00
Christoph Strobl
2f713bede5 DATAMONGO-1424 - Add support for NOT_LIKE.
We now support `notLike` and `isNotLike` in query derivation.

Original pull request: #364.
2016-05-09 11:05:44 +02:00
Mark Paluch
e03520d2fb DATAMONGO-1425 - Polishing.
Add NotContaining to documentation. Add integration test for Containing/NotContaining on collection properties.

Original pull request: #363.
2016-05-09 10:42:19 +02:00
Christoph Strobl
3829d58dc2 DATAMONGO-1425 - Fix query derivation for notContaining on String properties.
We now correctly build up the criteria for derived queries using notContaining keyword on String properties.

Original pull request: #363.
2016-05-09 10:42:15 +02:00
Mark Paluch
7b87fa9509 DATAMONGO-1412 - Fix backticks and code element highlighting.
Fixed broken highlighting using backticks followed by chars/single quotes. Convert single quote emphasis of id to backtick code fences. Add missing spaces between words and backticks.

Original Pull Request: #359
2016-05-02 13:20:29 +02:00
Mark Paluch
b2b9f3406a DATAMONGO-1412 - Document mapping rules for Java types to MongoDB representation.
Original Pull Request: #359
Related pull request: #353
Related ticket: DATAMONGO-1404
2016-05-02 13:20:22 +02:00
Mark Paluch
d610761019 DATAMONGO-1404 - Polishing.
Add author and since tags. Update license headers. Reformat code. Replace FQCN with import and simple class name. Remove final keyword in test methods. Add tests for numeric values. Update documentation.

Original pull request: #353.
2016-05-02 13:20:10 +02:00
Alexey Plotnik
8983bd26ce DATAMONGO-1404 - Add support for $min and $max update operators.
Original pull request: #353.
CLA: 169820160330091912 (Alexey Plotnik)
2016-05-02 13:19:31 +02:00
Christoph Strobl
5485f2fcd4 DATAMONGO-1391 - Polishing.
Removed white spaces, updated Javadoc and return early when using non 3.2 $unwind options.

Original Pull Request: #355
2016-05-02 12:56:38 +02:00
Mark Paluch
f8681fec66 DATAMONGO-1391 - Support Mongo 3.2 syntax for $unwind in aggregation.
We now support both, the simple {$unwind: path} and the MongoDB 3.2 {$unwind: {…}} syntax.

Original Pull Request: #355
2016-05-02 12:52:39 +02:00
Mark Paluch
0dd904894d DATAMONGO-1399 - Polishing.
Update since version to 1.10. Remove trailing whitespaces.

Original pull request: #352.
2016-05-02 09:44:37 +02:00
Christoph Strobl
7d70a8677e DATAMONGO-1399 - Allow adding hole to GeoJSON Polygon.
We now allow creation of GeoJsonPolygon having an outer and multiple inner rings.

Original pull request: #352.
2016-05-02 09:40:44 +02:00
Mark Paluch
13a52b5ac9 DATAMONGO-1403 - Add maxExecutionTimeMs alias for @Meta(maxExcecutionTime).
We added maxExecutionTimeMs as an alias for maxExcecutionTime which has been deprecated due to spelling issues.

Original pull request: #356.
2016-04-26 12:43:37 +02:00
Mark Paluch
f5cfcda673 DATAMONGO-1411 - Enable build on TravisCI.
We now start MongoDB server via apt-get instead of relying on the TravisCI managed 2.4.2 installation.
Doing this we altered tests to just check on the port and not the host part of the URIs.

Additionally we upgraded build profiles, removed promoted snapshot-versions, renamed mongo32-next to mongo32  and added mongo33-next build profile.

Original pull request: #358
2016-04-26 10:40:29 +02:00
Raja Dilip Kolli
cf44a7105f DATAMONGO-1420 - Update version numbers in Github readme.
Bumped to latest versions available

Original pull request: #354
2016-04-15 08:30:32 +02:00
Oliver Gierke
0228255d2b DATAMONGO-1356 - AuditingEventListener now has an explicit order.
AuditingEventListener now has a fixed ordering of 100. This allows other listeners to be registered to be executed before or after it.
2016-04-14 22:27:46 +02:00
Oliver Gierke
50e37355d4 DATAMONGO-1419 - Removed deprecated methods of AbstractMongoEventListener. 2016-04-14 22:27:42 +02:00
Oliver Gierke
a15dababfa DATAMONGO-1408 - Updated changelog. 2016-04-06 23:14:25 +02:00
Oliver Gierke
9942451017 DATAMONGO-1405 - After release cleanups. 2016-04-06 16:37:03 +02:00
Oliver Gierke
e144c29316 DATAMONGO-1405 - Prepare next development iteration. 2016-04-06 16:36:59 +02:00
Oliver Gierke
64d4880983 DATAMONGO-1405 - Release version 1.9 GA (Hopper). 2016-04-06 16:35:59 +02:00
Oliver Gierke
47c348e03a DATAMONGO-1405 - Prepare 1.9 GA (Hopper). 2016-04-06 16:34:45 +02:00
Oliver Gierke
dea86535c1 DATAMONGO-1405 - Updated changelog. 2016-04-06 16:34:39 +02:00
Artur Konczak
eee6b62589 DATAMONGO-1407 - updated jira link to point to correct project on jira.
Original pull request: #357.
2016-04-05 14:13:28 +02:00
Mark Paluch
771ca8d84c DATAMONGO-1407 - Add pull request template. 2016-04-05 09:50:18 +02:00
Christoph Strobl
8f5b334951 DATAMONGO-1398 - Mention QBE and add links.
Original Pull Request: #349
2016-03-31 21:00:27 +02:00
Mark Paluch
0dc6169282 DATAMONGO-1398, DATAMONGO-1395 - Update Lifecycle Events examples in Reference Documentation.
Replace deprecated methods by the supported API.

Original Pull Request: #349
2016-03-31 20:59:53 +02:00
Mark Paluch
abe78f0428 DATAMONGO-1398 - Updated what's new section and general improvements.
Update Spring Framework documentation links to point always to the Spring Framework version specified in the pom, where possible. Mention $lookup in aggregation.

Original Pull Request: #349
2016-03-31 20:59:16 +02:00
Christoph Strobl
9930ec2d19 DATAMONGO-1401 - Fix error when updating entity with both GeoJsonPoint and Version property.
We now ignore property reference exceptions when resolving field values that have already been mapped. Eg. in case of an already mapped update extracted from an actual domain type instance.

Original pull request: #351.
2016-03-31 09:15:29 +02:00
Oliver Gierke
83d7f4477e DATAMONGO-1392 - After release cleanups. 2016-03-18 11:16:07 +01:00
Oliver Gierke
18c3704c2e DATAMONGO-1392 - Prepare next development iteration. 2016-03-18 11:15:51 +01:00
Oliver Gierke
bef581caa5 DATAMONGO-1392 - Release version 1.9 RC1 (Hopper). 2016-03-18 11:15:00 +01:00
Oliver Gierke
2f0abe0604 DATAMONGO-1392 - Prepare 1.9 RC1 (Hopper). 2016-03-18 11:06:57 +01:00
Oliver Gierke
4235b44c47 DATAMONGO-1392 - Updated changelog. 2016-03-18 11:06:52 +01:00
Oliver Gierke
f318185ad0 DATAMONGO-1400 - Adapt to rename of Tuple to Pair in Spring Data Commons.
Related tickets: DATACMNS-818.
2016-03-18 09:59:06 +01:00
Oliver Gierke
43b496287c DATAMONGO-1245 - Final tweaks to Query by Example documentation.
Tweaked section anchor to match conventions. Use level offsets to accommodate changes in Spring Data Commons.
2016-03-18 09:29:46 +01:00
Oliver Gierke
9d0c8ecdc3 DATAMONGO-1245 - Polishing.
Adapt to API changes in Spring Data Commons.

Related tickets: DATACMNS-810.
Original pull request: #341.
2016-03-17 18:39:18 +01:00
Mark Paluch
5a78d99af0 DATAMONGO-1245 - Initial documentation for Query by Example.
Adopt changes from query by example API refactoring.

Related tickets: DATACMNS-810.
Original pull request: #341.
2016-03-17 18:39:18 +01:00
Christoph Strobl
693f5ddf6e DATAMONGO-1245 - Add support for Query By Example.
An explorative approach to QBE trying find possibilities and limitations. We now support querying documents by providing a sample of the given object holding compare values. For the sake of partial matching we flatten out nested structures so we can create different queries for matching like:

{ _id : 1, nested : { value : "conflux" } }
{ _id : 1, nested.value : { "conflux" } }

This is useful when you want so search using a only partially filled nested document. String matching can be configured to wrap strings with $regex which creates { firstname : { $regex : "^foo", $options: "i" } } when using StringMatchMode.STARTING along with the ignoreCaseOption. DBRefs and geo structures such as Point or GeoJsonPoint is converted to their according structure.

Related tickets: DATACMNS-810.
Original pull request: #341.
2016-03-17 18:39:18 +01:00
Christoph Strobl
ece655f67d DATAMONGO-1387 - Polishing.
Added a few more tests and append values if present on Query.

Original Pull Request: #345
2016-03-17 13:10:51 +01:00
John Willemin
119692c979 DATAMONGO-1387 - Fix BasicQuery getFieldsObject() inconsistency.
We changed BasicQuery to consider its parent getFieldsObject() when not given an explicit fields DBObject.

Original Pull Request: #345
CLA: 165520160303021604 (John Willemin)
2016-03-17 13:09:58 +01:00
Oliver Gierke
6068f3243a DATAMONGO-1397 - Polishing.
Switched to Slf4J-native placeholder replacement in debug logging for MongoTemplate.

Original pull request: #348.
2016-03-16 17:28:04 +01:00
Mark Paluch
a7cda2e793 DATAMONGO-1397 - Log command, entity and collection name in MongoTemplate.geoNear(…).
Original pull request: #348.
2016-03-16 17:28:03 +01:00
Oliver Gierke
2687cb85f0 DATAMONGO-1373 - Polishing.
Added method, field and annotation target to @Field annotation explicitly. Fixed copyright date ranges where needed.

Tweaked formatting in test cases.

Original pull request: #347.
Related ticket: DATACMNS-825.
2016-03-15 15:35:10 +01:00
Mark Paluch
b2ce1700d2 DATAMONGO-1373 - Allow usage of @AliasFor with mapping and indexing annotations.
We now support @AliasFor to build composed annotations with: @Document, @Id, @Field, @Indexed, @CompoundIndex, @GeoSpatialIndexed, @TextIndexed, @Query, and @Meta. Added missing license header to @Field.

Original pull request: #347.
Related tickets: DATACMNS-825.
2016-03-15 15:20:45 +01:00
Christoph Strobl
0b634f8340 DATAMONGO-1373 - Allow usage of @AliasFor for composed @Document annotation.
We now resolve aliased attribute values when reading @Document on entity types. This allows creation of composed annotations like:

@Retention(RetentionPolicy.RUNTIME)
@Target({ ElementType.TYPE })
@Document
static @interface ComposedDocumentAnnotation {

  @AliasFor(annotation = Document.class, attribute = "collection")
  String name() default "custom-collection-name";
}

Original pull request: #347.
Related issue: DATACMNS-825.
2016-03-15 15:20:22 +01:00
Mark Paluch
9a078b743f DATAMONGO-1326 - Support field inheritance for $lookup aggregation operator.
We now distinguish between aggregation operations that replace fields in the aggregation pipeline and those which inherit fields from previous operations. InheritsFieldsAggregationOperation is a nested interface of FieldsExposingAggregationOperation is a marker to lookup fields along the aggregation context chain. Added unit and integration tests. Mention lookup operator in docs.

Original pull request: #344.
2016-03-08 09:00:55 +01:00
Christoph Strobl
65b6576cfc DATAMONGO-1326 - Add Builder, update javadoc and remove additional interface.
Updated javadoc and formatting. Added tests and removed marker interface.

Original Pull Request: #344
2016-03-08 08:59:54 +01:00
Alessio Fachechi
78e99e6df2 DATAMONGO-1326 - Add support for $lookup to aggregation.
Original Pull Request: #344
CLA: 164120160221125037 (Alessio Fachechi)
2016-03-07 14:20:14 +01:00
Oliver Gierke
bb0a42733d DATAMONGO-1389 - Fixed test case to verify type predicting bean registration.
Related ticket: DATACMNS-821.
2016-03-01 15:02:18 +01:00
Oliver Gierke
a2ae08e263 DATAMONGO-1381 - Updated changelog. 2016-02-23 14:27:23 +01:00
Oliver Gierke
eaa9d6c7e6 DATAMONGO-1366 - After release cleanups. 2016-02-12 15:43:57 +01:00
Oliver Gierke
8900695153 DATAMONGO-1366 - Prepare next development iteration. 2016-02-12 15:43:39 +01:00
Oliver Gierke
bfe548d573 DATAMONGO-1366 - Release version 1.9 M1 (Hopper). 2016-02-12 15:42:47 +01:00
Oliver Gierke
7ab4002771 DATAMONGO-1366 - Prepare 1.9 M1 (Hopper). 2016-02-12 15:36:19 +01:00
Oliver Gierke
6eace856aa DATAMONGO-1366 - Updated changelog. 2016-02-12 15:36:11 +01:00
Oliver Gierke
f10e5a19c5 DATAMONGO-1345 - Finalized application of projections in query methods.
Refactored the query execution out of AbstractMongoQuery into MongoQueryExecution. Made sure the streaming execution lazily applies the projections, too.

Added a DtoInstantiatingConverter to be able to copy data from created entities into DTOs as we cannot hand the DTO type into the MongoTemplate execution in the first place as it's going to be used for the query mapping currently.
2016-02-12 14:14:54 +01:00
Uxío Fuentefría
90a4a63776 DATAMONGO-1378 - Update reference documentation: Change Query.sort() to Query.with(Sort sort).
sort() is not a method of Query, to sort a query you have to use with().

Original pull request: #320.
CLA: 162620160211060822 (Uxío Fuentefría)
2016-02-11 20:22:36 +01:00
Oliver Gierke
0f14e35ba3 DATAMONGO-1288 - Polishing.
Some JavaDoc here and there. Moved converter factory registration into MongoConverters.getConvertersToRegister() for consistency with others.

Original pull request: #331.
2016-02-11 14:08:31 +01:00
Christoph Strobl
ad0c4207d6 DATAMONGO-1288 - Add conversion for AtomicInteger & AtomicLong.
We now convert AtomicInteger and AtomicLong to the required Number target type by calling get() followed by the actual conversion. This allows to directly use these types e.g. as part of an Update: new Update().set("intValue", new AtomicInteger(10));

Original pull request: #331.
2016-02-11 14:08:19 +01:00
Mark Paluch
97da43645a DATAMONGO-1380 - Polishing.
Add credits, use message formatting instead string concatenation.

Original pull request: #317.
2016-02-11 12:02:09 +01:00
Alex Vengrovsk
42b7c42617 DATAMONGO-1380 - Improve logging in MongoChangeSetPersister.
Add checking for debug enabling in the getPersistentId method

Original pull request: #317.
2016-02-11 11:53:15 +01:00
Timo Kockert
bd81e25e6b DATAMONGO-1270 - Update documentation to reflect deprecation of MongoFactoryBean.
Original pull request: #315.
2016-02-10 15:57:15 +01:00
Thomas Dudouet
debe6aa649 DATAMONGO-1377 - Update JavaDoc: Use @EnableMongoRepositories instead of @EnableJpaRepositories.
The JavaDoc description references the EnableJpaRepositories annotation instead of the EnableMongoRepositories annotation.

Original pull request: #340.
2016-02-10 15:13:24 +01:00
Oliver Gierke
6f433902f0 DATAMONGO-1376 - Moved away from SimpleTypeInformationMapper.INSTANCE.
Related tickets: DATACMNS-815.
2016-02-09 14:31:05 +01:00
Martin Macko
ba902e7f8e DATAMONGO-1375 - Fix typo in MongoOperations JavaDoc.
Original pull request: #343.
2016-02-09 11:29:30 +01:00
Oliver Gierke
7e8ec21684 DATAMONGO-1372 - Polishing.
Tiny formattings, collapsed if-clause into ternary operation.

Original pull request: #342.
2016-02-04 15:19:51 +01:00
Christoph Strobl
b7131b7efc DATAMONGO-1372 - Add and register Converters for java.util.Currency.
We now support conversion from currency into ISO 4217 String and back.

Original pull request: #342.
2016-02-04 15:19:48 +01:00
Oliver Gierke
ace99c3464 DATAMONGO-1371 - Added code of conduct.
Moved to Asciidoctor for the CONTRIBUTING file.
2016-02-02 09:42:48 +01:00
Oliver Gierke
83fc5bc113 DATAMONGO-1366 - Declare Artifactory Maven plugin to be able to distribute build artifacts. 2016-01-28 14:51:55 +01:00
Oliver Gierke
160de0adf6 DATAMONGO-1361 - Guard command result statistics evaluation against changes in MongoDB 3.2.
MongoDB 3.2 RC1 decided to remove fields from statistics JSON documents returned in case no result was found for a geo near query. The avgDistance field is unfortunately missing as of that version.

Introduced a value object to encapsulate the mitigation behavior and make client code unaware of that.
2016-01-21 12:45:10 +01:00
Oliver Gierke
b4753f3a83 DATAMONGO-1360 - Query instances contained in a Near Query now get mapped during geoNear(…) execution.
A Query instance which might be part of a NearQuery definition is now passed through the QueryMapper to make sure complex types contained in it or even in more general types that have custom conversions registered are mapped correctly before the near command is actually executed.
2016-01-20 13:10:50 +01:00
Oliver Gierke
bce6e2c78c DATAMONGO-1163 - Polishing.
Fixed indentation changes in IndexingIntegrationTests. Separated test cases from each other.

Original pull request: #325.
2015-12-27 12:05:19 +01:00
Jordi Llach
b5ea0eccd2 DATAMONGO-1163 - Allow usage of @Indexed as meta-annotation.
@Indexed can now be used as meta-annotation so that user annotations can be annotated with it and the index creation facilities still pick up the configuration information.

Original pull request: #325.
2015-12-27 12:05:17 +01:00
Oliver Gierke
87865b9761 DATAMONGO-1355 - Updated changelog. 2015-12-18 10:55:56 +01:00
Christoph Strobl
13fa4703c0 DATAMONGO-1334 - Map-reduce operations now honor MapReduceOptions.limit.
We now also consider the limit set via MapReduceOptions when executing mapReduce operations via MongoTemplate.mapReduce(…).

MapReduceOptions.limit(…) supersedes a potential limit set via the Query itself. This change also allows to define a limit even when no explicit Query is used.

Original pull request: #338.
2015-12-16 11:57:44 +01:00
Christoph Strobl
5a21e00322 DATAMONGO-1317 - Assert compatibility with mongo-java-driver 3.2.
We now do a defensive check against the actual WObject of WriteConcern to avoid the IllegalStateException raised by the new java-driver in case _w is null or not an Integer. This allows us to run against recent 2.13, 2.14, 3.0, 3.1 and the latest 3.2.0.

Original pull request: #337.
2015-12-16 11:49:01 +01:00
Oliver Gierke
3feed2bc5a DATAMONGO-1289 - Polishing.
Some additional JavaDoc and comment removal.

Original pull request: #333.
2015-12-16 11:38:31 +01:00
Christoph Strobl
501b9501e0 DATAMONGO-1289 - MappingMongoEntityInformation no uses fallback identifier type derived from repository declaration.
We now use RepositoryMetdata.getIdType() to provide a fallback identifier type in case the entity information does not hold an id property which is perfectly valid for MongoDB.

Original pull request: #333.
2015-12-16 11:37:51 +01:00
Oliver Gierke
727271e68c DATAMONGO-1345 - Added support for projections on repository methods.
Related tickets: DATACMNS-89.
2015-12-14 19:56:42 +01:00
Christoph Strobl
63a619dddf DATAMONGO-1349 - Upgrade to mongo-java-driver 2.14.0. 2015-12-11 10:38:36 +01:00
Oliver Gierke
113566a6ab DATAMONGO-1346 - Update.pullAll(…) now registers multiple invocations correctly.
Previously calling the method multiple times overrode the result of previous calls. We now use addMultiFieldOperation(…) to make sure already existing values are kept.
2015-12-10 15:38:40 +01:00
Oliver Gierke
7862841b48 DATAMONGO-934 - Polishing.
Polished JavaDoc and implementation as well as tests. Extracted Tuple to Spring Data Commons. Moved exception translation into MongoExceptionTranslator.

Changed implementation of DefaultBulkOperations to consider the WriteConcernResolver of the underlying MongoTemplate to avoid exposing the WriteConcern on execution.

Original pull request: #327.
Related tickets: DATACMNS-790.
2015-11-26 17:56:39 +01:00
Tobias Trelle
fe6cbaa03d DATAMONGO-934 - Added support for bulk operations.
Introduced BulkOperations that can be obtained via MongoOperations, register operations to be eventually executed in a bulk.

Original pull request: #327.
2015-11-26 17:56:35 +01:00
Oliver Gierke
9ef1fc7304 DATAMONGO-1337 - Another round of polishes on SonarQuber complaints. 2015-11-26 12:27:22 +01:00
Oliver Gierke
cf3a9d3ced DATAMONGO-1337 - Reverted making some of the loggers static.
The logger instance in AbstractMonitor is supposed to pick up the type of the actual implementation class and thus cannot be static.

Related pull request: #336.
2015-11-26 12:00:40 +01:00
Christian Ivan
1d1c80db7b DATAMONGO-1337 - General code quality improvements.
A round of code polish regarding the PMD and Squid rules referred to in the ticket.

Original pull request: #336.
2015-11-26 11:53:06 +01:00
Oliver Gierke
eeb37e9104 DATAMONGO-1342 - Fixed potential NullPointerException in MongoQueryCreator.
MongoQueryCreator.nextAsArray(…) now returns a single element object array in case null is handed to the method. It previously failed with a NullPointerException.
2015-11-25 17:23:15 +01:00
Oliver Gierke
18bf0daee7 DATAMONGO-1335 - DBObjectAccessor now writes all nested fields correctly.
Previously, DBObjectAccessor has always reset the in-between values when traversing nested properties. This caused previously written values to be erased if subsequent values are written. We now reuse an already existing BasicDBObject if present.
2015-11-25 16:06:52 +01:00
Oliver Gierke
1e9189aee7 DATAMONGO-1341 - Moved MongoDbErrorCodes into utility package.
This resolves a package cycle introduced by MongoPersistentEntityIndexCreator referring to error codes now.

Updated Sonargraph architecture description along the way.
2015-11-25 15:36:22 +01:00
Oliver Gierke
95f6dfafdd DATAMONGO-1287 - Optimizations in reading associations as constructor arguments.
As per discussion on the ticket we now omit looking up the value for an association being used as constructor argument as the simple check whether the currently handled property is a constructor argument is sufficient to potentially skip handling the value.

Related pull requests: #335, #322.
2015-11-23 11:13:07 +01:00
Christoph Strobl
bedaae8a90 DATAMONGO-1287 - Fix double fetching for lazy DbRefs used in entity constructor.
We now check properties for their usage as constructor arguments, that might already have been resolved, before setting the actual value. This prevents turning already eagerly fetched DBRefs back into LazyLoadingProxies.

Original pull request: #335.
Related pull request: #322.
2015-11-20 13:39:00 +01:00
Oliver Gierke
7bfa3fe7fd DATAMONGO-1290 - Polishing.
Removed a level of indentation from ExpressionEvaluationParameterBinder.replacePlaceholders(…). Polished JavaDoc.

Original pull request: #332.
2015-11-20 13:20:11 +01:00
Christoph Strobl
143b0b73b9 DATAMONGO-1290 - Move parameter binding for String based queries.
Moved parameter binding for string based queries into separate class.

Original pull request: #332.
2015-11-20 13:20:09 +01:00
Christoph Strobl
cbfc46270e DATAMONGO-1290 - Convert byte[] parameter in @Query to $binary representation.
We now convert non quoted binary parameters to the $binary format. This allows using them along with the @Query annotation.

Original pull request: #332.
2015-11-20 13:06:22 +01:00
Christoph Strobl
b31efb46ec DATAMONGO-1204 - ObjectPath now uses raw id values to track resolved objects.
We now use the native id within ObjectPath for checking if a DBref has already been resolved. This is required as MongoDB Java driver 3 generation changed ObjectId.equals(…) which now performs a type check.

Original pull request: #334.
Related pull request: #288.
2015-11-20 12:47:52 +01:00
Oliver Gierke
ef3477098f DATAMONGO-1324 - Register ObjectId converters unconditionally to make sure they really get used.
The presence of ObjectToObjectConverter in a DefaultConversionService causes the guard trying to register converters for ObjectIds in AbstractMongoConverter to not trigger the registration. This in turn caused ObjectId conversions to be executed via reflection instead of the straight forward method calls and thus a drop in performance for such operations.

We no unconditionally register the converters to make sure they really get applied.

Related tickets: SPR-13703.
2015-11-19 12:02:41 +01:00
Oliver Gierke
9dce117555 DATAMONGO-1238 - Upgraded to Querydsl 4. 2015-11-17 13:42:38 +01:00
Oliver Gierke
e66e1e0502 DATAMONGO-1316 - Updated changelog. 2015-11-16 08:31:45 +01:00
Christoph Strobl
19e1e9daeb DATAMONGO-1297 - Allow @Indexed annotation on DBRef.
We now also treat references as source of a potential index. This enforces index creation for Objects like:

@Document
class WithDbRef {

  @Indexed
  @DBRef
  ReferencedObject reference;
}

Combining @TextIndexed or @GeoSpatialIndexed with a DBRef will lead to a MappingException.

Original pull request: #329.
2015-11-13 17:54:42 +01:00
Christoph Strobl
ec8a948f3f DATAMONGO-1302 - Allow ConverterFactory to be registered in CustomConversions.
We now allow registration of ConverterFactory within CustomConversions by inspecting the generic type arguments for determining the conversion source and target types.

Original pull request: #330.
2015-11-10 14:37:02 +01:00
Ilho Ahn
38fc7641a0 DATAMONGO-1314 - Fix typo in Exception message.
Original Pull Request: #265
2015-11-09 20:37:26 +01:00
Christoph Strobl
ddc3925659 DATAMONGO-1291 - Made @Document usable as meta-annotation.
We now use Spring's AnnotationUtils.findAnnotation(…) for @Document lookup which enables the full power of Spring 4.2's composable annotations.

Original pull request: #326.
2015-11-06 14:34:43 +01:00
Christoph Strobl
f8416edf8f DATAMONGO-1293 - Polishing.
Move configuration parsing error into method actually responsible for reading uri/client-uri attributes.

Original Pull Request: #328
2015-10-29 12:47:16 +01:00
Viktor Khoroshko
4f94f37ce8 DATAMONGO-1293 - Allowed id attribute in addition to client-uri attribute in MongoDbFactoryParser.
We now allow write-concern and id to be configured along with the uri or client-uri attribute of <mongo:db-factory.

Original Pull Request: #328
CLA: 140120150929074128 (Viktor Khoroshko)
2015-10-29 12:47:08 +01:00
Oliver Gierke
528de58418 DATAMONGO-1276 - Fixed potential NullPointerExceptions in MongoTemplate.
Triggering data access exception translation could lead to NullPointerException in cases where PersistenceExceptionTranslator returned null because the original exception couldn't be translated and the result was directly used from a throw clause.

This is now fixed by consistently the potentiallyConvertRuntimeException(…) method, which was made static to be able to refer to it from nested static classes.

Refactored Scanner usage to actually close the Scanner instance to prevent a resource leak.
2015-10-21 15:04:12 +02:00
Oliver Gierke
e6ea34aed8 DATAMONGO-1304 - Updated changelog. 2015-10-14 13:46:21 +02:00
Oliver Gierke
f171938b00 DATAMONGO-1303 - Added build profiles for MongoDB Java driver 3.1 and 3.2 snapshots.
Added new build profiles mongod31 and mongo32-next to build the project against the latest MongoDB 3.1 driver as well as upcoming snapshots of the 3.2 generation.
2015-10-12 15:41:30 +02:00
Oliver Gierke
7b27368d2d DATAMONGO-1282 - After release cleanups. 2015-09-01 12:11:02 +02:00
Spring Buildmaster
f754df51bc DATAMONGO-1282 - Prepare next development iteration. 2015-09-01 02:12:29 -07:00
Spring Buildmaster
77dce53c7a DATAMONGO-1282 - Release version 1.8.0.RELEASE (Gosling GA). 2015-09-01 02:12:26 -07:00
Oliver Gierke
73f268e7c4 DATAMONGO-1282 - Prepare 1.8.0.RELEASE (Gosling GA). 2015-09-01 09:44:21 +02:00
Oliver Gierke
075d7d8131 DATAMONGO-1282 - Updated changelog. 2015-09-01 09:44:11 +02:00
Christoph Strobl
206337044a DATAMONGO-1280 - Updated "What’s new" section in reference documentation.
Original pull request: #319.
2015-08-31 12:55:30 +02:00
Christoph Strobl
55b44ff7aa DATAMONGO-1275 - Fixed broken links in reference documentation.
Original pull request: #318.
2015-08-22 13:16:49 +02:00
Christoph Strobl
ae48639ae9 DATAMONGO-1275 - Added documentation for optimistic locking.
Original pull request: #318.
2015-08-22 13:16:45 +02:00
Oliver Gierke
6b5e78f810 DATAMONGO-1256 - Polishing.
Minor Javadoc polishing.

Original pull request: #316.
2015-08-07 14:04:49 +02:00
Christoph Strobl
3e485e0a88 DATAMONGO-1256 - MongoMappingEvents now expose the collection name they're issued for.
We now directly expose the collection name via MongoMappingEvent.getCollectionName(). Therefore we added new constructors to all the events, deprecating the previous ones. 

Several overloads have been added to MongoEventListener, deprecating previous API. We’ll call the deprecated from the new ones until their removal.

Original pull request: #316.
2015-08-07 14:04:47 +02:00
Oliver Gierke
335c78f908 DATAMONGO-1269 - Polishing.
Original pull request: #314.
2015-08-06 11:00:36 +02:00
Christoph Strobl
b103e4eaf6 DATAMONGO-1269 - Retain position parameter in property path.
We now retain position parameters in paths used in queries when mapping the field name. This allows to map "list.1.name" to the name property of the first element in the list.

The change also fixes a glitch in mapping java.util.Map like structures having numeric keys.

Original pull request: #314.
2015-08-06 11:00:36 +02:00
Oliver Gierke
c4a6c63d23 DATAMONGO-1268 - After release cleanups. 2015-08-04 14:09:20 +02:00
Spring Buildmaster
4a4f10f97b DATAMONGO-1268 - Prepare next development iteration. 2015-08-04 04:37:14 -07:00
Spring Buildmaster
a5712daab7 DATAMONGO-1268 - Release version 1.8.0.RC1 (Gosling RC1). 2015-08-04 04:37:12 -07:00
Oliver Gierke
28cb1ef106 DATAMONGO-1268 - Prepare 1.8.0.RC1 (Gosling RC1). 2015-08-04 13:08:49 +02:00
Oliver Gierke
0d99a3e527 DATAMONGO-1268 - Updated changelog. 2015-08-04 13:08:49 +02:00
Christoph Strobl
9da43263ce DATAMONGO-1263 - Index resolver considers generic type argument of collection elements.
We now consider the potential generic type argument of collection elements. 
Prior to this change an index within List<GenericWrapper<ConcreteWithIndex>> would not have been resolved.

Original pull request: #312.
2015-08-04 08:48:57 +02:00
Oliver Gierke
784e199068 DATAMONGO-1266 - Fixed domain type lookup for methods returning primitves.
If a repository query method returned a primitive, that primitive was exposed as domain type which e.g. caused deleteBy…(…) methods to fail that returned a void.

We now shortcut the MongoEntityMetadata lookup in MongoQueryMethod to use the repository's domain type if a primitive or wrapper is returned.
2015-08-03 11:53:10 +02:00
Oliver Gierke
1ffee802c0 DATAMONGO-1261 - Updated changelog. 2015-07-28 16:42:58 +02:00
Christoph Strobl
6f0ac7f0c2 DATAMONGO-1254 - Grouping after projection in aggregation now uses correct aliased field name.
We now push the aliased field name down the aggregation pipeline for projections including operations. This allows to reference them in a later stage. Prior to this change the field reference was potentially resolved to the target field of the operation which did not result in an error but lead to false results.

Original pull request: #311.
2015-07-27 14:15:33 +02:00
Christoph Strobl
941d4d8985 DATAMONGO-1260 - Prevent accidental authentication misconfiguration on SimpleMongoDbFactory.
We now reject configuration using MongoClient along with UserCredentials in SimpleMongoDbFactory. This move favors the native authentication mechanism provided via MongoCredential.

<mongo:mongo-client id="mongo-client-with-credentials" credentials="jon:warg@snow?uri.authMechanism=PLAIN" />

Original pull request: #309.
2015-07-27 14:08:42 +02:00
Oliver Gierke
44c76d8ffb DATAMONGO-1257 - We now hint to credential quoting from the XSD.
The namespace XSD now mentions the capability of quoting more complex credentials in case they validly contain a comma.
2015-07-27 13:47:11 +02:00
Oliver Gierke
df9a9f5fb6 DATAMONGO-1257 - Polishing.
Made internal helper methods in MongoCredentialPropertyEditor static where possible.

Original pull request: #310.
2015-07-24 18:40:57 +02:00
Christoph Strobl
bebd0fa0e6 DATAMONGO-1257 - <mongo:mongo-client /> element now supports usernames with a comma.
We now allow grouping credentials by enclosing them in single quotes like this:

credentials='CN=myName,OU=myOrgUnit,O=myOrg,L=myLocality,ST=myState,C=myCountry?uri.authMechanism=MONGODB-X509'

We also changed the required argument checks to be more authentication mechanism specific which means the pattern is now username[:password@database][?options].

Original pull request: #310.
2015-07-24 18:40:56 +02:00
Oliver Gierke
594e90789d DATAMONGO-1244 - Polishing.
Minor reformattings and extracted a method to improve digestability.

Original pull request: #306.
2015-07-08 10:18:35 +02:00
Thomas Darimont
f2ab42cb80 DATAMONGO-1244 - Improved handling of expression parameters in StringBasedMongoQuery.
Replaced regex based parsing of dynamic expression based parameters with custom parsing to make sure we also support complex nested expression objects.
Previously we only supported simple named or positional expressions. Since MongoDBs JSON based query language uses deeply nested objects to express queries, we needed to improve the handling here.

Manual parsing is tedious and more verbose than regex based parsing but it gives us more control over the whole parsing process.

We also dynamically adjust  the quoting so that we only output quoted parameters if necessary.

This enables to express complex filtering queries the use Spring Security constructors like:
```
@Query("{id: ?#{ hasRole('ROLE_ADMIN') ? {$exists:true} : principal.id}}")
List<User> findAllForCurrentUserById();
```

Original pull request: #306.
2015-07-08 10:18:27 +02:00
Christoph Strobl
3224fa8ce7 DATAMONGO-1251 - Fixed potential NullPointerException in UpdateMapper.
We now explicitly handle the possibility of the source object a type hint needs to be calculated for being null.
2015-07-07 09:57:46 +02:00
Oliver Gierke
ce156c1344 DATAMONGO-1250 - Fixed inline code formatting in reference docs. 2015-07-04 19:07:09 +02:00
Oliver Gierke
434e553022 DATAMONGO-1250 - Fixed accidental duplicate invocation of value conversion in UpdateMapper.
UpdateMapper.getMappedObjectForField(…) invokes the very same method of the super class but handed in an already mapped value so that value conversion was invoked twice.

This was especially problematic in cases a dedicated converter had been registered for an object that is already a Mongo-storable one (e.g. an enum-to-string converter and back) without indicating which of the tow converter is the reading or the writing one. This basically caused the source value converted back and forth during the update mapping creating the impression the value wasn't converted at all.

This is now fixed by removing the superfluous mapping.
2015-07-04 19:00:21 +02:00
Oliver Gierke
de5b5ee4b0 DATAMONGO-1246 - Updated changelog. 2015-07-01 10:00:16 +02:00
Oliver Gierke
60636bf56d DATAMONGO-1247 - Updated changelog. 2015-07-01 07:48:31 +02:00
Oliver Gierke
1ca71f93e9 DATAMONGO-1248 - Updated changelog. 2015-06-30 13:58:37 +02:00
Oliver Gierke
63ff39bed6 DATAMONGO-1236 - Polishing.
Removed the creation of a BasicMongoPersistentEntity in favor of always handing ClassTypeInformation.OBJECT into the converter in case not entity can be found.

This makes sure type information is written for updates on properties of type Object (which essentially leads to no PersistentEntity being available).

Original pull request: #301.
2015-06-30 09:54:53 +02:00
Christoph Strobl
cb0b9604d4 DATAMONGO-1236 - Update now include type hint correctly.
We now use property type information when mapping fields affected by an update in case we do not have proper entity information within the context. This allows more precise type resolution required for determining the need to write type hints for a given property.

Original pull request: #301.
2015-06-30 09:54:53 +02:00
Christoph Strobl
1dbe3b62d7 DATAMONGO-1125 - Improve exception message for index creation errors.
We now use MongoExceptionTranslator to potentially convert exceptions during index creation into Springs DataAccessException hierarchy. In case we encounter an error code indicating DataIntegrityViolation we try to fetch existing index data and append it to the exceptions message.

Original pull request: #302.
2015-06-24 20:28:23 +02:00
Christoph Strobl
5c0707d221 DATAMONGO-1232 - IngoreCase in criteria now escapes query.
We now quote the original criteria before actually wrapping it inside of an regular expression for case insensitive search. This happens not only to case insensitive is, startsWith, endsWith criteria but also to those using like. In that case we quote the part between leading and trailing wildcard if required.

Original pull request: #301.
2015-06-22 12:50:05 +02:00
Christoph Strobl
c4ffc37dd5 DATAMONGO-1166 - ReadPreference is now be used for aggregations.
We now use MongoTemplate.readPreference(…) when executing commands such as geoNear(…) and aggregate(…).

Original pull request: #303.
2015-06-22 08:21:23 +02:00
Christoph Strobl
aaf93b0f6f DATAMONGO-1157 - Throw meaningful exception when @DbRef is used with unsupported types.
We now eagerly check DBRef properties for invalid definitions such as final class or array. In that case we throw a MappingException when verify is called.
2015-06-19 15:54:19 +02:00
Thomas Darimont
23eab1e84f DATAMONGO-1242 - Update MongoDB Java driver to 3.0.2 in mongo3 profile.
Update mongo driver.

Original pull request: #304.
2015-06-19 15:37:47 +02:00
Oliver Gierke
218f32e552 DATAMONGO-1229 - Fixed application of ignore case flag on nested properties.
Previously we tried to apply the ignore case settings found in the PartTree to the root PropertyPath we handle in MongoQueryCreator.create(). This is now changed to work on the leaf property of the PropertyPath.
2015-06-05 06:49:03 +02:00
Eddú Meléndez
62fbe4d08c DATAMONGO-1234 - Fix typos in JavaDoc. 2015-06-05 06:37:22 +02:00
Oliver Gierke
41ffd00619 DATAMONGO-1228 - After release cleanups. 2015-06-02 11:58:11 +02:00
Spring Buildmaster
98b9a604cf DATAMONGO-1228 - Prepare next development iteration. 2015-06-02 01:29:04 -07:00
Spring Buildmaster
01468b640a DATAMONGO-1228 - Release version 1.8.0.M1 (Gosling M1). 2015-06-02 01:29:01 -07:00
Oliver Gierke
4d96b036a2 DATAMONGO-1228 - Prepare 1.8.0.M1 (Gosling M1). 2015-06-02 09:29:53 +02:00
Oliver Gierke
2d1ac15e24 DATAMONGO-1228 - Updated changelog. 2015-06-02 08:24:47 +02:00
Oliver Gierke
2c27e8576f DATAMONGO-990 - Polishing.
Removed EvaluationExpressionContext from all AbstractMongoQuery implementations that don't actually need it and from AbstractMongoQuery itself, too. Cleaned up test cases after that.

Moved SpEL related tests into AbstractPersonRepositoryIntegrationTests to make sure they're executed for all sub-types. JavaDoc and assertion polishes.

Original pull request: #285.
2015-06-01 17:27:58 +02:00
Thomas Darimont
67f638d953 DATAMONGO-990 - Add support for SpEL expressions in @Query.
Ported and adapted support for SpEL expressions @Query annotations from Spring Data JPA. StringBasedMongoQuery can now evaluate SpEL fragments in queries with the help of the given EvaluationContextProvider. Introduced EvaluationContextProvider to AbstractMongoQuery. Exposed access to actual parameter values in MongoParameterAccessor.

Original pull request: #285.
2015-06-01 17:27:58 +02:00
Oliver Gierke
ea5bd5f7d3 DATAMONGO-1210 - Polishing.
Moved getTypeHint(…) method to Field class.

Original pull request: #292.
2015-06-01 13:21:07 +02:00
Christoph Strobl
394f695416 DATAMONGO-1210 - Fixed type hints for usage with findAndModify(…).
We now inspect the actual field type during update mapping and provide a type hint accordingly. Simple, non interface and non abstract types will no longer be decorated with the _class attribute. We now honor positional parameters when trying to map paths to properties. This allows more decent type mapping since we have now access to the meta model which allows us to check if presence of type hint (aka _class) is required.

We now add a special type hint indicating nested types to the converter. This allows more fine grained removal of _class property without the need to break the contract of MongoWriter.convertToMongoType(…).

Original pull request: #292.
2015-06-01 13:21:07 +02:00
Stefan Ganzer
e4db466ab9 DATAMONGO-1210 - Add breaking test case for findAndModify/addToSet/each.
The problem stems from the inconsistent handling of type hints such as MongoTemplate.save(…) does not add a type hint, but findAndModify(…) does. The same values are then treated differently by MongoDB, depending on whether they have a type hint or not. To verify this behavior, you can manually add the (superfluous) type hint to the saved object - findAndModify will then work as expected.

Additional tests demonstrate that findAndModify(…) removes type hints from complex documents in collections that are either nested in another collection or in a document, or doesn't add them in the first place.

Original pull requests: #290, #291.
Related pull request: #292.
CLA: 119820150506013701 (Stefan Ganzer)
2015-06-01 13:21:01 +02:00
Christoph Strobl
ee04c014c9 DATAMONGO-1134 - Add support for $geoIntersects.
We now support $geoIntersects via Criteria.intersects(…) using GeoJSON types.

Original pull request: #295.
2015-06-01 12:36:20 +02:00
Christoph Strobl
ea84f08de8 DATAMONGO-1216 - Skip authentication via AuthDB for MongoClient.
We now skip authentication via an explicit AuthDB when requesting a DB via a MongoClient instance.

Related ticket: DATACMNS-1218
Original pull request: #296.
2015-06-01 12:10:14 +02:00
Christoph Strobl
7d8a2b2d56 DATAMONGO-1218 - Deprecate non-MongoClient related configuration options in XML namespace.
We added deprecation hints to the description sections of elements and attributes within the spring-mongo.xsd of 1.7. Also we’ve added (for 1.8) a configuration attribute to db-factory allowing to set a client-uri creating a MongoClientURI instead of a MongoURI that will be passed on to MongoDbFactory. Just as 'uri', 'client-uri' will not allow additional configuration options like username, password next to it.

Original pull request: #296
2015-06-01 12:10:14 +02:00
Christoph Strobl
995d1e5aac DATAMONGO-1202 - Polishing.
Moved and renamed types into test class.
Added collection cleanup and missing author information.

Original pull request: #293.
2015-06-01 09:23:35 +02:00
Thomas Darimont
3b918492ae DATAMONGO-1202 - More robust type inspection for @Indexed properties.
We now use TypeInformation in IndexResolver to lookup the root PersistentEntity for resolving @Indexed properties to ensure that we retrieve the same PersistentEntity that was stored. Previously we used the Class to lookup up the PersistentEntity which yielded a partially processed result.

Original pull request: #293.
2015-06-01 09:08:31 +02:00
Christoph Strobl
66b419163c DATAMONGO-1193 - Prevent unnecessary database lookups when resolving DBRefs on 2.x driver.
We now check against the used driver version before requesting db instance from factory. Potential improvements on fetch strategy for MongoDB Java Driver 3 will be handled in DATAMONGO-1194.

Related tickets: DATAMONGO-1194.
Original pull request: #286.
2015-06-01 08:09:50 +02:00
Oliver Gierke
52bff39c22 DATAMONGO-1224 - Ensure Spring Framework 4.2 compatibility.
Removed obsolete generics in MongoPersistentEntityIndexCreator to make sure MappingContextEvents are delivered to the listener on Spring 4.2 which applies more strict generics handling to ApplicationEvents.

Tweaked PersonBeforeSaveListener in test code to actually reflect how an ApplicationEventListener for MongoDB would be implemented.

Removed deprecated (and now removed) usage of ConversionServiceFactory in AbstractMongoConverter. Added MongoMappingEventPublisher.publishEvent(Object) as NoOp.
2015-05-25 13:12:47 +02:00
Domenique Tilleuil
d151a13e87 DATAMONGO-1208 - Use QueryCursorPreparer for streaming in MongoTemplate.
We now use the QueryCursorPreparer honor skip, limit, sort, etc. for streaming.

Original pull request: #297.
Polishing pull request: #298.
2015-05-21 09:00:33 +02:00
Oliver Gierke
5e7e7d3598 DATAMONGO-1221 - Removed <relativePath /> element from parent POM declaration. 2015-05-15 15:07:30 +02:00
Oliver Gierke
356248bd05 DATAMONGO-1213 - Included section on dependency management in reference documentation.
Related ticket: DATACMNS-687.
2015-05-04 14:51:34 +02:00
Oliver Gierke
73a60153f6 DATAMONGO-1211 - Adapt to changes in Spring Data Commons.
Tweaked method signatures in MongoRepositoryFactory after some signature changes in Spring Data Commons. Use newly introduced getTragetRepositoryViaReflection(…) to obtain the repository instance via the super class.

Added repositoryBaseClass() attribute to @EnableMongoRepositories.

Related tickets: DATACMNS-542.
2015-05-02 14:49:31 +02:00
Oliver Gierke
67cf0e62a7 DATAMONGO-1207 - Fixed potential NPE in MongoTemplate.doInsertAll(…).
If a collection containing null values is handed to MongoTempalte.insertAll(…), a NullPointerException was caused by the unguarded attempt to lookup the class of the element. We now explicitly handle this case and skip the element.

Some code cleanups in MongoTemplate.doInsertAll(…).
2015-05-02 14:49:31 +02:00
Oliver Gierke
21fbcc3e67 DATAMONGO-1196 - Upgraded build profiles after MongoDB 3.0 Java driver GA release. 2015-04-01 17:11:55 +02:00
Oliver Gierke
0d63ff92a0 DATAMONGO-1192 - Switched to Spring 4.1's CollectionFactory. 2015-03-31 17:16:44 +02:00
Oliver Gierke
983645e222 DATAMONGO-1189 - After release cleanups. 2015-03-23 14:00:52 +01:00
Spring Buildmaster
d2805bfa47 DATAMONGO-1189 - Prepare next development iteration. 2015-03-23 13:03:26 +01:00
Spring Buildmaster
3f16b30631 DATAMONGO-1189 - Release version 1.7.0.RELEASE (Fowler GA). 2015-03-23 13:03:07 +01:00
Oliver Gierke
8ebcbe3c5c DATAMONGO-1189 - Prepare 1.7.0.RELEASE (Fowler GA). 2015-03-23 12:34:49 +01:00
Oliver Gierke
363bed5c37 DATAMONGO-1189 - Updated changelog. 2015-03-23 12:03:56 +01:00
Christoph Strobl
1547a646dd DATAMONGO-1189 - DATAJPA-692 - Polish reference docs before release.
Add repository query return types to reference doc.
Fall back to locally available Spring Data Commons reference docs as the remote variant doesn't seem to work currently
2015-03-23 11:17:25 +01:00
Oliver Gierke
1408d51065 DATAMONGO-979 - Polishing.
Minor JavaDoc and code style polishes.

Original pull request: #272.
2015-03-23 09:32:52 +01:00
Thomas Darimont
f5c319f18f DATAMONGO-979 - Add support for $size expression in project and group aggregation pipeline.
Introduced AggregationExpression interface to be able to represent arbitrary MongoDB expressions that can be used in projection and group operations. Supported function expressions are provided via the AggregationFunctionExpressions enum.

Original pull request: #272.
2015-03-23 09:32:26 +01:00
Christoph Strobl
a3c29054d0 DATAMONGO-1124 - Switch log level for cyclic reference index warnings to INFO.
Reduce log level from warn to info to avoid noise during application startup.

Original pull request: #282.
2015-03-23 09:00:24 +01:00
Oliver Gierke
01533ca34c DATAMONGO-1181 - Register GeoJsonModule with @EnableSpringDataWebSupport.
Added the necessary configuration infrastructure to automatically register the GeoJsonModule as Spring bean when @EnableSpringDataWebSupport is used. This is implemented by exposing a configuration class annotated with @SpringDataWebConfigurationMixin.

Added Spring WebMVC as test dependency to be able to write an integration test. Polished GeoJsonModule to hide the actual serializers.

Original pull request: #283.
Related ticket: DATACMNS-660.
2015-03-17 19:40:57 +01:00
Christoph Strobl
a1f6dc6db4 DATAMONGO-1181 - Add Jackson Module for GeoJSON types.
Added GeoJsonModule providing JsonDeserializers for GeoJsonPoint, GeoJsonMultiPoint, GeoJsonLineString, GeoJsonMultiLineString, GeoJsonPolygon and GeoJsonMultiPolygon.

Original pull request: #283.
2015-03-17 19:40:57 +01:00
Oliver Gierke
37d53d936d DATAMONGO-1179 - Polishing. 2015-03-10 14:29:22 +01:00
Christoph Strobl
bc0a2df653 DATAMONGO-1179 - Update reference documentation.
Added new-features section. Updated links and requirements. Added section for GeoJSON support. Updated Script Operations section. Added return type Stream to repositories section. Updated keyword list.

Original pull request: #281.
2015-03-10 14:29:22 +01:00
Oliver Gierke
7e50fd8273 DATAMONGO-1180 - Polishing.
Fixed copyright ranges in license headers. Added unit test to PartTreeMongoQueryUnitTests to verify the root exception being propagated correctly.

Original pull request: #280.
Related pull request: #259.
2015-03-10 12:20:53 +01:00
Thomas Darimont
ba560ffbad DATAMONGO-1180 - Fixed incorrect exception message creation in PartTreeMongoQuery.
The JSONParseException caught in PartTreeMongoQuery is now passed to the IllegalStateException we throw from the method. Previously it was passed to the String.format(…) varargs. Verified by manually throwing a JSONParseException in the debugger.

Original pull request: #280.
Related pull request: #259.
2015-03-10 12:20:53 +01:00
Oliver Gierke
50ca32c8b9 DATAMONGO-1173 - After release cleanups. 2015-03-05 19:41:05 +01:00
Spring Buildmaster
bdfe3af505 DATAMONGO-1173 - Prepare next development iteration. 2015-03-05 07:47:13 -08:00
Spring Buildmaster
798b56055d DATAMONGO-1173 - Release version 1.7.0.RC1. 2015-03-05 07:47:11 -08:00
Oliver Gierke
ce68e4a070 DATAMONGO-1173 - Prepare 1.7.0.RC1 (Fowler RC1). 2015-03-05 16:31:00 +01:00
Oliver Gierke
5da3130d26 DATAMONGO-1173 - Updated changelog. 2015-03-05 15:54:23 +01:00
Oliver Gierke
6687cdc101 DATAMONGO-1110 - Polishing.
Moved to newly introduced Range type in Spring Data Commons to more safely bind minimum and maximum distances. Changed internal APIs to always use a Range<Distance> which gets populated based on the method signature's characteristics: if only one Distance parameter is found it's interpreted as a range with upper bound only.

Removed invalid testcase for minDistance on 2D index.

Original pull request: #277.
2015-03-05 15:35:42 +01:00
Christoph Strobl
7e74ec6b62 DATAMONGO-1110 - Add support for $minDistance.
We now support $minDistance for NearQuery and Criteria. Please keep in mind that minDistance is only available for MongoDB 2.6 and better and can only be combined with $near or $nearSphere operator depending on the defined index type. Usage of $minDistance with NearQuery is only possible when a 2dsphere index is present. We also make sure $minDistance operator gets correctly nested when using GeoJSON types.

It is now possible to use a Range<Distance> parameter within the repository queries. This allows to define near queries like:

findByLocationNear(Point point, Range<Distance> distances);

The lower bound of the range is treated as the minimum distance while the upper one defines the maximum distance from the given point. In case a Distance parameter is provided it will serve as maxDistance.

Original pull request: #277.
2015-03-05 15:34:45 +01:00
Thomas Darimont
b887fa70a5 DATAMONGO-1133 - Fixed broken tests,
AggregationTests.shouldHonorFieldAliasesForFieldReferences() now correctly sets up 3 different instances of MeterData and correctly calculates the aggreated counter values.

Original pull request: #279.
2015-03-05 15:30:35 +01:00
Oliver Gierke
1c6ab25253 DATAMONGO-1135 - Polishing.
A few polishing changes to the GeoConverters.
2015-03-05 14:28:11 +01:00
Christoph Strobl
1c43a3d1ee DATAMONGO-1135 - Add support for GeoJson.
We’ve added special types representing GeoJson structures. This allows to use those within both queries and domain types.

GeoJson types should only be used in combination with a 2dsphere index as 2d index is not able to handle the structure. Though legacy coordinate pairs and GeoJson types can be mixed inside MongoDB, we currently do not support conversion of legacy coordinates to GeoJson types.
2015-03-05 14:28:11 +01:00
Thomas Darimont
60ca1b3509 DATAMONGO-1133 - Assert that field aliasing is honored in aggregation operations.
Added some test to show that field aliases are honored during object rendering in aggregation operations.

Original pull request: #279.
2015-03-05 12:21:12 +01:00
Oliver Gierke
39d9312005 DATAMONGO-479 - Polishing.
Removed ServersideJavaScript abstraction as we still had to resort on instanceof checks and it created more ambiguities than it helped (e.g. in a script with name and code, which of the two get's executed?). We now have an ExecutableMongoScript which is code only and a NamedMongoScript, which basically is the former assigned to a name. Execution can be triggered on the former or a name.

ScriptOperations.exists(…) now returns a primitive boolean to avoid null checks. JavaDoc.

Original pull request: #254.
2015-03-04 15:18:46 +01:00
Christoph Strobl
a0e42f5dfe DATAMONGO-479 - Add support for calling functions.
We added ScriptOperations to MongoTemplate. Those allow storage and execution of java script function directly on the MongoDB server instance. Having ScriptOperations in place builds the foundation for annotation driver support in repository layer.

Original pull request: #254.
2015-03-04 15:18:40 +01:00
Oliver Gierke
7a3aff12a5 DATAMONGO-1165 - Polishing.
Renamed MongoOperations executeAsStream(…) to stream(…). Make use of Spring Data Commons StreamUtils in AbstractMongoQuery's StreamExecution. Moved test case from PersonRepositoryIntegrationTests to AbstractPersonRepositoryIntegrationTests to make sure they're executed for all sub-types.

Original pull request: #274.
2015-03-03 22:33:33 +01:00
Thomas Darimont
d4f1ef8704 DATAMONGO-1165 - Add support for Java 8 Stream as return type for repository methods.
Added support for a MongoDB Cursor backed Iterator that allows the usage of a Java 8 Stream at the repository level.

Original pull request: #274.
2015-03-03 20:56:47 +01:00
Oliver Gierke
a86d704bec DATAMONGO-1158 - Polishing.
MongoFactoryBean, MongoOptionsFactoryBean, MongoClientFactoryBean and MongoClientOptionsFactoryBean now extend AbstractFactoryBean to get a lot of the lifecycle callbacks without further code.

Added non-null assertions to newly introduced methods on MongoOperations/MongoTemplate.

Moved MongoClientVersion into util package. Introduced static imports for ReflectionUtils and MongoClientVersion for all references in the newly introduced Invoker types.

Some formatting, JavaDoc polishes, suppress deprecation warnings. Added build profile for MongoDB Java driver 3.0 as well as the following snapshot.

Original pull request: #273.
2015-03-02 21:50:27 +01:00
Christoph Strobl
57ab27aa5b DATAMONGO-1158 - Add Support for MongoDB Java driver 3.0.
We now support mongo-java-driver version 2.x and 3.0 along with MongoDB Server 2.6.7 and 3.0.0.

Please note that some of the configurations options might no longer be valid when used with version 3 of the MongoDB Java driver. Have a look at the table below so see some of the major differences in using version 2.x or 3.0

                      | 2.x                  | 3.0
----------------------+----------------------+-----------------------------------------------
default WriteConcern  | NONE                 | UNACKNOWLEDGED
----------------------+----------------------+-----------------------------------------------
option for slaveOk    | available            | ignored
----------------------+----------------------+-----------------------------------------------
option for autoConnect| available            | ignored
----------------------+----------------------+-----------------------------------------------
write result checking | available            | ignored (errors are exceptions anyway)
----------------------+----------------------+-----------------------------------------------
rest index cache      | available            | throws UnsupportedOperationException
----------------------+----------------------+-----------------------------------------------
DBRef resolution      | via DBRef.fetch      | via collection.findOne
----------------------+----------------------+-----------------------------------------------
MapReduce Options     | applied              | ignored
----------------------+----------------------+-----------------------------------------------
authentication        | via UserCredentials  | via MongoClient
----------------------+----------------------+-----------------------------------------------
WriteConcernException | not available        | translated to DataIntegretyViolationException
----------------------+----------------------+-----------------------------------------------
executeInSession      | available            | requestStart/requestDone commands ignored.
----------------------+----------------------+-----------------------------------------------
index creation        | via createIndex      | via createIndex
----------------------+----------------------+-----------------------------------------------

We need to soften the exception validation a bit since the message is slightly different when using different storage engines in a MongoDB 3.0 environment.

Added an explicit <mongo-client /> element and <client-options /> to the configuration schema. These elements will replace existing <mongo /> and <options /> elements in a subsequent release. Added credentials attribute to <mongo-client /> which allows to define a set of credentials used for setting up the MongoClient correctly using authentication data. We now reject <mongo-options /> configuration when using MongoDB Java driver generation 3.0 and above.

Original pull request: #273.
2015-03-02 20:26:50 +01:00
Thomas Darimont
909cc8b5d3 DATAMONGO-1081 - Improve documentation on field mapping semantics.
Added table with examples to identifier field mapping section.

Original pull request: #276.
2015-03-02 18:15:18 +01:00
Thomas Darimont
b7acbc4347 DATAMONGO-1167 - Added QueryDslPredicateExecutor.findAll(Predicate, Sort).
We now support findAll on QueryDslMongoRepository that accepts a Querydsl Predicate and a Sort and returns a List<T>.

Original pull request: #275.
2015-02-24 09:51:39 +01:00
Oliver Gierke
d276306ddc DATAMONGO-1162 - Adapt to API changes in Spring Data Commons. 2015-02-06 12:48:26 +01:00
Oliver Gierke
25b98b7ad2 DATAMONGO-1154 - Upgraded to MongoDB Java driver 2.13.0. 2015-01-29 20:54:39 +01:00
Oliver Gierke
819b424142 DATAMONGO-1153 - Fix documentation build.
Movend jconsole.png to the images folder. Extracted MongoDB-specific auditing documentation into separate file for inclusion after the general auditing docs.
2015-01-29 14:04:16 +01:00
Oliver Gierke
5d0328ba4b DATAMONGO-1144 - Updated changelog. 2015-01-28 20:46:19 +01:00
Oliver Gierke
b219cff29c DATAMONGO-1143 - Updated changelog. 2015-01-28 10:00:59 +01:00
Oliver Gierke
409eeaf962 DATAMONGO-1148 - Favor EclipseLink’s JPA over the Hibernate one. 2015-01-27 21:43:46 +01:00
Oliver Gierke
4e5e8bd026 DATAMONGO-1146 - Polishing.
Added missing @Override annotations to QueryDslMongoRepository methods.

Related tickets: DATACMNS-636.
Original pull request: #270.
2015-01-26 11:52:14 +01:00
Thomas Darimont
b91ec53ae0 DATAMONGO-1146 - Added QueryDslMongoRepository.exists(…) which accepts a Querydsl predicate.
Added explicit test case for QueryDslMongoRepository.

Related tickets: DATACMNS-636.
Original pull request: #270.
2015-01-26 11:52:06 +01:00
Oliver Gierke
ce0624b8b0 DATAMONGO-712 - Another round of performance improvements.
Refactored CustomConversions to unify locked access to the cached types. Added a cache for raw-write-targets so that they’re cached, too.

DBObjectAccessor now avoids expensive code paths for both reads and writes in case of simple field names.

MappingMongoConverter now eagerly skips conversions of simple types in case the value is already assignable to the target type.

QueryMapper now checks the ConversionService and only triggers a conversion if it’s actually capable of doing so instead of catching a more expensive exception.

CachingMongoPersistentProperty now also caches usePropertyAccess() and isTransient() as they’re used quite frequently.

Related ticket: DATACMNS-637.
2015-01-25 18:57:56 +01:00
alex-on-java
b4de2769cf DATAMONGO-1147 - Remove manual array copy.
Remove manual array coping by using Arrays.copyOf(values, values.length).

Original pull request: #258.
2015-01-23 17:51:44 +01:00
Thomas Darimont
3f7b0f1eb6 DATAMONGO-1082 - Improved documentation of alias usage in aggregation framework.
Added missing JavaDoc and added short note to the reference documentation.

Original pull request: #268.
2015-01-22 08:47:15 +01:00
Thomas Darimont
4055365c57 DATAMONGO-1127 - Add support for geoNear queries with distance information.
Made unit tests more robust to small differences in distance calculations between MongoDB versions.
2015-01-20 19:04:01 +01:00
Thomas Darimont
db7f782ca6 DATAMONGO-1127 - Add support for geoNear queries with distance information.
We now support geoNear queries in Aggregations. Exposed GeoNearOperation factory method in Aggregation. Introduced new distanceField property to NearQuery since it is required for geoNear queries in Aggregations.

Original pull request: #261.
2015-01-20 18:16:12 +01:00
Christoph Strobl
cde9d8d23a DATAMONGO-1121 - Fix false positive when checking for potential cycles.
We now only check for cycles on entity types and explicitly exclude simple types.

Original pull request: #267.
2015-01-20 12:17:13 +01:00
Oliver Gierke
3dd9b0a2b6 DATAMONGO-1136 - Polishing.
Polished equals(…) / hashCode() methods in GeoCommand.

Original pull request: #263.
2015-01-12 19:42:50 +01:00
Christoph Strobl
59e54cecd2 DATAMONGO-1136 - Use $geoWithin instead of $within for geo queries.
We now use the $geoWithin operator for geospatial criteria which requires to run on  at least MongoDB 2.4.

Original pull request: #263.
2015-01-12 19:42:15 +01:00
Oliver Gierke
5ed7e8efc2 DATAMONGO-1139 - MongoQueryCreator now only uses $nearSpherical if non-neutral Metric is used.
Fixed the evaluation of the Distance for a near clause handed into a query method. Previously we evaluated against null, which will never result in true as Distance returns Metrics.NEUTRAL by default.
2015-01-12 19:10:10 +01:00
Oliver Gierke
fa85adfe0b DATAMONGO-1123 - Improve JavaDoc of MongoOperations.geoNear(…).
The JavaDoc of the geoNear(…) methods in MongoOperations now contain a hint to MongoDB limiting the number of results by default and an explicit limit on the NearQuery can be used to disable that.
2015-01-07 15:02:32 +01:00
Oliver Gierke
a3e4f44a64 DATAMONGO-1118 - Polishing.
Created dedicated prepareMapKey(…) method to chain calls to potentiallyConvertMapKey(…) and potentiallyEscapeMapKey(…) and make sure they always get applied in combination.

Fixed initial map creation for DBRefs to apply the fixed behavior, too.

Original pull request: #260.
2015-01-06 15:45:30 +01:00
Thomas Darimont
4a7a485e62 DATAMONGO-1118 - Simplified potentiallyConvertMapKey in MappingMongoConverter.
Fixed typos in CustomConversions.

Original pull request: #260.
2015-01-06 15:45:30 +01:00
Christoph Strobl
c353e02b3e DATAMONGO-1118 - MappingMongoConverter now uses custom conversions for Map keys, too.
We now allow conversions of map keys using custom Converter implementations if the conversion target type is a String.

Original pull request: #260.
2015-01-06 15:45:26 +01:00
Christophe Fargette
1c2964cab4 DATACMNS-1132 - Fixed keyword translation table in the reference documentation.
Original pull request: #262.
2015-01-06 13:29:37 +01:00
Oliver Gierke
47e083280a DATACMNS-1131 - We now register the ThreeTen back port converters by default.
Related ticket: DATACMNS-628.
2015-01-05 19:11:54 +01:00
Oliver Gierke
7db003100b DATAMONGO-1129 - Upgraded to MongoDB Java driver 2.12.4.
Added Travis build configuration, too.
2014-12-31 14:20:52 +01:00
Oliver Gierke
f814b1ef47 DATAMONGO-1128 - Added test cases to validate Optional mapping.
Added test cases to make sure Optional instances are handled correctly and the converters are actually applied to the nested value.
2014-12-31 13:59:43 +01:00
Oliver Gierke
f3d2ae366e DATAMONGO-1120 - Fix execution of query methods using pagination and field mapping customizations.
Repository queries that used pagination and referred to a field that was customized were failing as the count query executed was not mapped correctly in MongoOperations.

This result from the fix for DATAMONGO-1080 which removed the premature field name translation from AbstractMongoQuery and thus lead to unmapped field names being used for the count query.

We now expose the previously existing, but not public count(…) method on MongoOperations that takes both an entity type as well as an explicit collection name to be able to count-query a dedicated collection but still get the query mapping applied for a certain type.

Related ticket: DATAMONGO-1080.
2014-12-18 15:47:54 +01:00
Oliver Gierke
b6ecce3aa2 DATAMONGO-1096 - Polishing.
Fixed formatting for changes introduced with DATAMONGO-1096.
2014-12-17 18:37:33 +01:00
Oliver Gierke
c5235be9a7 DATAMONGO-1106 - After release cleanups. 2014-12-01 13:44:58 +01:00
Spring Buildmaster
23300de9d4 DATAMONGO-1106 - Prepare next development iteration. 2014-12-01 13:36:36 +01:00
Spring Buildmaster
41dc57c84f DATAMONGO-1106 - Release version 1.7.0.M1 (Fowler M1). 2014-12-01 13:36:32 +01:00
Oliver Gierke
85d1fe1ce6 DATAMONGO-1106 - Prepare 1.7.0.M1 (Fowler M1). 2014-12-01 12:26:52 +01:00
Oliver Gierke
ac6067ad53 DATAMONGO-1106 - Updated changelog. 2014-12-01 12:25:53 +01:00
Thomas Darimont
173a62b5ce DATAMONGO-1085 - Fixed sorting with Querydsl in QueryDslMongoRepository.
We now translate QSort's OrderSpecifiers into appropriate sort criteria.
Previously the OrderSpecifiers were not correctly translated to appropriate property path expressions.

We're now overriding support for findAll(Pageable) and findAll(Sort) to QueryDslMongoRepository to apply special QSort handling.

Original pull request: #236.
2014-12-01 12:09:14 +01:00
Oliver Gierke
cbbafce73d DATAMONGO-1043 - Make sure we dynamically lookup SpEL based collection names for query execution.
Changed SimpleMongoEntityMetadata to keep a reference to the collection entity instead of the eagerly resolved collection name. This is to make sure the name gets re-evaluated for every query execution to support dynamically changing collections defined via SpEL expressions.

Related pull request: #238.
2014-11-28 20:26:23 +01:00
Oliver Gierke
2e74c19995 DATAMONGO-1054 - Polishing.
Tweaked JavaDoc of the APIs to be less specific about implementation internals and rather point to the save(…) methods. Changed SimpleMongoRepository.save(…) methods to inspect the given entity/entities and use the optimized insert(All)-calls if all entities are considered new.

Original pull request: #253.
2014-11-28 18:33:19 +01:00
Thomas Darimont
a212b7566c DATAMONGO-1054 - Add support for fast insertion via MongoRepository.insert(..).
Introduced new insert(..) method variants on MongoRepositories that delegates to MongoTemplate.insert(..). This bypasses ID-population, save event generation and version checking and allows for fast insertion of bulk data.

Original pull request: #253.
2014-11-28 18:33:18 +01:00
Oliver Gierke
08faa52ef4 DATAMONGO-1108 - Performance improvements in BasicMongoPersistentEntity.
BasicMongoPersistentEntity.getCollection() now avoids repeated SpEL-parsing and evaluating in case no SpEL expression is used. Parsing is happening at most once now. Evaluation is skipped entirely if the configured collection String is not or does not contain an expression.
2014-11-28 16:24:25 +01:00
Oliver Gierke
33bc4fffd9 DATAMONGO-1079 - Updated changelog. 2014-11-28 12:06:05 +01:00
Christoph Strobl
eca2108e15 DATAMONGO-1087 - Fix index resolver detecting cycles for partial match.
We now check for presence of a dot path to verify that we’ve detected a cycle.

Original pull request: #240.
2014-11-28 12:03:38 +01:00
Christoph Strobl
dab6034eb9 DATAMONGO-943 - Add support for $position to Update $push $each.
We now support $position on update.push.

Original pull request: #248.
2014-11-28 11:41:51 +01:00
Christoph Strobl
461e7d05d7 DATAMONGO-1092 - Ensure compatibility with MongoDB 2.8.0.rc0 and java driver 2.13.0-rc0.
We updated GroupByResults to allow working with changed data types returned for count and keys and fixed assertion on error message for duplicate keys.
Using java-driver 2.12.x when connecting to an 2.8.0.rc-0 instance is likely to cause trouble with authentication. This is the intended behavior.

2.8.0-rc0 throws error when removing elements from a collection that does not yet exist, which is different to what 2.6.x does.

The java-driver 2.13.0-rc0 works perfectly fine with a 2.6.x Server instance.
We deprecated Index.Duplicates#DROP since it has been removed in MongoDB 2.8

Original pull request: #246.
2014-11-28 11:33:12 +01:00
Oliver Gierke
10c37b101d DATAMONGO-1105 - Added implementation of QueryDslPredicateExecutor.findAll(OrderSpecifier<?>... orders).
Renamed QuerydslRepositorySupportUnitTests to QuerydslRepositorySupportTests as it's an integration test.
2014-11-28 10:37:21 +01:00
Christoph Strobl
81f2c910f7 DATAMONGO-1075 - Containing keyword is now correctly translated for collection properties.
We now inspect the properties type when creating criteria for CONTAINS keyword so that, if the target property is of type String, we use an expression, and if the property is collection like we try to finds an exact match within the collection using $in.

Added support for NotContaining along the way.

Original pull request: #241.
2014-11-27 17:06:39 +01:00
Thomas Darimont
1fd97713c1 DATAMONGO-1093 - Added hashCode() and equals(…) in BasicQuery.
We now have equals(…) and hashCode(…) methods on BasicQuery. Previously we solely relied on Query.hashCode()/equals(…) which didn't consider the fields of BasicQuery.

Introduced equals verifier library to automatically test equals contracts.
Added some additional test cases to BasicQueryUnitTests.

Original pull request: #252.
2014-11-27 16:45:35 +01:00
Oliver Gierke
2d3eeed9ec DATAMONGO-1102 - Added support for Java 8 date/time types.
We're now able to persist and read non-time-zoned JDK 8 date/time types (LocalDate, LocalTime, LocalDateTime) to and from Date instances.
2014-11-27 16:28:36 +01:00
Christoph Strobl
b22eb6f12f DATAMONGO-1101 - Add support for $bit to Update.
We now support bitwise and/or/xor operations for Update.
2014-11-26 11:26:56 +01:00
Mikhail Mikhaylenko
dfb0a2a368 DATAMONGO-1096 - Use null-safe toString representation of query for debug logging.
We now use the null-safe serailizeToJsonSafely to avoid potential RuntimeExceptions during debug query printing in MongoTemplate.

Based on original PR: #247.

Original pull request: #251.
2014-11-26 09:40:30 +01:00
Thomas Darimont
03bcc56429 DATAMONGO-1094 - Fixed ambiguous field mapping error message in BasicMongoPersistentEntity.
Original pull request: #245.
2014-11-25 17:32:16 +01:00
Christoph Strobl
457fda3fc3 DATAMONGO-1097 - Add support for $mul to Update.
We now support multiply on Update allowing to multiply the value of the given key by a multiplier.
2014-11-24 20:38:44 +01:00
Oliver Gierke
54cee64610 DATAMONGO-1100 - Upgrade to new PersistentPropertyAccessor API. 2014-11-20 15:12:25 +01:00
Christoph Strobl
477499248a DATAMONGO-1086 - Mapping fails for collection with two embbeded types that extend a generic abstract.
We now use the type information of the raw property type to check if we need to include _class.
2014-11-20 15:12:25 +01:00
Oliver Gierke
3b70b6aeee DATAMONGO-1078 - Polishing.
Polished test cases. Simplified equals(…)/hashCode() for sample entity and its identifier type.

Original pull request: #239.
2014-11-10 16:38:03 +01:00
Christoph Strobl
163762e99e DATAMONGO-1078 - @Query annotated repository method fails for complex Id when used with Collection type.
Remove object type hint defaulting.
2014-11-10 16:37:56 +01:00
Oliver Gierke
b99833df75 DATAMONGO-1080 - AbstractMongoQuery now refrains from eagerly post-processing the query execution results.
To properly support general post processing of query execution results (in QueryExecutorMethodInterceptor) we need to remove the eager post-processing of query execution results in AbstractMongoQuery.

Removed the usage of the local ConversionService all together.
2014-10-30 11:35:51 +01:00
Thomas Darimont
4be6231426 DATAMONGO-1076 - Avoid resolving lazy-loading proxy for DBRefs during finalize.
We now handle intercepted finalize method invocations by not resolving the proxy. Previously the LazyLoadingProxy tried to resolve the proxy during finalization which could lead to unnecessary database accesses.

Original pull request: #234.
2014-10-29 10:16:12 +01:00
Christoph Strobl
4673e3d511 DATAMONGO-1077 - Fix Update removing $ operator for DBRef.
We now retain the positional parameter "$" when mapping field names for associations.

Orignal pull request: #235.
2014-10-28 14:28:22 +01:00
Christoph Strobl
00e48cc424 DATAMONGO-1050 - Explicitly annotated Field should not be considered Id.
We changed the id resolution to skip properties having an explicit name set via @Field unless they are marked with @Id. This means that

@Field(“id”) String id;

will be stored as “id” within mongodb. Prior to this change the fieldname would have been changed to “_id”.
Added tests to ensure proper field mapping for various "id" field variants.

Original pull request: #225.
2014-10-23 11:39:17 +02:00
Christoph Strobl
f8453825fb DATAMONGO-1072 - Fix annotated query placeholders not replaced correctly.
We now also check field names for potential placeholder matches to ensure those are registered for binding parameters.

Original pull request: #233.
2014-10-22 13:55:50 +02:00
Christoph Strobl
6cda9ab939 DATAMONGO-1068 - Fix getCritieriaObject returns empty DBO when no key defined.
We now check for the presence of a Critieria key.

Original pull request: #232.
2014-10-21 11:36:15 +02:00
Oliver Gierke
831d667896 DATAMONGO-1070 - Fixed a few glitches in DBRef binding for repository query methods.
The QueryMapping for derived repository queries pointing to the identifier of the referenced document. We now reduce the query field's key from reference.id to reference so that the generated DBRef is applied correctly and also take care that the id's are potentially converted to ObjectIds. This is mainly achieved by using the AssociationConverter pulled up from UpdateMapper in ObjectMapper.getMappedKey().

MongoQueryCreator now refrains from translating the field keys as that will fail the QueryMapper to correctly detect id properties.

Fixed DBRef handling for StringBasedMongoQuery which previously didn't parse the DBRef instance created after JSON parsing for placeholders.
2014-10-15 10:13:53 +02:00
Christoph Strobl
17c342895a DATAMONGO-1063 - Fix application of Querydsl'S any().in() throwing Exception.
We now only convert paths that point to either a property or variable.

Original pull request: #230.
2014-10-10 11:35:45 +02:00
Christoph Strobl
6ef518e6a0 DATAMONGO-1053 - Type check is now only performed on explicit language properties.
We now only perform a type check on via @Language explicitly defined language properties. Prior to this change non-String properties named language caused errors on entity validation.

Original pull request: #228.
2014-10-10 11:31:30 +02:00
Oliver Gierke
ddee2fbb12 DATAMONGO-1057 - Polishing.
Slightly tweaked the changes in SlicedExecution to simplify the implementation. We now apply the given pageable but tweak the limit the query uses to peek into the next page.

Original pull request: #226.
2014-10-08 07:06:18 +02:00
Christoph Strobl
6512c2cdfb DATAMONGO-1057 - Fix SliceExecution skipping elements.
We now directly set the offset to use instead of reading it from the used pageable. This asserts that every single element is read from the store.
Prior to this change the altered pageSize lead to an unintended increase of the number of elements to skip.

Original pull request: #226.
2014-10-08 07:06:14 +02:00
Oliver Gierke
0eee05adaa DATAMONGO-1062 - Polishing.
Removed exploded static imports. Updated copyright header.

Original pull request: #229.
2014-10-07 15:32:18 +02:00
Christoph Strobl
17e0154ff3 DATAMONGO-1058 - DBRef should respect explicit field name.
We now use property.getFieldName() for mapping DbRefs. This assures we also capture explicitly defined names set via @Field.

Original pull request: #227.
2014-10-01 10:06:22 +02:00
Thomas Darimont
2780f60c65 DATAMONGO-1062 - Fix failing test in ServerAddressPropertyEditorUnitTests.
The test rejectsAddressConfigWithoutASingleParsableServerAddress fails because the supposedly non-existing hostname "bar" "now" resolves to a real host-address.

The addresses "gugu.nonexistant.example.org, gaga.nonexistant.example.org" shouldn't be resolvable TM.

Original pull request: #229.
2014-09-30 12:55:24 +02:00
Christoph Strobl
7dd3450362 DATAMONGO-1049 - Check for explicitly declared language field.
We now check for an explicitly declared language field for setting language_override within a text index. Therefore the attribute (even if named with the reserved keyword language) has to be explicitly marked with @Language. Prior to this change having:

@Language String lang;
String language;

would have caused trouble when trying to resolve index structures as one cannot set language override to more than one property.

Original pull request: #224.
2014-09-25 12:40:26 +02:00
Oliver Gierke
ca4b2a61b8 DATAMONGO-1046 - After release cleanups. 2014-09-15 14:30:23 +02:00
Oliver Gierke
d2ecd65ca5 DATAMONGO-1046 - After release cleanups. 2014-09-05 14:27:21 +02:00
Spring Buildmaster
03bd49f6c8 DATAMONGO-1046 - Prepare next development iteration. 2014-09-05 03:12:04 -07:00
Spring Buildmaster
51607c5ed8 DATAMONGO-1046 - Release version 1.6.0.RELEASE (Evans GA). 2014-09-05 03:12:02 -07:00
Oliver Gierke
e2cbd3ee28 DATAMONGO-1046 - Prepare 1.6.0.RELEASE (Evans GA). 2014-09-05 11:43:58 +02:00
Oliver Gierke
5944e6b57e DATAMONGO-1046 - Updated changelog. 2014-09-05 09:23:31 +02:00
Oliver Gierke
efd46498ef DATAMONGO-1033 - Updated changelog. 2014-09-05 07:31:54 +02:00
Christoph Strobl
3d705a737f DATAMONGO-1040 - Derived delete should respect collection name.
Adding collection metadata allows to fine grained remove entities from specific collections using derived delete queries.

Original pull request: #223.
2014-09-04 15:47:13 +02:00
Christoph Strobl
996c57bccf DATAMONGO-1039 - Polish db clean hook implementation.
- Refactored internal structure.
- Updated documentation.
- Added some tests

Original pull request: #222.
2014-09-04 11:21:55 +02:00
Oliver Gierke
a31e72ff06 DATAMONGO-1045 - Tweak AspectJ setup in cross-store module to be able to build against Spring 4.1.
Added an aop.xml to only compile explicitly listed aspects in the cross-store module. This is needed as Spring 4.1 includes a new aspect for JavaEE 7 JCache support that has optional dependencies which we don't have in the classpath. Trying to compile all aspects contained in spring-aspects will result in ClassNotFoundExceptions for the aspects with missing dependencies.
2014-09-04 08:51:31 +02:00
Mark Paluch
f07d8fca8c DATAMONGO-1036 - Improved detection of custom implementations for CDI repositories.
Adapted to API changes in CDI extension.

Related ticket: DATACMNS-565.
2014-09-01 13:51:20 +02:00
Christoph Strobl
69dbdee01f DATAMONGO-1038 - Assert Mongo instances cleaned up properly after test runs.
Add JUnit rule and RunListener taking care of clean up task.

Original pull request: #221.
2014-08-27 11:12:39 +02:00
Oliver Gierke
dedb9f3dc0 DATAMONGO-1034 - Explicitly reject incompatible types in MappingMongoConverter.
Improved the exception message that is occurs if the source document contains a BasicDBList but has to be converted into a complex object. We now explicitly hint to use a custom Converter to manually.

Improved toString() method on ObjectPath to create more helpful output.
2014-08-26 20:07:46 +02:00
Oliver Gierke
7d69b840fe DATAMONGO-1030 - Projections now work on single-entity query method executions.
We now correctly forward the domain type collection to the query executing a query for a projection type.
2014-08-26 15:16:18 +02:00
Christoph Strobl
4eaef300cb DATAMONGO-1025 - Fix creation of nested named index.
Deprecated collection attribute for @Indexed, @CompoundIndex, @GeoSpatialIndexed. Removed deprecated attribute `expireAfterSeconds` from @CompoundIndex.

Original pull request: #219.
2014-08-26 14:33:47 +02:00
Christoph Strobl
ec1a6b5edd DATAMONGO-1025 - Fix creation of nested named index.
We new prefix explicitly named indexes on nested types (eg. for embedded properties) with the path pointing to the property. This avoids errors having equally named index definitions on different paths pointing to the same type within one collection.

Along the way we harmonized index naming for geospatial index definitions where only the properties field name was taken into account where it should have been the full property path.

Original pull request: #219.
2014-08-26 14:33:47 +02:00
Oliver Gierke
adc5485c09 DATAMONGO-1032 - Polished Asciidoctor documentation. 2014-08-26 14:24:51 +02:00
Oliver Gierke
f622b2916d DATAMONGO-1021 - After release cleanups. 2014-08-13 16:32:42 +02:00
Spring Buildmaster
26be0cf948 DATAMONGO-1021 - Prepare next development iteration. 2014-08-13 07:02:43 -07:00
Spring Buildmaster
e27c01fe5b DATAMONGO-1021 - Release version 1.6.0.RC1 (Evans RC1). 2014-08-13 07:02:41 -07:00
Oliver Gierke
d639e58fb9 DATAMONGO-1021 - Prepare 1.6.0.RC1 (Evans RC1). 2014-08-13 15:37:48 +02:00
Oliver Gierke
0195c2cb48 DATAMONGO-1021 - Updated changelog. 2014-08-13 15:37:44 +02:00
Oliver Gierke
068e2ec49b DATAMONGO-1024 - Upgraded to MongoDB Java driver 2.12.3. 2014-08-13 15:36:08 +02:00
Christoph Strobl
a9306b99ec DATAMONGO-957 - Add support for query modifiers.
Using Meta allows to the define $comment, $maxScan, $maxTimeMS and $snapshot on query. When executed we add the meta information to the cursor in use.

We’ve introduced the @Meta annotation that allows to the define $comment, $maxScan, $maxTimeMS and $snapshot on a repository finder method.
Added tests to verify proper invocation of template methods
Use DBCursor.copy() for CursorPreparer.

Original pull request: #216.
2014-08-13 14:56:10 +02:00
Thomas Darimont
3597194742 DATAMONGO-1012 - Improved identifier initialization on DBRef proxies.
Identifier initalization is now only triggered if field access is used. Before that the id initialization would've resolved the proxy eagerly as the getter access performed by the BeanWrapper would've been intercepted by the proxy and is indistinguishable from a normal method call. This would've rendered the entire use case to create proxies ad absurdum.

Added test case to check for non-initialization in the property access scenario.
2014-08-13 14:34:38 +02:00
Oliver Gierke
6f06ccec8e DATAMONGO-1012 - Identifier initialization for lazy DBRef proxies with field access.
We now initialize the ID property for proxies created for lazily initialized DBRefs. This will allow the lookup of ID properties for types that use field access without initializing the entire proxy.
2014-08-13 14:34:15 +02:00
Oliver Gierke
6fe7f220f9 DATAMONGO-1007 - Updated changelog. 2014-08-13 10:56:02 +02:00
Christoph Strobl
45e70d493d DATAMONGO-1016 - Remove deprecations in geospatial area.
Removed:
 - Box
 - Circle
 - CustomMetric
 - Distance
 - GeoPage
 - GeoResult
 - GeoResults
 - Metric
 - Metrics
 - Point
 - Polygon
 - Shape

Updated api doc.
Removed deprecation warnings.
2014-08-13 09:52:02 +02:00
Thomas Darimont
ce71ab83f2 DATAMONGO-1020 - LimitOperation is now a public class.
Original pull request: #218.
2014-08-12 12:30:41 +02:00
Oliver Gierke
bf85d8facd DATAMONGO-1005 - Polishing introduction of ObjectPath.
Simplified implementation of ObjectPath to use a static root instance and hand the path further down until final resolution in MappingMongoConverter.readValue(…). This removes a bit of boxing and unboxing code both in ObjectPath and the converter.

Introduced ObjectPath.getPathItem(…) to internalize the iteration to find a potentially already resolved object.

Renamed parameters and fields of type ObjectPath to path consistently. Removed obsolete method in MappingMongoConverter.

Original pull request: #209.
2014-08-12 08:12:31 +02:00
Thomas Darimont
c5ff7cdb2b DATAMONGO-1005 - Improve cycle-detection for DbRef's.
Introduced ObjectPath that collects the target objects while converting a DBObject to a domain object. If we detect that a potentially nested DBRef points to an object that is already under construction we simply return a reference to that object in order to avoid StackOverFlowErrors.

Original pull request: #209.
2014-08-12 08:10:47 +02:00
Mark Paluch
f9ccf4f532 DATAMONGO-1017 - Add support for custom implementations in CDI repositories.
Original pull request: #215.
2014-08-11 07:47:57 +02:00
Greg Turnquist
ab731f40a7 DATAMONGO-1019 - Corrected examples in reference documentation.
Examples were not properly converted. One table got dropped, so I added it back. Fix IMPORTANT notes.

Original pull requests: #214.
2014-08-10 16:04:45 +02:00
Oliver Gierke
d8434fffa8 DATAMONGO-1015 - Fixed link to Spring Data Commons reference docs. 2014-08-10 15:55:13 +02:00
Christoph Strobl
151b1d4510 DATAMONGO-973 - Add support for deriving full-text queries.
Added support to execute full-text queries on repositories. Query methods now can have a parameter of type TextCriteria which will be triggering a text search clause for the property annotated with @TestScore.

Retrieving document score and sorting by score is only possible if the entity holds a property annotated with @TextScore. If present, any find execution will be enriched so that it asserts loading of the according { $meta : textScore } field. The sort object will only be mapped in case the existing sort property already exists - in that case we replace the existing expression for the property with its $meta representation.

This allows for example the following:

TextCriteria criteria = TextCriteria.forDefaultLanguage().matching("term");

repository.findAllBy(criteria, new Sort("score"));
repository.findAllBy(criteria, new PageRequest(0, 10, Direction.DESC, "score"));
repository.findByFooOrderByScoreDesc("foo", criteria);

For more details and examples see the "Full text search queries" section in the reference manual.
2014-08-06 22:25:38 +02:00
Greg Turnquist
168cf3e1f6 DATAMONGO-1015 - Migrate reference documentation from Docbook to Asciidoctor. 2014-08-06 21:38:46 +02:00
Oliver Gierke
52dab0fa20 DATAMONGO-1008 - Polishing.
Slightly changed the implementation of the 2dsphere check, Minor refactorings in the test case.

Original pull request: #210.
2014-07-31 17:23:19 +02:00
Christoph Strobl
9257bab06e DATAMONGO-1008 - DefaultIndexOperations no considers 2dsphere, too.
We now also check for 2dsphere when inspecting index keys and create an geo IndexField in that case.

Original pull request: #210.
2014-07-31 17:23:19 +02:00
Oliver Gierke
27f0a6f27a DATAMONGO-1008 - Added repository type based checks to strict matching algorithm.
Repositories extending MongoRepository are now considered strict matches as well.

Related ticket: DATACMNS-526.
2014-07-31 16:20:26 +02:00
Oliver Gierke
5bedbef2f2 DATAMONGO-1009 - Adapt to new multi-store configuration detection.
We now consider repositories managing domain types annotated with @Document MongoDB specific ones.

Related ticket: DATACMNS-526.
2014-07-28 20:15:40 +02:00
Christoph Strobl
51e7be8aa0 DATAMONGO-1001 - Renamed LazyLoadingProxy.initialize() to getTarget().
Original pull request: #208.
2014-07-24 13:29:27 +02:00
Christoph Strobl
6c85bb39a3 DATAMONGO-1001 - Fix saving lazy loaded object.
We now resolve the target type for CGLib-proxied objects and initialize lazy loaded ones before saving. As it turns out CustomConversions already knows how to deal with proxies correctly. Ee added an explicit test to assert that.

Original pull request: #208.
2014-07-24 13:28:36 +02:00
Oliver Gierke
07f7247707 DATAMONGO-1002 - Update.toString() now uses SerializationUtils.
A simple call of toString() on a DBObject might result in an exception if the DBObject contains objects that are non-native MongoDB types (i.e. types that need to be converted prior to persistence).

We now use SerializationUtils.serializeToJsonSafely(…) to avoid exceptions.
2014-07-23 12:36:15 +02:00
Thomas Darimont
f669711670 DATAMONGO-995 - Improve support of quote handling for custom query parameters.
Introduced ParameterBindingParser which exposes parameter references in query strings as ParameterBindings. This allows us to detect whether a parameter reference in a query string is already quoted avoiding wrongly double-quoting the parameter value.

Original pull request: #185.
Related ticket: DATAMONGO-420.
2014-07-21 20:15:18 +02:00
Oliver Gierke
5f3671f349 DATAMONGO-996 - Fixed boundary detection in pagination.
The fix for DATAMONGO-950 introduced a tiny glitch so that retrieving pages after the first one was broken in the repository query execution. We now correctly use the previously detected number of elements to detect whether the Pageable given is out of scope.

Related ticket: DATAMONGO-950.
2014-07-18 19:01:44 +02:00
Thomas Darimont
1335cb699b DATAMONGO-420 - Improve support of quote handling for custom query parameters.
Introduced ParameterBindingParser which exposes parameter references in query strings as ParameterBindings. This allows us to detect whether a parameter reference in a query string is already quoted avoiding wrongly double-quoting the parameter value.

Original pull request: #185.
2014-07-17 15:27:46 +02:00
Oliver Gierke
84414b87c0 DATAMONGO-987 - Some polishing in MappingMongoConverter.
Let getValueInternal(…) use the provided SpELExpressionEvaluator instead of relying on the MongoDbPropertyValueProvider to create a new one. Removed the obsolete constructor in MongoDbPropertyValueProvider.
2014-07-17 15:18:21 +02:00
Thomas Darimont
a1ecd4a501 DATAMONGO-987 - Avoid creation of lazy-loading proxies for null-values.
We now avoid creating a lazy-loading proxy if we detect that the property-value in the backing DbObject for a @Lazy(true) annotated field is null.

Original pull request: #207.
2014-07-17 15:18:21 +02:00
Thomas Darimont
d7e6f2ee41 DATAMONGO-989 - MatchOperation should accept CriteriaDefinition.
We replaced the constructor that accepted a Criteria with one that accepts a CriteriaDefinition to not force clients to extends Criteria. 
Original pull request: #206.
2014-07-17 09:21:29 +02:00
Oliver Gierke
04870fb8b3 DATAMONGO-991 - Adapted to deprecation removals in Spring Data Commons.
Related ticket: DATACMNS-469.
2014-07-16 12:04:10 +02:00
Oliver Gierke
9d196b78f7 DATAMONGO-981 - After release cleanups. 2014-07-10 20:44:20 +02:00
Spring Buildmaster
4229525928 DATAMONGO-981 - Prepare next development iteration. 2014-07-10 10:38:58 -07:00
376 changed files with 35266 additions and 10319 deletions

9
.github/PULL_REQUEST_TEMPLATE.md vendored Normal file
View File

@@ -0,0 +1,9 @@
Thank you for proposing a pull request. This template will guide you through the essential steps necessary for a pull request.
Make sure that:
- [ ] You have read the [Spring Data contribution guidelines](https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.adoc).
- [ ] There is a ticket in the bug tracker for the project in our [JIRA](https://jira.spring.io/browse/DATAMONGO).
- [ ] You use the code formatters provided [here](https://github.com/spring-projects/spring-data-build/tree/master/etc/ide) and have them applied to your changes. Dont submit any formatting related changes.
- [ ] You submit test cases (unit or integration tests) that back your changes.
- [ ] You added yourself as author in the headers of the classes you touched. Amend the date range in the Apache license header if needed. For new types, add the license header (copy from another file and set the current year only).
- [ ] You provide your full name and an email address registered with your GitHub account. If youre a first-time submitter, make sure you have completed the [Contributors License Agreement form](https://support.springsource.com/spring_committer_signup).

37
.travis.yml Normal file
View File

@@ -0,0 +1,37 @@
language: java
jdk:
- oraclejdk8
before_script:
- mongod --version
env:
matrix:
- PROFILE=ci
- PROFILE=mongo-next
- PROFILE=mongo3
- PROFILE=mongo3-next
- PROFILE=mongo31
- PROFILE=mongo32
- PROFILE=mongo33
# Current MongoDB version is 2.4.2 as of 2016-04, see https://github.com/travis-ci/travis-ci/issues/3694
# apt-get starts a MongoDB instance so it's not started using before_script
addons:
apt:
sources:
- mongodb-3.2-precise
packages:
- mongodb-org-server
- mongodb-org-shell
sudo: false
cache:
directories:
- $HOME/.m2
install: true
script: "mvn clean dependency:list test -P${PROFILE} -Dsort"

27
CODE_OF_CONDUCT.adoc Normal file
View File

@@ -0,0 +1,27 @@
= Contributor Code of Conduct
As contributors and maintainers of this project, and in the interest of fostering an open and welcoming community, we pledge to respect all people who contribute through reporting issues, posting feature requests, updating documentation, submitting pull requests or patches, and other activities.
We are committed to making participation in this project a harassment-free experience for everyone, regardless of level of experience, gender, gender identity and expression, sexual orientation, disability, personal appearance, body size, race, ethnicity, age, religion, or nationality.
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery
* Personal attacks
* Trolling or insulting/derogatory comments
* Public or private harassment
* Publishing other's private information, such as physical or electronic addresses,
without explicit permission
* Other unethical or unprofessional conduct
Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful.
By adopting this Code of Conduct, project maintainers commit themselves to fairly and consistently applying these principles to every aspect of managing this project. Project maintainers who do not follow or enforce the Code of Conduct may be permanently removed from the project team.
This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community.
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting a project maintainer at spring-code-of-conduct@pivotal.io.
All complaints will be reviewed and investigated and will result in a response that is deemed necessary and appropriate to the circumstances.
Maintainers are obligated to maintain confidentiality with regard to the reporter of an incident.
This Code of Conduct is adapted from the http://contributor-covenant.org[Contributor Covenant], version 1.3.0, available at http://contributor-covenant.org/version/1/3/0/[contributor-covenant.org/version/1/3/0/].

View File

@@ -1 +0,0 @@
You find the contribution guidelines for Spring Data projects [here](https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.md).

3
CONTRIBUTING.adoc Normal file
View File

@@ -0,0 +1,3 @@
= Spring Data contribution guidelines
You find the contribution guidelines for Spring Data projects https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.adoc[here].

View File

@@ -1,3 +1,6 @@
[![Spring Data MongoDB](https://spring.io/badges/spring-data-mongodb/ga.svg)](http://projects.spring.io/spring-data-mongodb#quick-start)
[![Spring Data MongoDB](https://spring.io/badges/spring-data-mongodb/snapshot.svg)](http://projects.spring.io/spring-data-mongodb#quick-start)
# Spring Data MongoDB # Spring Data MongoDB
The primary goal of the [Spring Data](http://projects.spring.io/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services. The primary goal of the [Spring Data](http://projects.spring.io/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
@@ -26,7 +29,7 @@ Add the Maven dependency:
<dependency> <dependency>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId> <artifactId>spring-data-mongodb</artifactId>
<version>1.5.0.RELEASE</version> <version>${version}.RELEASE</version>
</dependency> </dependency>
``` ```
@@ -36,7 +39,7 @@ If you'd rather like the latest snapshots of the upcoming major version, use our
<dependency> <dependency>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId> <artifactId>spring-data-mongodb</artifactId>
<version>1.6.0.BUILD-SNAPSHOT</version> <version>${version}.BUILD-SNAPSHOT</version>
</dependency> </dependency>
<repository> <repository>

80
pom.xml
View File

@@ -1,11 +1,11 @@
<?xml version="1.0" encoding="UTF-8"?> <?xml version="1.0" encoding="UTF-8" standalone="no"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId> <artifactId>spring-data-mongodb-parent</artifactId>
<version>1.6.0.M1</version> <version>1.10.0.M1</version>
<packaging>pom</packaging> <packaging>pom</packaging>
<name>Spring Data MongoDB</name> <name>Spring Data MongoDB</name>
@@ -15,8 +15,7 @@
<parent> <parent>
<groupId>org.springframework.data.build</groupId> <groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId> <artifactId>spring-data-parent</artifactId>
<version>1.5.0.M1</version> <version>1.9.0.M1</version>
<relativePath>../spring-data-build/parent/pom.xml</relativePath>
</parent> </parent>
<modules> <modules>
@@ -29,9 +28,9 @@
<properties> <properties>
<project.type>multi</project.type> <project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id> <dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.9.0.M1</springdata.commons> <springdata.commons>1.13.0.M1</springdata.commons>
<mongo>2.12.1</mongo> <mongo>2.14.3</mongo>
<mongo.osgi>2.12.1</mongo.osgi> <mongo.osgi>2.13.0</mongo.osgi>
</properties> </properties>
<developers> <developers>
@@ -108,7 +107,7 @@
<id>mongo-next</id> <id>mongo-next</id>
<properties> <properties>
<mongo>2.12.3-SNAPSHOT</mongo> <mongo>2.15.0-SNAPSHOT</mongo>
</properties> </properties>
<repositories> <repositories>
@@ -122,9 +121,18 @@
<profile> <profile>
<id>mongo-3-next</id> <id>mongo3</id>
<properties> <properties>
<mongo>3.0.0-SNAPSHOT</mongo> <mongo>3.0.4</mongo>
</properties>
</profile>
<profile>
<id>mongo3-next</id>
<properties>
<mongo>3.0.5-SNAPSHOT</mongo>
</properties> </properties>
<repositories> <repositories>
@@ -135,6 +143,54 @@
</repositories> </repositories>
</profile> </profile>
<profile>
<id>mongo31</id>
<properties>
<mongo>3.1.1</mongo>
</properties>
</profile>
<profile>
<id>mongo32</id>
<properties>
<mongo>3.2.2</mongo>
</properties>
</profile>
<profile>
<id>mongo33</id>
<properties>
<mongo>3.3.0</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>release</id>
<build>
<plugins>
<plugin>
<groupId>org.jfrog.buildinfo</groupId>
<artifactId>artifactory-maven-plugin</artifactId>
<inherited>false</inherited>
</plugin>
</plugins>
</build>
</profile>
</profiles> </profiles>
<dependencies> <dependencies>
@@ -149,14 +205,14 @@
<repositories> <repositories>
<repository> <repository>
<id>spring-libs-milestone</id> <id>spring-libs-milestone</id>
<url>http://repo.spring.io/libs-milestone</url> <url>https://repo.spring.io/libs-milestone</url>
</repository> </repository>
</repositories> </repositories>
<pluginRepositories> <pluginRepositories>
<pluginRepository> <pluginRepository>
<id>spring-plugins-release</id> <id>spring-plugins-release</id>
<url>http://repo.spring.io/plugins-release</url> <url>https://repo.spring.io/plugins-release</url>
</pluginRepository> </pluginRepository>
</pluginRepositories> </pluginRepositories>

View File

@@ -0,0 +1,7 @@
<?xml version="1.0" encoding="UTF-8"?>
<aspectj>
<aspects>
<aspect name="org.springframework.beans.factory.aspectj.AnnotationBeanConfigurerAspect" />
<aspect name="org.springframework.data.mongodb.crossstore.MongoDocumentBacking" />
</aspects>
</aspectj>

View File

@@ -2,22 +2,22 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>
<parent> <parent>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId> <artifactId>spring-data-mongodb-parent</artifactId>
<version>1.6.0.M1</version> <version>1.10.0.M1</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>
<artifactId>spring-data-mongodb-cross-store</artifactId> <artifactId>spring-data-mongodb-cross-store</artifactId>
<name>Spring Data MongoDB - Cross-Store Support</name> <name>Spring Data MongoDB - Cross-Store Support</name>
<properties> <properties>
<jpa>1.0.0.Final</jpa> <jpa>2.0.0</jpa>
<hibernate>3.6.10.Final</hibernate> <hibernate>3.6.10.Final</hibernate>
</properties> </properties>
<dependencies> <dependencies>
<!-- Spring --> <!-- Spring -->
@@ -48,7 +48,7 @@
<dependency> <dependency>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId> <artifactId>spring-data-mongodb</artifactId>
<version>1.6.0.M1</version> <version>1.10.0.M1</version>
</dependency> </dependency>
<dependency> <dependency>
@@ -59,8 +59,8 @@
<!-- JPA --> <!-- JPA -->
<dependency> <dependency>
<groupId>org.hibernate.javax.persistence</groupId> <groupId>org.eclipse.persistence</groupId>
<artifactId>hibernate-jpa-2.0-api</artifactId> <artifactId>javax.persistence</artifactId>
<version>${jpa}</version> <version>${jpa}</version>
<optional>true</optional> <optional>true</optional>
</dependency> </dependency>
@@ -81,7 +81,7 @@
<dependency> <dependency>
<groupId>javax.validation</groupId> <groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId> <artifactId>validation-api</artifactId>
<version>1.0.0.GA</version> <version>${validation}</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<dependency> <dependency>
@@ -126,10 +126,11 @@
<groupId>org.springframework</groupId> <groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId> <artifactId>spring-aspects</artifactId>
</aspectLibrary> </aspectLibrary>
</aspectLibraries> </aspectLibraries>
<complianceLevel>${source.level}</complianceLevel> <complianceLevel>${source.level}</complianceLevel>
<source>${source.level}</source> <source>${source.level}</source>
<target>${source.level}</target> <target>${source.level}</target>
<xmlConfigured>aop.xml</xmlConfigured>
</configuration> </configuration>
</plugin> </plugin>
</plugins> </plugins>

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011-2014 the original author or authors. * Copyright 2011-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -37,6 +37,8 @@ import com.mongodb.MongoException;
/** /**
* @author Thomas Risberg * @author Thomas Risberg
* @author Oliver Gierke * @author Oliver Gierke
* @author Alex Vengrovsk
* @author Mark Paluch
*/ */
public class MongoChangeSetPersister implements ChangeSetPersister<Object> { public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
@@ -45,7 +47,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_FIELD_NAME = "_entity_field_name"; private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class"; private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
protected final Logger log = LoggerFactory.getLogger(getClass()); private final Logger log = LoggerFactory.getLogger(getClass());
private MongoTemplate mongoTemplate; private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory; private EntityManagerFactory entityManagerFactory;
@@ -76,25 +78,25 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
dbk.put(ENTITY_ID, id); dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName()); dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for " + dbk); log.debug("Loading MongoDB data for {}", dbk);
} }
mongoTemplate.execute(collName, new CollectionCallback<Object>() { mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException { public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
for (DBObject dbo : collection.find(dbk)) { for (DBObject dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME); String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("Processing key: " + key); log.debug("Processing key: {}", key);
} }
if (!changeSet.getValues().containsKey(key)) { if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS); String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) { if (className == null) {
throw new DataIntegrityViolationException("Unble to convert property " + key + ": Invalid metadata, " throw new DataIntegrityViolationException(
+ ENTITY_FIELD_CLASS + " not available"); "Unble to convert property " + key + ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
} }
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader()); Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo); Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: " + key); log.debug("Adding to ChangeSet: {}", key);
} }
changeSet.set(key, value); changeSet.set(key, value);
} }
@@ -109,9 +111,9 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet) * @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/ */
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException { public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (log.isDebugEnabled()) {
log.debug("getPersistentId called on " + entity); log.debug("getPersistentId called on {}", entity);
}
if (entityManagerFactory == null) { if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null"); throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
} }
@@ -130,7 +132,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
} }
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("Flush: changeset: " + cs.getValues()); log.debug("Flush: changeset: {}", cs.getValues());
} }
String collName = getCollectionNameForEntity(entity.getClass()); String collName = getCollectionNameForEntity(entity.getClass());
@@ -152,7 +154,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
}); });
if (value == null) { if (value == null) {
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("Flush: removing: " + dbQuery); log.debug("Flush: removing: {}", dbQuery);
} }
mongoTemplate.execute(collName, new CollectionCallback<Object>() { mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException { public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
@@ -164,7 +166,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
final DBObject dbDoc = new BasicDBObject(); final DBObject dbDoc = new BasicDBObject();
dbDoc.putAll(dbQuery); dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("Flush: saving: " + dbQuery); log.debug("Flush: saving: {}", dbQuery);
} }
mongoTemplate.getConverter().write(value, dbDoc); mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName()); dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());

View File

@@ -2,7 +2,7 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>
<artifactId>spring-data-mongodb-distribution</artifactId> <artifactId>spring-data-mongodb-distribution</artifactId>
<packaging>pom</packaging> <packaging>pom</packaging>
@@ -13,10 +13,10 @@
<parent> <parent>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId> <artifactId>spring-data-mongodb-parent</artifactId>
<version>1.6.0.M1</version> <version>1.10.0.M1</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>
<properties> <properties>
<project.root>${basedir}/..</project.root> <project.root>${basedir}/..</project.root>
<dist.key>SDMONGO</dist.key> <dist.key>SDMONGO</dist.key>
@@ -32,6 +32,10 @@
<groupId>org.codehaus.mojo</groupId> <groupId>org.codehaus.mojo</groupId>
<artifactId>wagon-maven-plugin</artifactId> <artifactId>wagon-maven-plugin</artifactId>
</plugin> </plugin>
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
</plugin>
</plugins> </plugins>
</build> </build>

View File

@@ -5,7 +5,7 @@
<parent> <parent>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId> <artifactId>spring-data-mongodb-parent</artifactId>
<version>1.6.0.M1</version> <version>1.10.0.M1</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>

View File

@@ -160,7 +160,7 @@ public class MongoLog4jAppender extends AppenderSkeleton {
// Copy properties into document // Copy properties into document
Map<Object, Object> props = event.getProperties(); Map<Object, Object> props = event.getProperties();
if (null != props && props.size() > 0) { if (null != props && !props.isEmpty()) {
BasicDBObject propsDbo = new BasicDBObject(); BasicDBObject propsDbo = new BasicDBObject();
for (Map.Entry<Object, Object> entry : props.entrySet()) { for (Map.Entry<Object, Object> entry : props.entrySet()) {
propsDbo.put(entry.getKey().toString(), entry.getValue().toString()); propsDbo.put(entry.getKey().toString(), entry.getValue().toString());

View File

@@ -39,7 +39,7 @@ public class MongoLog4jAppenderIntegrationTests {
static final String NAME = MongoLog4jAppenderIntegrationTests.class.getName(); static final String NAME = MongoLog4jAppenderIntegrationTests.class.getName();
Logger log = Logger.getLogger(NAME); private static final Logger log = Logger.getLogger(NAME);
Mongo mongo; Mongo mongo;
DB db; DB db;
String collection; String collection;

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8" standalone="yes"?> <?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<context version="7.1.9.205"> <context version="7.2.2.230">
<scope type="Project" name="spring-data-mongodb"> <scope type="Project" name="spring-data-mongodb">
<element type="TypeFilterReferenceOverridden" name="Filter"> <element type="TypeFilterReferenceOverridden" name="Filter">
<element type="IncludeTypePattern" name="org.springframework.data.mongodb.**"/> <element type="IncludeTypePattern" name="org.springframework.data.mongodb.**"/>
@@ -32,8 +32,15 @@
<element type="IncludeTypePattern" name="**.config.**"/> <element type="IncludeTypePattern" name="**.config.**"/>
</element> </element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/> <dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Implementation" type="AllowedDependency"/> <dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Implementation" type="AllowedDependency"/>
</element> </element>
<element type="Subsystem" name="CDI">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.cdi.**"/>
</element>
<stereotype name="Unrestricted"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Config" type="AllowedDependency"/> <dependency toName="Project|spring-data-mongodb::Layer|Config" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/> <dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
</element> </element>
@@ -75,6 +82,11 @@
</element> </element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/> <dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
</element> </element>
<element type="Subsystem" name="Script">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.script.**"/>
</element>
</element>
<element type="Subsystem" name="Conversion"> <element type="Subsystem" name="Conversion">
<element type="TypeFilter" name="Assignment"> <element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.core.convert.**"/> <element type="IncludeTypePattern" name="**.core.convert.**"/>
@@ -82,6 +94,7 @@
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/> <dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/> <dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/> <dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Script" type="AllowedDependency"/>
</element> </element>
<element type="Subsystem" name="SpEL"> <element type="Subsystem" name="SpEL">
<element type="TypeFilter" name="Assignment"> <element type="TypeFilter" name="Assignment">
@@ -104,6 +117,11 @@
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/> <dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/> <dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
</element> </element>
<element type="Subsystem" name="MapReduce">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.mapreduce.**"/>
</element>
</element>
<element type="Subsystem" name="Core"> <element type="Subsystem" name="Core">
<element type="TypeFilter" name="Assignment"> <element type="TypeFilter" name="Assignment">
<element type="WeakTypePattern" name="**.core.**"/> <element type="WeakTypePattern" name="**.core.**"/>
@@ -112,8 +130,10 @@
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Conversion" type="AllowedDependency"/> <dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Conversion" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/> <dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Index" type="AllowedDependency"/> <dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Index" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|MapReduce" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/> <dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/> <dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Script" type="AllowedDependency"/>
</element> </element>
<element type="Subsystem" name="Util"> <element type="Subsystem" name="Util">
<element type="TypeFilter" name="Assignment"> <element type="TypeFilter" name="Assignment">
@@ -168,7 +188,32 @@
</element> </element>
<element type="Subsystem" name="Querydsl"> <element type="Subsystem" name="Querydsl">
<element type="TypeFilter" name="Assignment"> <element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="com.mysema.query.**"/> <element type="IncludeTypePattern" name="com.querydsl.**"/>
</element>
</element>
<element type="Subsystem" name="Slf4j">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="org.slf4j.**"/>
</element>
</element>
<element type="Subsystem" name="Jackson">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="com.fasterxml.jackson.**"/>
</element>
</element>
<element type="Subsystem" name="DOM">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="org.w3c.dom.**"/>
</element>
</element>
<element type="Subsystem" name="AOP Alliance">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="org.aopalliance.**"/>
</element>
</element>
<element type="Subsystem" name="Guava">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="com.google.common.**"/>
</element> </element>
</element> </element>
</architecture> </architecture>

View File

@@ -2,7 +2,7 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>
<artifactId>spring-data-mongodb</artifactId> <artifactId>spring-data-mongodb</artifactId>
<name>Spring Data MongoDB - Core</name> <name>Spring Data MongoDB - Core</name>
@@ -11,18 +11,18 @@
<parent> <parent>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId> <artifactId>spring-data-mongodb-parent</artifactId>
<version>1.6.0.M1</version> <version>1.10.0.M1</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>
<properties> <properties>
<validation>1.0.0.GA</validation>
<objenesis>1.3</objenesis> <objenesis>1.3</objenesis>
<equalsverifier>1.5</equalsverifier>
</properties> </properties>
<dependencies> <dependencies>
<!-- Spring --> <!-- Spring -->
<dependency> <dependency>
<groupId>org.springframework</groupId> <groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId> <artifactId>spring-tx</artifactId>
@@ -50,7 +50,7 @@
<artifactId>spring-expression</artifactId> <artifactId>spring-expression</artifactId>
</dependency> </dependency>
<!-- Spring Data --> <!-- Spring Data -->
<dependency> <dependency>
<groupId>${project.groupId}</groupId> <groupId>${project.groupId}</groupId>
<artifactId>spring-data-commons</artifactId> <artifactId>spring-data-commons</artifactId>
@@ -58,14 +58,14 @@
</dependency> </dependency>
<dependency> <dependency>
<groupId>com.mysema.querydsl</groupId> <groupId>com.querydsl</groupId>
<artifactId>querydsl-mongodb</artifactId> <artifactId>querydsl-mongodb</artifactId>
<version>${querydsl}</version> <version>${querydsl}</version>
<optional>true</optional> <optional>true</optional>
</dependency> </dependency>
<dependency> <dependency>
<groupId>com.mysema.querydsl</groupId> <groupId>com.querydsl</groupId>
<artifactId>querydsl-apt</artifactId> <artifactId>querydsl-apt</artifactId>
<version>${querydsl}</version> <version>${querydsl}</version>
<scope>provided</scope> <scope>provided</scope>
@@ -77,7 +77,7 @@
<version>1.0</version> <version>1.0</version>
<optional>true</optional> <optional>true</optional>
</dependency> </dependency>
<!-- CDI --> <!-- CDI -->
<dependency> <dependency>
<groupId>javax.enterprise</groupId> <groupId>javax.enterprise</groupId>
@@ -86,21 +86,21 @@
<scope>provided</scope> <scope>provided</scope>
<optional>true</optional> <optional>true</optional>
</dependency> </dependency>
<dependency> <dependency>
<groupId>javax.el</groupId> <groupId>javax.el</groupId>
<artifactId>el-api</artifactId> <artifactId>el-api</artifactId>
<version>${cdi}</version> <version>${cdi}</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.apache.openwebbeans.test</groupId> <groupId>org.apache.openwebbeans.test</groupId>
<artifactId>cditest-owb</artifactId> <artifactId>cditest-owb</artifactId>
<version>${webbeans}</version> <version>${webbeans}</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<dependency> <dependency>
<groupId>javax.servlet</groupId> <groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId> <artifactId>servlet-api</artifactId>
@@ -115,7 +115,7 @@
<version>${validation}</version> <version>${validation}</version>
<optional>true</optional> <optional>true</optional>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.objenesis</groupId> <groupId>org.objenesis</groupId>
<artifactId>objenesis</artifactId> <artifactId>objenesis</artifactId>
@@ -129,23 +129,50 @@
<version>4.2.0.Final</version> <version>4.2.0.Final</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<dependency> <dependency>
<groupId>joda-time</groupId> <groupId>joda-time</groupId>
<artifactId>joda-time</artifactId> <artifactId>joda-time</artifactId>
<version>${jodatime}</version> <version>${jodatime}</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<dependency>
<groupId>org.threeten</groupId>
<artifactId>threetenbp</artifactId>
<version>${threetenbp}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson}</version>
<optional>true</optional>
</dependency>
<dependency> <dependency>
<groupId>org.slf4j</groupId> <groupId>org.slf4j</groupId>
<artifactId>jul-to-slf4j</artifactId> <artifactId>jul-to-slf4j</artifactId>
<version>${slf4j}</version> <version>${slf4j}</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<dependency>
<groupId>nl.jqno.equalsverifier</groupId>
<artifactId>equalsverifier</artifactId>
<version>${equalsverifier}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-webmvc</artifactId>
<scope>test</scope>
</dependency>
</dependencies> </dependencies>
<build> <build>
<plugins> <plugins>
@@ -155,7 +182,7 @@
<version>${apt}</version> <version>${apt}</version>
<dependencies> <dependencies>
<dependency> <dependency>
<groupId>com.mysema.querydsl</groupId> <groupId>com.querydsl</groupId>
<artifactId>querydsl-apt</artifactId> <artifactId>querydsl-apt</artifactId>
<version>${querydsl}</version> <version>${querydsl}</version>
</dependency> </dependency>
@@ -189,9 +216,14 @@
<systemPropertyVariables> <systemPropertyVariables>
<java.util.logging.config.file>src/test/resources/logging.properties</java.util.logging.config.file> <java.util.logging.config.file>src/test/resources/logging.properties</java.util.logging.config.file>
</systemPropertyVariables> </systemPropertyVariables>
<properties>
<property>
<name>listener</name>
<value>org.springframework.data.mongodb.test.util.CleanMongoDBJunitRunListener</value>
</property>
</properties>
</configuration> </configuration>
</plugin> </plugin>
</plugins> </plugins>
</build> </build>
</project> </project>

View File

@@ -0,0 +1,61 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import java.util.List;
import org.springframework.dao.DataAccessException;
import com.mongodb.BulkWriteError;
import com.mongodb.BulkWriteException;
import com.mongodb.BulkWriteResult;
/**
* Is thrown when errors occur during bulk operations.
*
* @author Tobias Trelle
* @author Oliver Gierke
* @since 1.9
*/
public class BulkOperationException extends DataAccessException {
private static final long serialVersionUID = 73929601661154421L;
private final List<BulkWriteError> errors;
private final BulkWriteResult result;
/**
* Creates a new {@link BulkOperationException} with the given message and source {@link BulkWriteException}.
*
* @param message must not be {@literal null}.
* @param source must not be {@literal null}.
*/
public BulkOperationException(String message, BulkWriteException source) {
super(message, source);
this.errors = source.getWriteErrors();
this.result = source.getWriteResult();
}
public List<BulkWriteError> getErrors() {
return errors;
}
public BulkWriteResult getResult() {
return result;
}
}

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011-2014 the original author or authors. * Copyright 2011-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -46,6 +46,7 @@ import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
import com.mongodb.Mongo; import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/** /**
* Base class for Spring Data MongoDB configuration using JavaConfig. * Base class for Spring Data MongoDB configuration using JavaConfig.
@@ -54,6 +55,7 @@ import com.mongodb.Mongo;
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
* @author Ryan Tenney * @author Ryan Tenney
* @author Christoph Strobl
*/ */
@Configuration @Configuration
public abstract class AbstractMongoConfiguration { public abstract class AbstractMongoConfiguration {
@@ -70,7 +72,10 @@ public abstract class AbstractMongoConfiguration {
* returned by {@link #getDatabaseName()} later on effectively. * returned by {@link #getDatabaseName()} later on effectively.
* *
* @return * @return
* @deprecated since 1.7. {@link MongoClient} should hold authentication data within
* {@link MongoClient#getCredentialsList()}
*/ */
@Deprecated
protected String getAuthenticationDatabaseName() { protected String getAuthenticationDatabaseName() {
return null; return null;
} }
@@ -129,7 +134,10 @@ public abstract class AbstractMongoConfiguration {
* be used. * be used.
* *
* @return * @return
* @deprecated since 1.7. {@link MongoClient} should hold authentication data within
* {@link MongoClient#getCredentialsList()}
*/ */
@Deprecated
protected UserCredentials getUserCredentials() { protected UserCredentials getUserCredentials() {
return null; return null;
} }

View File

@@ -0,0 +1,85 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.core.MongoClientFactoryBean;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* Parser for {@code mongo-client} definitions.
*
* @author Christoph Strobl
* @since 1.7
*/
public class MongoClientParser implements BeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
Object source = parserContext.extractSource(element);
String id = element.getAttribute("id");
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(MongoClientFactoryBean.class);
ParsingUtils.setPropertyValue(builder, element, "port", "port");
ParsingUtils.setPropertyValue(builder, element, "host", "host");
ParsingUtils.setPropertyValue(builder, element, "credentials", "credentials");
MongoParsingUtils.parseMongoClientOptions(element, builder);
MongoParsingUtils.parseReplicaSet(element, builder);
String defaultedId = StringUtils.hasText(id) ? id : BeanNames.MONGO_BEAN_NAME;
parserContext.pushContainingComponent(new CompositeComponentDefinition("Mongo", source));
BeanComponentDefinition mongoComponent = helper.getComponent(builder, defaultedId);
parserContext.registerBeanComponent(mongoComponent);
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(MongoParsingUtils
.getServerAddressPropertyEditorBuilder());
parserContext.registerBeanComponent(serverAddressPropertyEditor);
BeanComponentDefinition writeConcernEditor = helper.getComponent(MongoParsingUtils
.getWriteConcernPropertyEditorBuilder());
parserContext.registerBeanComponent(writeConcernEditor);
BeanComponentDefinition readPreferenceEditor = helper.getComponent(MongoParsingUtils
.getReadPreferencePropertyEditorBuilder());
parserContext.registerBeanComponent(readPreferenceEditor);
BeanComponentDefinition credentialsEditor = helper.getComponent(MongoParsingUtils
.getMongoCredentialPropertyEditor());
parserContext.registerBeanComponent(credentialsEditor);
parserContext.popAndRegisterContainingComponent();
return mongoComponent.getBeanDefinition();
}
}

View File

@@ -0,0 +1,198 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.Properties;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.springframework.util.StringUtils;
import com.mongodb.MongoCredential;
/**
* Parse a {@link String} to a Collection of {@link MongoCredential}.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
private static final Pattern GROUP_PATTERN = Pattern.compile("(\\\\?')(.*?)\\1");
private static final String AUTH_MECHANISM_KEY = "uri.authMechanism";
private static final String USERNAME_PASSWORD_DELIMINATOR = ":";
private static final String DATABASE_DELIMINATOR = "@";
private static final String OPTIONS_DELIMINATOR = "?";
private static final String OPTION_VALUE_DELIMINATOR = "&";
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(String text) throws IllegalArgumentException {
if (!StringUtils.hasText(text)) {
return;
}
List<MongoCredential> credentials = new ArrayList<MongoCredential>();
for (String credentialString : extractCredentialsString(text)) {
String[] userNameAndPassword = extractUserNameAndPassword(credentialString);
String database = extractDB(credentialString);
Properties options = extractOptions(credentialString);
if (!options.isEmpty()) {
if (options.containsKey(AUTH_MECHANISM_KEY)) {
String authMechanism = options.getProperty(AUTH_MECHANISM_KEY);
if (MongoCredential.GSSAPI_MECHANISM.equals(authMechanism)) {
verifyUserNamePresent(userNameAndPassword);
credentials.add(MongoCredential.createGSSAPICredential(userNameAndPassword[0]));
} else if (MongoCredential.MONGODB_CR_MECHANISM.equals(authMechanism)) {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.createMongoCRCredential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if (MongoCredential.MONGODB_X509_MECHANISM.equals(authMechanism)) {
verifyUserNamePresent(userNameAndPassword);
credentials.add(MongoCredential.createMongoX509Credential(userNameAndPassword[0]));
} else if (MongoCredential.PLAIN_MECHANISM.equals(authMechanism)) {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.createPlainCredential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if (MongoCredential.SCRAM_SHA_1_MECHANISM.equals(authMechanism)) {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.createScramSha1Credential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else {
throw new IllegalArgumentException(
String.format("Cannot create MongoCredentials for unknown auth mechanism '%s'!", authMechanism));
}
}
} else {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(
MongoCredential.createCredential(userNameAndPassword[0], database, userNameAndPassword[1].toCharArray()));
}
}
setValue(credentials);
}
private List<String> extractCredentialsString(String source) {
Matcher matcher = GROUP_PATTERN.matcher(source);
List<String> list = new ArrayList<String>();
while (matcher.find()) {
String value = StringUtils.trimLeadingCharacter(matcher.group(), '\'');
list.add(StringUtils.trimTrailingCharacter(value, '\''));
}
if (!list.isEmpty()) {
return list;
}
return Arrays.asList(source.split(","));
}
private static String[] extractUserNameAndPassword(String text) {
int index = text.lastIndexOf(DATABASE_DELIMINATOR);
index = index != -1 ? index : text.lastIndexOf(OPTIONS_DELIMINATOR);
return index == -1 ? new String[] {} : text.substring(0, index).split(USERNAME_PASSWORD_DELIMINATOR);
}
private static String extractDB(String text) {
int dbSeperationIndex = text.lastIndexOf(DATABASE_DELIMINATOR);
if (dbSeperationIndex == -1) {
return "";
}
String tmp = text.substring(dbSeperationIndex + 1);
int optionsSeperationIndex = tmp.lastIndexOf(OPTIONS_DELIMINATOR);
return optionsSeperationIndex > -1 ? tmp.substring(0, optionsSeperationIndex) : tmp;
}
private static Properties extractOptions(String text) {
int optionsSeperationIndex = text.lastIndexOf(OPTIONS_DELIMINATOR);
int dbSeperationIndex = text.lastIndexOf(OPTIONS_DELIMINATOR);
if (optionsSeperationIndex == -1 || dbSeperationIndex > optionsSeperationIndex) {
return new Properties();
}
Properties properties = new Properties();
for (String option : text.substring(optionsSeperationIndex + 1).split(OPTION_VALUE_DELIMINATOR)) {
String[] optionArgs = option.split("=");
properties.put(optionArgs[0], optionArgs[1]);
}
return properties;
}
private static void verifyUsernameAndPasswordPresent(String[] source) {
verifyUserNamePresent(source);
if (source.length != 2) {
throw new IllegalArgumentException(
"Credentials need to specify username and password like in 'username:password@database'!");
}
}
private static void verifyDatabasePresent(String source) {
if (!StringUtils.hasText(source)) {
throw new IllegalArgumentException("Credentials need to specify database like in 'username:password@database'!");
}
}
private static void verifyUserNamePresent(String[] source) {
if (source.length == 0 || !StringUtils.hasText(source[0])) {
throw new IllegalArgumentException("Credentials need to specify username!");
}
}
}

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011-2014 by the original author(s). * Copyright 2011-2015 by the original author(s).
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -18,6 +18,10 @@ package org.springframework.data.mongodb.config;
import static org.springframework.data.config.ParsingUtils.*; import static org.springframework.data.config.ParsingUtils.*;
import static org.springframework.data.mongodb.config.MongoParsingUtils.*; import static org.springframework.data.mongodb.config.MongoParsingUtils.*;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import org.springframework.beans.factory.BeanDefinitionStoreException; import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.BeanDefinition; import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition; import org.springframework.beans.factory.parsing.BeanComponentDefinition;
@@ -34,6 +38,7 @@ import org.springframework.util.StringUtils;
import org.w3c.dom.Element; import org.w3c.dom.Element;
import com.mongodb.Mongo; import com.mongodb.Mongo;
import com.mongodb.MongoClientURI;
import com.mongodb.MongoURI; import com.mongodb.MongoURI;
/** /**
@@ -42,9 +47,22 @@ import com.mongodb.MongoURI;
* @author Jon Brisbin * @author Jon Brisbin
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
* @author Christoph Strobl
* @author Viktor Khoroshko
*/ */
public class MongoDbFactoryParser extends AbstractBeanDefinitionParser { public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
private static final Set<String> MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES;
static {
Set<String> mongoUriAllowedAdditionalAttributes = new HashSet<String>();
mongoUriAllowedAdditionalAttributes.add("id");
mongoUriAllowedAdditionalAttributes.add("write-concern");
MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES = Collections.unmodifiableSet(mongoUriAllowedAdditionalAttributes);
}
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext) * @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
@@ -64,29 +82,25 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
@Override @Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) { protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
Object source = parserContext.extractSource(element);
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
String uri = element.getAttribute("uri");
String mongoRef = element.getAttribute("mongo-ref");
String dbname = element.getAttribute("dbname");
BeanDefinition userCredentials = getUserCredentialsBeanDefinition(element, parserContext);
// Common setup // Common setup
BeanDefinitionBuilder dbFactoryBuilder = BeanDefinitionBuilder.genericBeanDefinition(SimpleMongoDbFactory.class); BeanDefinitionBuilder dbFactoryBuilder = BeanDefinitionBuilder.genericBeanDefinition(SimpleMongoDbFactory.class);
setPropertyValue(dbFactoryBuilder, element, "write-concern", "writeConcern"); setPropertyValue(dbFactoryBuilder, element, "write-concern", "writeConcern");
if (StringUtils.hasText(uri)) { BeanDefinition mongoUri = getMongoUri(element, parserContext);
if (StringUtils.hasText(mongoRef) || StringUtils.hasText(dbname) || userCredentials != null) {
parserContext.getReaderContext().error("Configure either Mongo URI or details individually!", source);
}
dbFactoryBuilder.addConstructorArgValue(getMongoUri(uri)); if (mongoUri != null) {
dbFactoryBuilder.addConstructorArgValue(mongoUri);
return getSourceBeanDefinition(dbFactoryBuilder, parserContext, element); return getSourceBeanDefinition(dbFactoryBuilder, parserContext, element);
} }
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
String mongoRef = element.getAttribute("mongo-ref");
String dbname = element.getAttribute("dbname");
BeanDefinition userCredentials = getUserCredentialsBeanDefinition(element, parserContext);
// Defaulting // Defaulting
if (StringUtils.hasText(mongoRef)) { if (StringUtils.hasText(mongoRef)) {
dbFactoryBuilder.addConstructorArgReference(mongoRef); dbFactoryBuilder.addConstructorArgReference(mongoRef);
@@ -147,14 +161,42 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
} }
/** /**
* Creates a {@link BeanDefinition} for a {@link MongoURI}. * Creates a {@link BeanDefinition} for a {@link MongoURI} or {@link MongoClientURI} depending on configured
* attributes. <br />
* Errors when configured element contains {@literal uri} or {@literal client-uri} along with other attributes except
* {@literal write-concern} and/or {@literal id}.
* *
* @param uri * @param element must not be {@literal null}.
* @return * @param parserContext
* @return {@literal null} in case no client-/uri defined.
*/ */
private BeanDefinition getMongoUri(String uri) { private BeanDefinition getMongoUri(Element element, ParserContext parserContext) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(MongoURI.class); boolean hasClientUri = element.hasAttribute("client-uri");
if (!hasClientUri && !element.hasAttribute("uri")) {
return null;
}
int allowedAttributesCount = 1;
for (String attribute : MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES) {
if (element.hasAttribute(attribute)) {
allowedAttributesCount++;
}
}
if (element.getAttributes().getLength() > allowedAttributesCount) {
parserContext.getReaderContext().error(
"Configure either " + (hasClientUri ? "Mongo Client URI" : "Mongo URI") + " or details individually!",
parserContext.extractSource(element));
}
Class<?> type = hasClientUri ? MongoClientURI.class : MongoURI.class;
String uri = hasClientUri ? element.getAttribute("client-uri") : element.getAttribute("uri");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(type);
builder.addConstructorArgValue(uri); builder.addConstructorArgValue(uri);
return builder.getBeanDefinition(); return builder.getBeanDefinition();

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011-2014 the original author or authors. * Copyright 2011-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -22,6 +22,7 @@ import org.springframework.beans.factory.xml.NamespaceHandlerSupport;
* *
* @author Oliver Gierke * @author Oliver Gierke
* @author Martin Baumgartner * @author Martin Baumgartner
* @author Christoph Strobl
*/ */
public class MongoNamespaceHandler extends NamespaceHandlerSupport { public class MongoNamespaceHandler extends NamespaceHandlerSupport {
@@ -33,6 +34,7 @@ public class MongoNamespaceHandler extends NamespaceHandlerSupport {
registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser()); registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser());
registerBeanDefinitionParser("mongo", new MongoParser()); registerBeanDefinitionParser("mongo", new MongoParser());
registerBeanDefinitionParser("mongo-client", new MongoClientParser());
registerBeanDefinitionParser("db-factory", new MongoDbFactoryParser()); registerBeanDefinitionParser("db-factory", new MongoDbFactoryParser());
registerBeanDefinitionParser("jmx", new MongoJmxParser()); registerBeanDefinitionParser("jmx", new MongoJmxParser());
registerBeanDefinitionParser("auditing", new MongoAuditingBeanDefinitionParser()); registerBeanDefinitionParser("auditing", new MongoAuditingBeanDefinitionParser());

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011-2014 the original author or authors. * Copyright 2011-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -15,14 +15,10 @@
*/ */
package org.springframework.data.mongodb.config; package org.springframework.data.mongodb.config;
import java.util.Map;
import org.springframework.beans.factory.config.BeanDefinition; import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.parsing.BeanComponentDefinition; import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition; import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder; import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser; import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext; import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder; import org.springframework.data.config.BeanComponentDefinitionBuilder;
@@ -36,6 +32,7 @@ import org.w3c.dom.Element;
* *
* @author Mark Pollack * @author Mark Pollack
* @author Oliver Gierke * @author Oliver Gierke
* @author Christoph Strobl
*/ */
public class MongoParser implements BeanDefinitionParser { public class MongoParser implements BeanDefinitionParser {
@@ -64,7 +61,8 @@ public class MongoParser implements BeanDefinitionParser {
BeanComponentDefinition mongoComponent = helper.getComponent(builder, defaultedId); BeanComponentDefinition mongoComponent = helper.getComponent(builder, defaultedId);
parserContext.registerBeanComponent(mongoComponent); parserContext.registerBeanComponent(mongoComponent);
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(registerServerAddressPropertyEditor()); BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(MongoParsingUtils
.getServerAddressPropertyEditorBuilder());
parserContext.registerBeanComponent(serverAddressPropertyEditor); parserContext.registerBeanComponent(serverAddressPropertyEditor);
BeanComponentDefinition writeConcernPropertyEditor = helper.getComponent(MongoParsingUtils BeanComponentDefinition writeConcernPropertyEditor = helper.getComponent(MongoParsingUtils
.getWriteConcernPropertyEditorBuilder()); .getWriteConcernPropertyEditorBuilder());
@@ -75,19 +73,4 @@ public class MongoParser implements BeanDefinitionParser {
return mongoComponent.getBeanDefinition(); return mongoComponent.getBeanDefinition();
} }
/**
* One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
* container.
*/
private BeanDefinitionBuilder registerServerAddressPropertyEditor() {
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011-2013 the original author or authors. * Copyright 2011-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -24,6 +24,7 @@ import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.BeanDefinitionBuilder; import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedMap; import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser; import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.data.mongodb.core.MongoClientOptionsFactoryBean;
import org.springframework.data.mongodb.core.MongoOptionsFactoryBean; import org.springframework.data.mongodb.core.MongoOptionsFactoryBean;
import org.springframework.util.xml.DomUtils; import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element; import org.w3c.dom.Element;
@@ -33,13 +34,13 @@ import org.w3c.dom.Element;
* *
* @author Mark Pollack * @author Mark Pollack
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
* @author Christoph Strobl
*/ */
@SuppressWarnings("deprecation")
abstract class MongoParsingUtils { abstract class MongoParsingUtils {
private MongoParsingUtils() { private MongoParsingUtils() {}
}
/** /**
* Parses the mongo replica-set element. * Parses the mongo replica-set element.
@@ -54,12 +55,14 @@ abstract class MongoParsingUtils {
} }
/** /**
* Parses the mongo:options sub-element. Populates the given attribute factory with the proper attributes. * Parses the {@code mongo:options} sub-element. Populates the given attribute factory with the proper attributes.
* *
* @return true if parsing actually occured, false otherwise * @return true if parsing actually occured, {@literal false} otherwise
*/ */
static boolean parseMongoOptions(Element element, BeanDefinitionBuilder mongoBuilder) { static boolean parseMongoOptions(Element element, BeanDefinitionBuilder mongoBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "options"); Element optionsElement = DomUtils.getChildElementByTagName(element, "options");
if (optionsElement == null) { if (optionsElement == null) {
return false; return false;
} }
@@ -80,13 +83,58 @@ abstract class MongoParsingUtils {
setPropertyValue(optionsDefBuilder, optionsElement, "write-timeout", "writeTimeout"); setPropertyValue(optionsDefBuilder, optionsElement, "write-timeout", "writeTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "write-fsync", "writeFsync"); setPropertyValue(optionsDefBuilder, optionsElement, "write-fsync", "writeFsync");
setPropertyValue(optionsDefBuilder, optionsElement, "slave-ok", "slaveOk"); setPropertyValue(optionsDefBuilder, optionsElement, "slave-ok", "slaveOk");
setPropertyValue(optionsDefBuilder, optionsElement, "ssl", "ssl"); setPropertyValue(optionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(optionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory"); setPropertyReference(optionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
mongoBuilder.addPropertyValue("mongoOptions", optionsDefBuilder.getBeanDefinition()); mongoBuilder.addPropertyValue("mongoOptions", optionsDefBuilder.getBeanDefinition());
return true; return true;
} }
/**
* Parses the {@code mongo:client-options} sub-element. Populates the given attribute factory with the proper
* attributes.
*
* @param element must not be {@literal null}.
* @param mongoClientBuilder must not be {@literal null}.
* @return
* @since 1.7
*/
public static boolean parseMongoClientOptions(Element element, BeanDefinitionBuilder mongoClientBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "client-options");
if (optionsElement == null) {
return false;
}
BeanDefinitionBuilder clientOptionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoClientOptionsFactoryBean.class);
setPropertyValue(clientOptionsDefBuilder, optionsElement, "description", "description");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-connections-per-host", "minConnectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connections-per-host", "connectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-wait-time", "maxWaitTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-idle-time", "maxConnectionIdleTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-life-time", "maxConnectionLifeTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connect-timeout", "connectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-timeout", "socketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "read-preference", "readPreference");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "write-concern", "writeConcern");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-frequency", "heartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-heartbeat-frequency", "minHeartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-connect-timeout", "heartbeatConnectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-socket-timeout", "heartbeatSocketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(clientOptionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
mongoClientBuilder.addPropertyValue("mongoClientOptions", clientOptionsDefBuilder.getBeanDefinition());
return true;
}
/** /**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a * Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link WriteConcernPropertyEditor}. * {@link WriteConcernPropertyEditor}.
@@ -103,4 +151,56 @@ abstract class MongoParsingUtils {
return builder; return builder;
} }
/**
* One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
* container.
*/
static BeanDefinitionBuilder getServerAddressPropertyEditorBuilder() {
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link ReadPreferencePropertyEditor}.
*
* @return
* @since 1.7
*/
static BeanDefinitionBuilder getReadPreferencePropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.ReadPreference", ReadPreferencePropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link MongoCredentialPropertyEditor}.
*
* @return
* @since 1.7
*/
static BeanDefinitionBuilder getMongoCredentialPropertyEditor() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.MongoCredential[]", MongoCredentialPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
} }

View File

@@ -0,0 +1,66 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import com.mongodb.ReadPreference;
/**
* Parse a {@link String} to a {@link ReadPreference}.
*
* @author Christoph Strobl
* @since 1.7
*/
public class ReadPreferencePropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(String readPreferenceString) throws IllegalArgumentException {
if (readPreferenceString == null) {
return;
}
ReadPreference preference = null;
try {
preference = ReadPreference.valueOf(readPreferenceString);
} catch (IllegalArgumentException ex) {
// ignore this one and try to map it differently
}
if (preference != null) {
setValue(preference);
} else if ("PRIMARY".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.primary());
} else if ("PRIMARY_PREFERRED".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.primaryPreferred());
} else if ("SECONDARY".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.secondary());
} else if ("SECONDARY_PREFERRED".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.secondaryPreferred());
} else if ("NEAREST".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.nearest());
} else {
throw new IllegalArgumentException(String.format("Cannot find matching ReadPreference for %s",
readPreferenceString));
}
}
}

View File

@@ -0,0 +1,145 @@
/*
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.List;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.Pair;
import com.mongodb.BulkWriteResult;
/**
* Bulk operations for insert/update/remove actions on a collection. These bulks operation are available since MongoDB
* 2.6 and make use of low level bulk commands on the protocol level. This interface defines a fluent API to add
* multiple single operations or list of similar operations in sequence which can then eventually be executed by calling
* {@link #execute()}.
*
* @author Tobias Trelle
* @author Oliver Gierke
* @since 1.9
*/
public interface BulkOperations {
/**
* Mode for bulk operation.
**/
public enum BulkMode {
/** Perform bulk operations in sequence. The first error will cancel processing. */
ORDERED,
/** Perform bulk operations in parallel. Processing will continue on errors. */
UNORDERED
};
/**
* Add a single insert to the bulk operation.
*
* @param documents the document to insert, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the insert added, will never be {@literal null}.
*/
BulkOperations insert(Object documents);
/**
* Add a list of inserts to the bulk operation.
*
* @param documents List of documents to insert, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the insert added, will never be {@literal null}.
*/
BulkOperations insert(List<? extends Object> documents);
/**
* Add a single update to the bulk operation. For the update request, only the first matching document is updated.
*
* @param query update criteria, must not be {@literal null}.
* @param update {@link Update} operation to perform, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateOne(Query query, Update update);
/**
* Add a list of updates to the bulk operation. For each update request, only the first matching document is updated.
*
* @param updates Update operations to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateOne(List<Pair<Query, Update>> updates);
/**
* Add a single update to the bulk operation. For the update request, all matching documents are updated.
*
* @param query Update criteria.
* @param update Update operation to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateMulti(Query query, Update update);
/**
* Add a list of updates to the bulk operation. For each update request, all matching documents are updated.
*
* @param updates Update operations to perform.
* @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateMulti(List<Pair<Query, Update>> updates);
/**
* Add a single upsert to the bulk operation. An upsert is an update if the set of matching documents is not empty,
* else an insert.
*
* @param query Update criteria.
* @param update Update operation to perform.
* @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations upsert(Query query, Update update);
/**
* Add a list of upserts to the bulk operation. An upsert is an update if the set of matching documents is not empty,
* else an insert.
*
* @param updates Updates/insert operations to perform.
* @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations upsert(List<Pair<Query, Update>> updates);
/**
* Add a single remove operation to the bulk operation.
*
* @param remove the {@link Query} to select the documents to be removed, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the removal added, will never be {@literal null}.
*/
BulkOperations remove(Query remove);
/**
* Add a list of remove operations to the bulk operation.
*
* @param removes the remove operations to perform, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the removal added, will never be {@literal null}.
*/
BulkOperations remove(List<Query> removes);
/**
* Execute all bulk operations using the default write concern.
*
* @return Result of the bulk operation providing counters for inserts/updates etc.
* @throws {@link BulkOperationException} if an error occurred during bulk processing.
*/
BulkWriteResult execute();
}

View File

@@ -25,7 +25,7 @@ import com.mongodb.DBCursor;
interface CursorPreparer { interface CursorPreparer {
/** /**
* Prepare the given cursor (apply limits, skips and so on). Returns th eprepared cursor. * Prepare the given cursor (apply limits, skips and so on). Returns the prepared cursor.
* *
* @param cursor * @param cursor
*/ */

View File

@@ -0,0 +1,327 @@
/*
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.List;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.Pair;
import org.springframework.util.Assert;
import com.mongodb.BulkWriteException;
import com.mongodb.BulkWriteOperation;
import com.mongodb.BulkWriteRequestBuilder;
import com.mongodb.BulkWriteResult;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.WriteConcern;
/**
* Default implementation for {@link BulkOperations}.
*
* @author Tobias Trelle
* @author Oliver Gierke
* @since 1.9
*/
class DefaultBulkOperations implements BulkOperations {
private final MongoOperations mongoOperations;
private final BulkMode bulkMode;
private final String collectionName;
private final Class<?> entityType;
private PersistenceExceptionTranslator exceptionTranslator;
private WriteConcernResolver writeConcernResolver;
private WriteConcern defaultWriteConcern;
private BulkWriteOperation bulk;
/**
* Creates a new {@link DefaultBulkOperations} for the given {@link MongoOperations}, {@link BulkMode}, collection
* name and {@link WriteConcern}.
*
* @param mongoOperations The underlying {@link MongoOperations}, must not be {@literal null}.
* @param bulkMode must not be {@literal null}.
* @param collectionName Name of the collection to work on, must not be {@literal null} or empty.
* @param entityType the entity type, can be {@literal null}.
*/
DefaultBulkOperations(MongoOperations mongoOperations, BulkMode bulkMode, String collectionName,
Class<?> entityType) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.notNull(bulkMode, "BulkMode must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
this.mongoOperations = mongoOperations;
this.bulkMode = bulkMode;
this.collectionName = collectionName;
this.entityType = entityType;
this.exceptionTranslator = new MongoExceptionTranslator();
this.writeConcernResolver = DefaultWriteConcernResolver.INSTANCE;
this.bulk = initBulkOperation();
}
/**
* Configures the {@link PersistenceExceptionTranslator} to be used. Defaults to {@link MongoExceptionTranslator}.
*
* @param exceptionTranslator can be {@literal null}.
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? new MongoExceptionTranslator() : exceptionTranslator;
}
/**
* Configures the {@link WriteConcernResolver} to be used. Defaults to {@link DefaultWriteConcernResolver}.
*
* @param writeConcernResolver can be {@literal null}.
*/
public void setWriteConcernResolver(WriteConcernResolver writeConcernResolver) {
this.writeConcernResolver = writeConcernResolver == null ? DefaultWriteConcernResolver.INSTANCE
: writeConcernResolver;
}
/**
* Configures the default {@link WriteConcern} to be used. Defaults to {@literal null}.
*
* @param defaultWriteConcern can be {@literal null}.
*/
public void setDefaultWriteConcern(WriteConcern defaultWriteConcern) {
this.defaultWriteConcern = defaultWriteConcern;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#insert(java.lang.Object)
*/
@Override
public BulkOperations insert(Object document) {
Assert.notNull(document, "Document must not be null!");
bulk.insert((DBObject) mongoOperations.getConverter().convertToMongoType(document));
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#insert(java.util.List)
*/
@Override
public BulkOperations insert(List<? extends Object> documents) {
Assert.notNull(documents, "Documents must not be null!");
for (Object document : documents) {
insert(document);
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateOne(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateOne(Query query, Update update) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
return updateOne(Arrays.asList(Pair.of(query, update)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateOne(java.util.List)
*/
@Override
public BulkOperations updateOne(List<Pair<Query, Update>> updates) {
Assert.notNull(updates, "Updates must not be null!");
for (Pair<Query, Update> update : updates) {
update(update.getFirst(), update.getSecond(), false, false);
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateMulti(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateMulti(Query query, Update update) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
return updateMulti(Arrays.asList(Pair.of(query, update)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateMulti(java.util.List)
*/
@Override
public BulkOperations updateMulti(List<Pair<Query, Update>> updates) {
Assert.notNull(updates, "Updates must not be null!");
for (Pair<Query, Update> update : updates) {
update(update.getFirst(), update.getSecond(), false, true);
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#upsert(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
public BulkOperations upsert(Query query, Update update) {
return update(query, update, true, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#upsert(java.util.List)
*/
@Override
public BulkOperations upsert(List<Pair<Query, Update>> updates) {
for (Pair<Query, Update> update : updates) {
upsert(update.getFirst(), update.getSecond());
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#remove(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public BulkOperations remove(Query query) {
Assert.notNull(query, "Query must not be null!");
bulk.find(query.getQueryObject()).remove();
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#remove(java.util.List)
*/
@Override
public BulkOperations remove(List<Query> removes) {
Assert.notNull(removes, "Removals must not be null!");
for (Query query : removes) {
remove(query);
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#executeBulk()
*/
@Override
public BulkWriteResult execute() {
MongoAction action = new MongoAction(defaultWriteConcern, MongoActionOperation.BULK, collectionName, entityType,
null, null);
WriteConcern writeConcern = writeConcernResolver.resolve(action);
try {
return writeConcern == null ? bulk.execute() : bulk.execute(writeConcern);
} catch (BulkWriteException o_O) {
DataAccessException toThrow = exceptionTranslator.translateExceptionIfPossible(o_O);
throw toThrow == null ? o_O : toThrow;
} finally {
this.bulk = initBulkOperation();
}
}
/**
* Performs update and upsert bulk operations.
*
* @param query the {@link Query} to determine documents to update.
* @param update the {@link Update} to perform, must not be {@literal null}.
* @param upsert whether to upsert.
* @param multi whether to issue a multi-update.
* @return the {@link BulkOperations} with the update registered.
*/
private BulkOperations update(Query query, Update update, boolean upsert, boolean multi) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
BulkWriteRequestBuilder builder = bulk.find(query.getQueryObject());
if (upsert) {
if (multi) {
builder.upsert().update(update.getUpdateObject());
} else {
builder.upsert().updateOne(update.getUpdateObject());
}
} else {
if (multi) {
builder.update(update.getUpdateObject());
} else {
builder.updateOne(update.getUpdateObject());
}
}
return this;
}
private final BulkWriteOperation initBulkOperation() {
DBCollection collection = mongoOperations.getCollection(collectionName);
switch (bulkMode) {
case ORDERED:
return collection.initializeOrderedBulkOperation();
case UNORDERED:
return collection.initializeUnorderedBulkOperation();
}
throw new IllegalStateException("BulkMode was null!");
}
}

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011-2014 the original author or authors. * Copyright 2011-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -18,6 +18,8 @@ package org.springframework.data.mongodb.core;
import static org.springframework.data.domain.Sort.Direction.*; import static org.springframework.data.domain.Sort.Direction.*;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.List; import java.util.List;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
@@ -42,6 +44,7 @@ public class DefaultIndexOperations implements IndexOperations {
private static final Double ONE = Double.valueOf(1); private static final Double ONE = Double.valueOf(1);
private static final Double MINUS_ONE = Double.valueOf(-1); private static final Double MINUS_ONE = Double.valueOf(-1);
private static final Collection<String> TWO_D_IDENTIFIERS = Arrays.asList("2d", "2dsphere");
private final MongoOperations mongoOperations; private final MongoOperations mongoOperations;
private final String collectionName; private final String collectionName;
@@ -70,9 +73,9 @@ public class DefaultIndexOperations implements IndexOperations {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException { public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
DBObject indexOptions = indexDefinition.getIndexOptions(); DBObject indexOptions = indexDefinition.getIndexOptions();
if (indexOptions != null) { if (indexOptions != null) {
collection.ensureIndex(indexDefinition.getIndexKeys(), indexOptions); collection.createIndex(indexDefinition.getIndexKeys(), indexOptions);
} else { } else {
collection.ensureIndex(indexDefinition.getIndexKeys()); collection.createIndex(indexDefinition.getIndexKeys());
} }
return null; return null;
} }
@@ -105,10 +108,12 @@ public class DefaultIndexOperations implements IndexOperations {
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#resetIndexCache() * @see org.springframework.data.mongodb.core.IndexOperations#resetIndexCache()
*/ */
@Deprecated
public void resetIndexCache() { public void resetIndexCache() {
mongoOperations.execute(collectionName, new CollectionCallback<Void>() { mongoOperations.execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException { public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
collection.resetIndexCache();
ReflectiveDBCollectionInvoker.resetIndexCache(collection);
return null; return null;
} }
}); });
@@ -141,7 +146,7 @@ public class DefaultIndexOperations implements IndexOperations {
Object value = keyDbObject.get(key); Object value = keyDbObject.get(key);
if ("2d".equals(value)) { if (TWO_D_IDENTIFIERS.contains(value)) {
indexFields.add(IndexField.geo(key)); indexFields.add(IndexField.geo(key));
} else if ("text".equals(value)) { } else if ("text".equals(value)) {

View File

@@ -0,0 +1,188 @@
/*
* Copyright 2014-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static java.util.UUID.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
import org.bson.types.ObjectId;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.script.ExecutableMongoScript;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.DB;
import com.mongodb.MongoException;
/**
* Default implementation of {@link ScriptOperations} capable of saving and executing {@link ServerSideJavaScript}.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class DefaultScriptOperations implements ScriptOperations {
private static final String SCRIPT_COLLECTION_NAME = "system.js";
private static final String SCRIPT_NAME_PREFIX = "func_";
private final MongoOperations mongoOperations;
/**
* Creates new {@link DefaultScriptOperations} using given {@link MongoOperations}.
*
* @param mongoOperations must not be {@literal null}.
*/
public DefaultScriptOperations(MongoOperations mongoOperations) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
this.mongoOperations = mongoOperations;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.ExecutableMongoScript)
*/
@Override
public NamedMongoScript register(ExecutableMongoScript script) {
return register(new NamedMongoScript(generateScriptName(), script));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.NamedMongoScript)
*/
@Override
public NamedMongoScript register(NamedMongoScript script) {
Assert.notNull(script, "Script must not be null!");
mongoOperations.save(script, SCRIPT_COLLECTION_NAME);
return script;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#execute(org.springframework.data.mongodb.core.script.ExecutableMongoScript, java.lang.Object[])
*/
@Override
public Object execute(final ExecutableMongoScript script, final Object... args) {
Assert.notNull(script, "Script must not be null!");
return mongoOperations.execute(new DbCallback<Object>() {
@Override
public Object doInDB(DB db) throws MongoException, DataAccessException {
return db.eval(script.getCode(), convertScriptArgs(args));
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#call(java.lang.String, java.lang.Object[])
*/
@Override
public Object call(final String scriptName, final Object... args) {
Assert.hasText(scriptName, "ScriptName must not be null or empty!");
return mongoOperations.execute(new DbCallback<Object>() {
@Override
public Object doInDB(DB db) throws MongoException, DataAccessException {
return db.eval(String.format("%s(%s)", scriptName, convertAndJoinScriptArgs(args)));
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#exists(java.lang.String)
*/
@Override
public boolean exists(String scriptName) {
Assert.hasText(scriptName, "ScriptName must not be null or empty!");
return mongoOperations.exists(query(where("name").is(scriptName)), NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#getScriptNames()
*/
@Override
public Set<String> getScriptNames() {
List<NamedMongoScript> scripts = mongoOperations.findAll(NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
if (CollectionUtils.isEmpty(scripts)) {
return Collections.emptySet();
}
Set<String> scriptNames = new HashSet<String>();
for (NamedMongoScript script : scripts) {
scriptNames.add(script.getName());
}
return scriptNames;
}
private Object[] convertScriptArgs(Object... args) {
if (ObjectUtils.isEmpty(args)) {
return args;
}
List<Object> convertedValues = new ArrayList<Object>(args.length);
for (Object arg : args) {
convertedValues.add(arg instanceof String ? String.format("'%s'", arg) : this.mongoOperations.getConverter()
.convertToMongoType(arg));
}
return convertedValues.toArray();
}
private String convertAndJoinScriptArgs(Object... args) {
return ObjectUtils.isEmpty(args) ? "" : StringUtils.arrayToCommaDelimitedString(convertScriptArgs(args));
}
/**
* Generate a valid name for the {@literal JavaScript}. MongoDB requires an id of type String for scripts. Calling
* scripts having {@link ObjectId} as id fails. Therefore we create a random UUID without {@code -} (as this won't
* work) an prefix the result with {@link #SCRIPT_NAME_PREFIX}.
*
* @return
*/
private static String generateScriptName() {
return SCRIPT_NAME_PREFIX + randomUUID().toString().replaceAll("-", "");
}
}

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2011-2014 the original author or authors. * Copyright 2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* http://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,15 +13,20 @@
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.springframework.data.mongodb.core.geo; package org.springframework.data.mongodb.core;
import com.mongodb.WriteConcern;
/** /**
* Interface for {@link Metric}s that can be applied to a base scale. * Default {@link WriteConcernResolver} resolving the {@link WriteConcern} from the given {@link MongoAction}.
* *
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metric}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont
*/ */
@Deprecated enum DefaultWriteConcernResolver implements WriteConcernResolver {
public interface Metric extends org.springframework.data.geo.Metric {}
INSTANCE;
public WriteConcern resolve(MongoAction action) {
return action.getDefaultWriteConcern();
}
}

View File

@@ -0,0 +1,72 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Value object to mitigate different representations of geo command execution results in MongoDB.
*
* @author Oliver Gierke
* @soundtrack Fruitcake - Jeff Coffin (The Inside of the Outside)
*/
class GeoCommandStatistics {
private static final GeoCommandStatistics NONE = new GeoCommandStatistics(new BasicDBObject());
private final DBObject source;
/**
* Creates a new {@link GeoCommandStatistics} instance with the given source document.
*
* @param source must not be {@literal null}.
*/
private GeoCommandStatistics(DBObject source) {
Assert.notNull(source, "Source document must not be null!");
this.source = source;
}
/**
* Creates a new {@link GeoCommandStatistics} from the given command result extracting the statistics.
*
* @param commandResult must not be {@literal null}.
* @return
*/
public static GeoCommandStatistics from(DBObject commandResult) {
Assert.notNull(commandResult, "Command result must not be null!");
Object stats = commandResult.get("stats");
return stats == null ? NONE : new GeoCommandStatistics((DBObject) stats);
}
/**
* Returns the average distance reported by the command result. Mitigating a removal of the field in case the command
* didn't return any result introduced in MongoDB 3.2 RC1.
*
* @return
* @see https://jira.mongodb.org/browse/SERVER-21024
*/
public double getAverageDistance() {
Object averageDistance = source.get("avgDistance");
return averageDistance == null ? Double.NaN : (Double) averageDistance;
}
}

View File

@@ -0,0 +1,34 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.context.annotation.Bean;
import org.springframework.data.mongodb.core.geo.GeoJsonModule;
import org.springframework.data.web.config.SpringDataWebConfigurationMixin;
/**
* Configuration class to expose {@link GeoJsonModule} as a Spring bean.
*
* @author Oliver Gierke
*/
@SpringDataWebConfigurationMixin
public class GeoJsonConfiguration {
@Bean
public GeoJsonModule geoJsonModule() {
return new GeoJsonModule();
}
}

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011 the original author or authors. * Copyright 2011-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -20,12 +20,12 @@ import java.util.List;
import org.springframework.data.mongodb.core.index.IndexDefinition; import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo; import org.springframework.data.mongodb.core.index.IndexInfo;
/** /**
* Index operations on a collection. * Index operations on a collection.
* *
* @author Mark Pollack * @author Mark Pollack
* @author Oliver Gierke * @author Oliver Gierke
* @author Christoph Strobl
*/ */
public interface IndexOperations { public interface IndexOperations {
@@ -51,7 +51,11 @@ public interface IndexOperations {
/** /**
* Clears all indices that have not yet been applied to this collection. * Clears all indices that have not yet been applied to this collection.
*
* @deprecated since 1.7. The MongoDB Java driver version 3.0 does no longer support reseting the index cache.
* @throws {@link UnsupportedOperationException} when used with MongoDB Java driver version 3.0.
*/ */
@Deprecated
void resetIndexCache(); void resetIndexCache();
/** /**

View File

@@ -49,7 +49,7 @@ public class MongoAction {
* @param collectionName the collection name, must not be {@literal null} or empty. * @param collectionName the collection name, must not be {@literal null} or empty.
* @param entityType the POJO that is being operated against * @param entityType the POJO that is being operated against
* @param document the converted DBObject from the POJO or Spring Update object * @param document the converted DBObject from the POJO or Spring Update object
* @param query the converted DBOjbect from the Spring Query object * @param query the converted DBObject from the Spring Query object
*/ */
public MongoAction(WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation, public MongoAction(WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation,
String collectionName, Class<?> entityType, DBObject document, DBObject query) { String collectionName, Class<?> entityType, DBObject document, DBObject query) {

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011-2012 the original author or authors. * Copyright 2011-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -25,5 +25,5 @@ package org.springframework.data.mongodb.core;
*/ */
public enum MongoActionOperation { public enum MongoActionOperation {
REMOVE, UPDATE, INSERT, INSERT_LIST, SAVE REMOVE, UPDATE, INSERT, INSERT_LIST, SAVE, BULK;
} }

View File

@@ -0,0 +1,189 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.net.UnknownHostException;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.util.CollectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientOptions;
import com.mongodb.MongoCredential;
import com.mongodb.ServerAddress;
/**
* Convenient factory for configuring MongoDB.
*
* @author Christoph Strobl
* @since 1.7
*/
public class MongoClientFactoryBean extends AbstractFactoryBean<Mongo> implements PersistenceExceptionTranslator {
private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator();
private MongoClientOptions mongoClientOptions;
private String host;
private Integer port;
private List<ServerAddress> replicaSetSeeds;
private List<MongoCredential> credentials;
private PersistenceExceptionTranslator exceptionTranslator = DEFAULT_EXCEPTION_TRANSLATOR;
/**
* Set the {@link MongoClientOptions} to be used when creating {@link MongoClient}.
*
* @param mongoClientOptions
*/
public void setMongoClientOptions(MongoClientOptions mongoClientOptions) {
this.mongoClientOptions = mongoClientOptions;
}
/**
* Set the list of credentials to be used when creating {@link MongoClient}.
*
* @param credentials can be {@literal null}.
*/
public void setCredentials(MongoCredential[] credentials) {
this.credentials = filterNonNullElementsAsList(credentials);
}
/**
* Set the list of {@link ServerAddress} to build up a replica set for.
*
* @param replicaSetSeeds can be {@literal null}.
*/
public void setReplicaSetSeeds(ServerAddress[] replicaSetSeeds) {
this.replicaSetSeeds = filterNonNullElementsAsList(replicaSetSeeds);
}
/**
* Configures the host to connect to.
*
* @param host
*/
public void setHost(String host) {
this.host = host;
}
/**
* Configures the port to connect to.
*
* @param port
*/
public void setPort(int port) {
this.port = port;
}
/**
* Configures the {@link PersistenceExceptionTranslator} to use.
*
* @param exceptionTranslator
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
public Class<? extends Mongo> getObjectType() {
return Mongo.class;
}
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
*/
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
return exceptionTranslator.translateExceptionIfPossible(ex);
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override
protected Mongo createInstance() throws Exception {
if (mongoClientOptions == null) {
mongoClientOptions = MongoClientOptions.builder().build();
}
if (credentials == null) {
credentials = Collections.emptyList();
}
return createMongoClient();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object)
*/
@Override
protected void destroyInstance(Mongo instance) throws Exception {
instance.close();
}
private MongoClient createMongoClient() throws UnknownHostException {
if (!CollectionUtils.isEmpty(replicaSetSeeds)) {
return new MongoClient(replicaSetSeeds, credentials, mongoClientOptions);
}
return new MongoClient(createConfiguredOrDefaultServerAddress(), credentials, mongoClientOptions);
}
private ServerAddress createConfiguredOrDefaultServerAddress() throws UnknownHostException {
ServerAddress defaultAddress = new ServerAddress();
return new ServerAddress(StringUtils.hasText(host) ? host : defaultAddress.getHost(),
port != null ? port.intValue() : defaultAddress.getPort());
}
/**
* Returns the given array as {@link List} with all {@literal null} elements removed.
*
* @param elements the elements to filter <T>, can be {@literal null}.
* @return a new unmodifiable {@link List#} from the given elements without {@literal null}s.
*/
private static <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
}
}

View File

@@ -0,0 +1,295 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import javax.net.SocketFactory;
import javax.net.ssl.SSLSocketFactory;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.data.mongodb.MongoDbFactory;
import com.mongodb.DBDecoderFactory;
import com.mongodb.DBEncoderFactory;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientOptions;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
/**
* A factory bean for construction of a {@link MongoClientOptions} instance.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClientOptions> {
private static final MongoClientOptions DEFAULT_MONGO_OPTIONS = MongoClientOptions.builder().build();
private String description = DEFAULT_MONGO_OPTIONS.getDescription();
private int minConnectionsPerHost = DEFAULT_MONGO_OPTIONS.getMinConnectionsPerHost();
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.getConnectionsPerHost();
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS
.getThreadsAllowedToBlockForConnectionMultiplier();
private int maxWaitTime = DEFAULT_MONGO_OPTIONS.getMaxWaitTime();
private int maxConnectionIdleTime = DEFAULT_MONGO_OPTIONS.getMaxConnectionIdleTime();
private int maxConnectionLifeTime = DEFAULT_MONGO_OPTIONS.getMaxConnectionLifeTime();
private int connectTimeout = DEFAULT_MONGO_OPTIONS.getConnectTimeout();
private int socketTimeout = DEFAULT_MONGO_OPTIONS.getSocketTimeout();
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.isSocketKeepAlive();
private ReadPreference readPreference = DEFAULT_MONGO_OPTIONS.getReadPreference();
private DBDecoderFactory dbDecoderFactory = DEFAULT_MONGO_OPTIONS.getDbDecoderFactory();
private DBEncoderFactory dbEncoderFactory = DEFAULT_MONGO_OPTIONS.getDbEncoderFactory();
private WriteConcern writeConcern = DEFAULT_MONGO_OPTIONS.getWriteConcern();
private SocketFactory socketFactory = DEFAULT_MONGO_OPTIONS.getSocketFactory();
private boolean cursorFinalizerEnabled = DEFAULT_MONGO_OPTIONS.isCursorFinalizerEnabled();
private boolean alwaysUseMBeans = DEFAULT_MONGO_OPTIONS.isAlwaysUseMBeans();
private int heartbeatFrequency = DEFAULT_MONGO_OPTIONS.getHeartbeatFrequency();
private int minHeartbeatFrequency = DEFAULT_MONGO_OPTIONS.getMinHeartbeatFrequency();
private int heartbeatConnectTimeout = DEFAULT_MONGO_OPTIONS.getHeartbeatConnectTimeout();
private int heartbeatSocketTimeout = DEFAULT_MONGO_OPTIONS.getHeartbeatSocketTimeout();
private String requiredReplicaSetName = DEFAULT_MONGO_OPTIONS.getRequiredReplicaSetName();
private boolean ssl;
private SSLSocketFactory sslSocketFactory;
/**
* Set the {@link MongoClient} description.
*
* @param description
*/
public void setDescription(String description) {
this.description = description;
}
/**
* Set the minimum number of connections per host.
*
* @param minConnectionsPerHost
*/
public void setMinConnectionsPerHost(int minConnectionsPerHost) {
this.minConnectionsPerHost = minConnectionsPerHost;
}
/**
* Set the number of connections allowed per host. Will block if run out. Default is 10. System property
* {@code MONGO.POOLSIZE} can override
*
* @param connectionsPerHost
*/
public void setConnectionsPerHost(int connectionsPerHost) {
this.connectionsPerHost = connectionsPerHost;
}
/**
* Set the multiplier for connectionsPerHost for # of threads that can block. Default is 5. If connectionsPerHost is
* 10, and threadsAllowedToBlockForConnectionMultiplier is 5, then 50 threads can block more than that and an
* exception will be thrown.
*
* @param threadsAllowedToBlockForConnectionMultiplier
*/
public void setThreadsAllowedToBlockForConnectionMultiplier(int threadsAllowedToBlockForConnectionMultiplier) {
this.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;
}
/**
* Set the max wait time of a blocking thread for a connection. Default is 12000 ms (2 minutes)
*
* @param maxWaitTime
*/
public void setMaxWaitTime(int maxWaitTime) {
this.maxWaitTime = maxWaitTime;
}
/**
* The maximum idle time for a pooled connection.
*
* @param maxConnectionIdleTime
*/
public void setMaxConnectionIdleTime(int maxConnectionIdleTime) {
this.maxConnectionIdleTime = maxConnectionIdleTime;
}
/**
* Set the maximum life time for a pooled connection.
*
* @param maxConnectionLifeTime
*/
public void setMaxConnectionLifeTime(int maxConnectionLifeTime) {
this.maxConnectionLifeTime = maxConnectionLifeTime;
}
/**
* Set the connect timeout in milliseconds. 0 is default and infinite.
*
* @param connectTimeout
*/
public void setConnectTimeout(int connectTimeout) {
this.connectTimeout = connectTimeout;
}
/**
* Set the socket timeout. 0 is default and infinite.
*
* @param socketTimeout
*/
public void setSocketTimeout(int socketTimeout) {
this.socketTimeout = socketTimeout;
}
/**
* Set the keep alive flag, controls whether or not to have socket keep alive timeout. Defaults to false.
*
* @param socketKeepAlive
*/
public void setSocketKeepAlive(boolean socketKeepAlive) {
this.socketKeepAlive = socketKeepAlive;
}
/**
* Set the {@link ReadPreference}.
*
* @param readPreference
*/
public void setReadPreference(ReadPreference readPreference) {
this.readPreference = readPreference;
}
/**
* Set the {@link WriteConcern} that will be the default value used when asking the {@link MongoDbFactory} for a DB
* object.
*
* @param writeConcern
*/
public void setWriteConcern(WriteConcern writeConcern) {
this.writeConcern = writeConcern;
}
/**
* @param socketFactory
*/
public void setSocketFactory(SocketFactory socketFactory) {
this.socketFactory = socketFactory;
}
/**
* Set the frequency that the driver will attempt to determine the current state of each server in the cluster.
*
* @param heartbeatFrequency
*/
public void setHeartbeatFrequency(int heartbeatFrequency) {
this.heartbeatFrequency = heartbeatFrequency;
}
/**
* In the event that the driver has to frequently re-check a server's availability, it will wait at least this long
* since the previous check to avoid wasted effort.
*
* @param minHeartbeatFrequency
*/
public void setMinHeartbeatFrequency(int minHeartbeatFrequency) {
this.minHeartbeatFrequency = minHeartbeatFrequency;
}
/**
* Set the connect timeout for connections used for the cluster heartbeat.
*
* @param heartbeatConnectTimeout
*/
public void setHeartbeatConnectTimeout(int heartbeatConnectTimeout) {
this.heartbeatConnectTimeout = heartbeatConnectTimeout;
}
/**
* Set the socket timeout for connections used for the cluster heartbeat.
*
* @param heartbeatSocketTimeout
*/
public void setHeartbeatSocketTimeout(int heartbeatSocketTimeout) {
this.heartbeatSocketTimeout = heartbeatSocketTimeout;
}
/**
* Configures the name of the replica set.
*
* @param requiredReplicaSetName
*/
public void setRequiredReplicaSetName(String requiredReplicaSetName) {
this.requiredReplicaSetName = requiredReplicaSetName;
}
/**
* This controls if the driver should us an SSL connection. Defaults to |@literal false}.
*
* @param ssl
*/
public void setSsl(boolean ssl) {
this.ssl = ssl;
}
/**
* Set the {@link SSLSocketFactory} to use for the {@literal SSL} connection. If none is configured here,
* {@link SSLSocketFactory#getDefault()} will be used.
*
* @param sslSocketFactory
*/
public void setSslSocketFactory(SSLSocketFactory sslSocketFactory) {
this.sslSocketFactory = sslSocketFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override
protected MongoClientOptions createInstance() throws Exception {
SocketFactory socketFactoryToUse = ssl ? (sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory
.getDefault()) : this.socketFactory;
return MongoClientOptions.builder() //
.alwaysUseMBeans(this.alwaysUseMBeans) //
.connectionsPerHost(this.connectionsPerHost) //
.connectTimeout(connectTimeout) //
.cursorFinalizerEnabled(cursorFinalizerEnabled) //
.dbDecoderFactory(dbDecoderFactory) //
.dbEncoderFactory(dbEncoderFactory) //
.description(description) //
.heartbeatConnectTimeout(heartbeatConnectTimeout) //
.heartbeatFrequency(heartbeatFrequency) //
.heartbeatSocketTimeout(heartbeatSocketTimeout) //
.maxConnectionIdleTime(maxConnectionIdleTime) //
.maxConnectionLifeTime(maxConnectionLifeTime) //
.maxWaitTime(maxWaitTime) //
.minConnectionsPerHost(minConnectionsPerHost) //
.minHeartbeatFrequency(minHeartbeatFrequency) //
.readPreference(readPreference) //
.requiredReplicaSetName(requiredReplicaSetName) //
.socketFactory(socketFactoryToUse) //
.socketKeepAlive(socketKeepAlive) //
.socketTimeout(socketTimeout) //
.threadsAllowedToBlockForConnectionMultiplier(threadsAllowedToBlockForConnectionMultiplier) //
.writeConcern(writeConcern).build();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
public Class<?> getObjectType() {
return MongoClientOptions.class;
}
}

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2010-2013 the original author or authors. * Copyright 2010-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -18,12 +18,13 @@ package org.springframework.data.mongodb.core;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import org.springframework.data.authentication.UserCredentials; import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException; import org.springframework.data.mongodb.util.MongoClientVersion;
import org.springframework.transaction.support.TransactionSynchronizationManager; import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import com.mongodb.DB; import com.mongodb.DB;
import com.mongodb.Mongo; import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/** /**
* Helper class featuring helper methods for internal MongoDb classes. Mainly intended for internal use within the * Helper class featuring helper methods for internal MongoDb classes. Mainly intended for internal use within the
@@ -34,6 +35,7 @@ import com.mongodb.Mongo;
* @author Oliver Gierke * @author Oliver Gierke
* @author Randy Watler * @author Randy Watler
* @author Thomas Darimont * @author Thomas Darimont
* @author Christoph Strobl
* @since 1.0 * @since 1.0
*/ */
public abstract class MongoDbUtils { public abstract class MongoDbUtils {
@@ -43,9 +45,7 @@ public abstract class MongoDbUtils {
/** /**
* Private constructor to prevent instantiation. * Private constructor to prevent instantiation.
*/ */
private MongoDbUtils() { private MongoDbUtils() {}
}
/** /**
* Obtains a {@link DB} connection for the given {@link Mongo} instance and database name * Obtains a {@link DB} connection for the given {@link Mongo} instance and database name
@@ -65,11 +65,24 @@ public abstract class MongoDbUtils {
* @param databaseName the database name, must not be {@literal null} or empty. * @param databaseName the database name, must not be {@literal null} or empty.
* @param credentials the credentials to use, must not be {@literal null}. * @param credentials the credentials to use, must not be {@literal null}.
* @return the {@link DB} connection * @return the {@link DB} connection
* @deprecated since 1.7. The {@link MongoClient} itself should hold credentials within
* {@link MongoClient#getCredentialsList()}.
*/ */
@Deprecated
public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials) { public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials) {
return getDB(mongo, databaseName, credentials, databaseName); return getDB(mongo, databaseName, credentials, databaseName);
} }
/**
* @param mongo
* @param databaseName
* @param credentials
* @param authenticationDatabaseName
* @return
* @deprecated since 1.7. The {@link MongoClient} itself should hold credentials within
* {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials, public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials,
String authenticationDatabaseName) { String authenticationDatabaseName) {
@@ -109,22 +122,9 @@ public abstract class MongoDbUtils {
LOGGER.debug("Getting Mongo Database name=[{}]", databaseName); LOGGER.debug("Getting Mongo Database name=[{}]", databaseName);
DB db = mongo.getDB(databaseName); DB db = mongo.getDB(databaseName);
boolean credentialsGiven = credentials.hasUsername() && credentials.hasPassword();
DB authDb = databaseName.equals(authenticationDatabaseName) ? db : mongo.getDB(authenticationDatabaseName); if (!(mongo instanceof MongoClient) && requiresAuthDbAuthentication(credentials)) {
ReflectiveDbInvoker.authenticate(mongo, db, credentials, authenticationDatabaseName);
synchronized (authDb) {
if (credentialsGiven && !authDb.isAuthenticated()) {
String username = credentials.getUsername();
String password = credentials.hasPassword() ? credentials.getPassword() : null;
if (!authDb.authenticate(username, password == null ? null : password.toCharArray())) {
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName + "], "
+ credentials.toString(), databaseName, credentials);
}
}
} }
// TX sync active, bind new database to thread // TX sync active, bind new database to thread
@@ -181,16 +181,36 @@ public abstract class MongoDbUtils {
* Perform actual closing of the Mongo DB object, catching and logging any cleanup exceptions thrown. * Perform actual closing of the Mongo DB object, catching and logging any cleanup exceptions thrown.
* *
* @param db the DB to close (may be <code>null</code>) * @param db the DB to close (may be <code>null</code>)
* @deprecated since 1.7. The main use case for this method is to ensure that applications can read their own
* unacknowledged writes, but this is no longer so prevalent since the MongoDB Java driver version 3
* started defaulting to acknowledged writes.
*/ */
@Deprecated
public static void closeDB(DB db) { public static void closeDB(DB db) {
if (db != null) { if (db != null) {
LOGGER.debug("Closing Mongo DB object"); LOGGER.debug("Closing Mongo DB object");
try { try {
db.requestDone(); ReflectiveDbInvoker.requestDone(db);
} catch (Throwable ex) { } catch (Throwable ex) {
LOGGER.debug("Unexpected exception on closing Mongo DB object", ex); LOGGER.debug("Unexpected exception on closing Mongo DB object", ex);
} }
} }
} }
/**
* Check if credentials present. In case we're using a mongo-java-driver version 3 or above we do not have the need
* for authentication as the auth data has to be provided within the MongoClient
*
* @param credentials
* @return
*/
private static boolean requiresAuthDbAuthentication(UserCredentials credentials) {
if (credentials == null || !credentials.hasUsername()) {
return false;
}
return !MongoClientVersion.isMongo3Driver();
}
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2010-2013 the original author or authors. * Copyright 2010-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -15,23 +15,25 @@
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.HashSet;
import java.util.Set;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException; import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.dao.DuplicateKeyException; import org.springframework.dao.DuplicateKeyException;
import org.springframework.dao.InvalidDataAccessApiUsageException; import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.InvalidDataAccessResourceUsageException; import org.springframework.dao.InvalidDataAccessResourceUsageException;
import org.springframework.dao.PermissionDeniedDataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator; import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.BulkOperationException;
import org.springframework.data.mongodb.UncategorizedMongoDbException; import org.springframework.data.mongodb.UncategorizedMongoDbException;
import org.springframework.data.mongodb.util.MongoDbErrorCodes;
import org.springframework.util.ClassUtils;
import com.mongodb.MongoCursorNotFoundException; import com.mongodb.BulkWriteException;
import com.mongodb.MongoException; import com.mongodb.MongoException;
import com.mongodb.MongoException.CursorNotFound;
import com.mongodb.MongoException.DuplicateKey;
import com.mongodb.MongoException.Network;
import com.mongodb.MongoInternalException;
import com.mongodb.MongoServerSelectionException;
import com.mongodb.MongoSocketException;
import com.mongodb.MongoTimeoutException;
/** /**
* Simple {@link PersistenceExceptionTranslator} for Mongo. Convert the given runtime exception to an appropriate * Simple {@link PersistenceExceptionTranslator} for Mongo. Convert the given runtime exception to an appropriate
@@ -40,9 +42,23 @@ import com.mongodb.MongoTimeoutException;
* *
* @author Oliver Gierke * @author Oliver Gierke
* @author Michal Vich * @author Michal Vich
* @author Christoph Strobl
*/ */
public class MongoExceptionTranslator implements PersistenceExceptionTranslator { public class MongoExceptionTranslator implements PersistenceExceptionTranslator {
private static final Set<String> DULICATE_KEY_EXCEPTIONS = new HashSet<String>(
Arrays.asList("MongoException.DuplicateKey", "DuplicateKeyException"));
private static final Set<String> RESOURCE_FAILURE_EXCEPTIONS = new HashSet<String>(
Arrays.asList("MongoException.Network", "MongoSocketException", "MongoException.CursorNotFound",
"MongoCursorNotFoundException", "MongoServerSelectionException", "MongoTimeoutException"));
private static final Set<String> RESOURCE_USAGE_EXCEPTIONS = new HashSet<String>(
Arrays.asList("MongoInternalException"));
private static final Set<String> DATA_INTEGRETY_EXCEPTIONS = new HashSet<String>(
Arrays.asList("WriteConcernException"));
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException) * @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
@@ -51,41 +67,42 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
// Check for well-known MongoException subclasses. // Check for well-known MongoException subclasses.
if (ex instanceof DuplicateKey || ex instanceof DuplicateKeyException) { String exception = ClassUtils.getShortName(ClassUtils.getUserClass(ex.getClass()));
if (DULICATE_KEY_EXCEPTIONS.contains(exception)) {
return new DuplicateKeyException(ex.getMessage(), ex); return new DuplicateKeyException(ex.getMessage(), ex);
} }
if (ex instanceof Network || ex instanceof MongoSocketException) { if (RESOURCE_FAILURE_EXCEPTIONS.contains(exception)) {
return new DataAccessResourceFailureException(ex.getMessage(), ex); return new DataAccessResourceFailureException(ex.getMessage(), ex);
} }
if (ex instanceof CursorNotFound || ex instanceof MongoCursorNotFoundException) { if (RESOURCE_USAGE_EXCEPTIONS.contains(exception)) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof MongoServerSelectionException) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof MongoTimeoutException) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof MongoInternalException) {
return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex); return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex);
} }
if (DATA_INTEGRETY_EXCEPTIONS.contains(exception)) {
return new DataIntegrityViolationException(ex.getMessage(), ex);
}
if (ex instanceof BulkWriteException) {
return new BulkOperationException(ex.getMessage(), (BulkWriteException) ex);
}
// All other MongoExceptions // All other MongoExceptions
if (ex instanceof MongoException) { if (ex instanceof MongoException) {
int code = ((MongoException) ex).getCode(); int code = ((MongoException) ex).getCode();
if (code == 11000 || code == 11001) { if (MongoDbErrorCodes.isDuplicateKeyCode(code)) {
throw new DuplicateKeyException(ex.getMessage(), ex); throw new DuplicateKeyException(ex.getMessage(), ex);
} else if (code == 12000 || code == 13440) { } else if (MongoDbErrorCodes.isDataAccessResourceFailureCode(code)) {
throw new DataAccessResourceFailureException(ex.getMessage(), ex); throw new DataAccessResourceFailureException(ex.getMessage(), ex);
} else if (code == 10003 || code == 12001 || code == 12010 || code == 12011 || code == 12012) { } else if (MongoDbErrorCodes.isInvalidDataAccessApiUsageCode(code) || code == 10003 || code == 12001
|| code == 12010 || code == 12011 || code == 12012) {
throw new InvalidDataAccessApiUsageException(ex.getMessage(), ex); throw new InvalidDataAccessApiUsageException(ex.getMessage(), ex);
} else if (MongoDbErrorCodes.isPermissionDeniedCode(code)) {
throw new PermissionDeniedDataAccessException(ex.getMessage(), ex);
} }
return new UncategorizedMongoDbException(ex.getMessage(), ex); return new UncategorizedMongoDbException(ex.getMessage(), ex);
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2010-2013 the original author or authors. * Copyright 2010-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -20,9 +20,7 @@ import java.util.Collection;
import java.util.Collections; import java.util.Collections;
import java.util.List; import java.util.List;
import org.springframework.beans.factory.DisposableBean; import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator; import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException; import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
@@ -40,12 +38,14 @@ import com.mongodb.WriteConcern;
* @author Graeme Rocher * @author Graeme Rocher
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
* @author Christoph Strobl
* @since 1.0 * @since 1.0
* @deprecated since 1.7. Please use {@link MongoClientFactoryBean} instead.
*/ */
public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, DisposableBean, @Deprecated
PersistenceExceptionTranslator { public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements PersistenceExceptionTranslator {
private Mongo mongo; private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator();
private MongoOptions mongoOptions; private MongoOptions mongoOptions;
private String host; private String host;
@@ -53,9 +53,11 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
private WriteConcern writeConcern; private WriteConcern writeConcern;
private List<ServerAddress> replicaSetSeeds; private List<ServerAddress> replicaSetSeeds;
private List<ServerAddress> replicaPair; private List<ServerAddress> replicaPair;
private PersistenceExceptionTranslator exceptionTranslator = DEFAULT_EXCEPTION_TRANSLATOR;
private PersistenceExceptionTranslator exceptionTranslator = new MongoExceptionTranslator(); /**
* @param mongoOptions
*/
public void setMongoOptions(MongoOptions mongoOptions) { public void setMongoOptions(MongoOptions mongoOptions) {
this.mongoOptions = mongoOptions; this.mongoOptions = mongoOptions;
} }
@@ -66,7 +68,6 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
/** /**
* @deprecated use {@link #setReplicaSetSeeds(ServerAddress[])} instead * @deprecated use {@link #setReplicaSetSeeds(ServerAddress[])} instead
*
* @param replicaPair * @param replicaPair
*/ */
@Deprecated @Deprecated
@@ -75,30 +76,19 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
} }
/** /**
* @param elements the elements to filter <T> * Configures the host to connect to.
* @return a new unmodifiable {@link List#} from the given elements without nulls *
* @param host
*/ */
private <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
}
public void setHost(String host) { public void setHost(String host) {
this.host = host; this.host = host;
} }
/**
* Configures the port to connect to.
*
* @param port
*/
public void setPort(int port) { public void setPort(int port) {
this.port = port; this.port = port;
} }
@@ -112,12 +102,13 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
this.writeConcern = writeConcern; this.writeConcern = writeConcern;
} }
/**
* Configures the {@link PersistenceExceptionTranslator} to use.
*
* @param exceptionTranslator can be {@literal null}.
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) { public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator; this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator;
}
public Mongo getObject() throws Exception {
return mongo;
} }
/* /*
@@ -128,14 +119,6 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
return Mongo.class; return Mongo.class;
} }
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
*/
public boolean isSingleton() {
return true;
}
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException) * @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
@@ -146,10 +129,10 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet() * @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/ */
@SuppressWarnings("deprecation") @Override
public void afterPropertiesSet() throws Exception { protected Mongo createInstance() throws Exception {
Mongo mongo; Mongo mongo;
ServerAddress defaultOptions = new ServerAddress(); ServerAddress defaultOptions = new ServerAddress();
@@ -175,18 +158,42 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
mongo.setWriteConcern(writeConcern); mongo.setWriteConcern(writeConcern);
} }
this.mongo = mongo; return mongo;
}
private boolean isNullOrEmpty(Collection<?> elements) {
return elements == null || elements.isEmpty();
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.beans.factory.DisposableBean#destroy() * @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object)
*/ */
public void destroy() throws Exception { @Override
this.mongo.close(); protected void destroyInstance(Mongo mongo) throws Exception {
mongo.close();
}
private static boolean isNullOrEmpty(Collection<?> elements) {
return elements == null || elements.isEmpty();
}
/**
* Returns the given array as {@link List} with all {@literal null} elements removed.
*
* @param elements the elements to filter <T>
* @return a new unmodifiable {@link List#} from the given elements without nulls
*/
private static <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
} }
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011-2014 the original author or authors. * Copyright 2011-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -19,11 +19,12 @@ import java.util.Collection;
import java.util.List; import java.util.List;
import java.util.Set; import java.util.Set;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.mongodb.core.BulkOperations.BulkMode;
import org.springframework.data.mongodb.core.aggregation.Aggregation; import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults; import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation; import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MongoConverter; import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.mapreduce.GroupBy; import org.springframework.data.mongodb.core.mapreduce.GroupBy;
import org.springframework.data.mongodb.core.mapreduce.GroupByResults; import org.springframework.data.mongodb.core.mapreduce.GroupByResults;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions; import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
@@ -33,10 +34,14 @@ import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery; import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query; import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update; import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.CloseableIterator;
import com.mongodb.CommandResult; import com.mongodb.CommandResult;
import com.mongodb.Cursor;
import com.mongodb.DB;
import com.mongodb.DBCollection; import com.mongodb.DBCollection;
import com.mongodb.DBObject; import com.mongodb.DBObject;
import com.mongodb.ReadPreference;
import com.mongodb.WriteResult; import com.mongodb.WriteResult;
/** /**
@@ -52,7 +57,6 @@ import com.mongodb.WriteResult;
* @author Christoph Strobl * @author Christoph Strobl
* @author Thomas Darimont * @author Thomas Darimont
*/ */
@SuppressWarnings("deprecation")
public interface MongoOperations { public interface MongoOperations {
/** /**
@@ -86,9 +90,23 @@ public interface MongoOperations {
* *
* @param command a MongoDB command * @param command a MongoDB command
* @param options query options to use * @param options query options to use
* @deprecated since 1.7. Please use {@link #executeCommand(DBObject, ReadPreference)}, as the MongoDB Java driver
* version 3 no longer supports this operation.
*/ */
@Deprecated
CommandResult executeCommand(DBObject command, int options); CommandResult executeCommand(DBObject command, int options);
/**
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's data
* access exception hierarchy.
*
* @param command a MongoDB command, must not be {@literal null}.
* @param readPreference read preferences to use, can be {@literal null}.
* @return
* @since 1.7
*/
CommandResult executeCommand(DBObject command, ReadPreference readPreference);
/** /**
* Execute a MongoDB query and iterate over the query results on a per-document basis with a DocumentCallbackHandler. * Execute a MongoDB query and iterate over the query results on a per-document basis with a DocumentCallbackHandler.
* *
@@ -144,9 +162,41 @@ public interface MongoOperations {
* @param <T> return type * @param <T> return type
* @param action callback that specified the MongoDB actions to perform on the DB instance * @param action callback that specified the MongoDB actions to perform on the DB instance
* @return a result object returned by the action or <tt>null</tt> * @return a result object returned by the action or <tt>null</tt>
* @deprecated since 1.7 as the MongoDB Java driver version 3 does not longer support request boundaries via
* {@link DB#requestStart()} and {@link DB#requestDone()}.
*/ */
@Deprecated
<T> T executeInSession(DbCallback<T> action); <T> T executeInSession(DbCallback<T> action);
/**
* Executes the given {@link Query} on the entity collection of the specified {@code entityType} backed by a Mongo DB
* {@link Cursor}.
* <p>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed.
*
* @param <T> element return type
* @param query must not be {@literal null}.
* @param entityType must not be {@literal null}.
* @return will never be {@literal null}.
* @since 1.7
*/
<T> CloseableIterator<T> stream(Query query, Class<T> entityType);
/**
* Executes the given {@link Query} on the entity collection of the specified {@code entityType} and collection backed
* by a Mongo DB {@link Cursor}.
* <p>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed.
*
* @param <T> element return type
* @param query must not be {@literal null}.
* @param entityType must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return will never be {@literal null}.
* @since 1.10
*/
<T> CloseableIterator<T> stream(Query query, Class<T> entityType, String collectionName);
/** /**
* Create an uncapped collection with a name based on the provided entity class. * Create an uncapped collection with a name based on the provided entity class.
* *
@@ -156,7 +206,7 @@ public interface MongoOperations {
<T> DBCollection createCollection(Class<T> entityClass); <T> DBCollection createCollection(Class<T> entityClass);
/** /**
* Create a collect with a name based on the provided entity class using the options. * Create a collection with a name based on the provided entity class using the options.
* *
* @param entityClass class that determines the collection to create * @param entityClass class that determines the collection to create
* @param collectionOptions options to use when creating the collection. * @param collectionOptions options to use when creating the collection.
@@ -173,7 +223,7 @@ public interface MongoOperations {
DBCollection createCollection(String collectionName); DBCollection createCollection(String collectionName);
/** /**
* Create a collect with the provided name and options. * Create a collection with the provided name and options.
* *
* @param collectionName name of the collection * @param collectionName name of the collection
* @param collectionOptions options to use when creating the collection. * @param collectionOptions options to use when creating the collection.
@@ -250,6 +300,42 @@ public interface MongoOperations {
*/ */
IndexOperations indexOps(Class<?> entityClass); IndexOperations indexOps(Class<?> entityClass);
/**
* Returns the {@link ScriptOperations} that can be performed on {@link com.mongodb.DB} level.
*
* @return
* @since 1.7
*/
ScriptOperations scriptOps();
/**
* Returns a new {@link BulkOperations} for the given collection.
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param collectionName the name of the collection to work on, must not be {@literal null} or empty.
* @return {@link BulkOperations} on the named collection
*/
BulkOperations bulkOps(BulkMode mode, String collectionName);
/**
* Returns a new {@link BulkOperations} for the given entity type.
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param entityType the name of the entity class, must not be {@literal null}.
* @return {@link BulkOperations} on the named collection associated of the given entity class.
*/
BulkOperations bulkOps(BulkMode mode, Class<?> entityType);
/**
* Returns a new {@link BulkOperations} for the given entity type and collection name.
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param entityClass the name of the entity class, must not be {@literal null}.
* @param collectionName the name of the collection to work on, must not be {@literal null} or empty.
* @return {@link BulkOperations} on the named collection associated with the given entity class.
*/
BulkOperations bulkOps(BulkMode mode, Class<?> entityType, String collectionName);
/** /**
* Query for a list of objects of type T from the collection used by the entity class. * Query for a list of objects of type T from the collection used by the entity class.
* <p/> * <p/>
@@ -416,7 +502,9 @@ public interface MongoOperations {
/** /**
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}. Will consider entity mapping * Returns {@link GeoResults} for all entities matching the given {@link NearQuery}. Will consider entity mapping
* information to determine the collection the query is ran against. * information to determine the collection the query is ran against. Note, that MongoDB limits the number of results
* by default. Make sure to add an explicit limit to the {@link NearQuery} if you expect a particular number of
* results.
* *
* @param near must not be {@literal null}. * @param near must not be {@literal null}.
* @param entityClass must not be {@literal null}. * @param entityClass must not be {@literal null}.
@@ -425,7 +513,9 @@ public interface MongoOperations {
<T> GeoResults<T> geoNear(NearQuery near, Class<T> entityClass); <T> GeoResults<T> geoNear(NearQuery near, Class<T> entityClass);
/** /**
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}. * Returns {@link GeoResults} for all entities matching the given {@link NearQuery}. Note, that MongoDB limits the
* number of results by default. Make sure to add an explicit limit to the {@link NearQuery} if you expect a
* particular number of results.
* *
* @param near must not be {@literal null}. * @param near must not be {@literal null}.
* @param entityClass must not be {@literal null}. * @param entityClass must not be {@literal null}.
@@ -554,8 +644,8 @@ public interface MongoOperations {
<T> T findById(Object id, Class<T> entityClass, String collectionName); <T> T findById(Object id, Class<T> entityClass, String collectionName);
/** /**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/> * Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}. * <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
* *
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional * @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification. * fields specification.
@@ -566,8 +656,8 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, Class<T> entityClass); <T> T findAndModify(Query query, Update update, Class<T> entityClass);
/** /**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/> * Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}. * <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
* *
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional * @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification. * fields specification.
@@ -579,8 +669,8 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, Class<T> entityClass, String collectionName); <T> T findAndModify(Query query, Update update, Class<T> entityClass, String collectionName);
/** /**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/> * Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking * <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account. * {@link FindAndModifyOptions} into account.
* *
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional * @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
@@ -593,8 +683,8 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass); <T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass);
/** /**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/> * Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking * <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account. * {@link FindAndModifyOptions} into account.
* *
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional * @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
@@ -653,14 +743,28 @@ public interface MongoOperations {
long count(Query query, Class<?> entityClass); long count(Query query, Class<?> entityClass);
/** /**
* Returns the number of documents for the given {@link Query} querying the given collection. * Returns the number of documents for the given {@link Query} querying the given collection. The given {@link Query}
* must solely consist of document field references as we lack type information to map potential property references
* onto document fields. TO make sure the query gets mapped, use {@link #count(Query, Class, String)}.
* *
* @param query * @param query
* @param collectionName must not be {@literal null} or empty. * @param collectionName must not be {@literal null} or empty.
* @return * @return
* @see #count(Query, Class, String)
*/ */
long count(Query query, String collectionName); long count(Query query, String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the given collection using the given entity
* class to map the given {@link Query}.
*
* @param query
* @param entityClass must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return
*/
long count(Query query, Class<?> entityClass, String collectionName);
/** /**
* Insert the object into the collection for the entity type of the object to save. * Insert the object into the collection for the entity type of the object to save.
* <p/> * <p/>
@@ -668,9 +772,9 @@ public interface MongoOperations {
* <p/> * <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a * If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your * String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a * property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" * <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" >
* >Spring's Type Conversion"</a> for more details. * Spring's Type Conversion"</a> for more details.
* <p/> * <p/>
* <p/> * <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method. * Insert is used to initially store the object into the database. To update an existing object use the save method.
@@ -725,9 +829,9 @@ public interface MongoOperations {
* <p/> * <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a * If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your * String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a * property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" * <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" >
* >Spring's Type Conversion"</a> for more details. * Spring's Type Conversion"</a> for more details.
* *
* @param objectToSave the object to store in the collection * @param objectToSave the object to store in the collection
*/ */

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2010-2014 the original author or authors. * Copyright 2010-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -17,41 +17,48 @@ package org.springframework.data.mongodb.core;
import javax.net.ssl.SSLSocketFactory; import javax.net.ssl.SSLSocketFactory;
import org.springframework.beans.factory.FactoryBean; import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.beans.factory.InitializingBean; import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.MongoOptions; import com.mongodb.MongoOptions;
/** /**
* A factory bean for construction of a {@link MongoOptions} instance. * A factory bean for construction of a {@link MongoOptions} instance. In case used with MongoDB Java driver version 3
* * porperties not suppprted by the driver will be ignored.
*
* @author Graeme Rocher * @author Graeme Rocher
* @author Mark Pollack * @author Mark Pollack
* @author Mike Saavedra * @author Mike Saavedra
* @author Thomas Darimont * @author Thomas Darimont
* @author Christoph Strobl
* @deprecated since 1.7. Please use {@link MongoClientOptionsFactoryBean} instead.
*/ */
@SuppressWarnings("deprecation") @Deprecated
public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, InitializingBean { public class MongoOptionsFactoryBean extends AbstractFactoryBean<MongoOptions> {
private static final MongoOptions DEFAULT_MONGO_OPTIONS = new MongoOptions(); private static final MongoOptions DEFAULT_MONGO_OPTIONS = new MongoOptions();
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.connectionsPerHost; private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.getConnectionsPerHost();
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS.threadsAllowedToBlockForConnectionMultiplier; private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS
private int maxWaitTime = DEFAULT_MONGO_OPTIONS.maxWaitTime; .getThreadsAllowedToBlockForConnectionMultiplier();
private int connectTimeout = DEFAULT_MONGO_OPTIONS.connectTimeout; private int maxWaitTime = DEFAULT_MONGO_OPTIONS.getMaxWaitTime();
private int socketTimeout = DEFAULT_MONGO_OPTIONS.socketTimeout; private int connectTimeout = DEFAULT_MONGO_OPTIONS.getConnectTimeout();
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.socketKeepAlive; private int socketTimeout = DEFAULT_MONGO_OPTIONS.getSocketTimeout();
private boolean autoConnectRetry = DEFAULT_MONGO_OPTIONS.autoConnectRetry; private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.isSocketKeepAlive();
private long maxAutoConnectRetryTime = DEFAULT_MONGO_OPTIONS.maxAutoConnectRetryTime; private int writeNumber = DEFAULT_MONGO_OPTIONS.getW();
private int writeNumber = DEFAULT_MONGO_OPTIONS.w; private int writeTimeout = DEFAULT_MONGO_OPTIONS.getWtimeout();
private int writeTimeout = DEFAULT_MONGO_OPTIONS.wtimeout; private boolean writeFsync = DEFAULT_MONGO_OPTIONS.isFsync();
private boolean writeFsync = DEFAULT_MONGO_OPTIONS.fsync;
private boolean slaveOk = DEFAULT_MONGO_OPTIONS.slaveOk; private boolean autoConnectRetry = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getAutoConnectRetry(DEFAULT_MONGO_OPTIONS) : false;
private long maxAutoConnectRetryTime = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getMaxAutoConnectRetryTime(DEFAULT_MONGO_OPTIONS) : -1;
private boolean slaveOk = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getSlaveOk(DEFAULT_MONGO_OPTIONS) : false;
private boolean ssl; private boolean ssl;
private SSLSocketFactory sslSocketFactory; private SSLSocketFactory sslSocketFactory;
private MongoOptions options;
/** /**
* Configures the maximum number of connections allowed per host until we will block. * Configures the maximum number of connections allowed per host until we will block.
* *
@@ -144,7 +151,10 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
/** /**
* Configures whether or not the system retries automatically on a failed connect. This defaults to {@literal false}. * Configures whether or not the system retries automatically on a failed connect. This defaults to {@literal false}.
*
* @deprecated since 1.7.
*/ */
@Deprecated
public void setAutoConnectRetry(boolean autoConnectRetry) { public void setAutoConnectRetry(boolean autoConnectRetry) {
this.autoConnectRetry = autoConnectRetry; this.autoConnectRetry = autoConnectRetry;
} }
@@ -154,7 +164,9 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
* defaults to {@literal 0}, which means to use the default {@literal 15s} if {@link #autoConnectRetry} is on. * defaults to {@literal 0}, which means to use the default {@literal 15s} if {@link #autoConnectRetry} is on.
* *
* @param maxAutoConnectRetryTime the maxAutoConnectRetryTime to set * @param maxAutoConnectRetryTime the maxAutoConnectRetryTime to set
* @deprecated since 1.7
*/ */
@Deprecated
public void setMaxAutoConnectRetryTime(long maxAutoConnectRetryTime) { public void setMaxAutoConnectRetryTime(long maxAutoConnectRetryTime) {
this.maxAutoConnectRetryTime = maxAutoConnectRetryTime; this.maxAutoConnectRetryTime = maxAutoConnectRetryTime;
} }
@@ -163,7 +175,9 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
* Specifies if the driver is allowed to read from secondaries or slaves. Defaults to {@literal false}. * Specifies if the driver is allowed to read from secondaries or slaves. Defaults to {@literal false}.
* *
* @param slaveOk true if the driver should read from secondaries or slaves. * @param slaveOk true if the driver should read from secondaries or slaves.
* @deprecated since 1.7
*/ */
@Deprecated
public void setSlaveOk(boolean slaveOk) { public void setSlaveOk(boolean slaveOk) {
this.slaveOk = slaveOk; this.slaveOk = slaveOk;
} }
@@ -194,40 +208,41 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
this.sslSocketFactory = sslSocketFactory; this.sslSocketFactory = sslSocketFactory;
} }
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet() * @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/ */
public void afterPropertiesSet() { @Override
protected MongoOptions createInstance() throws Exception {
if (MongoClientVersion.isMongo3Driver()) {
throw new IllegalArgumentException(
String
.format("Usage of 'mongo-options' is no longer supported for MongoDB Java driver version 3 and above. Please use 'mongo-client-options' and refer to chapter 'MongoDB 3.0 Support' for details."));
}
MongoOptions options = new MongoOptions(); MongoOptions options = new MongoOptions();
options.connectionsPerHost = connectionsPerHost; options.setConnectionsPerHost(connectionsPerHost);
options.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier; options.setThreadsAllowedToBlockForConnectionMultiplier(threadsAllowedToBlockForConnectionMultiplier);
options.maxWaitTime = maxWaitTime; options.setMaxWaitTime(maxWaitTime);
options.connectTimeout = connectTimeout; options.setConnectTimeout(connectTimeout);
options.socketTimeout = socketTimeout; options.setSocketTimeout(socketTimeout);
options.socketKeepAlive = socketKeepAlive; options.setSocketKeepAlive(socketKeepAlive);
options.autoConnectRetry = autoConnectRetry;
options.maxAutoConnectRetryTime = maxAutoConnectRetryTime; options.setW(writeNumber);
options.slaveOk = slaveOk; options.setWtimeout(writeTimeout);
options.w = writeNumber; options.setFsync(writeFsync);
options.wtimeout = writeTimeout;
options.fsync = writeFsync;
if (ssl) { if (ssl) {
options.setSocketFactory(sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory.getDefault()); options.setSocketFactory(sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory.getDefault());
} }
this.options = options; ReflectiveMongoOptionsInvoker.setAutoConnectRetry(options, autoConnectRetry);
} ReflectiveMongoOptionsInvoker.setMaxAutoConnectRetryTime(options, maxAutoConnectRetryTime);
ReflectiveMongoOptionsInvoker.setSlaveOk(options, slaveOk);
/* return options;
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
public MongoOptions getObject() {
return this.options;
} }
/* /*
@@ -237,12 +252,4 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
public Class<?> getObjectType() { public Class<?> getObjectType() {
return MongoOptions.class; return MongoOptions.class;
} }
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
*/
public boolean isSingleton() {
return true;
}
} }

View File

@@ -0,0 +1,109 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
/**
* {@link ReflectiveDBCollectionInvoker} provides reflective access to {@link DBCollection} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class ReflectiveDBCollectionInvoker {
private static final Method GEN_INDEX_NAME_METHOD;
private static final Method RESET_INDEX_CHACHE_METHOD;
static {
GEN_INDEX_NAME_METHOD = findMethod(DBCollection.class, "genIndexName", DBObject.class);
RESET_INDEX_CHACHE_METHOD = findMethod(DBCollection.class, "resetIndexCache");
}
private ReflectiveDBCollectionInvoker() {}
/**
* Convenience method to generate an index name from the set of fields it is over. Will fall back to a MongoDB Java
* driver version 2 compatible way of generating index name in case of {@link MongoClientVersion#isMongo3Driver()}.
*
* @param keys the names of the fields used in this index
* @return
*/
public static String generateIndexName(DBObject keys) {
if (isMongo3Driver()) {
return genIndexName(keys);
}
return (String) invokeMethod(GEN_INDEX_NAME_METHOD, null, keys);
}
/**
* In case of MongoDB Java driver version 2 all indices that have not yet been applied to this collection will be
* cleared. Since this method is not available for the MongoDB Java driver version 3 the operation will throw
* {@link UnsupportedOperationException}.
*
* @param dbCollection
* @throws UnsupportedOperationException
*/
public static void resetIndexCache(DBCollection dbCollection) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException("The mongo java driver 3 does no loger support resetIndexCache!");
}
invokeMethod(RESET_INDEX_CHACHE_METHOD, dbCollection);
}
/**
* Borrowed from MongoDB Java driver version 2. See <a
* href="http://github.com/mongodb/mongo-java-driver/blob/r2.13.0/src/main/com/mongodb/DBCollection.java#L754"
* >http://github.com/mongodb/mongo-java-driver/blob/r2.13.0/src/main/com/mongodb/DBCollection.java#L754</a>
*
* @param keys
* @return
*/
private static String genIndexName(DBObject keys) {
StringBuilder name = new StringBuilder();
for (String s : keys.keySet()) {
if (name.length() > 0) {
name.append('_');
}
name.append(s).append('_');
Object val = keys.get(s);
if (val instanceof Number || val instanceof String) {
name.append(val.toString().replace(' ', '_'));
}
}
return name.toString();
}
}

View File

@@ -0,0 +1,134 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.DB;
import com.mongodb.Mongo;
/**
* {@link ReflectiveDbInvoker} provides reflective access to {@link DB} API that is not consistently available for
* various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
final class ReflectiveDbInvoker {
private static final Method DB_IS_AUTHENTICATED_METHOD;
private static final Method DB_AUTHENTICATE_METHOD;
private static final Method DB_REQUEST_DONE_METHOD;
private static final Method DB_ADD_USER_METHOD;
private static final Method DB_REQUEST_START_METHOD;
static {
DB_IS_AUTHENTICATED_METHOD = findMethod(DB.class, "isAuthenticated");
DB_AUTHENTICATE_METHOD = findMethod(DB.class, "authenticate", String.class, char[].class);
DB_REQUEST_DONE_METHOD = findMethod(DB.class, "requestDone");
DB_ADD_USER_METHOD = findMethod(DB.class, "addUser", String.class, char[].class);
DB_REQUEST_START_METHOD = findMethod(DB.class, "requestStart");
}
private ReflectiveDbInvoker() {}
/**
* Authenticate against database using provided credentials in case of a MongoDB Java driver version 2.
*
* @param mongo must not be {@literal null}.
* @param db must not be {@literal null}.
* @param credentials must not be {@literal null}.
* @param authenticationDatabaseName
*/
public static void authenticate(Mongo mongo, DB db, UserCredentials credentials, String authenticationDatabaseName) {
String databaseName = db.getName();
DB authDb = databaseName.equals(authenticationDatabaseName) ? db : mongo.getDB(authenticationDatabaseName);
synchronized (authDb) {
Boolean isAuthenticated = (Boolean) invokeMethod(DB_IS_AUTHENTICATED_METHOD, authDb);
if (!isAuthenticated) {
String username = credentials.getUsername();
String password = credentials.hasPassword() ? credentials.getPassword() : null;
Boolean authenticated = (Boolean) invokeMethod(DB_AUTHENTICATE_METHOD, authDb, username,
password == null ? null : password.toCharArray());
if (!authenticated) {
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName + "], "
+ credentials.toString(), databaseName, credentials);
}
}
}
}
/**
* Starts a new 'consistent request' in case of MongoDB Java driver version 2. Will do nothing for MongoDB Java driver
* version 3 since the operation is no longer available.
*
* @param db
*/
public static void requestStart(DB db) {
if (isMongo3Driver()) {
return;
}
invokeMethod(DB_REQUEST_START_METHOD, db);
}
/**
* Ends the current 'consistent request'. a new 'consistent request' in case of MongoDB Java driver version 2. Will do
* nothing for MongoDB Java driver version 3 since the operation is no longer available
*
* @param db
*/
public static void requestDone(DB db) {
if (MongoClientVersion.isMongo3Driver()) {
return;
}
invokeMethod(DB_REQUEST_DONE_METHOD, db);
}
/**
* @param db
* @param username
* @param password
* @throws UnsupportedOperationException
*/
public static void addUser(DB db, String username, char[] password) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Please use DB.command(…) to call either the createUser or updateUser command!");
}
invokeMethod(DB_ADD_USER_METHOD, db, username, password);
}
}

View File

@@ -0,0 +1,62 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.util.Assert;
import com.mongodb.MapReduceCommand;
/**
* {@link ReflectiveMapReduceInvoker} provides reflective access to {@link MapReduceCommand} API that is not
* consistently available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
final class ReflectiveMapReduceInvoker {
private static final Method ADD_EXTRA_OPTION_METHOD;
static {
ADD_EXTRA_OPTION_METHOD = findMethod(MapReduceCommand.class, "addExtraOption", String.class, Object.class);
}
private ReflectiveMapReduceInvoker() {}
/**
* Sets the extra option for MongoDB Java driver version 2. Will do nothing for MongoDB Java driver version 2.
*
* @param cmd can be {@literal null} for MongoDB Java driver version 2.
* @param key
* @param value
*/
public static void addExtraOption(MapReduceCommand cmd, String key, Object value) {
if (isMongo3Driver()) {
return;
}
Assert.notNull(cmd, "MapReduceCommand must not be null!");
invokeMethod(ADD_EXTRA_OPTION_METHOD, cmd, key, value);
}
}

View File

@@ -0,0 +1,158 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.beans.DirectFieldAccessor;
import org.springframework.util.ReflectionUtils;
import com.mongodb.MongoOptions;
/**
* {@link ReflectiveMongoOptionsInvoker} provides reflective access to {@link MongoOptions} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
@SuppressWarnings("deprecation")
class ReflectiveMongoOptionsInvoker {
private static final Method GET_AUTO_CONNECT_RETRY_METHOD;
private static final Method SET_AUTO_CONNECT_RETRY_METHOD;
private static final Method GET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD;
private static final Method SET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD;
static {
SET_AUTO_CONNECT_RETRY_METHOD = ReflectionUtils
.findMethod(MongoOptions.class, "setAutoConnectRetry", boolean.class);
GET_AUTO_CONNECT_RETRY_METHOD = ReflectionUtils.findMethod(MongoOptions.class, "isAutoConnectRetry");
SET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD = ReflectionUtils.findMethod(MongoOptions.class,
"setMaxAutoConnectRetryTime", long.class);
GET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD = ReflectionUtils.findMethod(MongoOptions.class,
"getMaxAutoConnectRetryTime");
}
private ReflectiveMongoOptionsInvoker() {}
/**
* Sets the retry connection flag for MongoDB Java driver version 2. Will do nothing for MongoDB Java driver version 3
* since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @param autoConnectRetry
*/
public static void setAutoConnectRetry(MongoOptions options, boolean autoConnectRetry) {
if (isMongo3Driver()) {
return;
}
invokeMethod(SET_AUTO_CONNECT_RETRY_METHOD, options, autoConnectRetry);
}
/**
* Sets the maxAutoConnectRetryTime attribute for MongoDB Java driver version 2. Will do nothing for MongoDB Java
* driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @param maxAutoConnectRetryTime
*/
public static void setMaxAutoConnectRetryTime(MongoOptions options, long maxAutoConnectRetryTime) {
if (isMongo3Driver()) {
return;
}
invokeMethod(SET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD, options, maxAutoConnectRetryTime);
}
/**
* Sets the slaveOk attribute for MongoDB Java driver version 2. Will do nothing for MongoDB Java driver version 3
* since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @param slaveOk
*/
public static void setSlaveOk(MongoOptions options, boolean slaveOk) {
if (isMongo3Driver()) {
return;
}
new DirectFieldAccessor(options).setPropertyValue("slaveOk", slaveOk);
}
/**
* Gets the slaveOk attribute for MongoDB Java driver version 2. Throws {@link UnsupportedOperationException} for
* MongoDB Java driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @return
* @throws UnsupportedOperationException
*/
public static boolean getSlaveOk(MongoOptions options) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Cannot get value for autoConnectRetry which has been removed in MongoDB Java driver version 3.");
}
return ((Boolean) new DirectFieldAccessor(options).getPropertyValue("slaveOk")).booleanValue();
}
/**
* Gets the autoConnectRetry attribute for MongoDB Java driver version 2. Throws {@link UnsupportedOperationException}
* for MongoDB Java driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @return
* @throws UnsupportedOperationException
*/
public static boolean getAutoConnectRetry(MongoOptions options) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Cannot get value for autoConnectRetry which has been removed in MongoDB Java driver version 3.");
}
return ((Boolean) invokeMethod(GET_AUTO_CONNECT_RETRY_METHOD, options)).booleanValue();
}
/**
* Gets the maxAutoConnectRetryTime attribute for MongoDB Java driver version 2. Throws
* {@link UnsupportedOperationException} for MongoDB Java driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @return
* @throws UnsupportedOperationException
*/
public static long getMaxAutoConnectRetryTime(MongoOptions options) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Cannot get value for maxAutoConnectRetryTime which has been removed in MongoDB Java driver version 3.");
}
return ((Long) invokeMethod(GET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD, options)).longValue();
}
}

View File

@@ -0,0 +1,48 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import org.springframework.beans.DirectFieldAccessor;
import com.mongodb.WriteConcern;
/**
* {@link ReflectiveWriteConcernInvoker} provides reflective access to {@link WriteConcern} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class ReflectiveWriteConcernInvoker {
private static final WriteConcern NONE_OR_UNACKNOWLEDGED;
static {
NONE_OR_UNACKNOWLEDGED = isMongo3Driver() ? WriteConcern.UNACKNOWLEDGED : (WriteConcern) new DirectFieldAccessor(
new WriteConcern()).getPropertyValue("NONE");
}
/**
* @return {@link WriteConcern#NONE} for MongoDB Java driver version 2, otherwise {@link WriteConcern#UNACKNOWLEDGED}.
*/
public static WriteConcern noneOrUnacknowledged() {
return NONE_OR_UNACKNOWLEDGED;
}
}

View File

@@ -0,0 +1,67 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import com.mongodb.MongoException;
import com.mongodb.WriteResult;
/**
* {@link ReflectiveWriteResultInvoker} provides reflective access to {@link WriteResult} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
final class ReflectiveWriteResultInvoker {
private static final Method GET_ERROR_METHOD;
private static final Method WAS_ACKNOWLEDGED_METHOD;
private ReflectiveWriteResultInvoker() {}
static {
GET_ERROR_METHOD = findMethod(WriteResult.class, "getError");
WAS_ACKNOWLEDGED_METHOD = findMethod(WriteResult.class, "wasAcknowledged");
}
/**
* @param writeResult can be {@literal null} for MongoDB Java driver version 3.
* @return null in case of MongoDB Java driver version 3 since errors are thrown as {@link MongoException}.
*/
public static String getError(WriteResult writeResult) {
if (isMongo3Driver()) {
return null;
}
return (String) invokeMethod(GET_ERROR_METHOD, writeResult);
}
/**
* @param writeResult
* @return return in case of MongoDB Java driver version 2.
*/
public static boolean wasAcknowledged(WriteResult writeResult) {
return isMongo3Driver() ? ((Boolean) invokeMethod(WAS_ACKNOWLEDGED_METHOD, writeResult)).booleanValue() : true;
}
}

View File

@@ -0,0 +1,84 @@
/*
* Copyright 2014-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.Set;
import org.springframework.data.mongodb.core.script.ExecutableMongoScript;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import com.mongodb.DB;
/**
* Script operations on {@link com.mongodb.DB} level. Allows interaction with server side JavaScript functions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
public interface ScriptOperations {
/**
* Store given {@link ExecutableMongoScript} generating a syntheitcal name so that it can be called by it
* subsequently.
*
* @param script must not be {@literal null}.
* @return {@link NamedMongoScript} with name under which the {@code JavaScript} function can be called.
*/
NamedMongoScript register(ExecutableMongoScript script);
/**
* Registers the given {@link NamedMongoScript} in the database.
*
* @param script the {@link NamedMongoScript} to be registered.
* @return
*/
NamedMongoScript register(NamedMongoScript script);
/**
* Executes the {@literal script} by either calling it via its {@literal name} or directly sending it.
*
* @param script must not be {@literal null}.
* @param args arguments to pass on for script execution.
* @return the script evaluation result.
* @throws org.springframework.dao.DataAccessException
*/
Object execute(ExecutableMongoScript script, Object... args);
/**
* Call the {@literal JavaScript} by its name.
*
* @param scriptName must not be {@literal null} or empty.
* @param args
* @return
*/
Object call(String scriptName, Object... args);
/**
* Checks {@link DB} for existence of {@link ServerSideJavaScript} with given name.
*
* @param scriptName must not be {@literal null} or empty.
* @return false if no {@link ServerSideJavaScript} with given name exists.
*/
boolean exists(String scriptName);
/**
* Returns names of {@literal JavaScript} functions that can be called.
*
* @return empty {@link Set} if no scripts found.
*/
Set<String> getScriptNames();
}

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011-2013 the original author or authors. * Copyright 2011-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -19,6 +19,7 @@ import java.net.UnknownHostException;
import org.springframework.beans.factory.DisposableBean; import org.springframework.beans.factory.DisposableBean;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.support.PersistenceExceptionTranslator; import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.authentication.UserCredentials; import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.MongoDbFactory; import org.springframework.data.mongodb.MongoDbFactory;
@@ -27,6 +28,8 @@ import org.springframework.util.StringUtils;
import com.mongodb.DB; import com.mongodb.DB;
import com.mongodb.Mongo; import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientURI;
import com.mongodb.MongoException; import com.mongodb.MongoException;
import com.mongodb.MongoURI; import com.mongodb.MongoURI;
import com.mongodb.WriteConcern; import com.mongodb.WriteConcern;
@@ -37,6 +40,7 @@ import com.mongodb.WriteConcern;
* @author Mark Pollack * @author Mark Pollack
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
* @author Christoph Strobl
*/ */
public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory { public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
@@ -54,7 +58,9 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* *
* @param mongo Mongo instance, must not be {@literal null}. * @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName database name, not be {@literal null} or empty. * @param databaseName database name, not be {@literal null} or empty.
* @deprecated since 1.7. Please use {@link #SimpleMongoDbFactory(MongoClient, String)}.
*/ */
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName) { public SimpleMongoDbFactory(Mongo mongo, String databaseName) {
this(mongo, databaseName, null); this(mongo, databaseName, null);
} }
@@ -65,7 +71,9 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @param mongo Mongo instance, must not be {@literal null}. * @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName Database name, must not be {@literal null} or empty. * @param databaseName Database name, must not be {@literal null} or empty.
* @param credentials username and password. * @param credentials username and password.
* @deprecated since 1.7. The credentials used should be provided by {@link MongoClient#getCredentialsList()}.
*/ */
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials) { public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials) {
this(mongo, databaseName, credentials, false, null); this(mongo, databaseName, credentials, false, null);
} }
@@ -77,7 +85,9 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @param databaseName Database name, must not be {@literal null} or empty. * @param databaseName Database name, must not be {@literal null} or empty.
* @param credentials username and password. * @param credentials username and password.
* @param authenticationDatabaseName the database name to use for authentication * @param authenticationDatabaseName the database name to use for authentication
* @deprecated since 1.7. The credentials used should be provided by {@link MongoClient#getCredentialsList()}.
*/ */
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials, public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
String authenticationDatabaseName) { String authenticationDatabaseName) {
this(mongo, databaseName, credentials, false, authenticationDatabaseName); this(mongo, databaseName, credentials, false, authenticationDatabaseName);
@@ -90,16 +100,44 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @throws MongoException * @throws MongoException
* @throws UnknownHostException * @throws UnknownHostException
* @see MongoURI * @see MongoURI
* @deprecated since 1.7. Please use {@link #SimpleMongoDbFactory(MongoClientURI)} instead.
*/ */
@SuppressWarnings("deprecation") @Deprecated
public SimpleMongoDbFactory(MongoURI uri) throws MongoException, UnknownHostException { public SimpleMongoDbFactory(MongoURI uri) throws MongoException, UnknownHostException {
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())), this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())), true,
true, uri.getDatabase()); uri.getDatabase());
}
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClientURI}.
*
* @param uri must not be {@literal null}.
* @throws UnknownHostException
* @since 1.7
*/
public SimpleMongoDbFactory(MongoClientURI uri) throws UnknownHostException {
this(new MongoClient(uri), uri.getDatabase(), true);
}
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClient}.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null}.
* @since 1.7
*/
public SimpleMongoDbFactory(MongoClient mongoClient, String databaseName) {
this(mongoClient, databaseName, false);
} }
private SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials, private SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
boolean mongoInstanceCreated, String authenticationDatabaseName) { boolean mongoInstanceCreated, String authenticationDatabaseName) {
if (mongo instanceof MongoClient && (credentials != null && !UserCredentials.NO_CREDENTIALS.equals(credentials))) {
throw new InvalidDataAccessApiUsageException(
"Usage of 'UserCredentials' with 'MongoClient' is no longer supported. Please use 'MongoCredential' for 'MongoClient' or just 'Mongo'.");
}
Assert.notNull(mongo, "Mongo must not be null"); Assert.notNull(mongo, "Mongo must not be null");
Assert.hasText(databaseName, "Database name must not be empty"); Assert.hasText(databaseName, "Database name must not be empty");
Assert.isTrue(databaseName.matches("[\\w-]+"), Assert.isTrue(databaseName.matches("[\\w-]+"),
@@ -117,6 +155,25 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
"Authentication database name must only contain letters, numbers, underscores and dashes!"); "Authentication database name must only contain letters, numbers, underscores and dashes!");
} }
/**
* @param client
* @param databaseName
* @param mongoInstanceCreated
* @since 1.7
*/
private SimpleMongoDbFactory(MongoClient client, String databaseName, boolean mongoInstanceCreated) {
Assert.notNull(client, "MongoClient must not be null!");
Assert.hasText(databaseName, "Database name must not be empty!");
this.mongo = client;
this.databaseName = databaseName;
this.mongoInstanceCreated = mongoInstanceCreated;
this.exceptionTranslator = new MongoExceptionTranslator();
this.credentials = UserCredentials.NO_CREDENTIALS;
this.authenticationDatabaseName = databaseName;
}
/** /**
* Configures the {@link WriteConcern} to be used on the {@link DB} instance being created. * Configures the {@link WriteConcern} to be used on the {@link DB} instance being created.
* *
@@ -138,6 +195,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb(java.lang.String) * @see org.springframework.data.mongodb.MongoDbFactory#getDb(java.lang.String)
*/ */
@SuppressWarnings("deprecation")
public DB getDb(String dbName) throws DataAccessException { public DB getDb(String dbName) throws DataAccessException {
Assert.hasText(dbName, "Database name must not be empty."); Assert.hasText(dbName, "Database name must not be empty.");

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013-2014 the original author or authors. * Copyright 2013-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -25,8 +25,10 @@ import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Direction; import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField; import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference; import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField; import org.springframework.data.mongodb.core.aggregation.Fields.*;
import org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation;
import org.springframework.data.mongodb.core.query.Criteria; import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.SerializationUtils; import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.util.Assert; import org.springframework.util.Assert;
@@ -36,10 +38,14 @@ import com.mongodb.DBObject;
/** /**
* An {@code Aggregation} is a representation of a list of aggregation steps to be performed by the MongoDB Aggregation * An {@code Aggregation} is a representation of a list of aggregation steps to be performed by the MongoDB Aggregation
* Framework. * Framework.
* *
* @author Tobias Trelle * @author Tobias Trelle
* @author Thomas Darimont * @author Thomas Darimont
* @author Oliver Gierke * @author Oliver Gierke
* @author Mark Paluch
* @author Alessio Fachechi
* @author Christoph Strobl
* @author Nikolay Bogdanov
* @since 1.3 * @since 1.3
*/ */
public class Aggregation { public class Aggregation {
@@ -64,7 +70,7 @@ public class Aggregation {
/** /**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s. * Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
* *
* @param operations must not be {@literal null} or empty. * @param operations must not be {@literal null} or empty.
*/ */
public static Aggregation newAggregation(List<? extends AggregationOperation> operations) { public static Aggregation newAggregation(List<? extends AggregationOperation> operations) {
@@ -73,7 +79,7 @@ public class Aggregation {
/** /**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s. * Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
* *
* @param operations must not be {@literal null} or empty. * @param operations must not be {@literal null} or empty.
*/ */
public static Aggregation newAggregation(AggregationOperation... operations) { public static Aggregation newAggregation(AggregationOperation... operations) {
@@ -83,7 +89,7 @@ public class Aggregation {
/** /**
* Returns a copy of this {@link Aggregation} with the given {@link AggregationOptions} set. Note that options are * Returns a copy of this {@link Aggregation} with the given {@link AggregationOptions} set. Note that options are
* supported in MongoDB version 2.6+. * supported in MongoDB version 2.6+.
* *
* @param options must not be {@literal null}. * @param options must not be {@literal null}.
* @return * @return
* @since 1.6 * @since 1.6
@@ -96,7 +102,7 @@ public class Aggregation {
/** /**
* Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s. * Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s.
* *
* @param type must not be {@literal null}. * @param type must not be {@literal null}.
* @param operations must not be {@literal null} or empty. * @param operations must not be {@literal null} or empty.
*/ */
@@ -106,7 +112,7 @@ public class Aggregation {
/** /**
* Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s. * Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s.
* *
* @param type must not be {@literal null}. * @param type must not be {@literal null}.
* @param operations must not be {@literal null} or empty. * @param operations must not be {@literal null} or empty.
*/ */
@@ -116,7 +122,7 @@ public class Aggregation {
/** /**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s. * Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
* *
* @param aggregationOperations must not be {@literal null} or empty. * @param aggregationOperations must not be {@literal null} or empty.
*/ */
protected Aggregation(AggregationOperation... aggregationOperations) { protected Aggregation(AggregationOperation... aggregationOperations) {
@@ -136,7 +142,7 @@ public class Aggregation {
/** /**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s. * Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
* *
* @param aggregationOperations must not be {@literal null} or empty. * @param aggregationOperations must not be {@literal null} or empty.
*/ */
protected Aggregation(List<AggregationOperation> aggregationOperations) { protected Aggregation(List<AggregationOperation> aggregationOperations) {
@@ -145,23 +151,34 @@ public class Aggregation {
/** /**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s. * Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
* *
* @param aggregationOperations must not be {@literal null} or empty. * @param aggregationOperations must not be {@literal null} or empty.
* @param options must not be {@literal null} or empty. * @param options must not be {@literal null} or empty.
*/ */
protected Aggregation(List<AggregationOperation> aggregationOperations, AggregationOptions options) { protected Aggregation(List<AggregationOperation> aggregationOperations, AggregationOptions options) {
Assert.notNull(aggregationOperations, "AggregationOperations must not be null!"); Assert.notNull(aggregationOperations, "AggregationOperations must not be null!");
Assert.isTrue(aggregationOperations.size() > 0, "At least one AggregationOperation has to be provided"); Assert.isTrue(!aggregationOperations.isEmpty(), "At least one AggregationOperation has to be provided");
Assert.notNull(options, "AggregationOptions must not be null!"); Assert.notNull(options, "AggregationOptions must not be null!");
// check $out is the last operation if it exists
for (AggregationOperation aggregationOperation : aggregationOperations) {
if (aggregationOperation instanceof OutOperation && !isLast(aggregationOperation, aggregationOperations)) {
throw new IllegalArgumentException("The $out operator must be the last stage in the pipeline.");
}
}
this.operations = aggregationOperations; this.operations = aggregationOperations;
this.options = options; this.options = options;
} }
private boolean isLast(AggregationOperation aggregationOperation, List<AggregationOperation> aggregationOperations) {
return aggregationOperations.indexOf(aggregationOperation) == aggregationOperations.size() - 1;
}
/** /**
* A pointer to the previous {@link AggregationOperation}. * A pointer to the previous {@link AggregationOperation}.
* *
* @return * @return
*/ */
public static String previousOperation() { public static String previousOperation() {
@@ -170,7 +187,7 @@ public class Aggregation {
/** /**
* Creates a new {@link ProjectionOperation} including the given fields. * Creates a new {@link ProjectionOperation} including the given fields.
* *
* @param fields must not be {@literal null}. * @param fields must not be {@literal null}.
* @return * @return
*/ */
@@ -180,7 +197,7 @@ public class Aggregation {
/** /**
* Creates a new {@link ProjectionOperation} includeing the given {@link Fields}. * Creates a new {@link ProjectionOperation} includeing the given {@link Fields}.
* *
* @param fields must not be {@literal null}. * @param fields must not be {@literal null}.
* @return * @return
*/ */
@@ -190,17 +207,61 @@ public class Aggregation {
/** /**
* Factory method to create a new {@link UnwindOperation} for the field with the given name. * Factory method to create a new {@link UnwindOperation} for the field with the given name.
* *
* @param fieldName must not be {@literal null} or empty. * @param field must not be {@literal null} or empty.
* @return * @return
*/ */
public static UnwindOperation unwind(String field) { public static UnwindOperation unwind(String field) {
return new UnwindOperation(field(field)); return new UnwindOperation(field(field));
} }
/**
* Factory method to create a new {@link UnwindOperation} for the field with the given name and
* {@code preserveNullAndEmptyArrays}. Note that extended unwind is supported in MongoDB version 3.2+.
*
* @param field must not be {@literal null} or empty.
* @param preserveNullAndEmptyArrays {@literal true} to output the document if path is {@literal null}, missing or
* array is empty.
* @return new {@link UnwindOperation}
* @since 1.10
*/
public static UnwindOperation unwind(String field, boolean preserveNullAndEmptyArrays) {
return new UnwindOperation(field(field), preserveNullAndEmptyArrays);
}
/**
* Factory method to create a new {@link UnwindOperation} for the field with the given name including the name of a
* new field to hold the array index of the element as {@code arrayIndex}. Note that extended unwind is supported in
* MongoDB version 3.2+.
*
* @param field must not be {@literal null} or empty.
* @param arrayIndex must not be {@literal null} or empty.
* @return new {@link UnwindOperation}
* @since 1.10
*/
public static UnwindOperation unwind(String field, String arrayIndex) {
return new UnwindOperation(field(field), field(arrayIndex), false);
}
/**
* Factory method to create a new {@link UnwindOperation} for the field with the given nameincluding the name of a new
* field to hold the array index of the element as {@code arrayIndex} using {@code preserveNullAndEmptyArrays}. Note
* that extended unwind is supported in MongoDB version 3.2+.
*
* @param field must not be {@literal null} or empty.
* @param arrayIndex must not be {@literal null} or empty.
* @param preserveNullAndEmptyArrays {@literal true} to output the document if path is {@literal null}, missing or
* array is empty.
* @return new {@link UnwindOperation}
* @since 1.10
*/
public static UnwindOperation unwind(String field, String arrayIndex, boolean preserveNullAndEmptyArrays) {
return new UnwindOperation(field(field), field(arrayIndex), preserveNullAndEmptyArrays);
}
/** /**
* Creates a new {@link GroupOperation} for the given fields. * Creates a new {@link GroupOperation} for the given fields.
* *
* @param fields must not be {@literal null}. * @param fields must not be {@literal null}.
* @return * @return
*/ */
@@ -210,7 +271,7 @@ public class Aggregation {
/** /**
* Creates a new {@link GroupOperation} for the given {@link Fields}. * Creates a new {@link GroupOperation} for the given {@link Fields}.
* *
* @param fields must not be {@literal null}. * @param fields must not be {@literal null}.
* @return * @return
*/ */
@@ -220,7 +281,7 @@ public class Aggregation {
/** /**
* Factory method to create a new {@link SortOperation} for the given {@link Sort}. * Factory method to create a new {@link SortOperation} for the given {@link Sort}.
* *
* @param sort must not be {@literal null}. * @param sort must not be {@literal null}.
* @return * @return
*/ */
@@ -230,7 +291,7 @@ public class Aggregation {
/** /**
* Factory method to create a new {@link SortOperation} for the given sort {@link Direction} and {@code fields}. * Factory method to create a new {@link SortOperation} for the given sort {@link Direction} and {@code fields}.
* *
* @param direction must not be {@literal null}. * @param direction must not be {@literal null}.
* @param fields must not be {@literal null}. * @param fields must not be {@literal null}.
* @return * @return
@@ -241,7 +302,7 @@ public class Aggregation {
/** /**
* Creates a new {@link SkipOperation} skipping the given number of elements. * Creates a new {@link SkipOperation} skipping the given number of elements.
* *
* @param elementsToSkip must not be less than zero. * @param elementsToSkip must not be less than zero.
* @return * @return
*/ */
@@ -251,7 +312,7 @@ public class Aggregation {
/** /**
* Creates a new {@link LimitOperation} limiting the result to the given number of elements. * Creates a new {@link LimitOperation} limiting the result to the given number of elements.
* *
* @param maxElements must not be less than zero. * @param maxElements must not be less than zero.
* @return * @return
*/ */
@@ -261,7 +322,7 @@ public class Aggregation {
/** /**
* Creates a new {@link MatchOperation} using the given {@link Criteria}. * Creates a new {@link MatchOperation} using the given {@link Criteria}.
* *
* @param criteria must not be {@literal null}. * @param criteria must not be {@literal null}.
* @return * @return
*/ */
@@ -269,12 +330,54 @@ public class Aggregation {
return new MatchOperation(criteria); return new MatchOperation(criteria);
} }
/**
* Creates a new {@link OutOperation} using the given collection name. This operation must be the last operation
* in the pipeline.
*
* @param outCollectionName collection name to export aggregation results. The {@link OutOperation} creates a new
* collection in the current database if one does not already exist. The collection is
* not visible until the aggregation completes. If the aggregation fails, MongoDB does
* not create the collection. Must not be {@literal null}.
* @return
*/
public static OutOperation out(String outCollectionName) {
return new OutOperation(outCollectionName);
}
/**
* Creates a new {@link LookupOperation}.
*
* @param from must not be {@literal null}.
* @param localField must not be {@literal null}.
* @param foreignField must not be {@literal null}.
* @param as must not be {@literal null}.
* @return never {@literal null}.
* @since 1.9
*/
public static LookupOperation lookup(String from, String localField, String foreignField, String as) {
return lookup(field(from), field(localField), field(foreignField), field(as));
}
/**
* Creates a new {@link LookupOperation} for the given {@link Fields}.
*
* @param from must not be {@literal null}.
* @param localField must not be {@literal null}.
* @param foreignField must not be {@literal null}.
* @param as must not be {@literal null}.
* @return never {@literal null}.
* @since 1.9
*/
public static LookupOperation lookup(Field from, Field localField, Field foreignField, Field as) {
return new LookupOperation(from, localField, foreignField, as);
}
/** /**
* Creates a new {@link Fields} instance for the given field names. * Creates a new {@link Fields} instance for the given field names.
* *
* @see Fields#fields(String...)
* @param fields must not be {@literal null}. * @param fields must not be {@literal null}.
* @return * @return
* @see Fields#fields(String...)
*/ */
public static Fields fields(String... fields) { public static Fields fields(String... fields) {
return Fields.fields(fields); return Fields.fields(fields);
@@ -282,7 +385,7 @@ public class Aggregation {
/** /**
* Creates a new {@link Fields} instance from the given field name and target reference. * Creates a new {@link Fields} instance from the given field name and target reference.
* *
* @param name must not be {@literal null} or empty. * @param name must not be {@literal null} or empty.
* @param target must not be {@literal null} or empty. * @param target must not be {@literal null} or empty.
* @return * @return
@@ -291,9 +394,22 @@ public class Aggregation {
return Fields.from(field(name, target)); return Fields.from(field(name, target));
} }
/**
* Creates a new {@link GeoNearOperation} instance from the given {@link NearQuery} and the{@code distanceField}. The
* {@code distanceField} defines output field that contains the calculated distance.
*
* @param query must not be {@literal null}.
* @param distanceField must not be {@literal null} or empty.
* @return
* @since 1.7
*/
public static GeoNearOperation geoNear(NearQuery query, String distanceField) {
return new GeoNearOperation(query, distanceField);
}
/** /**
* Returns a new {@link AggregationOptions.Builder}. * Returns a new {@link AggregationOptions.Builder}.
* *
* @return * @return
* @since 1.6 * @since 1.6
*/ */
@@ -303,7 +419,7 @@ public class Aggregation {
/** /**
* Converts this {@link Aggregation} specification to a {@link DBObject}. * Converts this {@link Aggregation} specification to a {@link DBObject}.
* *
* @param inputCollectionName the name of the input collection * @param inputCollectionName the name of the input collection
* @return the {@code DBObject} representing this aggregation * @return the {@code DBObject} representing this aggregation
*/ */
@@ -317,8 +433,14 @@ public class Aggregation {
operationDocuments.add(operation.toDBObject(context)); operationDocuments.add(operation.toDBObject(context));
if (operation instanceof FieldsExposingAggregationOperation) { if (operation instanceof FieldsExposingAggregationOperation) {
FieldsExposingAggregationOperation exposedFieldsOperation = (FieldsExposingAggregationOperation) operation; FieldsExposingAggregationOperation exposedFieldsOperation = (FieldsExposingAggregationOperation) operation;
context = new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields(), rootContext);
if (operation instanceof InheritsFieldsAggregationOperation) {
context = new InheritingExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields(), context);
} else {
context = new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields(), context);
}
} }
} }
@@ -342,7 +464,7 @@ public class Aggregation {
/** /**
* Simple {@link AggregationOperationContext} that just returns {@link FieldReference}s as is. * Simple {@link AggregationOperationContext} that just returns {@link FieldReference}s as is.
* *
* @author Oliver Gierke * @author Oliver Gierke
*/ */
private static class NoOpAggregationOperationContext implements AggregationOperationContext { private static class NoOpAggregationOperationContext implements AggregationOperationContext {
@@ -377,7 +499,7 @@ public class Aggregation {
/** /**
* Describes the system variables available in MongoDB aggregation framework pipeline expressions. * Describes the system variables available in MongoDB aggregation framework pipeline expressions.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @see http://docs.mongodb.org/manual/reference/aggregation-variables * @see http://docs.mongodb.org/manual/reference/aggregation-variables
*/ */
@@ -390,7 +512,7 @@ public class Aggregation {
/** /**
* Return {@literal true} if the given {@code fieldRef} denotes a well-known system variable, {@literal false} * Return {@literal true} if the given {@code fieldRef} denotes a well-known system variable, {@literal false}
* otherwise. * otherwise.
* *
* @param fieldRef may be {@literal null}. * @param fieldRef may be {@literal null}.
* @return * @return
*/ */

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013-2014 the original author or authors. * Copyright 2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -13,25 +13,25 @@
* See the License for the specific language governing permissions and * See the License for the specific language governing permissions and
* limitations under the License. * limitations under the License.
*/ */
package org.springframework.data.mongodb.core.geo; package org.springframework.data.mongodb.core.aggregation;
import com.mongodb.DBObject;
/** /**
* Value object to create custom {@link Metric}s on the fly. * An {@link AggregationExpression} can be used with field expressions in aggregation pipeline stages like
* {@code project} and {@code group}.
* *
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metric}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
* @author Oliver Gierke
*/ */
@Deprecated interface AggregationExpression {
public class CustomMetric extends org.springframework.data.geo.CustomMetric implements Metric {
/** /**
* Creates a custom {@link Metric} using the given multiplier. * Turns the {@link AggregationExpression} into a {@link DBObject} within the given
* {@link AggregationOperationContext}.
* *
* @param multiplier * @param context
* @return
*/ */
public CustomMetric(double multiplier) { DBObject toDbObject(AggregationOperationContext context);
super(multiplier);
}
} }

View File

@@ -0,0 +1,106 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* An enum of supported {@link AggregationExpression}s in aggregation pipeline stages.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.10
*/
public enum AggregationFunctionExpressions {
SIZE;
/**
* Returns an {@link AggregationExpression} build from the current {@link Enum} name and the given parameters.
*
* @param parameters must not be {@literal null}
* @return
*/
public AggregationExpression of(Object... parameters) {
Assert.notNull(parameters, "Parameters must not be null!");
return new FunctionExpression(name().toLowerCase(), parameters);
}
/**
* An {@link AggregationExpression} representing a function call.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.10
*/
static class FunctionExpression implements AggregationExpression {
private final String name;
private final List<Object> values;
/**
* Creates a new {@link FunctionExpression} for the given name and values.
*
* @param name must not be {@literal null} or empty.
* @param values must not be {@literal null}.
*/
public FunctionExpression(String name, Object[] values) {
Assert.hasText(name, "Name must not be null!");
Assert.notNull(values, "Values must not be null!");
this.name = name;
this.values = Arrays.asList(values);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Expression#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
List<Object> args = new ArrayList<Object>(values.size());
for (Object value : values) {
args.add(unpack(value, context));
}
return new BasicDBObject("$" + name, args);
}
private static Object unpack(Object value, AggregationOperationContext context) {
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDbObject(context);
}
if (value instanceof Field) {
return context.getReference((Field) value).toString();
}
return value;
}
}
}

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013-2014 the original author or authors. * Copyright 2013-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -30,6 +30,7 @@ import org.springframework.util.CompositeIterator;
* *
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
* @author Mark Paluch
* @since 1.3 * @since 1.3
*/ */
public final class ExposedFields implements Iterable<ExposedField> { public final class ExposedFields implements Iterable<ExposedField> {
@@ -88,7 +89,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
} }
/** /**
* Creates a new {@link ExposedFields} instance for the given fields in either sythetic or non-synthetic way. * Creates a new {@link ExposedFields} instance for the given fields in either synthetic or non-synthetic way.
* *
* @param fields must not be {@literal null}. * @param fields must not be {@literal null}.
* @param synthetic * @param synthetic
@@ -107,7 +108,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
} }
/** /**
* Creates a new {@link ExposedFields} with the given orignals and synthetics. * Creates a new {@link ExposedFields} with the given originals and synthetics.
* *
* @param originals must not be {@literal null}. * @param originals must not be {@literal null}.
* @param synthetic must not be {@literal null}. * @param synthetic must not be {@literal null}.
@@ -203,8 +204,13 @@ public final class ExposedFields implements Iterable<ExposedField> {
public Iterator<ExposedField> iterator() { public Iterator<ExposedField> iterator() {
CompositeIterator<ExposedField> iterator = new CompositeIterator<ExposedField>(); CompositeIterator<ExposedField> iterator = new CompositeIterator<ExposedField>();
iterator.add(syntheticFields.iterator()); if (!syntheticFields.isEmpty()) {
iterator.add(originalFields.iterator()); iterator.add(syntheticFields.iterator());
}
if (!originalFields.isEmpty()) {
iterator.add(originalFields.iterator());
}
return iterator; return iterator;
} }
@@ -363,7 +369,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
} }
/** /**
* Returns the referenve value for the given field reference. Will return 1 for a synthetic, unaliased field or the * Returns the reference value for the given field reference. Will return 1 for a synthetic, unaliased field or the
* raw rendering of the reference otherwise. * raw rendering of the reference otherwise.
* *
* @return * @return

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013-2014 the original author or authors. * Copyright 2013-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -24,9 +24,10 @@ import com.mongodb.DBObject;
/** /**
* {@link AggregationOperationContext} that combines the available field references from a given * {@link AggregationOperationContext} that combines the available field references from a given
* {@code AggregationOperationContext} and an {@link FieldsExposingAggregationOperation}. * {@code AggregationOperationContext} and an {@link FieldsExposingAggregationOperation}.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @author Oliver Gierke * @author Oliver Gierke
* @author Mark Paluch
* @since 1.4 * @since 1.4
*/ */
class ExposedFieldsAggregationOperationContext implements AggregationOperationContext { class ExposedFieldsAggregationOperationContext implements AggregationOperationContext {
@@ -37,11 +38,12 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
/** /**
* Creates a new {@link ExposedFieldsAggregationOperationContext} from the given {@link ExposedFields}. Uses the given * Creates a new {@link ExposedFieldsAggregationOperationContext} from the given {@link ExposedFields}. Uses the given
* {@link AggregationOperationContext} to perform a mapping to mongo types if necessary. * {@link AggregationOperationContext} to perform a mapping to mongo types if necessary.
* *
* @param exposedFields must not be {@literal null}. * @param exposedFields must not be {@literal null}.
* @param rootContext must not be {@literal null}. * @param rootContext must not be {@literal null}.
*/ */
public ExposedFieldsAggregationOperationContext(ExposedFields exposedFields, AggregationOperationContext rootContext) { public ExposedFieldsAggregationOperationContext(ExposedFields exposedFields,
AggregationOperationContext rootContext) {
Assert.notNull(exposedFields, "ExposedFields must not be null!"); Assert.notNull(exposedFields, "ExposedFields must not be null!");
Assert.notNull(rootContext, "RootContext must not be null!"); Assert.notNull(rootContext, "RootContext must not be null!");
@@ -79,7 +81,7 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
/** /**
* Returns a {@link FieldReference} to the given {@link Field} with the given {@code name}. * Returns a {@link FieldReference} to the given {@link Field} with the given {@code name}.
* *
* @param field may be {@literal null} * @param field may be {@literal null}
* @param name must not be {@literal null} * @param name must not be {@literal null}
* @return * @return
@@ -88,6 +90,22 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
Assert.notNull(name, "Name must not be null!"); Assert.notNull(name, "Name must not be null!");
FieldReference exposedField = resolveExposedField(field, name);
if (exposedField != null) {
return exposedField;
}
throw new IllegalArgumentException(String.format("Invalid reference '%s'!", name));
}
/**
* Resolves a {@link field}/{@link name} for a {@link FieldReference} if possible.
*
* @param field may be {@literal null}
* @param name must not be {@literal null}
* @return the resolved reference or {@literal null}
*/
protected FieldReference resolveExposedField(Field field, String name) {
ExposedField exposedField = exposedFields.getField(name); ExposedField exposedField = exposedFields.getField(name);
if (exposedField != null) { if (exposedField != null) {
@@ -111,7 +129,6 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
return new FieldReference(new ExposedField(name, true)); return new FieldReference(new ExposedField(name, true));
} }
} }
return null;
throw new IllegalArgumentException(String.format("Invalid reference '%s'!", name));
} }
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013-2014 the original author or authors. * Copyright 2013-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -84,6 +84,15 @@ public final class Fields implements Iterable<Field> {
return new AggregationField(name); return new AggregationField(name);
} }
/**
* Creates a {@link Field} with the given {@code name} and {@code target}.
* <p>
* The {@code target} is the name of the backing document field that will be aliased with {@code name}.
*
* @param name
* @param target must not be {@literal null} or empty
* @return
*/
public static Field field(String name, String target) { public static Field field(String name, String target) {
Assert.hasText(target, "Target must not be null or empty!"); Assert.hasText(target, "Target must not be null or empty!");
return new AggregationField(name, target); return new AggregationField(name, target);
@@ -187,15 +196,24 @@ public final class Fields implements Iterable<Field> {
private final String target; private final String target;
/** /**
* Creates an aggregation field with the given name. As no target is set explicitly, the name will be used as target * Creates an aggregation field with the given {@code name}.
* as well.
* *
* @param key * @see AggregationField#AggregationField(String, String).
* @param name must not be {@literal null} or empty
*/ */
public AggregationField(String key) { public AggregationField(String name) {
this(key, null); this(name, null);
} }
/**
* Creates an aggregation field with the given {@code name} and {@code target}.
* <p>
* The {@code name} serves as an alias for the actual backing document field denoted by {@code target}. If no target
* is set explicitly, the name will be used as target.
*
* @param name must not be {@literal null} or empty
* @param target
*/
public AggregationField(String name, String target) { public AggregationField(String name, String target) {
String nameToSet = cleanUp(name); String nameToSet = cleanUp(name);

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013 the original author or authors. * Copyright 2013-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -16,17 +16,28 @@
package org.springframework.data.mongodb.core.aggregation; package org.springframework.data.mongodb.core.aggregation;
/** /**
* {@link AggregationOperation} that exposes new {@link ExposedFields} that can be used for later aggregation pipeline * {@link AggregationOperation} that exposes {@link ExposedFields} that can be used for later aggregation pipeline
* {@code AggregationOperation}s. * {@code AggregationOperation}s. A {@link FieldsExposingAggregationOperation} implementing the
* * {@link InheritsFieldsAggregationOperation} will expose fields from its parent operations. Not implementing
* {@link InheritsFieldsAggregationOperation} will replace existing exposed fields.
*
* @author Thomas Darimont * @author Thomas Darimont
* @author Mark Paluch
*/ */
public interface FieldsExposingAggregationOperation extends AggregationOperation { public interface FieldsExposingAggregationOperation extends AggregationOperation {
/** /**
* Returns the fields exposed by the {@link AggregationOperation}. * Returns the fields exposed by the {@link AggregationOperation}.
* *
* @return will never be {@literal null}. * @return will never be {@literal null}.
*/ */
ExposedFields getFields(); ExposedFields getFields();
/**
* Marker interface for {@link AggregationOperation} that inherits fields from previous operations.
*/
static interface InheritsFieldsAggregationOperation extends FieldsExposingAggregationOperation {
}
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013 the original author or authors. * Copyright 2013-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -22,17 +22,33 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject; import com.mongodb.DBObject;
/** /**
* Represents a {@code geoNear} aggregation operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#geoNear(NearQuery, String)} instead of creating
* instances of this class directly.
*
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.3 * @since 1.3
*/ */
public class GeoNearOperation implements AggregationOperation { public class GeoNearOperation implements AggregationOperation {
private final NearQuery nearQuery; private final NearQuery nearQuery;
private final String distanceField;
public GeoNearOperation(NearQuery nearQuery) { /**
* Creates a new {@link GeoNearOperation} from the given {@link NearQuery} and the given distance field. The
* {@code distanceField} defines output field that contains the calculated distance.
*
* @param query must not be {@literal null}.
* @param distanceField must not be {@literal null}.
*/
public GeoNearOperation(NearQuery nearQuery, String distanceField) {
Assert.notNull(nearQuery, "NearQuery must not be null.");
Assert.hasLength(distanceField, "Distance field must not be null or empty.");
Assert.notNull(nearQuery);
this.nearQuery = nearQuery; this.nearQuery = nearQuery;
this.distanceField = distanceField;
} }
/* /*
@@ -41,6 +57,10 @@ public class GeoNearOperation implements AggregationOperation {
*/ */
@Override @Override
public DBObject toDBObject(AggregationOperationContext context) { public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$geoNear", context.getMappedObject(nearQuery.toDBObject()));
BasicDBObject command = (BasicDBObject) context.getMappedObject(nearQuery.toDBObject());
command.put("distanceField", distanceField);
return new BasicDBObject("$geoNear", command);
} }
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013-2014 the original author or authors. * Copyright 2013-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -31,6 +31,9 @@ import com.mongodb.DBObject;
/** /**
* Encapsulates the aggregation framework {@code $group}-operation. * Encapsulates the aggregation framework {@code $group}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#group(Fields)} instead of creating instances of this
* class directly.
* *
* @see http://docs.mongodb.org/manual/reference/aggregation/group/#stage._S_group * @see http://docs.mongodb.org/manual/reference/aggregation/group/#stage._S_group
* @author Sebastian Herold * @author Sebastian Herold
@@ -190,6 +193,16 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return newBuilder(GroupOps.LAST, reference, null); return newBuilder(GroupOps.LAST, reference, null);
} }
/**
* Generates an {@link GroupOperationBuilder} for an {@code $last}-expression for the given {@link AggregationExpression}.
*
* @param expr
* @return
*/
public GroupOperationBuilder last(AggregationExpression expr) {
return newBuilder(GroupOps.LAST, null, expr);
}
/** /**
* Generates an {@link GroupOperationBuilder} for a {@code $first}-expression for the given field-reference. * Generates an {@link GroupOperationBuilder} for a {@code $first}-expression for the given field-reference.
* *
@@ -200,6 +213,16 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return newBuilder(GroupOps.FIRST, reference, null); return newBuilder(GroupOps.FIRST, reference, null);
} }
/**
* Generates an {@link GroupOperationBuilder} for a {@code $first}-expression for the given {@link AggregationExpression}.
*
* @param expr
* @return
*/
public GroupOperationBuilder first(AggregationExpression expr) {
return newBuilder(GroupOps.FIRST, null, expr);
}
/** /**
* Generates an {@link GroupOperationBuilder} for an {@code $avg}-expression for the given field-reference. * Generates an {@link GroupOperationBuilder} for an {@code $avg}-expression for the given field-reference.
* *
@@ -210,6 +233,16 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return newBuilder(GroupOps.AVG, reference, null); return newBuilder(GroupOps.AVG, reference, null);
} }
/**
* Generates an {@link GroupOperationBuilder} for an {@code $avg}-expression for the given {@link AggregationExpression}.
*
* @param expr
* @return
*/
public GroupOperationBuilder avg(AggregationExpression expr) {
return newBuilder(GroupOps.AVG, null, expr);
}
/** /**
* Generates an {@link GroupOperationBuilder} for an {@code $push}-expression for the given field-reference. * Generates an {@link GroupOperationBuilder} for an {@code $push}-expression for the given field-reference.
* *
@@ -244,6 +277,16 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return newBuilder(GroupOps.MIN, reference, null); return newBuilder(GroupOps.MIN, reference, null);
} }
/**
* Generates an {@link GroupOperationBuilder} for an {@code $min}-expression that for the given {@link AggregationExpression}.
*
* @param expr
* @return
*/
public GroupOperationBuilder min(AggregationExpression expr) {
return newBuilder(GroupOps.MIN, null, expr);
}
/** /**
* Generates an {@link GroupOperationBuilder} for an {@code $max}-expression that for the given field-reference. * Generates an {@link GroupOperationBuilder} for an {@code $max}-expression that for the given field-reference.
* *
@@ -254,6 +297,16 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return newBuilder(GroupOps.MAX, reference, null); return newBuilder(GroupOps.MAX, reference, null);
} }
/**
* Generates an {@link GroupOperationBuilder} for an {@code $max}-expression that for the given {@link AggregationExpression}.
*
* @param expr
* @return
*/
public GroupOperationBuilder max(AggregationExpression expr) {
return newBuilder(GroupOps.MAX, null, expr);
}
private GroupOperationBuilder newBuilder(Keyword keyword, String reference, Object value) { private GroupOperationBuilder newBuilder(Keyword keyword, String reference, Object value) {
return new GroupOperationBuilder(this, new Operation(keyword, null, reference, value)); return new GroupOperationBuilder(this, new Operation(keyword, null, reference, value));
} }
@@ -366,6 +419,11 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
public Object getValue(AggregationOperationContext context) { public Object getValue(AggregationOperationContext context) {
if (reference == null) { if (reference == null) {
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDbObject(context);
}
return value; return value;
} }

View File

@@ -0,0 +1,65 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.util.Assert;
/**
* {@link ExposedFieldsAggregationOperationContext} that inherits fields from its parent
* {@link AggregationOperationContext}.
*
* @author Mark Paluch
*/
class InheritingExposedFieldsAggregationOperationContext extends ExposedFieldsAggregationOperationContext {
private final AggregationOperationContext previousContext;
/**
* Creates a new {@link ExposedFieldsAggregationOperationContext} from the given {@link ExposedFields}. Uses the given
* {@link AggregationOperationContext} to perform a mapping to mongo types if necessary.
*
* @param exposedFields must not be {@literal null}.
* @param previousContext must not be {@literal null}.
*/
public InheritingExposedFieldsAggregationOperationContext(ExposedFields exposedFields,
AggregationOperationContext previousContext) {
super(exposedFields, previousContext);
Assert.notNull(previousContext, "PreviousContext must not be null!");
this.previousContext = previousContext;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ExposedFieldsAggregationOperationContext#resolveExposedField(org.springframework.data.mongodb.core.aggregation.Field, java.lang.String)
*/
@Override
protected FieldReference resolveExposedField(Field field, String name) {
FieldReference fieldReference = super.resolveExposedField(field, name);
if (fieldReference != null) {
return fieldReference;
}
if (field != null) {
return previousContext.getReference(field);
}
return previousContext.getReference(name);
}
}

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013 the original author or authors. * Copyright 2013-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -21,14 +21,17 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject; import com.mongodb.DBObject;
/** /**
* Encapsulates the {@code $limit}-operation * Encapsulates the {@code $limit}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#limit(long)} instead of creating instances of this
* class directly.
* *
* @see http://docs.mongodb.org/manual/reference/aggregation/limit/ * @see http://docs.mongodb.org/manual/reference/aggregation/limit/
* @author Thomas Darimont * @author Thomas Darimont
* @author Oliver Gierke * @author Oliver Gierke
* @since 1.3 * @since 1.3
*/ */
class LimitOperation implements AggregationOperation { public class LimitOperation implements AggregationOperation {
private final long maxElements; private final long maxElements;

View File

@@ -0,0 +1,196 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $lookup}-operation. We recommend to use the static factory method
* {@link Aggregation#lookup(String, String, String, String)} instead of creating instances of this class directly.
*
* @author Alessio Fachechi
* @author Christoph Strobl
* @author Mark Paluch
* @see http://docs.mongodb.org/manual/reference/aggregation/lookup/#stage._S_lookup
* @since 1.9
*/
public class LookupOperation implements FieldsExposingAggregationOperation, InheritsFieldsAggregationOperation {
private Field from;
private Field localField;
private Field foreignField;
private ExposedField as;
/**
* Creates a new {@link LookupOperation} for the given {@link Field}s.
*
* @param from must not be {@literal null}.
* @param localField must not be {@literal null}.
* @param foreignField must not be {@literal null}.
* @param as must not be {@literal null}.
*/
public LookupOperation(Field from, Field localField, Field foreignField, Field as) {
Assert.notNull(from, "From must not be null!");
Assert.notNull(localField, "LocalField must not be null!");
Assert.notNull(foreignField, "ForeignField must not be null!");
Assert.notNull(as, "As must not be null!");
this.from = from;
this.localField = localField;
this.foreignField = foreignField;
this.as = new ExposedField(as, true);
}
private LookupOperation() {
// used by builder
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
*/
@Override
public ExposedFields getFields() {
return ExposedFields.from(as);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
BasicDBObject lookupObject = new BasicDBObject();
lookupObject.append("from", from.getTarget());
lookupObject.append("localField", localField.getTarget());
lookupObject.append("foreignField", foreignField.getTarget());
lookupObject.append("as", as.getTarget());
return new BasicDBObject("$lookup", lookupObject);
}
/**
* Get a builder that allows creation of {@link LookupOperation}.
*
* @return
*/
public static FromBuilder newLookup() {
return new LookupOperationBuilder();
}
public static interface FromBuilder {
/**
* @param name the collection in the same database to perform the join with, must not be {@literal null} or empty.
* @return
*/
LocalFieldBuilder from(String name);
}
public static interface LocalFieldBuilder {
/**
* @param name the field from the documents input to the {@code $lookup} stage, must not be {@literal null} or
* empty.
* @return
*/
ForeignFieldBuilder localField(String name);
}
public static interface ForeignFieldBuilder {
/**
* @param name the field from the documents in the {@code from} collection, must not be {@literal null} or empty.
* @return
*/
AsBuilder foreignField(String name);
}
public static interface AsBuilder {
/**
* @param name the name of the new array field to add to the input documents, must not be {@literal null} or empty.
* @return
*/
LookupOperation as(String name);
}
/**
* Builder for fluent {@link LookupOperation} creation.
*
* @author Christoph Strobl
* @since 1.9
*/
public static final class LookupOperationBuilder
implements FromBuilder, LocalFieldBuilder, ForeignFieldBuilder, AsBuilder {
private final LookupOperation lookupOperation;
private LookupOperationBuilder() {
this.lookupOperation = new LookupOperation();
}
/**
* Creates new builder for {@link LookupOperation}.
*
* @return never {@literal null}.
*/
public static FromBuilder newBuilder() {
return new LookupOperationBuilder();
}
@Override
public LocalFieldBuilder from(String name) {
Assert.hasText(name, "'From' must not be null or empty!");
lookupOperation.from = Fields.field(name);
return this;
}
@Override
public LookupOperation as(String name) {
Assert.hasText(name, "'As' must not be null or empty!");
lookupOperation.as = new ExposedField(Fields.field(name), true);
return new LookupOperation(lookupOperation.from, lookupOperation.localField, lookupOperation.foreignField,
lookupOperation.as);
}
@Override
public AsBuilder foreignField(String name) {
Assert.hasText(name, "'ForeignField' must not be null or empty!");
lookupOperation.foreignField = Fields.field(name);
return this;
}
@Override
public ForeignFieldBuilder localField(String name) {
Assert.hasText(name, "'LocalField' must not be null or empty!");
lookupOperation.localField = Fields.field(name);
return this;
}
}
}

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013 the original author or authors. * Copyright 2013-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -15,14 +15,18 @@
*/ */
package org.springframework.data.mongodb.core.aggregation; package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.query.Criteria; import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import com.mongodb.BasicDBObject; import com.mongodb.BasicDBObject;
import com.mongodb.DBObject; import com.mongodb.DBObject;
/** /**
* Encapsulates the {@code $match}-operation * Encapsulates the {@code $match}-operation.
* <p>
* We recommend to use the static factory method
* {@link Aggregation#match(org.springframework.data.mongodb.core.query.Criteria)} instead of creating instances of this
* class directly.
* *
* @see http://docs.mongodb.org/manual/reference/aggregation/match/ * @see http://docs.mongodb.org/manual/reference/aggregation/match/
* @author Sebastian Herold * @author Sebastian Herold
@@ -32,17 +36,17 @@ import com.mongodb.DBObject;
*/ */
public class MatchOperation implements AggregationOperation { public class MatchOperation implements AggregationOperation {
private final Criteria criteria; private final CriteriaDefinition criteriaDefinition;
/** /**
* Creates a new {@link MatchOperation} for the given {@link Criteria}. * Creates a new {@link MatchOperation} for the given {@link CriteriaDefinition}.
* *
* @param criteria must not be {@literal null}. * @param criteriaDefinition must not be {@literal null}.
*/ */
public MatchOperation(Criteria criteria) { public MatchOperation(CriteriaDefinition criteriaDefinition) {
Assert.notNull(criteria, "Criteria must not be null!"); Assert.notNull(criteriaDefinition, "Criteria must not be null!");
this.criteria = criteria; this.criteriaDefinition = criteriaDefinition;
} }
/* /*
@@ -51,6 +55,6 @@ public class MatchOperation implements AggregationOperation {
*/ */
@Override @Override
public DBObject toDBObject(AggregationOperationContext context) { public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$match", context.getMappedObject(criteria.getCriteriaObject())); return new BasicDBObject("$match", context.getMappedObject(criteriaDefinition.getCriteriaObject()));
} }
} }

View File

@@ -0,0 +1,51 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import org.springframework.util.Assert;
/**
* Encapsulates the {@code $out}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#out(String)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/out/
* @author Nikolay Bogdanov
*/
public class OutOperation implements AggregationOperation {
private final String collectionName;
/**
* @param outCollectionName Collection name to export the results. Must not be {@literal null}.
*/
public OutOperation(String outCollectionName) {
Assert.notNull(outCollectionName, "Collection name must not be null!");
this.collectionName = outCollectionName;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$out", collectionName);
}
}

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013-2014 the original author or authors. * Copyright 2013-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -21,6 +21,7 @@ import java.util.Collections;
import java.util.List; import java.util.List;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField; import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
import org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder.FieldProjection; import org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder.FieldProjection;
import org.springframework.util.Assert; import org.springframework.util.Assert;
@@ -28,15 +29,19 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject; import com.mongodb.DBObject;
/** /**
* Encapsulates the aggregation framework {@code $project}-operation. Projection of field to be used in an * Encapsulates the aggregation framework {@code $project}-operation.
* {@link Aggregation}. A projection is similar to a {@link Field} inclusion/exclusion but more powerful. It can
* generate new fields, change values of given field etc.
* <p> * <p>
* Projection of field to be used in an {@link Aggregation}. A projection is similar to a {@link Field}
* inclusion/exclusion but more powerful. It can generate new fields, change values of given field etc.
* <p>
* We recommend to use the static factory method {@link Aggregation#project(Fields)} instead of creating instances of
* this class directly.
* *
* @see http://docs.mongodb.org/manual/reference/aggregation/project/ * @see http://docs.mongodb.org/manual/reference/aggregation/project/
* @author Tobias Trelle * @author Tobias Trelle
* @author Thomas Darimont * @author Thomas Darimont
* @author Oliver Gierke * @author Oliver Gierke
* @author Christoph Strobl
* @since 1.3 * @since 1.3
*/ */
public class ProjectionOperation implements FieldsExposingAggregationOperation { public class ProjectionOperation implements FieldsExposingAggregationOperation {
@@ -118,6 +123,10 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return new ExpressionProjectionOperationBuilder(expression, this, params); return new ExpressionProjectionOperationBuilder(expression, this, params);
} }
public ProjectionOperationBuilder and(AggregationExpression expression) {
return new ProjectionOperationBuilder(expression, this, null);
}
/** /**
* Excludes the given fields from the projection. * Excludes the given fields from the projection.
* *
@@ -343,6 +352,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* *
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
* @author Christoph Strobl
*/ */
public static class ProjectionOperationBuilder extends AbstractProjectionOperationBuilder { public static class ProjectionOperationBuilder extends AbstractProjectionOperationBuilder {
@@ -417,9 +427,13 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
if (this.previousProjection != null) { if (this.previousProjection != null) {
return this.operation.andReplaceLastOneWith(this.previousProjection.withAlias(alias)); return this.operation.andReplaceLastOneWith(this.previousProjection.withAlias(alias));
} else {
return this.operation.and(new FieldProjection(Fields.field(alias, name), null));
} }
if (value instanceof AggregationExpression) {
return this.operation.and(new ExpressionProjection(Fields.field(alias), (AggregationExpression) value));
}
return this.operation.and(new FieldProjection(Fields.field(alias, name), null));
} }
/** /**
@@ -549,6 +563,41 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return project("mod", Fields.field(fieldReference)); return project("mod", Fields.field(fieldReference));
} }
/**
* Generates a {@code $size} expression that returns the size of the array held by the given field. <br />
*
* @return never {@literal null}.
* @since 1.7
*/
public ProjectionOperationBuilder size() {
return project("size");
}
/**
* Generates a {@code $slice} expression that returns a subset of the array held by the given field. <br />
* If {@literal n} is positive, $slice returns up to the first n elements in the array. <br />
* If {@literal n} is negative, $slice returns up to the last n elements in the array.
*
* @param count max number of elements.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder slice(int count) {
return project("slice", count);
}
/**
* Generates a {@code $slice} expression that returns a subset of the array held by the given field. <br />
*
* @param count max number of elements. Must not be negative.
* @param offset the offset within the array to start from.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder slice(int count, int offset) {
return project("slice", offset, count);
}
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext) * @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
@@ -748,6 +797,20 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return field; return field;
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#getExposedField()
*/
@Override
public ExposedField getExposedField() {
if (!getField().isAliased()) {
return super.getExposedField();
}
return new ExposedField(new AggregationField(getField().getName()), true);
}
/** /**
* Creates a new instance of this {@link OperationProjection} with the given alias. * Creates a new instance of this {@link OperationProjection} with the given alias.
* *
@@ -937,4 +1000,31 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
*/ */
public abstract DBObject toDBObject(AggregationOperationContext context); public abstract DBObject toDBObject(AggregationOperationContext context);
} }
/**
* @author Thomas Darimont
*/
static class ExpressionProjection extends Projection {
private final AggregationExpression expression;
private final Field field;
/**
* Creates a new {@link ExpressionProjection}.
*
* @param field
* @param expression
*/
public ExpressionProjection(Field field, AggregationExpression expression) {
super(field);
this.field = field;
this.expression = expression;
}
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject(field.getName(), expression.toDbObject(context));
}
}
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013 the original author or authors. * Copyright 2013-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -22,6 +22,9 @@ import com.mongodb.DBObject;
/** /**
* Encapsulates the aggregation framework {@code $skip}-operation. * Encapsulates the aggregation framework {@code $skip}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#skip(int)} instead of creating instances of this
* class directly.
* *
* @see http://docs.mongodb.org/manual/reference/aggregation/skip/ * @see http://docs.mongodb.org/manual/reference/aggregation/skip/
* @author Thomas Darimont * @author Thomas Darimont

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013 the original author or authors. * Copyright 2013-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -26,6 +26,9 @@ import com.mongodb.DBObject;
/** /**
* Encapsulates the aggregation framework {@code $sort}-operation. * Encapsulates the aggregation framework {@code $sort}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#sort(Direction, String...)} instead of creating
* instances of this class directly.
* *
* @see http://docs.mongodb.org/manual/reference/aggregation/sort/#pipe._S_sort * @see http://docs.mongodb.org/manual/reference/aggregation/sort/#pipe._S_sort
* @author Thomas Darimont * @author Thomas Darimont

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013 the original author or authors. * Copyright 2013-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -23,33 +23,245 @@ import com.mongodb.DBObject;
/** /**
* Encapsulates the aggregation framework {@code $unwind}-operation. * Encapsulates the aggregation framework {@code $unwind}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#unwind(String)} instead of creating instances of
* this class directly.
* *
* @see http://docs.mongodb.org/manual/reference/aggregation/unwind/#pipe._S_unwind * @see http://docs.mongodb.org/manual/reference/aggregation/unwind/#pipe._S_unwind
* @author Thomas Darimont * @author Thomas Darimont
* @author Oliver Gierke * @author Oliver Gierke
* @author Mark Paluch
* @author Christoph Strobl
* @since 1.3 * @since 1.3
*/ */
public class UnwindOperation implements AggregationOperation { public class UnwindOperation
implements AggregationOperation, FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation {
private final ExposedField field; private final ExposedField field;
private final ExposedField arrayIndex;
private final boolean preserveNullAndEmptyArrays;
/** /**
* Creates a new {@link UnwindOperation} for the given {@link Field}. * Creates a new {@link UnwindOperation} for the given {@link Field}.
* *
* @param field must not be {@literal null}. * @param field must not be {@literal null}.
*/ */
public UnwindOperation(Field field) { public UnwindOperation(Field field) {
this(new ExposedField(field, true), false);
Assert.notNull(field);
this.field = new ExposedField(field, true);
} }
/* /**
* Creates a new {@link UnwindOperation} using Mongo 3.2 syntax.
*
* @param field must not be {@literal null}.
* @param preserveNullAndEmptyArrays {@literal true} to output the document if path is {@literal null}, missing or
* array is empty.
* @since 1.10
*/
public UnwindOperation(Field field, boolean preserveNullAndEmptyArrays) {
Assert.notNull(field, "Field must not be null!");
this.field = new ExposedField(field, true);
this.arrayIndex = null;
this.preserveNullAndEmptyArrays = preserveNullAndEmptyArrays;
}
/**
* Creates a new {@link UnwindOperation} using Mongo 3.2 syntax.
*
* @param field must not be {@literal null}.
* @param arrayIndex optional field name to expose the field array index, must not be {@literal null}.
* @param preserveNullAndEmptyArrays {@literal true} to output the document if path is {@literal null}, missing or
* array is empty.
* @since 1.10
*/
public UnwindOperation(Field field, Field arrayIndex, boolean preserveNullAndEmptyArrays) {
Assert.notNull(field, "Field must not be null!");
Assert.notNull(arrayIndex, "ArrayIndex must not be null!");
this.field = new ExposedField(field, true);
this.arrayIndex = new ExposedField(arrayIndex, true);
this.preserveNullAndEmptyArrays = preserveNullAndEmptyArrays;
}
/*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext) * @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/ */
@Override @Override
public DBObject toDBObject(AggregationOperationContext context) { public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$unwind", context.getReference(field).toString());
String path = context.getReference(field).toString();
if (!preserveNullAndEmptyArrays && arrayIndex == null) {
return new BasicDBObject("$unwind", path);
}
DBObject unwindArgs = new BasicDBObject();
unwindArgs.put("path", path);
if (arrayIndex != null) {
unwindArgs.put("includeArrayIndex", arrayIndex.getName());
}
unwindArgs.put("preserveNullAndEmptyArrays", preserveNullAndEmptyArrays);
return new BasicDBObject("$unwind", unwindArgs);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
*/
@Override
public ExposedFields getFields() {
return arrayIndex != null ? ExposedFields.from(arrayIndex) : ExposedFields.from();
}
/**
* Get a builder that allows creation of {@link LookupOperation}.
*
* @return
* @since 1.10
*/
public static PathBuilder newUnwind() {
return UnwindOperationBuilder.newBuilder();
}
/**
* @author Mark Paluch
* @since 1.10
*/
public static interface PathBuilder {
/**
* @param path the path to unwind, must not be {@literal null} or empty.
* @return
*/
IndexBuilder path(String path);
}
/**
* @author Mark Paluch
* @since 1.10
*/
public static interface IndexBuilder {
/**
* Exposes the array index as {@code field}.
*
* @param field field name to expose the field array index, must not be {@literal null} or empty.
* @return
*/
EmptyArraysBuilder arrayIndex(String field);
/**
* Do not expose the array index.
*
* @return
*/
EmptyArraysBuilder noArrayIndex();
}
public static interface EmptyArraysBuilder {
/**
* Output documents if the array is null or empty.
*
* @return
*/
UnwindOperation preserveNullAndEmptyArrays();
/**
* Do not output documents if the array is null or empty.
*
* @return
*/
UnwindOperation skipNullAndEmptyArrays();
}
/**
* Builder for fluent {@link UnwindOperation} creation.
*
* @author Mark Paluch
* @since 1.10
*/
public static final class UnwindOperationBuilder implements PathBuilder, IndexBuilder, EmptyArraysBuilder {
private Field field;
private Field arrayIndex;
private UnwindOperationBuilder() {}
/**
* Creates new builder for {@link UnwindOperation}.
*
* @return never {@literal null}.
*/
public static PathBuilder newBuilder() {
return new UnwindOperationBuilder();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.UnwindOperation.EmptyArraysBuilder#preserveNullAndEmptyArrays()
*/
@Override
public UnwindOperation preserveNullAndEmptyArrays() {
if (arrayIndex != null) {
return new UnwindOperation(field, arrayIndex, true);
}
return new UnwindOperation(field, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.UnwindOperation.EmptyArraysBuilder#skipNullAndEmptyArrays()
*/
@Override
public UnwindOperation skipNullAndEmptyArrays() {
if (arrayIndex != null) {
return new UnwindOperation(field, arrayIndex, false);
}
return new UnwindOperation(field, false);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.UnwindOperation.IndexBuilder#arrayIndex(java.lang.String)
*/
@Override
public EmptyArraysBuilder arrayIndex(String field) {
Assert.hasText(field, "'ArrayIndex' must not be null or empty!");
arrayIndex = Fields.field(field);
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.UnwindOperation.IndexBuilder#noArrayIndex()
*/
@Override
public EmptyArraysBuilder noArrayIndex() {
arrayIndex = null;
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.UnwindOperation.PathBuilder#path(java.lang.String)
*/
@Override
public UnwindOperationBuilder path(String path) {
Assert.hasText(path, "'Path' must not be null or empty!");
field = Fields.field(path);
return this;
}
} }
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011-2013 the original author or authors. * Copyright 2011-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -20,7 +20,7 @@ import java.math.BigInteger;
import org.bson.types.ObjectId; import org.bson.types.ObjectId;
import org.springframework.beans.factory.InitializingBean; import org.springframework.beans.factory.InitializingBean;
import org.springframework.core.convert.ConversionService; import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.ConversionServiceFactory; import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.core.convert.support.GenericConversionService; import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.convert.EntityInstantiators; import org.springframework.data.convert.EntityInstantiators;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToObjectIdConverter; import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToObjectIdConverter;
@@ -46,10 +46,8 @@ public abstract class AbstractMongoConverter implements MongoConverter, Initiali
* *
* @param conversionService * @param conversionService
*/ */
@SuppressWarnings("deprecation")
public AbstractMongoConverter(GenericConversionService conversionService) { public AbstractMongoConverter(GenericConversionService conversionService) {
this.conversionService = conversionService == null ? ConversionServiceFactory.createDefaultConversionService() this.conversionService = conversionService == null ? new DefaultConversionService() : conversionService;
: conversionService;
} }
/** /**
@@ -77,15 +75,13 @@ public abstract class AbstractMongoConverter implements MongoConverter, Initiali
*/ */
private void initializeConverters() { private void initializeConverters() {
if (!conversionService.canConvert(ObjectId.class, String.class)) { conversionService.addConverter(ObjectIdToStringConverter.INSTANCE);
conversionService.addConverter(ObjectIdToStringConverter.INSTANCE); conversionService.addConverter(StringToObjectIdConverter.INSTANCE);
}
if (!conversionService.canConvert(String.class, ObjectId.class)) {
conversionService.addConverter(StringToObjectIdConverter.INSTANCE);
}
if (!conversionService.canConvert(ObjectId.class, BigInteger.class)) { if (!conversionService.canConvert(ObjectId.class, BigInteger.class)) {
conversionService.addConverter(ObjectIdToBigIntegerConverter.INSTANCE); conversionService.addConverter(ObjectIdToBigIntegerConverter.INSTANCE);
} }
if (!conversionService.canConvert(BigInteger.class, ObjectId.class)) { if (!conversionService.canConvert(BigInteger.class, ObjectId.class)) {
conversionService.addConverter(BigIntegerToObjectIdConverter.INSTANCE); conversionService.addConverter(BigIntegerToObjectIdConverter.INSTANCE);
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011-2014 the original author or authors. * Copyright 2011-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -17,14 +17,15 @@ package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Arrays; import java.util.Arrays;
import java.util.Collection;
import java.util.Collections; import java.util.Collections;
import java.util.HashSet; import java.util.HashSet;
import java.util.LinkedHashSet; import java.util.LinkedHashSet;
import java.util.List; import java.util.List;
import java.util.Locale; import java.util.Locale;
import java.util.Map;
import java.util.Set; import java.util.Set;
import java.util.concurrent.ConcurrentHashMap; import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ConcurrentMap;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
@@ -36,17 +37,13 @@ import org.springframework.core.convert.converter.GenericConverter;
import org.springframework.core.convert.converter.GenericConverter.ConvertiblePair; import org.springframework.core.convert.converter.GenericConverter.ConvertiblePair;
import org.springframework.core.convert.support.GenericConversionService; import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.convert.JodaTimeConverters; import org.springframework.data.convert.JodaTimeConverters;
import org.springframework.data.convert.Jsr310Converters;
import org.springframework.data.convert.ReadingConverter; import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.ThreeTenBackPortConverters;
import org.springframework.data.convert.WritingConverter; import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mapping.model.SimpleTypeHolder; import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigDecimalToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.DBObjectToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigDecimalConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigIntegerConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToURLConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.URLToStringConverter;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes; import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.util.CacheValue;
import org.springframework.util.Assert; import org.springframework.util.Assert;
/** /**
@@ -58,6 +55,7 @@ import org.springframework.util.Assert;
* *
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
* @author Christoph Strobl
*/ */
public class CustomConversions { public class CustomConversions {
@@ -69,10 +67,13 @@ public class CustomConversions {
private final Set<ConvertiblePair> writingPairs; private final Set<ConvertiblePair> writingPairs;
private final Set<Class<?>> customSimpleTypes; private final Set<Class<?>> customSimpleTypes;
private final SimpleTypeHolder simpleTypeHolder; private final SimpleTypeHolder simpleTypeHolder;
private final ConcurrentMap<ConvertiblePair, CacheValue> customReadTargetTypes;
private final List<Object> converters; private final List<Object> converters;
private final Map<ConvertiblePair, CacheValue<Class<?>>> customReadTargetTypes;
private final Map<ConvertiblePair, CacheValue<Class<?>>> customWriteTargetTypes;
private final Map<Class<?>, CacheValue<Class<?>>> rawWriteTargetTypes;
/** /**
* Creates an empty {@link CustomConversions} object. * Creates an empty {@link CustomConversions} object.
*/ */
@@ -92,23 +93,20 @@ public class CustomConversions {
this.readingPairs = new LinkedHashSet<ConvertiblePair>(); this.readingPairs = new LinkedHashSet<ConvertiblePair>();
this.writingPairs = new LinkedHashSet<ConvertiblePair>(); this.writingPairs = new LinkedHashSet<ConvertiblePair>();
this.customSimpleTypes = new HashSet<Class<?>>(); this.customSimpleTypes = new HashSet<Class<?>>();
this.customReadTargetTypes = new ConcurrentHashMap<GenericConverter.ConvertiblePair, CacheValue>(); this.customReadTargetTypes = new ConcurrentHashMap<ConvertiblePair, CacheValue<Class<?>>>();
this.customWriteTargetTypes = new ConcurrentHashMap<ConvertiblePair, CacheValue<Class<?>>>();
this.rawWriteTargetTypes = new ConcurrentHashMap<Class<?>, CacheValue<Class<?>>>();
List<Object> toRegister = new ArrayList<Object>(); List<Object> toRegister = new ArrayList<Object>();
// Add user provided converters to make sure they can override the defaults // Add user provided converters to make sure they can override the defaults
toRegister.addAll(converters); toRegister.addAll(converters);
toRegister.add(CustomToStringConverter.INSTANCE); toRegister.add(CustomToStringConverter.INSTANCE);
toRegister.add(BigDecimalToStringConverter.INSTANCE); toRegister.addAll(MongoConverters.getConvertersToRegister());
toRegister.add(StringToBigDecimalConverter.INSTANCE);
toRegister.add(BigIntegerToStringConverter.INSTANCE);
toRegister.add(StringToBigIntegerConverter.INSTANCE);
toRegister.add(URLToStringConverter.INSTANCE);
toRegister.add(StringToURLConverter.INSTANCE);
toRegister.add(DBObjectToStringConverter.INSTANCE);
toRegister.addAll(JodaTimeConverters.getConvertersToRegister()); toRegister.addAll(JodaTimeConverters.getConvertersToRegister());
toRegister.addAll(GeoConverters.getConvertersToRegister()); toRegister.addAll(GeoConverters.getConvertersToRegister());
toRegister.addAll(Jsr310Converters.getConvertersToRegister());
toRegister.addAll(ThreeTenBackPortConverters.getConvertersToRegister());
for (Object c : toRegister) { for (Object c : toRegister) {
registerConversion(c); registerConversion(c);
@@ -168,14 +166,15 @@ public class CustomConversions {
} }
if (!added) { if (!added) {
throw new IllegalArgumentException("Given set contains element that is neither Converter nor ConverterFactory!"); throw new IllegalArgumentException(
"Given set contains element that is neither Converter nor ConverterFactory!");
} }
} }
} }
/** /**
* Registers a conversion for the given converter. Inspects either generics or the {@link ConvertiblePair}s returned * Registers a conversion for the given converter. Inspects either generics of {@link Converter} and
* by a {@link GenericConverter}. * {@link ConverterFactory} or the {@link ConvertiblePair}s returned by a {@link GenericConverter}.
* *
* @param converter * @param converter
*/ */
@@ -190,6 +189,10 @@ public class CustomConversions {
for (ConvertiblePair pair : genericConverter.getConvertibleTypes()) { for (ConvertiblePair pair : genericConverter.getConvertibleTypes()) {
register(new ConverterRegistration(pair, isReading, isWriting)); register(new ConverterRegistration(pair, isReading, isWriting));
} }
} else if (converter instanceof ConverterFactory) {
Class<?>[] arguments = GenericTypeResolver.resolveTypeArguments(converter.getClass(), ConverterFactory.class);
register(new ConverterRegistration(arguments[0], arguments[1], isReading, isWriting));
} else if (converter instanceof Converter) { } else if (converter instanceof Converter) {
Class<?>[] arguments = GenericTypeResolver.resolveTypeArguments(converter.getClass(), Converter.class); Class<?>[] arguments = GenericTypeResolver.resolveTypeArguments(converter.getClass(), Converter.class);
register(new ConverterRegistration(arguments[0], arguments[1], isReading, isWriting)); register(new ConverterRegistration(arguments[0], arguments[1], isReading, isWriting));
@@ -235,70 +238,103 @@ public class CustomConversions {
* @param sourceType must not be {@literal null} * @param sourceType must not be {@literal null}
* @return * @return
*/ */
public Class<?> getCustomWriteTarget(Class<?> sourceType) { public Class<?> getCustomWriteTarget(final Class<?> sourceType) {
return getCustomWriteTarget(sourceType, null);
return getOrCreateAndCache(sourceType, rawWriteTargetTypes, new Producer() {
@Override
public Class<?> get() {
return getCustomTarget(sourceType, null, writingPairs);
}
});
} }
/** /**
* Returns the target type we can write an onject of the given source type to. The returned type might be a subclass * Returns the target type we can readTargetWriteLocl an inject of the given source type to. The returned type might
* oth the given expected type though. If {@code expectedTargetType} is {@literal null} we will simply return the * be a subclass of the given expected type though. If {@code expectedTargetType} is {@literal null} we will simply
* first target type matching or {@literal null} if no conversion can be found. * return the first target type matching or {@literal null} if no conversion can be found.
* *
* @param sourceType must not be {@literal null} * @param sourceType must not be {@literal null}
* @param requestedTargetType * @param requestedTargetType
* @return * @return
*/ */
public Class<?> getCustomWriteTarget(Class<?> sourceType, Class<?> requestedTargetType) { public Class<?> getCustomWriteTarget(final Class<?> sourceType, final Class<?> requestedTargetType) {
Assert.notNull(sourceType); if (requestedTargetType == null) {
return getCustomWriteTarget(sourceType);
}
return getCustomTarget(sourceType, requestedTargetType, writingPairs); return getOrCreateAndCache(new ConvertiblePair(sourceType, requestedTargetType), customWriteTargetTypes,
new Producer() {
@Override
public Class<?> get() {
return getCustomTarget(sourceType, requestedTargetType, writingPairs);
}
});
} }
/** /**
* Returns whether we have a custom conversion registered to write into a Mongo native type. The returned type might * Returns whether we have a custom conversion registered to readTargetWriteLocl into a Mongo native type. The
* be a subclass of the given expected type though. * returned type might be a subclass of the given expected type though.
* *
* @param sourceType must not be {@literal null} * @param sourceType must not be {@literal null}
* @return * @return
*/ */
public boolean hasCustomWriteTarget(Class<?> sourceType) { public boolean hasCustomWriteTarget(Class<?> sourceType) {
Assert.notNull(sourceType);
return hasCustomWriteTarget(sourceType, null); return hasCustomWriteTarget(sourceType, null);
} }
/** /**
* Returns whether we have a custom conversion registered to write an object of the given source type into an object * Returns whether we have a custom conversion registered to readTargetWriteLocl an object of the given source type
* of the given Mongo native target type. * into an object of the given Mongo native target type.
* *
* @param sourceType must not be {@literal null}. * @param sourceType must not be {@literal null}.
* @param requestedTargetType * @param requestedTargetType
* @return * @return
*/ */
public boolean hasCustomWriteTarget(Class<?> sourceType, Class<?> requestedTargetType) { public boolean hasCustomWriteTarget(Class<?> sourceType, Class<?> requestedTargetType) {
Assert.notNull(sourceType);
return getCustomWriteTarget(sourceType, requestedTargetType) != null; return getCustomWriteTarget(sourceType, requestedTargetType) != null;
} }
/** /**
* Returns whether we have a custom conversion registered to read the given source into the given target type. * Returns whether we have a custom conversion registered to readTargetReadLock the given source into the given target
* type.
* *
* @param sourceType must not be {@literal null} * @param sourceType must not be {@literal null}
* @param requestedTargetType must not be {@literal null} * @param requestedTargetType must not be {@literal null}
* @return * @return
*/ */
public boolean hasCustomReadTarget(Class<?> sourceType, Class<?> requestedTargetType) { public boolean hasCustomReadTarget(Class<?> sourceType, Class<?> requestedTargetType) {
Assert.notNull(sourceType);
Assert.notNull(requestedTargetType);
return getCustomReadTarget(sourceType, requestedTargetType) != null; return getCustomReadTarget(sourceType, requestedTargetType) != null;
} }
/** /**
* Inspects the given {@link ConvertiblePair} for ones that have a source compatible type as source. Additionally * Returns the actual target type for the given {@code sourceType} and {@code requestedTargetType}. Note that the
* returned {@link Class} could be an assignable type to the given {@code requestedTargetType}.
*
* @param sourceType must not be {@literal null}.
* @param requestedTargetType can be {@literal null}.
* @return
*/
private Class<?> getCustomReadTarget(final Class<?> sourceType, final Class<?> requestedTargetType) {
if (requestedTargetType == null) {
return null;
}
return getOrCreateAndCache(new ConvertiblePair(sourceType, requestedTargetType), customReadTargetTypes,
new Producer() {
@Override
public Class<?> get() {
return getCustomTarget(sourceType, requestedTargetType, readingPairs);
}
});
}
/**
* Inspects the given {@link ConvertiblePair}s for ones that have a source compatible type as source. Additionally
* checks assignability of the target type if one is given. * checks assignability of the target type if one is given.
* *
* @param sourceType must not be {@literal null}. * @param sourceType must not be {@literal null}.
@@ -307,11 +343,15 @@ public class CustomConversions {
* @return * @return
*/ */
private static Class<?> getCustomTarget(Class<?> sourceType, Class<?> requestedTargetType, private static Class<?> getCustomTarget(Class<?> sourceType, Class<?> requestedTargetType,
Iterable<ConvertiblePair> pairs) { Collection<ConvertiblePair> pairs) {
Assert.notNull(sourceType); Assert.notNull(sourceType);
Assert.notNull(pairs); Assert.notNull(pairs);
if (requestedTargetType != null && pairs.contains(new ConvertiblePair(sourceType, requestedTargetType))) {
return requestedTargetType;
}
for (ConvertiblePair typePair : pairs) { for (ConvertiblePair typePair : pairs) {
if (typePair.getSourceType().isAssignableFrom(sourceType)) { if (typePair.getSourceType().isAssignableFrom(sourceType)) {
Class<?> targetType = typePair.getTargetType(); Class<?> targetType = typePair.getTargetType();
@@ -325,32 +365,31 @@ public class CustomConversions {
} }
/** /**
* Returns the actual target type for the given {@code sourceType} and {@code requestedTargetType}. Note that the * Will try to find a value for the given key in the given cache or produce one using the given {@link Producer} and
* returned {@link Class} could be an assignable type to the given {@code requestedTargetType}. * store it in the cache.
* *
* @param sourceType must not be {@literal null}. * @param key the key to lookup a potentially existing value, must not be {@literal null}.
* @param requestedTargetType can be {@literal null}. * @param cache the cache to find the value in, must not be {@literal null}.
* @param producer the {@link Producer} to create values to cache, must not be {@literal null}.
* @return * @return
*/ */
private Class<?> getCustomReadTarget(Class<?> sourceType, Class<?> requestedTargetType) { private static <T> Class<?> getOrCreateAndCache(T key, Map<T, CacheValue<Class<?>>> cache, Producer producer) {
Assert.notNull(sourceType); CacheValue<Class<?>> cacheValue = cache.get(key);
if (requestedTargetType == null) { if (cacheValue != null) {
return null; return cacheValue.getValue();
} }
ConvertiblePair lookupKey = new ConvertiblePair(sourceType, requestedTargetType); Class<?> type = producer.get();
CacheValue readTargetTypeValue = customReadTargetTypes.get(lookupKey); cache.put(key, CacheValue.<Class<?>> ofNullable(type));
if (readTargetTypeValue != null) { return type;
return readTargetTypeValue.getType(); }
}
readTargetTypeValue = CacheValue.of(getCustomTarget(sourceType, requestedTargetType, readingPairs)); private interface Producer {
CacheValue cacheValue = customReadTargetTypes.putIfAbsent(lookupKey, readTargetTypeValue);
return cacheValue != null ? cacheValue.getType() : readTargetTypeValue.getType(); Class<?> get();
} }
@WritingConverter @WritingConverter
@@ -358,6 +397,10 @@ public class CustomConversions {
INSTANCE; INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.GenericConverter#getConvertibleTypes()
*/
public Set<ConvertiblePair> getConvertibleTypes() { public Set<ConvertiblePair> getConvertibleTypes() {
ConvertiblePair localeToString = new ConvertiblePair(Locale.class, String.class); ConvertiblePair localeToString = new ConvertiblePair(Locale.class, String.class);
@@ -366,34 +409,12 @@ public class CustomConversions {
return new HashSet<ConvertiblePair>(Arrays.asList(localeToString, booleanToString)); return new HashSet<ConvertiblePair>(Arrays.asList(localeToString, booleanToString));
} }
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.GenericConverter#convert(java.lang.Object, org.springframework.core.convert.TypeDescriptor, org.springframework.core.convert.TypeDescriptor)
*/
public Object convert(Object source, TypeDescriptor sourceType, TypeDescriptor targetType) { public Object convert(Object source, TypeDescriptor sourceType, TypeDescriptor targetType) {
return source.toString(); return source.toString();
} }
} }
/**
* Wrapper to safely store {@literal null} values in the type cache.
*
* @author Patryk Wasik
* @author Oliver Gierke
* @author Thomas Darimont
*/
private static class CacheValue {
private static final CacheValue ABSENT = new CacheValue(null);
private final Class<?> type;
public CacheValue(Class<?> type) {
this.type = type;
}
public Class<?> getType() {
return type;
}
static CacheValue of(Class<?> type) {
return type == null ? ABSENT : new CacheValue(type);
}
}
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013 the original author or authors. * Copyright 2013-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -34,7 +34,7 @@ import com.mongodb.DBObject;
*/ */
class DBObjectAccessor { class DBObjectAccessor {
private final DBObject dbObject; private final BasicDBObject dbObject;
/** /**
* Creates a new {@link DBObjectAccessor} for the given {@link DBObject}. * Creates a new {@link DBObjectAccessor} for the given {@link DBObject}.
@@ -46,7 +46,7 @@ class DBObjectAccessor {
Assert.notNull(dbObject, "DBObject must not be null!"); Assert.notNull(dbObject, "DBObject must not be null!");
Assert.isInstanceOf(BasicDBObject.class, dbObject, "Given DBObject must be a BasicDBObject!"); Assert.isInstanceOf(BasicDBObject.class, dbObject, "Given DBObject must be a BasicDBObject!");
this.dbObject = dbObject; this.dbObject = (BasicDBObject) dbObject;
} }
/** /**
@@ -62,6 +62,11 @@ class DBObjectAccessor {
Assert.notNull(prop, "MongoPersistentProperty must not be null!"); Assert.notNull(prop, "MongoPersistentProperty must not be null!");
String fieldName = prop.getFieldName(); String fieldName = prop.getFieldName();
if (!fieldName.contains(".")) {
dbObject.put(fieldName, value);
return;
}
Iterator<String> parts = Arrays.asList(fieldName.split("\\.")).iterator(); Iterator<String> parts = Arrays.asList(fieldName.split("\\.")).iterator();
DBObject dbObject = this.dbObject; DBObject dbObject = this.dbObject;
@@ -70,9 +75,7 @@ class DBObjectAccessor {
String part = parts.next(); String part = parts.next();
if (parts.hasNext()) { if (parts.hasNext()) {
BasicDBObject nestedDbObject = new BasicDBObject(); dbObject = getOrCreateNestedDbObject(part, dbObject);
dbObject.put(part, nestedDbObject);
dbObject = nestedDbObject;
} else { } else {
dbObject.put(part, value); dbObject.put(part, value);
} }
@@ -87,12 +90,16 @@ class DBObjectAccessor {
* @param property must not be {@literal null}. * @param property must not be {@literal null}.
* @return * @return
*/ */
@SuppressWarnings("unchecked")
public Object get(MongoPersistentProperty property) { public Object get(MongoPersistentProperty property) {
String fieldName = property.getFieldName(); String fieldName = property.getFieldName();
if (!fieldName.contains(".")) {
return this.dbObject.get(fieldName);
}
Iterator<String> parts = Arrays.asList(fieldName.split("\\.")).iterator(); Iterator<String> parts = Arrays.asList(fieldName.split("\\.")).iterator();
Map<Object, Object> source = this.dbObject.toMap(); Map<String, Object> source = this.dbObject;
Object result = null; Object result = null;
while (source != null && parts.hasNext()) { while (source != null && parts.hasNext()) {
@@ -107,17 +114,45 @@ class DBObjectAccessor {
return result; return result;
} }
/**
* Returns the given source object as map, i.e. {@link BasicDBObject}s and maps as is or {@literal null} otherwise.
*
* @param source can be {@literal null}.
* @return
*/
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
private Map<Object, Object> getAsMap(Object source) { private static Map<String, Object> getAsMap(Object source) {
if (source instanceof BasicDBObject) { if (source instanceof BasicDBObject) {
return ((DBObject) source).toMap(); return (BasicDBObject) source;
} }
if (source instanceof Map) { if (source instanceof Map) {
return (Map<Object, Object>) source; return (Map<String, Object>) source;
} }
return null; return null;
} }
/**
* Returns the {@link DBObject} which either already exists in the given source under the given key, or creates a new
* nested one, registers it with the source and returns it.
*
* @param key must not be {@literal null} or empty.
* @param source must not be {@literal null}.
* @return
*/
private static DBObject getOrCreateNestedDbObject(String key, DBObject source) {
Object existing = source.get(key);
if (existing instanceof BasicDBObject) {
return (BasicDBObject) existing;
}
DBObject nested = new BasicDBObject();
source.put(key, nested);
return nested;
}
} }

View File

@@ -0,0 +1,28 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBRef;
/**
* @author Oliver Gierke
*/
public interface DbRefProxyHandler {
Object populateId(MongoPersistentProperty property, DBRef source, Object proxy);
}

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013-2014 the original author or authors. * Copyright 2013-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -15,9 +15,13 @@
*/ */
package org.springframework.data.mongodb.core.convert; package org.springframework.data.mongodb.core.convert;
import java.util.List;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity; import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty; import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBObject;
import com.mongodb.DBRef; import com.mongodb.DBRef;
/** /**
@@ -25,6 +29,7 @@ import com.mongodb.DBRef;
* *
* @author Thomas Darimont * @author Thomas Darimont
* @author Oliver Gierke * @author Oliver Gierke
* @author Christoph Strobl
* @since 1.4 * @since 1.4
*/ */
public interface DbRefResolver { public interface DbRefResolver {
@@ -39,7 +44,8 @@ public interface DbRefResolver {
* @param callback will never be {@literal null}. * @param callback will never be {@literal null}.
* @return * @return
*/ */
Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback); Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback,
DbRefProxyHandler proxyHandler);
/** /**
* Creates a {@link DBRef} instance for the given {@link org.springframework.data.mongodb.core.mapping.DBRef} * Creates a {@link DBRef} instance for the given {@link org.springframework.data.mongodb.core.mapping.DBRef}
@@ -52,4 +58,25 @@ public interface DbRefResolver {
*/ */
DBRef createDbRef(org.springframework.data.mongodb.core.mapping.DBRef annotation, MongoPersistentEntity<?> entity, DBRef createDbRef(org.springframework.data.mongodb.core.mapping.DBRef annotation, MongoPersistentEntity<?> entity,
Object id); Object id);
/**
* Actually loads the {@link DBRef} from the datasource.
*
* @param dbRef must not be {@literal null}.
* @return
* @since 1.7
*/
DBObject fetch(DBRef dbRef);
/**
* Loads a given {@link List} of {@link DBRef}s from the datasource in one batch. The resulting {@link List} of
* {@link DBObject} will reflect the ordering of the {@link DBRef} passed in.<br />
* The {@link DBRef} elements in the list must not reference different collections.
*
* @param dbRefs must not be {@literal null}.
* @return never {@literal null}.
* @throws InvalidDataAccessApiUsageException in case not all {@link DBRef} target the same collection.
* @since 1.10
*/
List<DBObject> bulkFetch(List<DBRef> dbRefs);
} }

View File

@@ -0,0 +1,79 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.DefaultSpELExpressionEvaluator;
import org.springframework.data.mapping.model.SpELContext;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
/**
* @author Oliver Gierke
*/
class DefaultDbRefProxyHandler implements DbRefProxyHandler {
private final SpELContext spELContext;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final ValueResolver resolver;
/**
* @param spELContext must not be {@literal null}.
* @param conversionService must not be {@literal null}.
* @param mappingContext must not be {@literal null}.
*/
public DefaultDbRefProxyHandler(SpELContext spELContext,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext, ValueResolver resolver) {
this.spELContext = spELContext;
this.mappingContext = mappingContext;
this.resolver = resolver;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DbRefProxyHandler#populateId(com.mongodb.DBRef, java.lang.Object)
*/
@Override
public Object populateId(MongoPersistentProperty property, DBRef source, Object proxy) {
if (source == null) {
return proxy;
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(property);
MongoPersistentProperty idProperty = entity.getIdProperty();
if (idProperty.usePropertyAccess()) {
return proxy;
}
SpELExpressionEvaluator evaluator = new DefaultSpELExpressionEvaluator(proxy, spELContext);
PersistentPropertyAccessor accessor = entity.getPropertyAccessor(proxy);
DBObject object = new BasicDBObject(idProperty.getFieldName(), source.getId());
ObjectPath objectPath = ObjectPath.ROOT.push(proxy, entity, null);
accessor.setProperty(idProperty, resolver.getValueInternal(idProperty, object, evaluator, objectPath));
return proxy;
}
}

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013-2014 the original author or authors. * Copyright 2013-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -22,6 +22,10 @@ import java.io.ObjectInputStream;
import java.io.ObjectOutputStream; import java.io.ObjectOutputStream;
import java.io.Serializable; import java.io.Serializable;
import java.lang.reflect.Method; import java.lang.reflect.Method;
import java.util.ArrayList;
import java.util.Collections;
import java.util.Comparator;
import java.util.List;
import org.aopalliance.intercept.MethodInterceptor; import org.aopalliance.intercept.MethodInterceptor;
import org.aopalliance.intercept.MethodInvocation; import org.aopalliance.intercept.MethodInvocation;
@@ -31,6 +35,7 @@ import org.springframework.cglib.proxy.Enhancer;
import org.springframework.cglib.proxy.Factory; import org.springframework.cglib.proxy.Factory;
import org.springframework.cglib.proxy.MethodProxy; import org.springframework.cglib.proxy.MethodProxy;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.support.PersistenceExceptionTranslator; import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.LazyLoadingException; import org.springframework.data.mongodb.LazyLoadingException;
import org.springframework.data.mongodb.MongoDbFactory; import org.springframework.data.mongodb.MongoDbFactory;
@@ -39,9 +44,11 @@ import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.objenesis.ObjenesisStd; import org.springframework.objenesis.ObjenesisStd;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.ReflectionUtils; import org.springframework.util.ReflectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.DB; import com.mongodb.DB;
import com.mongodb.DBObject;
import com.mongodb.DBRef; import com.mongodb.DBRef;
/** /**
@@ -50,6 +57,7 @@ import com.mongodb.DBRef;
* *
* @author Thomas Darimont * @author Thomas Darimont
* @author Oliver Gierke * @author Oliver Gierke
* @author Christoph Strobl
* @since 1.4 * @since 1.4
*/ */
public class DefaultDbRefResolver implements DbRefResolver { public class DefaultDbRefResolver implements DbRefResolver {
@@ -77,13 +85,14 @@ public class DefaultDbRefResolver implements DbRefResolver {
* @see org.springframework.data.mongodb.core.convert.DbRefResolver#resolveDbRef(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty, org.springframework.data.mongodb.core.convert.DbRefResolverCallback) * @see org.springframework.data.mongodb.core.convert.DbRefResolver#resolveDbRef(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty, org.springframework.data.mongodb.core.convert.DbRefResolverCallback)
*/ */
@Override @Override
public Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback) { public Object resolveDbRef(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback,
DbRefProxyHandler handler) {
Assert.notNull(property, "Property must not be null!"); Assert.notNull(property, "Property must not be null!");
Assert.notNull(callback, "Callback must not be null!"); Assert.notNull(callback, "Callback must not be null!");
if (isLazyDbRef(property)) { if (isLazyDbRef(property)) {
return createLazyLoadingProxy(property, dbref, callback); return createLazyLoadingProxy(property, dbref, callback, handler);
} }
return callback.resolve(property); return callback.resolve(property);
@@ -96,11 +105,50 @@ public class DefaultDbRefResolver implements DbRefResolver {
@Override @Override
public DBRef createDbRef(org.springframework.data.mongodb.core.mapping.DBRef annotation, public DBRef createDbRef(org.springframework.data.mongodb.core.mapping.DBRef annotation,
MongoPersistentEntity<?> entity, Object id) { MongoPersistentEntity<?> entity, Object id) {
return new DBRef(entity.getCollection(), id);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DbRefResolver#fetch(com.mongodb.DBRef)
*/
@Override
public DBObject fetch(DBRef dbRef) {
return ReflectiveDBRefResolver.fetch(mongoDbFactory, dbRef);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DbRefResolver#bulkFetch(java.util.List)
*/
@Override
public List<DBObject> bulkFetch(List<DBRef> refs) {
Assert.notNull(mongoDbFactory, "Factory must not be null!");
Assert.notNull(refs, "DBRef to fetch must not be null!");
if (refs.isEmpty()) {
return Collections.emptyList();
}
String collection = refs.iterator().next().getCollectionName();
List<Object> ids = new ArrayList<Object>(refs.size());
for (DBRef ref : refs) {
if (!collection.equals(ref.getCollectionName())) {
throw new InvalidDataAccessApiUsageException(
"DBRefs must all target the same collection for bulk fetch operation.");
}
ids.add(ref.getId());
}
DB db = mongoDbFactory.getDb(); DB db = mongoDbFactory.getDb();
db = annotation != null && StringUtils.hasText(annotation.db()) ? mongoDbFactory.getDb(annotation.db()) : db; List<DBObject> result = db.getCollection(collection)
.find(new BasicDBObjectBuilder().add("_id", new BasicDBObject("$in", ids)).get()).toArray();
return new DBRef(db, entity.getCollection(), id); Collections.sort(result, new DbRefByReferencePositionComparator(ids));
return result;
} }
/** /**
@@ -112,7 +160,8 @@ public class DefaultDbRefResolver implements DbRefResolver {
* @param callback must not be {@literal null}. * @param callback must not be {@literal null}.
* @return * @return
*/ */
private Object createLazyLoadingProxy(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback) { private Object createLazyLoadingProxy(MongoPersistentProperty property, DBRef dbref, DbRefResolverCallback callback,
DbRefProxyHandler handler) {
Class<?> propertyType = property.getType(); Class<?> propertyType = property.getType();
LazyLoadingInterceptor interceptor = new LazyLoadingInterceptor(property, dbref, exceptionTranslator, callback); LazyLoadingInterceptor interceptor = new LazyLoadingInterceptor(property, dbref, exceptionTranslator, callback);
@@ -122,7 +171,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
Factory factory = (Factory) objenesis.newInstance(getEnhancedTypeFor(propertyType)); Factory factory = (Factory) objenesis.newInstance(getEnhancedTypeFor(propertyType));
factory.setCallbacks(new Callback[] { interceptor }); factory.setCallbacks(new Callback[] { interceptor });
return factory; return handler.populateId(property, dbref, factory);
} }
ProxyFactory proxyFactory = new ProxyFactory(); ProxyFactory proxyFactory = new ProxyFactory();
@@ -135,7 +184,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
proxyFactory.addInterface(propertyType); proxyFactory.addInterface(propertyType);
proxyFactory.addAdvice(interceptor); proxyFactory.addAdvice(interceptor);
return proxyFactory.getProxy(); return handler.populateId(property, dbref, proxyFactory.getProxy());
} }
/** /**
@@ -171,11 +220,12 @@ public class DefaultDbRefResolver implements DbRefResolver {
* *
* @author Thomas Darimont * @author Thomas Darimont
* @author Oliver Gierke * @author Oliver Gierke
* @author Christoph Strobl
*/ */
static class LazyLoadingInterceptor implements MethodInterceptor, org.springframework.cglib.proxy.MethodInterceptor, static class LazyLoadingInterceptor
Serializable { implements MethodInterceptor, org.springframework.cglib.proxy.MethodInterceptor, Serializable {
private static final Method INITIALIZE_METHOD, TO_DBREF_METHOD; private static final Method INITIALIZE_METHOD, TO_DBREF_METHOD, FINALIZE_METHOD;
private final DbRefResolverCallback callback; private final DbRefResolverCallback callback;
private final MongoPersistentProperty property; private final MongoPersistentProperty property;
@@ -187,8 +237,9 @@ public class DefaultDbRefResolver implements DbRefResolver {
static { static {
try { try {
INITIALIZE_METHOD = LazyLoadingProxy.class.getMethod("initialize"); INITIALIZE_METHOD = LazyLoadingProxy.class.getMethod("getTarget");
TO_DBREF_METHOD = LazyLoadingProxy.class.getMethod("toDBRef"); TO_DBREF_METHOD = LazyLoadingProxy.class.getMethod("toDBRef");
FINALIZE_METHOD = Object.class.getDeclaredMethod("finalize");
} catch (Exception e) { } catch (Exception e) {
throw new RuntimeException(e); throw new RuntimeException(e);
} }
@@ -252,6 +303,11 @@ public class DefaultDbRefResolver implements DbRefResolver {
if (ReflectionUtils.isHashCodeMethod(method)) { if (ReflectionUtils.isHashCodeMethod(method)) {
return proxyHashCode(proxy); return proxyHashCode(proxy);
} }
// DATAMONGO-1076 - finalize methods should not trigger proxy initialization
if (FINALIZE_METHOD.equals(method)) {
return null;
}
} }
Object target = ensureResolved(); Object target = ensureResolved();
@@ -273,7 +329,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
StringBuilder description = new StringBuilder(); StringBuilder description = new StringBuilder();
if (dbref != null) { if (dbref != null) {
description.append(dbref.getRef()); description.append(dbref.getCollectionName());
description.append(":"); description.append(":");
description.append(dbref.getId()); description.append(dbref.getId());
} else { } else {
@@ -373,11 +429,45 @@ public class DefaultDbRefResolver implements DbRefResolver {
} catch (RuntimeException ex) { } catch (RuntimeException ex) {
DataAccessException translatedException = this.exceptionTranslator.translateExceptionIfPossible(ex); DataAccessException translatedException = this.exceptionTranslator.translateExceptionIfPossible(ex);
throw new LazyLoadingException("Unable to lazily resolve DBRef!", translatedException); throw new LazyLoadingException("Unable to lazily resolve DBRef!",
translatedException != null ? translatedException : ex);
} }
} }
return result; return result;
} }
} }
/**
* {@link Comparator} for sorting {@link DBObject} that have been loaded in random order by a predefined list of
* reference identifiers.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.10
*/
private static class DbRefByReferencePositionComparator implements Comparator<DBObject> {
private final List<Object> reference;
/**
* Creates a new {@link DbRefByReferencePositionComparator} for the given list of reference identifiers.
*
* @param referenceIds must not be {@literal null}.
*/
public DbRefByReferencePositionComparator(List<Object> referenceIds) {
Assert.notNull(referenceIds, "Reference identifiers must not be null!");
this.reference = new ArrayList<Object>(referenceIds);
}
/*
* (non-Javadoc)
* @see java.util.Comparator#compare(java.lang.Object, java.lang.Object)
*/
@Override
public int compare(DBObject o1, DBObject o2) {
return Integer.compare(reference.indexOf(o1.get("_id")), reference.indexOf(o2.get("_id")));
}
}
} }

View File

@@ -0,0 +1,61 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBObject;
/**
* Default implementation of {@link DbRefResolverCallback}.
*
* @author Oliver Gierke
*/
class DefaultDbRefResolverCallback implements DbRefResolverCallback {
private final DBObject surroundingObject;
private final ObjectPath path;
private final ValueResolver resolver;
private final SpELExpressionEvaluator evaluator;
/**
* Creates a new {@link DefaultDbRefResolverCallback} using the given {@link DBObject}, {@link ObjectPath},
* {@link ValueResolver} and {@link SpELExpressionEvaluator}.
*
* @param surroundingObject must not be {@literal null}.
* @param path must not be {@literal null}.
* @param evaluator must not be {@literal null}.
* @param resolver must not be {@literal null}.
*/
public DefaultDbRefResolverCallback(DBObject surroundingObject, ObjectPath path, SpELExpressionEvaluator evaluator,
ValueResolver resolver) {
this.surroundingObject = surroundingObject;
this.path = path;
this.resolver = resolver;
this.evaluator = evaluator;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DbRefResolverCallback#resolve(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty)
*/
@Override
public Object resolve(MongoPersistentProperty property) {
return resolver.getValueInternal(property, surroundingObject, evaluator, path);
}
}

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011-2013 the original author or authors. * Copyright 2011-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -45,9 +45,9 @@ import com.mongodb.DBObject;
public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implements MongoTypeMapper { public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implements MongoTypeMapper {
public static final String DEFAULT_TYPE_KEY = "_class"; public static final String DEFAULT_TYPE_KEY = "_class";
@SuppressWarnings("rawtypes")// @SuppressWarnings("rawtypes") //
private static final TypeInformation<List> LIST_TYPE_INFO = ClassTypeInformation.from(List.class); private static final TypeInformation<List> LIST_TYPE_INFO = ClassTypeInformation.from(List.class);
@SuppressWarnings("rawtypes")// @SuppressWarnings("rawtypes") //
private static final TypeInformation<Map> MAP_TYPE_INFO = ClassTypeInformation.from(Map.class); private static final TypeInformation<Map> MAP_TYPE_INFO = ClassTypeInformation.from(Map.class);
private final TypeAliasAccessor<DBObject> accessor; private final TypeAliasAccessor<DBObject> accessor;
@@ -58,12 +58,12 @@ public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implemen
} }
public DefaultMongoTypeMapper(String typeKey) { public DefaultMongoTypeMapper(String typeKey) {
this(typeKey, Arrays.asList(SimpleTypeInformationMapper.INSTANCE)); this(typeKey, Arrays.asList(new SimpleTypeInformationMapper()));
} }
public DefaultMongoTypeMapper(String typeKey, MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext) { public DefaultMongoTypeMapper(String typeKey, MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext) {
this(typeKey, new DBObjectTypeAliasAccessor(typeKey), mappingContext, Arrays this(typeKey, new DBObjectTypeAliasAccessor(typeKey), mappingContext,
.asList(SimpleTypeInformationMapper.INSTANCE)); Arrays.asList(new SimpleTypeInformationMapper()));
} }
public DefaultMongoTypeMapper(String typeKey, List<? extends TypeInformationMapper> mappers) { public DefaultMongoTypeMapper(String typeKey, List<? extends TypeInformationMapper> mappers) {
@@ -71,7 +71,8 @@ public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implemen
} }
private DefaultMongoTypeMapper(String typeKey, TypeAliasAccessor<DBObject> accessor, private DefaultMongoTypeMapper(String typeKey, TypeAliasAccessor<DBObject> accessor,
MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext, List<? extends TypeInformationMapper> mappers) { MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext,
List<? extends TypeInformationMapper> mappers) {
super(accessor, mappingContext, mappers); super(accessor, mappingContext, mappers);

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2014 the original author or authors. * Copyright 2014-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -30,9 +30,18 @@ import org.springframework.data.geo.Metrics;
import org.springframework.data.geo.Point; import org.springframework.data.geo.Point;
import org.springframework.data.geo.Polygon; import org.springframework.data.geo.Polygon;
import org.springframework.data.geo.Shape; import org.springframework.data.geo.Shape;
import org.springframework.data.mongodb.core.geo.GeoJson;
import org.springframework.data.mongodb.core.geo.GeoJsonGeometryCollection;
import org.springframework.data.mongodb.core.geo.GeoJsonLineString;
import org.springframework.data.mongodb.core.geo.GeoJsonMultiLineString;
import org.springframework.data.mongodb.core.geo.GeoJsonMultiPoint;
import org.springframework.data.mongodb.core.geo.GeoJsonMultiPolygon;
import org.springframework.data.mongodb.core.geo.GeoJsonPoint;
import org.springframework.data.mongodb.core.geo.GeoJsonPolygon;
import org.springframework.data.mongodb.core.geo.Sphere; import org.springframework.data.mongodb.core.geo.Sphere;
import org.springframework.data.mongodb.core.query.GeoCommand; import org.springframework.data.mongodb.core.query.GeoCommand;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import com.mongodb.BasicDBList; import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject; import com.mongodb.BasicDBObject;
@@ -43,6 +52,7 @@ import com.mongodb.DBObject;
* *
* @author Thomas Darimont * @author Thomas Darimont
* @author Oliver Gierke * @author Oliver Gierke
* @author Christoph Strobl
* @since 1.5 * @since 1.5
*/ */
abstract class GeoConverters { abstract class GeoConverters {
@@ -63,16 +73,24 @@ abstract class GeoConverters {
BoxToDbObjectConverter.INSTANCE // BoxToDbObjectConverter.INSTANCE //
, PolygonToDbObjectConverter.INSTANCE // , PolygonToDbObjectConverter.INSTANCE //
, CircleToDbObjectConverter.INSTANCE // , CircleToDbObjectConverter.INSTANCE //
, LegacyCircleToDbObjectConverter.INSTANCE //
, SphereToDbObjectConverter.INSTANCE // , SphereToDbObjectConverter.INSTANCE //
, DbObjectToBoxConverter.INSTANCE // , DbObjectToBoxConverter.INSTANCE //
, DbObjectToPolygonConverter.INSTANCE // , DbObjectToPolygonConverter.INSTANCE //
, DbObjectToCircleConverter.INSTANCE // , DbObjectToCircleConverter.INSTANCE //
, DbObjectToLegacyCircleConverter.INSTANCE //
, DbObjectToSphereConverter.INSTANCE // , DbObjectToSphereConverter.INSTANCE //
, DbObjectToPointConverter.INSTANCE // , DbObjectToPointConverter.INSTANCE //
, PointToDbObjectConverter.INSTANCE // , PointToDbObjectConverter.INSTANCE //
, GeoCommandToDbObjectConverter.INSTANCE); , GeoCommandToDbObjectConverter.INSTANCE //
, GeoJsonToDbObjectConverter.INSTANCE //
, GeoJsonPointToDbObjectConverter.INSTANCE //
, GeoJsonPolygonToDbObjectConverter.INSTANCE //
, DbObjectToGeoJsonPointConverter.INSTANCE //
, DbObjectToGeoJsonPolygonConverter.INSTANCE //
, DbObjectToGeoJsonLineStringConverter.INSTANCE //
, DbObjectToGeoJsonMultiLineStringConverter.INSTANCE //
, DbObjectToGeoJsonMultiPointConverter.INSTANCE //
, DbObjectToGeoJsonMultiPolygonConverter.INSTANCE //
, DbObjectToGeoJsonGeometryCollectionConverter.INSTANCE);
} }
/** /**
@@ -82,7 +100,7 @@ abstract class GeoConverters {
* @since 1.5 * @since 1.5
*/ */
@ReadingConverter @ReadingConverter
public static enum DbObjectToPointConverter implements Converter<DBObject, Point> { static enum DbObjectToPointConverter implements Converter<DBObject, Point> {
INSTANCE; INSTANCE;
@@ -91,13 +109,19 @@ abstract class GeoConverters {
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object) * @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/ */
@Override @Override
@SuppressWarnings("deprecation")
public Point convert(DBObject source) { public Point convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(source.keySet().size() == 2, "Source must contain 2 elements"); Assert.isTrue(source.keySet().size() == 2, "Source must contain 2 elements");
return source == null ? null : new org.springframework.data.mongodb.core.geo.Point((Double) source.get("x"), if (source.containsField("type")) {
(Double) source.get("y")); return DbObjectToGeoJsonPointConverter.INSTANCE.convert(source);
}
return new Point((Double) source.get("x"), (Double) source.get("y"));
} }
} }
@@ -107,7 +131,7 @@ abstract class GeoConverters {
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.5 * @since 1.5
*/ */
public static enum PointToDbObjectConverter implements Converter<Point, DBObject> { static enum PointToDbObjectConverter implements Converter<Point, DBObject> {
INSTANCE; INSTANCE;
@@ -128,7 +152,7 @@ abstract class GeoConverters {
* @since 1.5 * @since 1.5
*/ */
@WritingConverter @WritingConverter
public static enum BoxToDbObjectConverter implements Converter<Box, DBObject> { static enum BoxToDbObjectConverter implements Converter<Box, DBObject> {
INSTANCE; INSTANCE;
@@ -151,13 +175,13 @@ abstract class GeoConverters {
} }
/** /**
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Box}. * Converts a {@link BasicDBList} into a {@link Box}.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.5 * @since 1.5
*/ */
@ReadingConverter @ReadingConverter
public static enum DbObjectToBoxConverter implements Converter<DBObject, Box> { static enum DbObjectToBoxConverter implements Converter<DBObject, Box> {
INSTANCE; INSTANCE;
@@ -166,7 +190,6 @@ abstract class GeoConverters {
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object) * @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/ */
@Override @Override
@SuppressWarnings("deprecation")
public Box convert(DBObject source) { public Box convert(DBObject source) {
if (source == null) { if (source == null) {
@@ -176,7 +199,7 @@ abstract class GeoConverters {
Point first = DbObjectToPointConverter.INSTANCE.convert((DBObject) source.get("first")); Point first = DbObjectToPointConverter.INSTANCE.convert((DBObject) source.get("first"));
Point second = DbObjectToPointConverter.INSTANCE.convert((DBObject) source.get("second")); Point second = DbObjectToPointConverter.INSTANCE.convert((DBObject) source.get("second"));
return new org.springframework.data.mongodb.core.geo.Box(first, second); return new Box(first, second);
} }
} }
@@ -186,7 +209,7 @@ abstract class GeoConverters {
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.5 * @since 1.5
*/ */
public static enum CircleToDbObjectConverter implements Converter<Circle, DBObject> { static enum CircleToDbObjectConverter implements Converter<Circle, DBObject> {
INSTANCE; INSTANCE;
@@ -210,13 +233,13 @@ abstract class GeoConverters {
} }
/** /**
* Converts a {@link DBObject} into a {@link org.springframework.data.mongodb.core.geo.Circle}. * Converts a {@link DBObject} into a {@link Circle}.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.5 * @since 1.5
*/ */
@ReadingConverter @ReadingConverter
public static enum DbObjectToCircleConverter implements Converter<DBObject, Circle> { static enum DbObjectToCircleConverter implements Converter<DBObject, Circle> {
INSTANCE; INSTANCE;
@@ -251,78 +274,13 @@ abstract class GeoConverters {
} }
} }
/**
* Converts a {@link Circle} into a {@link BasicDBList}.
*
* @author Thomas Darimont
* @since 1.5
*/
@SuppressWarnings("deprecation")
public static enum LegacyCircleToDbObjectConverter implements
Converter<org.springframework.data.mongodb.core.geo.Circle, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public DBObject convert(org.springframework.data.mongodb.core.geo.Circle source) {
if (source == null) {
return null;
}
DBObject result = new BasicDBObject();
result.put("center", PointToDbObjectConverter.INSTANCE.convert(source.getCenter()));
result.put("radius", source.getRadius());
return result;
}
}
/**
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Circle}.
*
* @author Thomas Darimont
* @since 1.5
*/
@ReadingConverter
@SuppressWarnings("deprecation")
public static enum DbObjectToLegacyCircleConverter implements
Converter<DBObject, org.springframework.data.mongodb.core.geo.Circle> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public org.springframework.data.mongodb.core.geo.Circle convert(DBObject source) {
if (source == null) {
return null;
}
DBObject centerSource = (DBObject) source.get("center");
Double radius = (Double) source.get("radius");
Assert.notNull(centerSource, "Center must not be null!");
Assert.notNull(radius, "Radius must not be null!");
Point center = DbObjectToPointConverter.INSTANCE.convert(centerSource);
return new org.springframework.data.mongodb.core.geo.Circle(center, radius);
}
}
/** /**
* Converts a {@link Sphere} into a {@link BasicDBList}. * Converts a {@link Sphere} into a {@link BasicDBList}.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.5 * @since 1.5
*/ */
public static enum SphereToDbObjectConverter implements Converter<Sphere, DBObject> { static enum SphereToDbObjectConverter implements Converter<Sphere, DBObject> {
INSTANCE; INSTANCE;
@@ -352,7 +310,7 @@ abstract class GeoConverters {
* @since 1.5 * @since 1.5
*/ */
@ReadingConverter @ReadingConverter
public static enum DbObjectToSphereConverter implements Converter<DBObject, Sphere> { static enum DbObjectToSphereConverter implements Converter<DBObject, Sphere> {
INSTANCE; INSTANCE;
@@ -393,7 +351,7 @@ abstract class GeoConverters {
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.5 * @since 1.5
*/ */
public static enum PolygonToDbObjectConverter implements Converter<Polygon, DBObject> { static enum PolygonToDbObjectConverter implements Converter<Polygon, DBObject> {
INSTANCE; INSTANCE;
@@ -422,13 +380,13 @@ abstract class GeoConverters {
} }
/** /**
* Converts a {@link BasicDBList} into a {@link org.springframework.data.mongodb.core.geo.Polygon}. * Converts a {@link BasicDBList} into a {@link Polygon}.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.5 * @since 1.5
*/ */
@ReadingConverter @ReadingConverter
public static enum DbObjectToPolygonConverter implements Converter<DBObject, Polygon> { static enum DbObjectToPolygonConverter implements Converter<DBObject, Polygon> {
INSTANCE; INSTANCE;
@@ -437,7 +395,7 @@ abstract class GeoConverters {
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object) * @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/ */
@Override @Override
@SuppressWarnings({ "deprecation", "unchecked" }) @SuppressWarnings({ "unchecked" })
public Polygon convert(DBObject source) { public Polygon convert(DBObject source) {
if (source == null) { if (source == null) {
@@ -453,7 +411,7 @@ abstract class GeoConverters {
newPoints.add(DbObjectToPointConverter.INSTANCE.convert(element)); newPoints.add(DbObjectToPointConverter.INSTANCE.convert(element));
} }
return new org.springframework.data.mongodb.core.geo.Polygon(newPoints); return new Polygon(newPoints);
} }
} }
@@ -463,7 +421,7 @@ abstract class GeoConverters {
* @author Thomas Darimont * @author Thomas Darimont
* @since 1.5 * @since 1.5
*/ */
public static enum GeoCommandToDbObjectConverter implements Converter<GeoCommand, DBObject> { static enum GeoCommandToDbObjectConverter implements Converter<GeoCommand, DBObject> {
INSTANCE; INSTANCE;
@@ -472,7 +430,7 @@ abstract class GeoConverters {
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object) * @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/ */
@Override @Override
@SuppressWarnings("deprecation") @SuppressWarnings("rawtypes")
public DBObject convert(GeoCommand source) { public DBObject convert(GeoCommand source) {
if (source == null) { if (source == null) {
@@ -483,6 +441,10 @@ abstract class GeoConverters {
Shape shape = source.getShape(); Shape shape = source.getShape();
if (shape instanceof GeoJson) {
return GeoJsonToDbObjectConverter.INSTANCE.convert((GeoJson) shape);
}
if (shape instanceof Box) { if (shape instanceof Box) {
argument.add(toList(((Box) shape).getFirst())); argument.add(toList(((Box) shape).getFirst()));
@@ -493,10 +455,10 @@ abstract class GeoConverters {
argument.add(toList(((Circle) shape).getCenter())); argument.add(toList(((Circle) shape).getCenter()));
argument.add(((Circle) shape).getRadius().getNormalizedValue()); argument.add(((Circle) shape).getRadius().getNormalizedValue());
} else if (shape instanceof org.springframework.data.mongodb.core.geo.Circle) { } else if (shape instanceof Circle) {
argument.add(toList(((org.springframework.data.mongodb.core.geo.Circle) shape).getCenter())); argument.add(toList(((Circle) shape).getCenter()));
argument.add(((org.springframework.data.mongodb.core.geo.Circle) shape).getRadius()); argument.add(((Circle) shape).getRadius());
} else if (shape instanceof Polygon) { } else if (shape instanceof Polygon) {
@@ -514,7 +476,377 @@ abstract class GeoConverters {
} }
} }
/**
* @author Christoph Strobl
* @since 1.7
*/
@SuppressWarnings("rawtypes")
static enum GeoJsonToDbObjectConverter implements Converter<GeoJson, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public DBObject convert(GeoJson source) {
if (source == null) {
return null;
}
DBObject dbo = new BasicDBObject("type", source.getType());
if (source instanceof GeoJsonGeometryCollection) {
BasicDBList dbl = new BasicDBList();
for (GeoJson geometry : ((GeoJsonGeometryCollection) source).getCoordinates()) {
dbl.add(convert(geometry));
}
dbo.put("geometries", dbl);
} else {
dbo.put("coordinates", convertIfNecessarry(source.getCoordinates()));
}
return dbo;
}
private Object convertIfNecessarry(Object candidate) {
if (candidate instanceof GeoJson) {
return convertIfNecessarry(((GeoJson) candidate).getCoordinates());
}
if (candidate instanceof Iterable) {
BasicDBList dbl = new BasicDBList();
for (Object element : (Iterable) candidate) {
dbl.add(convertIfNecessarry(element));
}
return dbl;
}
if (candidate instanceof Point) {
return toList((Point) candidate);
}
return candidate;
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum GeoJsonPointToDbObjectConverter implements Converter<GeoJsonPoint, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public DBObject convert(GeoJsonPoint source) {
return GeoJsonToDbObjectConverter.INSTANCE.convert(source);
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum GeoJsonPolygonToDbObjectConverter implements Converter<GeoJsonPolygon, DBObject> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public DBObject convert(GeoJsonPolygon source) {
return GeoJsonToDbObjectConverter.INSTANCE.convert(source);
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonPointConverter implements Converter<DBObject, GeoJsonPoint> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
@SuppressWarnings("unchecked")
public GeoJsonPoint convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "Point"),
String.format("Cannot convert type '%s' to Point.", source.get("type")));
List<Number> dbl = (List<Number>) source.get("coordinates");
return new GeoJsonPoint(dbl.get(0).doubleValue(), dbl.get(1).doubleValue());
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonPolygonConverter implements Converter<DBObject, GeoJsonPolygon> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public GeoJsonPolygon convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "Polygon"),
String.format("Cannot convert type '%s' to Polygon.", source.get("type")));
return toGeoJsonPolygon((BasicDBList) source.get("coordinates"));
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonMultiPolygonConverter implements Converter<DBObject, GeoJsonMultiPolygon> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public GeoJsonMultiPolygon convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "MultiPolygon"),
String.format("Cannot convert type '%s' to MultiPolygon.", source.get("type")));
BasicDBList dbl = (BasicDBList) source.get("coordinates");
List<GeoJsonPolygon> polygones = new ArrayList<GeoJsonPolygon>();
for (Object polygon : dbl) {
polygones.add(toGeoJsonPolygon((BasicDBList) polygon));
}
return new GeoJsonMultiPolygon(polygones);
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonLineStringConverter implements Converter<DBObject, GeoJsonLineString> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public GeoJsonLineString convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "LineString"),
String.format("Cannot convert type '%s' to LineString.", source.get("type")));
BasicDBList cords = (BasicDBList) source.get("coordinates");
return new GeoJsonLineString(toListOfPoint(cords));
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonMultiPointConverter implements Converter<DBObject, GeoJsonMultiPoint> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public GeoJsonMultiPoint convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "MultiPoint"),
String.format("Cannot convert type '%s' to MultiPoint.", source.get("type")));
BasicDBList cords = (BasicDBList) source.get("coordinates");
return new GeoJsonMultiPoint(toListOfPoint(cords));
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonMultiLineStringConverter implements Converter<DBObject, GeoJsonMultiLineString> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public GeoJsonMultiLineString convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "MultiLineString"),
String.format("Cannot convert type '%s' to MultiLineString.", source.get("type")));
List<GeoJsonLineString> lines = new ArrayList<GeoJsonLineString>();
BasicDBList cords = (BasicDBList) source.get("coordinates");
for (Object line : cords) {
lines.add(new GeoJsonLineString(toListOfPoint((BasicDBList) line)));
}
return new GeoJsonMultiLineString(lines);
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
static enum DbObjectToGeoJsonGeometryCollectionConverter implements Converter<DBObject, GeoJsonGeometryCollection> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@SuppressWarnings("rawtypes")
@Override
public GeoJsonGeometryCollection convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "GeometryCollection"),
String.format("Cannot convert type '%s' to GeometryCollection.", source.get("type")));
List<GeoJson<?>> geometries = new ArrayList<GeoJson<?>>();
for (Object o : (List) source.get("geometries")) {
geometries.add(convertGeometries((DBObject) o));
}
return new GeoJsonGeometryCollection(geometries);
}
private static GeoJson<?> convertGeometries(DBObject source) {
Object type = source.get("type");
if (ObjectUtils.nullSafeEquals(type, "Point")) {
return DbObjectToGeoJsonPointConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "MultiPoint")) {
return DbObjectToGeoJsonMultiPointConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "LineString")) {
return DbObjectToGeoJsonLineStringConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "MultiLineString")) {
return DbObjectToGeoJsonMultiLineStringConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "Polygon")) {
return DbObjectToGeoJsonPolygonConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "MultiPolygon")) {
return DbObjectToGeoJsonMultiPolygonConverter.INSTANCE.convert(source);
}
throw new IllegalArgumentException(String.format("Cannot convert unknown GeoJson type %s", type));
}
}
static List<Double> toList(Point point) { static List<Double> toList(Point point) {
return Arrays.asList(point.getX(), point.getY()); return Arrays.asList(point.getX(), point.getY());
} }
/**
* Converts a coordinate pairs nested in in {@link BasicDBList} into {@link GeoJsonPoint}s.
*
* @param listOfCoordinatePairs
* @return
* @since 1.7
*/
@SuppressWarnings("unchecked")
static List<Point> toListOfPoint(BasicDBList listOfCoordinatePairs) {
List<Point> points = new ArrayList<Point>();
for (Object point : listOfCoordinatePairs) {
Assert.isInstanceOf(List.class, point);
List<Number> coordinatesList = (List<Number>) point;
points.add(new GeoJsonPoint(coordinatesList.get(0).doubleValue(), coordinatesList.get(1).doubleValue()));
}
return points;
}
/**
* Converts a coordinate pairs nested in in {@link BasicDBList} into {@link GeoJsonPolygon}.
*
* @param dbList
* @return
* @since 1.7
*/
static GeoJsonPolygon toGeoJsonPolygon(BasicDBList dbList) {
return new GeoJsonPolygon(toListOfPoint((BasicDBList) dbList.get(0)));
}
} }

View File

@@ -23,6 +23,7 @@ import com.mongodb.DBRef;
* Allows direct interaction with the underlying {@link LazyLoadingInterceptor}. * Allows direct interaction with the underlying {@link LazyLoadingInterceptor}.
* *
* @author Thomas Darimont * @author Thomas Darimont
* @author Christoph Strobl
* @since 1.5 * @since 1.5
*/ */
public interface LazyLoadingProxy { public interface LazyLoadingProxy {
@@ -33,7 +34,7 @@ public interface LazyLoadingProxy {
* @return * @return
* @since 1.5 * @since 1.5
*/ */
Object initialize(); Object getTarget();
/** /**
* Returns the {@link DBRef} represented by this {@link LazyLoadingProxy}, may be null. * Returns the {@link DBRef} represented by this {@link LazyLoadingProxy}, may be null.

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011-2014 by the original author(s). * Copyright 2011-2016 by the original author(s).
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -20,27 +20,30 @@ import java.util.Arrays;
import java.util.Collection; import java.util.Collection;
import java.util.Collections; import java.util.Collections;
import java.util.HashSet; import java.util.HashSet;
import java.util.LinkedHashMap;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.Map.Entry; import java.util.Map.Entry;
import java.util.Set;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import org.springframework.beans.BeansException; import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext; import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware; import org.springframework.context.ApplicationContextAware;
import org.springframework.core.CollectionFactory;
import org.springframework.core.convert.ConversionException; import org.springframework.core.convert.ConversionException;
import org.springframework.core.convert.ConversionService; import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.ConversionServiceFactory; import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.data.convert.CollectionFactory;
import org.springframework.data.convert.EntityInstantiator; import org.springframework.data.convert.EntityInstantiator;
import org.springframework.data.convert.TypeMapper; import org.springframework.data.convert.TypeMapper;
import org.springframework.data.mapping.Association; import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.AssociationHandler; import org.springframework.data.mapping.AssociationHandler;
import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.mapping.PreferredConstructor.Parameter; import org.springframework.data.mapping.PreferredConstructor.Parameter;
import org.springframework.data.mapping.PropertyHandler; import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mapping.context.MappingContext; import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.BeanWrapper; import org.springframework.data.mapping.model.ConvertingPropertyAccessor;
import org.springframework.data.mapping.model.DefaultSpELExpressionEvaluator; import org.springframework.data.mapping.model.DefaultSpELExpressionEvaluator;
import org.springframework.data.mapping.model.MappingException; import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mapping.model.ParameterValueProvider; import org.springframework.data.mapping.model.ParameterValueProvider;
@@ -52,10 +55,14 @@ import org.springframework.data.mapping.model.SpELExpressionParameterValueProvid
import org.springframework.data.mongodb.MongoDbFactory; import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity; import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty; import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.event.AfterConvertEvent;
import org.springframework.data.mongodb.core.mapping.event.AfterLoadEvent;
import org.springframework.data.mongodb.core.mapping.event.MongoMappingEvent;
import org.springframework.data.util.ClassTypeInformation; import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation; import org.springframework.data.util.TypeInformation;
import org.springframework.expression.spel.standard.SpelExpressionParser; import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.CollectionUtils; import org.springframework.util.CollectionUtils;
import com.mongodb.BasicDBList; import com.mongodb.BasicDBList;
@@ -72,8 +79,11 @@ import com.mongodb.DBRef;
* @author Patrik Wasik * @author Patrik Wasik
* @author Thomas Darimont * @author Thomas Darimont
* @author Christoph Strobl * @author Christoph Strobl
* @author Jordi Llach
*/ */
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware { public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware, ValueResolver {
private static final String INCOMPATIBLE_TYPES = "Cannot convert %1$s of type %2$s into an instance of %3$s! Implement a custom Converter<%2$s, %3$s> and register it with the CustomConversions. Parent object was: %4$s";
protected static final Logger LOGGER = LoggerFactory.getLogger(MappingMongoConverter.class); protected static final Logger LOGGER = LoggerFactory.getLogger(MappingMongoConverter.class);
@@ -81,6 +91,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
protected final SpelExpressionParser spelExpressionParser = new SpelExpressionParser(); protected final SpelExpressionParser spelExpressionParser = new SpelExpressionParser();
protected final QueryMapper idMapper; protected final QueryMapper idMapper;
protected final DbRefResolver dbRefResolver; protected final DbRefResolver dbRefResolver;
protected ApplicationContext applicationContext; protected ApplicationContext applicationContext;
protected MongoTypeMapper typeMapper; protected MongoTypeMapper typeMapper;
protected String mapKeyDotReplacement = null; protected String mapKeyDotReplacement = null;
@@ -93,11 +104,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param mongoDbFactory must not be {@literal null}. * @param mongoDbFactory must not be {@literal null}.
* @param mappingContext must not be {@literal null}. * @param mappingContext must not be {@literal null}.
*/ */
@SuppressWarnings("deprecation")
public MappingMongoConverter(DbRefResolver dbRefResolver, public MappingMongoConverter(DbRefResolver dbRefResolver,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) { MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
super(ConversionServiceFactory.createDefaultConversionService()); super(new DefaultConversionService());
Assert.notNull(dbRefResolver, "DbRefResolver must not be null!"); Assert.notNull(dbRefResolver, "DbRefResolver must not be null!");
Assert.notNull(mappingContext, "MappingContext must not be null!"); Assert.notNull(mappingContext, "MappingContext must not be null!");
@@ -132,8 +142,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param typeMapper the typeMapper to set * @param typeMapper the typeMapper to set
*/ */
public void setTypeMapper(MongoTypeMapper typeMapper) { public void setTypeMapper(MongoTypeMapper typeMapper) {
this.typeMapper = typeMapper == null ? new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, this.typeMapper = typeMapper == null
mappingContext) : typeMapper; ? new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, mappingContext) : typeMapper;
} }
/* /*
@@ -184,11 +194,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
} }
protected <S extends Object> S read(TypeInformation<S> type, DBObject dbo) { protected <S extends Object> S read(TypeInformation<S> type, DBObject dbo) {
return read(type, dbo, null); return read(type, dbo, ObjectPath.ROOT);
} }
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
protected <S extends Object> S read(TypeInformation<S> type, DBObject dbo, Object parent) { private <S extends Object> S read(TypeInformation<S> type, DBObject dbo, ObjectPath path) {
if (null == dbo) { if (null == dbo) {
return null; return null;
@@ -206,11 +216,15 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
} }
if (typeToUse.isCollectionLike() && dbo instanceof BasicDBList) { if (typeToUse.isCollectionLike() && dbo instanceof BasicDBList) {
return (S) readCollectionOrArray(typeToUse, (BasicDBList) dbo, parent); return (S) readCollectionOrArray(typeToUse, (BasicDBList) dbo, path);
} }
if (typeToUse.isMap()) { if (typeToUse.isMap()) {
return (S) readMap(typeToUse, dbo, parent); return (S) readMap(typeToUse, dbo, path);
}
if (dbo instanceof BasicDBList) {
throw new MappingException(String.format(INCOMPATIBLE_TYPES, dbo, BasicDBList.class, typeToUse.getType(), path));
} }
// Retrieve persistent entity info // Retrieve persistent entity info
@@ -220,41 +234,59 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
throw new MappingException("No mapping metadata found for " + rawType.getName()); throw new MappingException("No mapping metadata found for " + rawType.getName());
} }
return read(persistentEntity, dbo, parent); return read(persistentEntity, dbo, path);
} }
private ParameterValueProvider<MongoPersistentProperty> getParameterProvider(MongoPersistentEntity<?> entity, private ParameterValueProvider<MongoPersistentProperty> getParameterProvider(MongoPersistentEntity<?> entity,
DBObject source, DefaultSpELExpressionEvaluator evaluator, Object parent) { DBObject source, DefaultSpELExpressionEvaluator evaluator, ObjectPath path) {
MongoDbPropertyValueProvider provider = new MongoDbPropertyValueProvider(source, evaluator, parent); MongoDbPropertyValueProvider provider = new MongoDbPropertyValueProvider(source, evaluator, path);
PersistentEntityParameterValueProvider<MongoPersistentProperty> parameterProvider = new PersistentEntityParameterValueProvider<MongoPersistentProperty>( PersistentEntityParameterValueProvider<MongoPersistentProperty> parameterProvider = new PersistentEntityParameterValueProvider<MongoPersistentProperty>(
entity, provider, parent); entity, provider, path.getCurrentObject());
return new ConverterAwareSpELExpressionParameterValueProvider(evaluator, conversionService, parameterProvider, return new ConverterAwareSpELExpressionParameterValueProvider(evaluator, conversionService, parameterProvider,
parent); path);
} }
private <S extends Object> S read(final MongoPersistentEntity<S> entity, final DBObject dbo, final Object parent) { private <S extends Object> S read(final MongoPersistentEntity<S> entity, final DBObject dbo, final ObjectPath path) {
final DefaultSpELExpressionEvaluator evaluator = new DefaultSpELExpressionEvaluator(dbo, spELContext); final DefaultSpELExpressionEvaluator evaluator = new DefaultSpELExpressionEvaluator(dbo, spELContext);
ParameterValueProvider<MongoPersistentProperty> provider = getParameterProvider(entity, dbo, evaluator, parent); ParameterValueProvider<MongoPersistentProperty> provider = getParameterProvider(entity, dbo, evaluator, path);
EntityInstantiator instantiator = instantiators.getInstantiatorFor(entity); EntityInstantiator instantiator = instantiators.getInstantiatorFor(entity);
S instance = instantiator.createInstance(entity, provider); S instance = instantiator.createInstance(entity, provider);
final BeanWrapper<S> wrapper = BeanWrapper.create(instance, conversionService); final PersistentPropertyAccessor accessor = new ConvertingPropertyAccessor(entity.getPropertyAccessor(instance),
final S result = wrapper.getBean(); conversionService);
final MongoPersistentProperty idProperty = entity.getIdProperty();
final S result = instance;
// make sure id property is set before all other properties
Object idValue = null;
if (idProperty != null) {
idValue = getValueInternal(idProperty, dbo, evaluator, path);
accessor.setProperty(idProperty, idValue);
}
final ObjectPath currentPath = path.push(result, entity,
idValue != null ? dbo.get(idProperty.getFieldName()) : null);
// Set properties not already set in the constructor // Set properties not already set in the constructor
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() { entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty prop) { public void doWithPersistentProperty(MongoPersistentProperty prop) {
// we skip the id property since it was already set
if (idProperty != null && idProperty.equals(prop)) {
return;
}
if (!dbo.containsField(prop.getFieldName()) || entity.isConstructorArgument(prop)) { if (!dbo.containsField(prop.getFieldName()) || entity.isConstructorArgument(prop)) {
return; return;
} }
Object obj = getValueInternal(prop, dbo, evaluator, result); accessor.setProperty(prop, getValueInternal(prop, dbo, evaluator, currentPath));
wrapper.setProperty(prop, obj);
} }
}); });
@@ -262,19 +294,21 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
entity.doWithAssociations(new AssociationHandler<MongoPersistentProperty>() { entity.doWithAssociations(new AssociationHandler<MongoPersistentProperty>() {
public void doWithAssociation(Association<MongoPersistentProperty> association) { public void doWithAssociation(Association<MongoPersistentProperty> association) {
MongoPersistentProperty property = association.getInverse(); final MongoPersistentProperty property = association.getInverse();
Object value = dbo.get(property.getFieldName());
if (value == null || entity.isConstructorArgument(property)) {
return;
}
Object value = dbo.get(property.getName());
DBRef dbref = value instanceof DBRef ? (DBRef) value : null; DBRef dbref = value instanceof DBRef ? (DBRef) value : null;
Object obj = dbRefResolver.resolveDbRef(property, dbref, new DbRefResolverCallback() {
@Override DbRefProxyHandler handler = new DefaultDbRefProxyHandler(spELContext, mappingContext,
public Object resolve(MongoPersistentProperty property) { MappingMongoConverter.this);
return getValueInternal(property, dbo, evaluator, parent); DbRefResolverCallback callback = new DefaultDbRefResolverCallback(dbo, currentPath, evaluator,
} MappingMongoConverter.this);
});
wrapper.setProperty(property, obj); accessor.setProperty(property, dbRefResolver.resolveDbRef(property, dbref, callback, handler));
} }
}); });
@@ -314,14 +348,17 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return; return;
} }
boolean handledByCustomConverter = conversions.getCustomWriteTarget(obj.getClass(), DBObject.class) != null; Class<?> entityType = obj.getClass();
TypeInformation<? extends Object> type = ClassTypeInformation.from(obj.getClass()); boolean handledByCustomConverter = conversions.getCustomWriteTarget(entityType, DBObject.class) != null;
TypeInformation<? extends Object> type = ClassTypeInformation.from(entityType);
if (!handledByCustomConverter && !(dbo instanceof BasicDBList)) { if (!handledByCustomConverter && !(dbo instanceof BasicDBList)) {
typeMapper.writeType(type, dbo); typeMapper.writeType(type, dbo);
} }
writeInternal(obj, dbo, type); Object target = obj instanceof LazyLoadingProxy ? ((LazyLoadingProxy) obj).getTarget() : obj;
writeInternal(target, dbo, type);
} }
/** /**
@@ -337,7 +374,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return; return;
} }
Class<?> customTarget = conversions.getCustomWriteTarget(obj.getClass(), DBObject.class); Class<?> entityType = obj.getClass();
Class<?> customTarget = conversions.getCustomWriteTarget(entityType, DBObject.class);
if (customTarget != null) { if (customTarget != null) {
DBObject result = conversionService.convert(obj, DBObject.class); DBObject result = conversionService.convert(obj, DBObject.class);
@@ -345,17 +383,17 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return; return;
} }
if (Map.class.isAssignableFrom(obj.getClass())) { if (Map.class.isAssignableFrom(entityType)) {
writeMapInternal((Map<Object, Object>) obj, dbo, ClassTypeInformation.MAP); writeMapInternal((Map<Object, Object>) obj, dbo, ClassTypeInformation.MAP);
return; return;
} }
if (Collection.class.isAssignableFrom(obj.getClass())) { if (Collection.class.isAssignableFrom(entityType)) {
writeCollectionInternal((Collection<?>) obj, ClassTypeInformation.LIST, (BasicDBList) dbo); writeCollectionInternal((Collection<?>) obj, ClassTypeInformation.LIST, (BasicDBList) dbo);
return; return;
} }
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(obj.getClass()); MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityType);
writeInternal(obj, dbo, entity); writeInternal(obj, dbo, entity);
addCustomTypeKeyIfNecessary(typeHint, obj, dbo); addCustomTypeKeyIfNecessary(typeHint, obj, dbo);
} }
@@ -370,13 +408,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
throw new MappingException("No mapping metadata found for entity of type " + obj.getClass().getName()); throw new MappingException("No mapping metadata found for entity of type " + obj.getClass().getName());
} }
final BeanWrapper<Object> wrapper = BeanWrapper.create(obj, conversionService); final PersistentPropertyAccessor accessor = entity.getPropertyAccessor(obj);
final MongoPersistentProperty idProperty = entity.getIdProperty(); final MongoPersistentProperty idProperty = entity.getIdProperty();
if (!dbo.containsField("_id") && null != idProperty) { if (!dbo.containsField("_id") && null != idProperty) {
try { try {
Object id = wrapper.getProperty(idProperty, Object.class); Object id = accessor.getProperty(idProperty);
dbo.put("_id", idMapper.convertId(id)); dbo.put("_id", idMapper.convertId(id));
} catch (ConversionException ignored) {} } catch (ConversionException ignored) {}
} }
@@ -389,7 +427,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return; return;
} }
Object propertyObj = wrapper.getProperty(prop); Object propertyObj = accessor.getProperty(prop);
if (null != propertyObj) { if (null != propertyObj) {
@@ -403,10 +441,12 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}); });
entity.doWithAssociations(new AssociationHandler<MongoPersistentProperty>() { entity.doWithAssociations(new AssociationHandler<MongoPersistentProperty>() {
public void doWithAssociation(Association<MongoPersistentProperty> association) { public void doWithAssociation(Association<MongoPersistentProperty> association) {
MongoPersistentProperty inverseProp = association.getInverse(); MongoPersistentProperty inverseProp = association.getInverse();
Class<?> type = inverseProp.getType(); Object propertyObj = accessor.getProperty(inverseProp);
Object propertyObj = wrapper.getProperty(inverseProp, type);
if (null != propertyObj) { if (null != propertyObj) {
writePropertyInternal(propertyObj, dbo, inverseProp); writePropertyInternal(propertyObj, dbo, inverseProp);
} }
@@ -462,7 +502,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* If we have a LazyLoadingProxy we make sure it is initialized first. * If we have a LazyLoadingProxy we make sure it is initialized first.
*/ */
if (obj instanceof LazyLoadingProxy) { if (obj instanceof LazyLoadingProxy) {
obj = ((LazyLoadingProxy) obj).initialize(); obj = ((LazyLoadingProxy) obj).getTarget();
} }
// Lookup potential custom target type // Lookup potential custom target type
@@ -476,10 +516,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Object existingValue = accessor.get(prop); Object existingValue = accessor.get(prop);
BasicDBObject propDbObj = existingValue instanceof BasicDBObject ? (BasicDBObject) existingValue BasicDBObject propDbObj = existingValue instanceof BasicDBObject ? (BasicDBObject) existingValue
: new BasicDBObject(); : new BasicDBObject();
addCustomTypeKeyIfNecessary(type, obj, propDbObj); addCustomTypeKeyIfNecessary(ClassTypeInformation.from(prop.getRawType()), obj, propDbObj);
MongoPersistentEntity<?> entity = isSubtype(prop.getType(), obj.getClass()) ? mappingContext MongoPersistentEntity<?> entity = isSubtype(prop.getType(), obj.getClass())
.getPersistentEntity(obj.getClass()) : mappingContext.getPersistentEntity(type); ? mappingContext.getPersistentEntity(obj.getClass()) : mappingContext.getPersistentEntity(type);
writeInternal(obj, propDbObj, entity); writeInternal(obj, propDbObj, entity);
accessor.put(prop, propDbObj); accessor.put(prop, propDbObj);
@@ -559,7 +599,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (conversions.isSimpleType(key.getClass())) { if (conversions.isSimpleType(key.getClass())) {
String simpleKey = potentiallyEscapeMapKey(key.toString()); String simpleKey = prepareMapKey(key.toString());
dbObject.put(simpleKey, value != null ? createDBRef(value, property) : null); dbObject.put(simpleKey, value != null ? createDBRef(value, property) : null);
} else { } else {
@@ -611,12 +651,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
protected DBObject writeMapInternal(Map<Object, Object> obj, DBObject dbo, TypeInformation<?> propertyType) { protected DBObject writeMapInternal(Map<Object, Object> obj, DBObject dbo, TypeInformation<?> propertyType) {
for (Map.Entry<Object, Object> entry : obj.entrySet()) { for (Map.Entry<Object, Object> entry : obj.entrySet()) {
Object key = entry.getKey(); Object key = entry.getKey();
Object val = entry.getValue(); Object val = entry.getValue();
if (conversions.isSimpleType(key.getClass())) { if (conversions.isSimpleType(key.getClass())) {
// Don't use conversion service here as removal of ObjectToString converter results in some primitive types not
// being convertable String simpleKey = prepareMapKey(key);
String simpleKey = potentiallyEscapeMapKey(key.toString());
if (val == null || conversions.isSimpleType(val.getClass())) { if (val == null || conversions.isSimpleType(val.getClass())) {
writeSimpleInternal(val, dbo, simpleKey); writeSimpleInternal(val, dbo, simpleKey);
} else if (val instanceof Collection || val.getClass().isArray()) { } else if (val instanceof Collection || val.getClass().isArray()) {
@@ -637,6 +678,21 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return dbo; return dbo;
} }
/**
* Prepares the given {@link Map} key to be converted into a {@link String}. Will invoke potentially registered custom
* conversions and escape dots from the result as they're not supported as {@link Map} key in MongoDB.
*
* @param key must not be {@literal null}.
* @return
*/
private String prepareMapKey(Object key) {
Assert.notNull(key, "Map key must not be null!");
String convertedKey = potentiallyConvertMapKey(key);
return potentiallyEscapeMapKey(convertedKey);
}
/** /**
* Potentially replaces dots in the given map key with the configured map key replacement if configured or aborts * Potentially replaces dots in the given map key with the configured map key replacement if configured or aborts
* conversion if none is configured. * conversion if none is configured.
@@ -652,13 +708,31 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
} }
if (mapKeyDotReplacement == null) { if (mapKeyDotReplacement == null) {
throw new MappingException(String.format("Map key %s contains dots but no replacement was configured! Make " throw new MappingException(String.format(
+ "sure map keys don't contain dots in the first place or configure an appropriate replacement!", source)); "Map key %s contains dots but no replacement was configured! Make "
+ "sure map keys don't contain dots in the first place or configure an appropriate replacement!",
source));
} }
return source.replaceAll("\\.", mapKeyDotReplacement); return source.replaceAll("\\.", mapKeyDotReplacement);
} }
/**
* Returns a {@link String} representation of the given {@link Map} key
*
* @param key
* @return
*/
private String potentiallyConvertMapKey(Object key) {
if (key instanceof String) {
return (String) key;
}
return conversions.hasCustomWriteTarget(key.getClass(), String.class)
? (String) getPotentiallyConvertedSimpleWrite(key) : key.toString();
}
/** /**
* Translates the map key replacements in the given key just read with a dot in case a map key replacement has been * Translates the map key replacements in the given key just read with a dot in case a map key replacement has been
* configured. * configured.
@@ -673,7 +747,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/** /**
* Adds custom type information to the given {@link DBObject} if necessary. That is if the value is not the same as * Adds custom type information to the given {@link DBObject} if necessary. That is if the value is not the same as
* the one given. This is usually the case if you store a subtype of the actual declared type of the property. * the one given. This is usually the case if you store a subtype of the actual declared type of the property.
* *
* @param type * @param type
* @param value must not be {@literal null}. * @param value must not be {@literal null}.
* @param dbObject must not be {@literal null}. * @param dbObject must not be {@literal null}.
@@ -682,10 +756,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
TypeInformation<?> actualType = type != null ? type.getActualType() : null; TypeInformation<?> actualType = type != null ? type.getActualType() : null;
Class<?> reference = actualType == null ? Object.class : actualType.getType(); Class<?> reference = actualType == null ? Object.class : actualType.getType();
Class<?> valueType = ClassUtils.getUserClass(value.getClass());
boolean notTheSameClass = !value.getClass().equals(reference); boolean notTheSameClass = !valueType.equals(reference);
if (notTheSameClass) { if (notTheSameClass) {
typeMapper.writeType(value.getClass(), dbObject); typeMapper.writeType(valueType, dbObject);
} }
} }
@@ -738,7 +813,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
@SuppressWarnings({ "rawtypes", "unchecked" }) @SuppressWarnings({ "rawtypes", "unchecked" })
private Object getPotentiallyConvertedSimpleRead(Object value, Class<?> target) { private Object getPotentiallyConvertedSimpleRead(Object value, Class<?> target) {
if (value == null || target == null) { if (value == null || target == null || target.isAssignableFrom(value.getClass())) {
return value; return value;
} }
@@ -750,7 +825,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return Enum.valueOf((Class<Enum>) target, value.toString()); return Enum.valueOf((Class<Enum>) target, value.toString());
} }
return target.isAssignableFrom(value.getClass()) ? value : conversionService.convert(value, target); return conversionService.convert(value, target);
} }
protected DBRef createDBRef(Object target, MongoPersistentProperty property) { protected DBRef createDBRef(Object target, MongoPersistentProperty property) {
@@ -779,8 +854,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (target.getClass().equals(idProperty.getType())) { if (target.getClass().equals(idProperty.getType())) {
id = target; id = target;
} else { } else {
BeanWrapper<Object> wrapper = BeanWrapper.create(target, conversionService); PersistentPropertyAccessor accessor = targetEntity.getPropertyAccessor(target);
id = wrapper.getProperty(idProperty, Object.class); id = accessor.getProperty(idProperty);
} }
if (null == id) { if (null == id) {
@@ -791,11 +866,14 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
idMapper.convertId(id)); idMapper.convertId(id));
} }
protected Object getValueInternal(MongoPersistentProperty prop, DBObject dbo, SpELExpressionEvaluator eval, /*
Object parent) { * (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.ValueResolver#getValueInternal(org.springframework.data.mongodb.core.mapping.MongoPersistentProperty, com.mongodb.DBObject, org.springframework.data.mapping.model.SpELExpressionEvaluator, java.lang.Object)
MongoDbPropertyValueProvider provider = new MongoDbPropertyValueProvider(dbo, spELContext, parent); */
return provider.getPropertyValue(prop); @Override
public Object getValueInternal(MongoPersistentProperty prop, DBObject dbo, SpELExpressionEvaluator evaluator,
ObjectPath path) {
return new MongoDbPropertyValueProvider(dbo, evaluator, path).getPropertyValue(prop);
} }
/** /**
@@ -803,11 +881,14 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* *
* @param targetType must not be {@literal null}. * @param targetType must not be {@literal null}.
* @param sourceValue must not be {@literal null}. * @param sourceValue must not be {@literal null}.
* @param path must not be {@literal null}.
* @return the converted {@link Collection} or array, will never be {@literal null}. * @return the converted {@link Collection} or array, will never be {@literal null}.
*/ */
private Object readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue, Object parent) { @SuppressWarnings({ "rawtypes", "unchecked" })
private Object readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue, ObjectPath path) {
Assert.notNull(targetType); Assert.notNull(targetType, "Target type must not be null!");
Assert.notNull(path, "Object path must not be null!");
Class<?> collectionType = targetType.getType(); Class<?> collectionType = targetType.getType();
@@ -819,18 +900,21 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Class<?> rawComponentType = componentType == null ? null : componentType.getType(); Class<?> rawComponentType = componentType == null ? null : componentType.getType();
collectionType = Collection.class.isAssignableFrom(collectionType) ? collectionType : List.class; collectionType = Collection.class.isAssignableFrom(collectionType) ? collectionType : List.class;
Collection<Object> items = targetType.getType().isArray() ? new ArrayList<Object>() : CollectionFactory Collection<Object> items = targetType.getType().isArray() ? new ArrayList<Object>()
.createCollection(collectionType, rawComponentType, sourceValue.size()); : CollectionFactory.createCollection(collectionType, rawComponentType, sourceValue.size());
for (int i = 0; i < sourceValue.size(); i++) { if (!DBRef.class.equals(rawComponentType) && isCollectionOfDbRefWhereBulkFetchIsPossible(sourceValue)) {
return bulkReadAndConvertDBRefs((List<DBRef>) (List) (sourceValue), componentType, path, rawComponentType);
}
Object dbObjItem = sourceValue.get(i); for (Object dbObjItem : sourceValue) {
if (dbObjItem instanceof DBRef) { if (dbObjItem instanceof DBRef) {
items.add(DBRef.class.equals(rawComponentType) ? dbObjItem : read(componentType, readRef((DBRef) dbObjItem),
parent)); items.add(DBRef.class.equals(rawComponentType) ? dbObjItem
: readAndConvertDBRef((DBRef) dbObjItem, componentType, path, rawComponentType));
} else if (dbObjItem instanceof DBObject) { } else if (dbObjItem instanceof DBObject) {
items.add(read(componentType, (DBObject) dbObjItem, parent)); items.add(read(componentType, (DBObject) dbObjItem, path));
} else { } else {
items.add(getPotentiallyConvertedSimpleRead(dbObjItem, rawComponentType)); items.add(getPotentiallyConvertedSimpleRead(dbObjItem, rawComponentType));
} }
@@ -843,13 +927,15 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* Reads the given {@link DBObject} into a {@link Map}. will recursively resolve nested {@link Map}s as well. * Reads the given {@link DBObject} into a {@link Map}. will recursively resolve nested {@link Map}s as well.
* *
* @param type the {@link Map} {@link TypeInformation} to be used to unmarshall this {@link DBObject}. * @param type the {@link Map} {@link TypeInformation} to be used to unmarshall this {@link DBObject}.
* @param dbObject * @param dbObject must not be {@literal null}
* @param path must not be {@literal null}
* @return * @return
*/ */
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
protected Map<Object, Object> readMap(TypeInformation<?> type, DBObject dbObject, Object parent) { protected Map<Object, Object> readMap(TypeInformation<?> type, DBObject dbObject, ObjectPath path) {
Assert.notNull(dbObject); Assert.notNull(dbObject, "DBObject must not be null!");
Assert.notNull(path, "Object path must not be null!");
Class<?> mapType = typeMapper.readType(dbObject, type).getType(); Class<?> mapType = typeMapper.readType(dbObject, type).getType();
@@ -862,6 +948,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Map<Object, Object> map = CollectionFactory.createMap(mapType, rawKeyType, dbObject.keySet().size()); Map<Object, Object> map = CollectionFactory.createMap(mapType, rawKeyType, dbObject.keySet().size());
Map<String, Object> sourceMap = dbObject.toMap(); Map<String, Object> sourceMap = dbObject.toMap();
if (!DBRef.class.equals(rawValueType) && isCollectionOfDbRefWhereBulkFetchIsPossible(sourceMap.values())) {
bulkReadAndConvertDBRefMapIntoTarget(valueType, rawValueType, sourceMap, map);
return map;
}
for (Entry<String, Object> entry : sourceMap.entrySet()) { for (Entry<String, Object> entry : sourceMap.entrySet()) {
if (typeMapper.isTypeKey(entry.getKey())) { if (typeMapper.isTypeKey(entry.getKey())) {
continue; continue;
@@ -876,9 +967,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Object value = entry.getValue(); Object value = entry.getValue();
if (value instanceof DBObject) { if (value instanceof DBObject) {
map.put(key, read(valueType, (DBObject) value, parent)); map.put(key, read(valueType, (DBObject) value, path));
} else if (value instanceof DBRef) { } else if (value instanceof DBRef) {
map.put(key, DBRef.class.equals(rawValueType) ? value : read(valueType, readRef((DBRef) value))); map.put(key, DBRef.class.equals(rawValueType) ? value
: readAndConvertDBRef((DBRef) value, valueType, ObjectPath.ROOT, rawValueType));
} else { } else {
Class<?> valueClass = valueType == null ? null : valueType.getType(); Class<?> valueClass = valueType == null ? null : valueType.getType();
map.put(key, getPotentiallyConvertedSimpleRead(value, valueClass)); map.put(key, getPotentiallyConvertedSimpleRead(value, valueClass));
@@ -888,21 +980,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return map; return map;
} }
protected <T> List<?> unwrapList(BasicDBList dbList, TypeInformation<T> targetType) {
List<Object> rootList = new ArrayList<Object>();
for (int i = 0; i < dbList.size(); i++) {
Object obj = dbList.get(i);
if (obj instanceof BasicDBList) {
rootList.add(unwrapList((BasicDBList) obj, targetType.getComponentType()));
} else if (obj instanceof DBObject) {
rootList.add(read(targetType, (DBObject) obj));
} else {
rootList.add(obj);
}
}
return rootList;
}
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.MongoWriter#convertToMongoType(java.lang.Object, org.springframework.data.util.TypeInformation) * @see org.springframework.data.mongodb.core.convert.MongoWriter#convertToMongoType(java.lang.Object, org.springframework.data.util.TypeInformation)
@@ -924,27 +1001,38 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return getPotentiallyConvertedSimpleWrite(obj); return getPotentiallyConvertedSimpleWrite(obj);
} }
TypeInformation<?> typeHint = typeInformation == null ? ClassTypeInformation.OBJECT : typeInformation; TypeInformation<?> typeHint = typeInformation;
if (obj instanceof BasicDBList) { if (obj instanceof BasicDBList) {
return maybeConvertList((BasicDBList) obj, typeHint); return maybeConvertList((BasicDBList) obj, typeHint);
} }
if (obj instanceof DBObject) { if (obj instanceof DBObject) {
DBObject newValueDbo = new BasicDBObject(); DBObject newValueDbo = new BasicDBObject();
for (String vk : ((DBObject) obj).keySet()) { for (String vk : ((DBObject) obj).keySet()) {
Object o = ((DBObject) obj).get(vk); Object o = ((DBObject) obj).get(vk);
newValueDbo.put(vk, convertToMongoType(o, typeHint)); newValueDbo.put(vk, convertToMongoType(o, typeHint));
} }
return newValueDbo; return newValueDbo;
} }
if (obj instanceof Map) { if (obj instanceof Map) {
DBObject result = new BasicDBObject();
for (Map.Entry<Object, Object> entry : ((Map<Object, Object>) obj).entrySet()) { Map<Object, Object> converted = new LinkedHashMap<Object, Object>();
result.put(entry.getKey().toString(), convertToMongoType(entry.getValue(), typeHint));
for (Entry<Object, Object> entry : ((Map<Object, Object>) obj).entrySet()) {
TypeInformation<? extends Object> valueTypeHint = typeHint != null && typeHint.getMapValueType() != null
? typeHint.getMapValueType() : typeHint;
converted.put(convertToMongoType(entry.getKey()), convertToMongoType(entry.getValue(), valueTypeHint));
} }
return result;
return new BasicDBObject(converted);
} }
if (obj.getClass().isArray()) { if (obj.getClass().isArray()) {
@@ -959,10 +1047,14 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
this.write(obj, newDbo); this.write(obj, newDbo);
if (typeInformation == null) { if (typeInformation == null) {
return removeTypeInfoRecursively(newDbo); return removeTypeInfo(newDbo, true);
} }
return !obj.getClass().equals(typeInformation.getType()) ? newDbo : removeTypeInfoRecursively(newDbo); if (typeInformation.getType().equals(NestedDocument.class)) {
return removeTypeInfo(newDbo, false);
}
return !obj.getClass().equals(typeInformation.getType()) ? newDbo : removeTypeInfo(newDbo, true);
} }
public BasicDBList maybeConvertList(Iterable<?> source, TypeInformation<?> typeInformation) { public BasicDBList maybeConvertList(Iterable<?> source, TypeInformation<?> typeInformation) {
@@ -976,12 +1068,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
} }
/** /**
* Removes the type information from the conversion result. * Removes the type information from the entire conversion result.
* *
* @param object * @param object
* @param recursively whether to apply the removal recursively
* @return * @return
*/ */
private Object removeTypeInfoRecursively(Object object) { private Object removeTypeInfo(Object object, boolean recursively) {
if (!(object instanceof DBObject)) { if (!(object instanceof DBObject)) {
return object; return object;
@@ -989,19 +1082,29 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
DBObject dbObject = (DBObject) object; DBObject dbObject = (DBObject) object;
String keyToRemove = null; String keyToRemove = null;
for (String key : dbObject.keySet()) { for (String key : dbObject.keySet()) {
if (typeMapper.isTypeKey(key)) { if (recursively) {
keyToRemove = key;
Object value = dbObject.get(key);
if (value instanceof BasicDBList) {
for (Object element : (BasicDBList) value) {
removeTypeInfo(element, recursively);
}
} else {
removeTypeInfo(value, recursively);
}
} }
Object value = dbObject.get(key); if (typeMapper.isTypeKey(key)) {
if (value instanceof BasicDBList) {
for (Object element : (BasicDBList) value) { keyToRemove = key;
removeTypeInfoRecursively(element);
if (!recursively) {
break;
} }
} else {
removeTypeInfoRecursively(value);
} }
} }
@@ -1012,24 +1115,34 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return dbObject; return dbObject;
} }
/**
* {@link PropertyValueProvider} to evaluate a SpEL expression if present on the property or simply accesses the field
* of the configured source {@link DBObject}.
*
* @author Oliver Gierke
*/
private class MongoDbPropertyValueProvider implements PropertyValueProvider<MongoPersistentProperty> { private class MongoDbPropertyValueProvider implements PropertyValueProvider<MongoPersistentProperty> {
private final DBObjectAccessor source; private final DBObjectAccessor source;
private final SpELExpressionEvaluator evaluator; private final SpELExpressionEvaluator evaluator;
private final Object parent; private final ObjectPath path;
public MongoDbPropertyValueProvider(DBObject source, SpELContext factory, Object parent) { /**
this(source, new DefaultSpELExpressionEvaluator(source, factory), parent); * Creates a new {@link MongoDbPropertyValueProvider} for the given source, {@link SpELExpressionEvaluator} and
} * {@link ObjectPath}.
*
public MongoDbPropertyValueProvider(DBObject source, DefaultSpELExpressionEvaluator evaluator, Object parent) { * @param source must not be {@literal null}.
* @param evaluator must not be {@literal null}.
* @param path can be {@literal null}.
*/
public MongoDbPropertyValueProvider(DBObject source, SpELExpressionEvaluator evaluator, ObjectPath path) {
Assert.notNull(source); Assert.notNull(source);
Assert.notNull(evaluator); Assert.notNull(evaluator);
this.source = new DBObjectAccessor(source); this.source = new DBObjectAccessor(source);
this.evaluator = evaluator; this.evaluator = evaluator;
this.parent = parent; this.path = path;
} }
/* /*
@@ -1045,7 +1158,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return null; return null;
} }
return readValue(value, property.getTypeInformation(), parent); return readValue(value, property.getTypeInformation(), path);
} }
} }
@@ -1055,10 +1168,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* *
* @author Oliver Gierke * @author Oliver Gierke
*/ */
private class ConverterAwareSpELExpressionParameterValueProvider extends private class ConverterAwareSpELExpressionParameterValueProvider
SpELExpressionParameterValueProvider<MongoPersistentProperty> { extends SpELExpressionParameterValueProvider<MongoPersistentProperty> {
private final Object parent; private final ObjectPath path;
/** /**
* Creates a new {@link ConverterAwareSpELExpressionParameterValueProvider}. * Creates a new {@link ConverterAwareSpELExpressionParameterValueProvider}.
@@ -1068,10 +1181,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param delegate must not be {@literal null}. * @param delegate must not be {@literal null}.
*/ */
public ConverterAwareSpELExpressionParameterValueProvider(SpELExpressionEvaluator evaluator, public ConverterAwareSpELExpressionParameterValueProvider(SpELExpressionEvaluator evaluator,
ConversionService conversionService, ParameterValueProvider<MongoPersistentProperty> delegate, Object parent) { ConversionService conversionService, ParameterValueProvider<MongoPersistentProperty> delegate,
ObjectPath path) {
super(evaluator, conversionService, delegate); super(evaluator, conversionService, delegate);
this.parent = parent; this.path = path;
} }
/* /*
@@ -1080,28 +1194,102 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*/ */
@Override @Override
protected <T> T potentiallyConvertSpelValue(Object object, Parameter<T, MongoPersistentProperty> parameter) { protected <T> T potentiallyConvertSpelValue(Object object, Parameter<T, MongoPersistentProperty> parameter) {
return readValue(object, parameter.getType(), parent); return readValue(object, parameter.getType(), path);
} }
} }
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
private <T> T readValue(Object value, TypeInformation<?> type, Object parent) { private <T> T readValue(Object value, TypeInformation<?> type, ObjectPath path) {
Class<?> rawType = type.getType(); Class<?> rawType = type.getType();
if (conversions.hasCustomReadTarget(value.getClass(), rawType)) { if (conversions.hasCustomReadTarget(value.getClass(), rawType)) {
return (T) conversionService.convert(value, rawType); return (T) conversionService.convert(value, rawType);
} else if (value instanceof DBRef) { } else if (value instanceof DBRef) {
return (T) (rawType.equals(DBRef.class) ? value : read(type, readRef((DBRef) value), parent)); return potentiallyReadOrResolveDbRef((DBRef) value, type, path, rawType);
} else if (value instanceof BasicDBList) { } else if (value instanceof BasicDBList) {
return (T) readCollectionOrArray(type, (BasicDBList) value, parent); return (T) readCollectionOrArray(type, (BasicDBList) value, path);
} else if (value instanceof DBObject) { } else if (value instanceof DBObject) {
return (T) read(type, (DBObject) value, parent); return (T) read(type, (DBObject) value, path);
} else { } else {
return (T) getPotentiallyConvertedSimpleRead(value, rawType); return (T) getPotentiallyConvertedSimpleRead(value, rawType);
} }
} }
@SuppressWarnings("unchecked")
private <T> T potentiallyReadOrResolveDbRef(DBRef dbref, TypeInformation<?> type, ObjectPath path, Class<?> rawType) {
if (rawType.equals(DBRef.class)) {
return (T) dbref;
}
Object object = dbref == null ? null : path.getPathItem(dbref.getId(), dbref.getCollectionName());
return (T) (object != null ? object : readAndConvertDBRef(dbref, type, path, rawType));
}
private <T> T readAndConvertDBRef(DBRef dbref, TypeInformation<?> type, ObjectPath path, final Class<?> rawType) {
List<T> result = bulkReadAndConvertDBRefs(Collections.singletonList(dbref), type, path, rawType);
return CollectionUtils.isEmpty(result) ? null : result.iterator().next();
}
@SuppressWarnings({ "unchecked", "rawtypes" })
private void bulkReadAndConvertDBRefMapIntoTarget(TypeInformation<?> valueType, Class<?> rawValueType,
Map<String, Object> sourceMap, Map<Object, Object> targetMap) {
LinkedHashMap<String, Object> referenceMap = new LinkedHashMap<String, Object>(sourceMap);
List<Object> convertedObjects = bulkReadAndConvertDBRefs((List<DBRef>) new ArrayList(referenceMap.values()),
valueType, ObjectPath.ROOT, rawValueType);
int index = 0;
for (String key : referenceMap.keySet()) {
targetMap.put(key, convertedObjects.get(index));
index++;
}
}
@SuppressWarnings("unchecked")
private <T> List<T> bulkReadAndConvertDBRefs(List<DBRef> dbrefs, TypeInformation<?> type, ObjectPath path,
final Class<?> rawType) {
if (CollectionUtils.isEmpty(dbrefs)) {
return Collections.emptyList();
}
List<DBObject> referencedRawDocuments = dbrefs.size() == 1
? Collections.singletonList(readRef(dbrefs.iterator().next())) : bulkReadRefs(dbrefs);
String collectionName = dbrefs.iterator().next().getCollectionName();
List<T> targeList = new ArrayList<T>(dbrefs.size());
for (DBObject document : referencedRawDocuments) {
if (document != null) {
maybeEmitEvent(new AfterLoadEvent<T>(document, (Class<T>) rawType, collectionName));
}
final T target = (T) read(type, document, path);
targeList.add(target);
if (target != null) {
maybeEmitEvent(new AfterConvertEvent<T>(document, target, collectionName));
}
}
return targeList;
}
private void maybeEmitEvent(MongoMappingEvent<?> event) {
if (canPublishEvent()) {
this.applicationContext.publishEvent(event);
}
}
private boolean canPublishEvent() {
return this.applicationContext != null;
}
/** /**
* Performs the fetch operation for the given {@link DBRef}. * Performs the fetch operation for the given {@link DBRef}.
* *
@@ -1109,6 +1297,56 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @return * @return
*/ */
DBObject readRef(DBRef ref) { DBObject readRef(DBRef ref) {
return ref.fetch(); return dbRefResolver.fetch(ref);
}
/**
* Performs a bulk fetch operation for the given {@link DBRef}s.
*
* @param references must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
List<DBObject> bulkReadRefs(List<DBRef> references) {
return dbRefResolver.bulkFetch(references);
}
/**
* Returns whether the given {@link Iterable} contains {@link DBRef} instances all pointing to the same collection.
*
* @param source must not be {@literal null}.
* @return
*/
private static boolean isCollectionOfDbRefWhereBulkFetchIsPossible(Iterable<Object> source) {
Assert.notNull(source, "Iterable of DBRefs must not be null!");
Set<String> collectionsFound = new HashSet<String>();
for (Object dbObjItem : source) {
if (!(dbObjItem instanceof DBRef)) {
return false;
}
collectionsFound.add(((DBRef) dbObjItem).getCollectionName());
if (collectionsFound.size() > 1) {
return false;
}
}
return true;
}
/**
* Marker class used to indicate we have a non root document object here that might be used within an update - so we
* need to preserve type hints for potential nested elements but need to remove it on top level.
*
* @author Christoph Strobl
* @since 1.8
*/
static class NestedDocument {
} }
} }

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011-2014 the original author or authors. * Copyright 2011-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -19,14 +19,30 @@ import java.math.BigDecimal;
import java.math.BigInteger; import java.math.BigInteger;
import java.net.MalformedURLException; import java.net.MalformedURLException;
import java.net.URL; import java.net.URL;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Currency;
import java.util.List;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.concurrent.atomic.AtomicLong;
import org.bson.types.Code;
import org.bson.types.ObjectId; import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionFailedException; import org.springframework.core.convert.ConversionFailedException;
import org.springframework.core.convert.TypeDescriptor; import org.springframework.core.convert.TypeDescriptor;
import org.springframework.core.convert.converter.ConditionalConverter;
import org.springframework.core.convert.converter.Converter; import org.springframework.core.convert.converter.Converter;
import org.springframework.core.convert.converter.ConverterFactory;
import org.springframework.data.convert.ReadingConverter; import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mongodb.core.query.Term;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import org.springframework.util.Assert;
import org.springframework.util.NumberUtils;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.DBObject; import com.mongodb.DBObject;
/** /**
@@ -34,6 +50,7 @@ import com.mongodb.DBObject;
* *
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
* @author Christoph Strobl
*/ */
abstract class MongoConverters { abstract class MongoConverters {
@@ -42,6 +59,36 @@ abstract class MongoConverters {
*/ */
private MongoConverters() {} private MongoConverters() {}
/**
* Returns the converters to be registered.
*
* @return
* @since 1.9
*/
public static Collection<Object> getConvertersToRegister() {
List<Object> converters = new ArrayList<Object>();
converters.add(BigDecimalToStringConverter.INSTANCE);
converters.add(StringToBigDecimalConverter.INSTANCE);
converters.add(BigIntegerToStringConverter.INSTANCE);
converters.add(StringToBigIntegerConverter.INSTANCE);
converters.add(URLToStringConverter.INSTANCE);
converters.add(StringToURLConverter.INSTANCE);
converters.add(DBObjectToStringConverter.INSTANCE);
converters.add(TermToStringConverter.INSTANCE);
converters.add(NamedMongoScriptToDBObjectConverter.INSTANCE);
converters.add(DBObjectToNamedMongoScriptCoverter.INSTANCE);
converters.add(CurrencyToStringConverter.INSTANCE);
converters.add(StringToCurrencyConverter.INSTANCE);
converters.add(AtomicIntegerToIntegerConverter.INSTANCE);
converters.add(AtomicLongToLongConverter.INSTANCE);
converters.add(LongToAtomicLongConverter.INSTANCE);
converters.add(IntegerToAtomicIntegerConverter.INSTANCE);
return converters;
}
/** /**
* Simple singleton to convert {@link ObjectId}s to their {@link String} representation. * Simple singleton to convert {@link ObjectId}s to their {@link String} representation.
* *
@@ -160,4 +207,238 @@ abstract class MongoConverters {
return source == null ? null : source.toString(); return source == null ? null : source.toString();
} }
} }
/**
* @author Christoph Strobl
* @since 1.6
*/
@WritingConverter
public static enum TermToStringConverter implements Converter<Term, String> {
INSTANCE;
@Override
public String convert(Term source) {
return source == null ? null : source.getFormatted();
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
public static enum DBObjectToNamedMongoScriptCoverter implements Converter<DBObject, NamedMongoScript> {
INSTANCE;
@Override
public NamedMongoScript convert(DBObject source) {
if (source == null) {
return null;
}
String id = source.get("_id").toString();
Object rawValue = source.get("value");
return new NamedMongoScript(id, ((Code) rawValue).getCode());
}
}
/**
* @author Christoph Strobl
* @since 1.7
*/
public static enum NamedMongoScriptToDBObjectConverter implements Converter<NamedMongoScript, DBObject> {
INSTANCE;
@Override
public DBObject convert(NamedMongoScript source) {
if (source == null) {
return new BasicDBObject();
}
BasicDBObjectBuilder builder = new BasicDBObjectBuilder();
builder.append("_id", source.getName());
builder.append("value", new Code(source.getCode()));
return builder.get();
}
}
/**
* {@link Converter} implementation converting {@link Currency} into its ISO 4217 {@link String} representation.
*
* @author Christoph Strobl
* @since 1.9
*/
@WritingConverter
public static enum CurrencyToStringConverter implements Converter<Currency, String> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public String convert(Currency source) {
return source == null ? null : source.getCurrencyCode();
}
}
/**
* {@link Converter} implementation converting ISO 4217 {@link String} into {@link Currency}.
*
* @author Christoph Strobl
* @since 1.9
*/
@ReadingConverter
public static enum StringToCurrencyConverter implements Converter<String, Currency> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public Currency convert(String source) {
return StringUtils.hasText(source) ? Currency.getInstance(source) : null;
}
}
/**
* {@link ConverterFactory} implementation using {@link NumberUtils} for number conversion and parsing. Additionally
* deals with {@link AtomicInteger} and {@link AtomicLong} by calling {@code get()} before performing the actual
* conversion.
*
* @author Christoph Strobl
* @since 1.9
*/
@WritingConverter
public static enum NumberToNumberConverterFactory implements ConverterFactory<Number, Number>,ConditionalConverter {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.ConverterFactory#getConverter(java.lang.Class)
*/
@Override
public <T extends Number> Converter<Number, T> getConverter(Class<T> targetType) {
return new NumberToNumberConverter<T>(targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.ConditionalConverter#matches(org.springframework.core.convert.TypeDescriptor, org.springframework.core.convert.TypeDescriptor)
*/
@Override
public boolean matches(TypeDescriptor sourceType, TypeDescriptor targetType) {
return !sourceType.equals(targetType);
}
private final static class NumberToNumberConverter<T extends Number> implements Converter<Number, T> {
private final Class<T> targetType;
/**
* Creates a new {@link NumberToNumberConverter} for the given target type.
*
* @param targetType must not be {@literal null}.
*/
public NumberToNumberConverter(Class<T> targetType) {
Assert.notNull(targetType, "Target type must not be null!");
this.targetType = targetType;
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public T convert(Number source) {
if (source instanceof AtomicInteger) {
return NumberUtils.convertNumberToTargetClass(((AtomicInteger) source).get(), this.targetType);
}
if (source instanceof AtomicLong) {
return NumberUtils.convertNumberToTargetClass(((AtomicLong) source).get(), this.targetType);
}
return NumberUtils.convertNumberToTargetClass(source, this.targetType);
}
}
}
/**
* {@link ConverterFactory} implementation converting {@link AtomicLong} into {@link Long}.
*
* @author Christoph Strobl
* @since 1.10
*/
@WritingConverter
public static enum AtomicLongToLongConverter implements Converter<AtomicLong, Long> {
INSTANCE;
@Override
public Long convert(AtomicLong source) {
return NumberUtils.convertNumberToTargetClass(source, Long.class);
}
}
/**
* {@link ConverterFactory} implementation converting {@link AtomicInteger} into {@link Integer}.
*
* @author Christoph Strobl
* @since 1.10
*/
@WritingConverter
public static enum AtomicIntegerToIntegerConverter implements Converter<AtomicInteger, Integer> {
INSTANCE;
@Override
public Integer convert(AtomicInteger source) {
return NumberUtils.convertNumberToTargetClass(source, Integer.class);
}
}
/**
* {@link ConverterFactory} implementation converting {@link Long} into {@link AtomicLong}.
*
* @author Christoph Strobl
* @since 1.10
*/
@ReadingConverter
public static enum LongToAtomicLongConverter implements Converter<Long, AtomicLong> {
INSTANCE;
@Override
public AtomicLong convert(Long source) {
return source != null ? new AtomicLong(source) : null;
}
}
/**
* {@link ConverterFactory} implementation converting {@link Integer} into {@link AtomicInteger}.
*
* @author Christoph Strobl
* @since 1.10
*/
@ReadingConverter
public static enum IntegerToAtomicIntegerConverter implements Converter<Integer, AtomicInteger> {
INSTANCE;
@Override
public AtomicInteger convert(Integer source) {
return source != null ? new AtomicInteger(source) : null;
}
}
} }

View File

@@ -0,0 +1,275 @@
/*
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import java.util.Set;
import java.util.Stack;
import java.util.regex.Pattern;
import org.springframework.data.domain.Example;
import org.springframework.data.domain.ExampleMatcher.NullHandler;
import org.springframework.data.domain.ExampleMatcher.PropertyValueTransformer;
import org.springframework.data.domain.ExampleMatcher.StringMatcher;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.query.MongoRegexCreator;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.data.repository.core.support.ExampleMatcherAccessor;
import org.springframework.data.repository.query.parser.Part.Type;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* @author Christoph Strobl
* @author Mark Paluch
* @since 1.8
*/
public class MongoExampleMapper {
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final MongoConverter converter;
private final Map<StringMatcher, Type> stringMatcherPartMapping = new HashMap<StringMatcher, Type>();
public MongoExampleMapper(MongoConverter converter) {
this.converter = converter;
this.mappingContext = converter.getMappingContext();
stringMatcherPartMapping.put(StringMatcher.EXACT, Type.SIMPLE_PROPERTY);
stringMatcherPartMapping.put(StringMatcher.CONTAINING, Type.CONTAINING);
stringMatcherPartMapping.put(StringMatcher.STARTING, Type.STARTING_WITH);
stringMatcherPartMapping.put(StringMatcher.ENDING, Type.ENDING_WITH);
stringMatcherPartMapping.put(StringMatcher.REGEX, Type.REGEX);
}
/**
* Returns the given {@link Example} as {@link DBObject} holding matching values extracted from
* {@link Example#getProbe()}.
*
* @param example must not be {@literal null}.
* @return
*/
public DBObject getMappedExample(Example<?> example) {
Assert.notNull(example, "Example must not be null!");
return getMappedExample(example, mappingContext.getPersistentEntity(example.getProbeType()));
}
/**
* Returns the given {@link Example} as {@link DBObject} holding matching values extracted from
* {@link Example#getProbe()}.
*
* @param example must not be {@literal null}.
* @param entity must not be {@literal null}.
* @return
*/
public DBObject getMappedExample(Example<?> example, MongoPersistentEntity<?> entity) {
Assert.notNull(example, "Example must not be null!");
Assert.notNull(entity, "MongoPersistentEntity must not be null!");
DBObject reference = (DBObject) converter.convertToMongoType(example.getProbe());
if (entity.hasIdProperty() && entity.getIdentifierAccessor(example.getProbe()).getIdentifier() == null) {
reference.removeField(entity.getIdProperty().getFieldName());
}
ExampleMatcherAccessor matcherAccessor = new ExampleMatcherAccessor(example.getMatcher());
applyPropertySpecs("", reference, example.getProbeType(), matcherAccessor);
DBObject flattened = ObjectUtils.nullSafeEquals(NullHandler.INCLUDE, matcherAccessor.getNullHandler()) ? reference
: new BasicDBObject(SerializationUtils.flattenMap(reference));
DBObject result = example.getMatcher().isAllMatching() ? flattened : orConcatenate(flattened);
this.converter.getTypeMapper().writeTypeRestrictions(result, getTypesToMatch(example));
return result;
}
private static DBObject orConcatenate(DBObject source) {
List<DBObject> foo = new ArrayList<DBObject>(source.keySet().size());
for (String key : source.keySet()) {
foo.add(new BasicDBObject(key, source.get(key)));
}
return new BasicDBObject("$or", foo);
}
private Set<Class<?>> getTypesToMatch(Example<?> example) {
Set<Class<?>> types = new HashSet<Class<?>>();
for (TypeInformation<?> reference : mappingContext.getManagedTypes()) {
if (example.getProbeType().isAssignableFrom(reference.getType())) {
types.add(reference.getType());
}
}
return types;
}
private String getMappedPropertyPath(String path, Class<?> probeType) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(probeType);
Iterator<String> parts = Arrays.asList(path.split("\\.")).iterator();
final Stack<MongoPersistentProperty> stack = new Stack<MongoPersistentProperty>();
List<String> resultParts = new ArrayList<String>();
while (parts.hasNext()) {
final String part = parts.next();
MongoPersistentProperty prop = entity.getPersistentProperty(part);
if (prop == null) {
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
@Override
public void doWithPersistentProperty(MongoPersistentProperty property) {
if (property.getFieldName().equals(part)) {
stack.push(property);
}
}
});
if (stack.isEmpty()) {
return "";
}
prop = stack.pop();
}
resultParts.add(prop.getName());
if (prop.isEntity() && mappingContext.hasPersistentEntityFor(prop.getActualType())) {
entity = mappingContext.getPersistentEntity(prop.getActualType());
} else {
break;
}
}
return StringUtils.collectionToDelimitedString(resultParts, ".");
}
private void applyPropertySpecs(String path, DBObject source, Class<?> probeType,
ExampleMatcherAccessor exampleSpecAccessor) {
if (!(source instanceof BasicDBObject)) {
return;
}
Iterator<Map.Entry<String, Object>> iter = ((BasicDBObject) source).entrySet().iterator();
while (iter.hasNext()) {
Map.Entry<String, Object> entry = iter.next();
String propertyPath = StringUtils.hasText(path) ? path + "." + entry.getKey() : entry.getKey();
String mappedPropertyPath = getMappedPropertyPath(propertyPath, probeType);
if (isEmptyIdProperty(entry)) {
iter.remove();
continue;
}
if (exampleSpecAccessor.isIgnoredPath(propertyPath) || exampleSpecAccessor.isIgnoredPath(mappedPropertyPath)) {
iter.remove();
continue;
}
StringMatcher stringMatcher = exampleSpecAccessor.getDefaultStringMatcher();
Object value = entry.getValue();
boolean ignoreCase = exampleSpecAccessor.isIgnoreCaseEnabled();
if (exampleSpecAccessor.hasPropertySpecifiers()) {
mappedPropertyPath = exampleSpecAccessor.hasPropertySpecifier(propertyPath) ? propertyPath
: getMappedPropertyPath(propertyPath, probeType);
stringMatcher = exampleSpecAccessor.getStringMatcherForPath(mappedPropertyPath);
ignoreCase = exampleSpecAccessor.isIgnoreCaseForPath(mappedPropertyPath);
}
// TODO: should a PropertySpecifier outrule the later on string matching?
if (exampleSpecAccessor.hasPropertySpecifier(mappedPropertyPath)) {
PropertyValueTransformer valueTransformer = exampleSpecAccessor.getValueTransformerForPath(mappedPropertyPath);
value = valueTransformer.convert(value);
if (value == null) {
iter.remove();
continue;
}
entry.setValue(value);
}
if (entry.getValue() instanceof String) {
applyStringMatcher(entry, stringMatcher, ignoreCase);
} else if (entry.getValue() instanceof BasicDBObject) {
applyPropertySpecs(propertyPath, (BasicDBObject) entry.getValue(), probeType, exampleSpecAccessor);
}
}
}
private boolean isEmptyIdProperty(Entry<String, Object> entry) {
return entry.getKey().equals("_id") && entry.getValue() == null;
}
private void applyStringMatcher(Map.Entry<String, Object> entry, StringMatcher stringMatcher, boolean ignoreCase) {
BasicDBObject dbo = new BasicDBObject();
if (ObjectUtils.nullSafeEquals(StringMatcher.DEFAULT, stringMatcher)) {
if (ignoreCase) {
dbo.put("$regex", Pattern.quote((String) entry.getValue()));
entry.setValue(dbo);
}
} else {
Type type = stringMatcherPartMapping.get(stringMatcher);
String expression = MongoRegexCreator.INSTANCE.toRegularExpression((String) entry.getValue(), type);
dbo.put("$regex", expression);
entry.setValue(dbo);
}
if (ignoreCase) {
dbo.put("$options", "i");
}
}
}

View File

@@ -0,0 +1,182 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.DBObject;
/**
* A path of objects nested into each other. The type allows access to all parent objects currently in creation even
* when resolving more nested objects. This allows to avoid re-resolving object instances that are logically equivalent
* to already resolved ones.
* <p>
* An immutable ordered set of target objects for {@link DBObject} to {@link Object} conversions. Object paths can be
* constructed by the {@link #toObjectPath(Object)} method and extended via {@link #push(Object)}.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.6
*/
class ObjectPath {
public static final ObjectPath ROOT = new ObjectPath();
private final List<ObjectPathItem> items;
private ObjectPath() {
this.items = Collections.emptyList();
}
/**
* Creates a new {@link ObjectPath} from the given parent {@link ObjectPath} by adding the provided
* {@link ObjectPathItem} to it.
*
* @param parent can be {@literal null}.
* @param item
*/
private ObjectPath(ObjectPath parent, ObjectPath.ObjectPathItem item) {
List<ObjectPath.ObjectPathItem> items = new ArrayList<ObjectPath.ObjectPathItem>(parent.items);
items.add(item);
this.items = Collections.unmodifiableList(items);
}
/**
* Returns a copy of the {@link ObjectPath} with the given {@link Object} as current object.
*
* @param object must not be {@literal null}.
* @param entity must not be {@literal null}.
* @param id must not be {@literal null}.
* @return
*/
public ObjectPath push(Object object, MongoPersistentEntity<?> entity, Object id) {
Assert.notNull(object, "Object must not be null!");
Assert.notNull(entity, "MongoPersistentEntity must not be null!");
ObjectPathItem item = new ObjectPathItem(object, id, entity.getCollection());
return new ObjectPath(this, item);
}
/**
* Returns the object with the given id and stored in the given collection if it's contained in the {@link ObjectPath}
* .
*
* @param id must not be {@literal null}.
* @param collection must not be {@literal null} or empty.
* @return
*/
public Object getPathItem(Object id, String collection) {
Assert.notNull(id, "Id must not be null!");
Assert.hasText(collection, "Collection name must not be null!");
for (ObjectPathItem item : items) {
Object object = item.getObject();
if (object == null) {
continue;
}
if (item.getIdValue() == null) {
continue;
}
if (collection.equals(item.getCollection()) && id.equals(item.getIdValue())) {
return object;
}
}
return null;
}
/**
* Returns the current object of the {@link ObjectPath} or {@literal null} if the path is empty.
*
* @return
*/
public Object getCurrentObject() {
return items.isEmpty() ? null : items.get(items.size() - 1).getObject();
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
if (items.isEmpty()) {
return "[empty]";
}
List<String> strings = new ArrayList<String>(items.size());
for (ObjectPathItem item : items) {
strings.add(item.object.toString());
}
return StringUtils.collectionToDelimitedString(strings, " -> ");
}
/**
* An item in an {@link ObjectPath}.
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
private static class ObjectPathItem {
private final Object object;
private final Object idValue;
private final String collection;
/**
* Creates a new {@link ObjectPathItem}.
*
* @param object
* @param idValue
* @param collection
*/
ObjectPathItem(Object object, Object idValue, String collection) {
this.object = object;
this.idValue = idValue;
this.collection = collection;
}
public Object getObject() {
return object;
}
public Object getIdValue() {
return idValue;
}
public String getCollection() {
return collection;
}
}
}

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2011-2014 the original author or authors. * Copyright 2011-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -27,17 +27,22 @@ import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionException; import org.springframework.core.convert.ConversionException;
import org.springframework.core.convert.ConversionService; import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.converter.Converter; import org.springframework.core.convert.converter.Converter;
import org.springframework.data.domain.Example;
import org.springframework.data.mapping.Association; import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.PersistentEntity; import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PropertyPath; import org.springframework.data.mapping.PropertyPath;
import org.springframework.data.mapping.PropertyReferenceException; import org.springframework.data.mapping.PropertyReferenceException;
import org.springframework.data.mapping.context.InvalidPersistentPropertyPath;
import org.springframework.data.mapping.context.MappingContext; import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.PersistentPropertyPath; import org.springframework.data.mapping.context.PersistentPropertyPath;
import org.springframework.data.mapping.model.MappingException; import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter.NestedDocument;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity; import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty; import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty.PropertyToFieldNameConverter; import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty.PropertyToFieldNameConverter;
import org.springframework.data.mongodb.core.query.Query; import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import com.mongodb.BasicDBList; import com.mongodb.BasicDBList;
@@ -57,10 +62,17 @@ import com.mongodb.DBRef;
public class QueryMapper { public class QueryMapper {
private static final List<String> DEFAULT_ID_NAMES = Arrays.asList("id", "_id"); private static final List<String> DEFAULT_ID_NAMES = Arrays.asList("id", "_id");
private static final DBObject META_TEXT_SCORE = new BasicDBObject("$meta", "textScore");
static final ClassTypeInformation<?> NESTED_DOCUMENT = ClassTypeInformation.from(NestedDocument.class);
private enum MetaMapping {
FORCE, WHEN_PRESENT, IGNORE;
}
private final ConversionService conversionService; private final ConversionService conversionService;
private final MongoConverter converter; private final MongoConverter converter;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext; private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final MongoExampleMapper exampleMapper;
/** /**
* Creates a new {@link QueryMapper} with the given {@link MongoConverter}. * Creates a new {@link QueryMapper} with the given {@link MongoConverter}.
@@ -74,6 +86,7 @@ public class QueryMapper {
this.conversionService = converter.getConversionService(); this.conversionService = converter.getConversionService();
this.converter = converter; this.converter = converter;
this.mappingContext = converter.getMappingContext(); this.mappingContext = converter.getMappingContext();
this.exampleMapper = new MongoExampleMapper(converter);
} }
/** /**
@@ -110,15 +123,80 @@ public class QueryMapper {
continue; continue;
} }
Field field = createPropertyField(entity, key, mappingContext); try {
Entry<String, Object> entry = getMappedObjectForField(field, query.get(key));
result.put(entry.getKey(), entry.getValue()); Field field = createPropertyField(entity, key, mappingContext);
Entry<String, Object> entry = getMappedObjectForField(field, query.get(key));
result.put(entry.getKey(), entry.getValue());
} catch (InvalidPersistentPropertyPath invalidPathException) {
// in case the object has not already been mapped
if (!(query.get(key) instanceof DBObject)) {
throw invalidPathException;
}
result.put(key, query.get(key));
}
} }
return result; return result;
} }
/**
* Maps fields used for sorting to the {@link MongoPersistentEntity}s properties. <br />
* Also converts properties to their {@code $meta} representation if present.
*
* @param sortObject
* @param entity
* @return
* @since 1.6
*/
public DBObject getMappedSort(DBObject sortObject, MongoPersistentEntity<?> entity) {
if (sortObject == null) {
return null;
}
DBObject mappedSort = getMappedObject(sortObject, entity);
mapMetaAttributes(mappedSort, entity, MetaMapping.WHEN_PRESENT);
return mappedSort;
}
/**
* Maps fields to retrieve to the {@link MongoPersistentEntity}s properties. <br />
* Also onverts and potentially adds missing property {@code $meta} representation.
*
* @param fieldsObject
* @param entity
* @return
* @since 1.6
*/
public DBObject getMappedFields(DBObject fieldsObject, MongoPersistentEntity<?> entity) {
DBObject mappedFields = fieldsObject != null ? getMappedObject(fieldsObject, entity) : new BasicDBObject();
mapMetaAttributes(mappedFields, entity, MetaMapping.FORCE);
return mappedFields.keySet().isEmpty() ? null : mappedFields;
}
private void mapMetaAttributes(DBObject source, MongoPersistentEntity<?> entity, MetaMapping metaMapping) {
if (entity == null || source == null) {
return;
}
if (entity.hasTextScoreProperty() && !MetaMapping.IGNORE.equals(metaMapping)) {
MongoPersistentProperty textScoreProperty = entity.getTextScoreProperty();
if (MetaMapping.FORCE.equals(metaMapping)
|| (MetaMapping.WHEN_PRESENT.equals(metaMapping) && source.containsField(textScoreProperty.getFieldName()))) {
source.putAll(getMappedTextScoreField(textScoreProperty));
}
}
}
private DBObject getMappedTextScoreField(MongoPersistentProperty property) {
return new BasicDBObject(property.getFieldName(), META_TEXT_SCORE);
}
/** /**
* Extracts the mapped object value for given field out of rawValue taking nested {@link Keyword}s into account * Extracts the mapped object value for given field out of rawValue taking nested {@link Keyword}s into account
* *
@@ -162,7 +240,7 @@ public class QueryMapper {
protected DBObject getMappedKeyword(Keyword keyword, MongoPersistentEntity<?> entity) { protected DBObject getMappedKeyword(Keyword keyword, MongoPersistentEntity<?> entity) {
// $or/$nor // $or/$nor
if (keyword.isOrOrNor() || keyword.hasIterableValue()) { if (keyword.isOrOrNor() || (keyword.hasIterableValue() && !keyword.isGeometry())) {
Iterable<?> conditions = keyword.getValue(); Iterable<?> conditions = keyword.getValue();
BasicDBList newConditions = new BasicDBList(); BasicDBList newConditions = new BasicDBList();
@@ -175,6 +253,10 @@ public class QueryMapper {
return new BasicDBObject(keyword.getKey(), newConditions); return new BasicDBObject(keyword.getKey(), newConditions);
} }
if (keyword.isSample()) {
return exampleMapper.getMappedExample(keyword.<Example<?>> getValue(), entity);
}
return new BasicDBObject(keyword.getKey(), convertSimpleOrDBObject(keyword.getValue(), entity)); return new BasicDBObject(keyword.getKey(), convertSimpleOrDBObject(keyword.getValue(), entity));
} }
@@ -190,8 +272,8 @@ public class QueryMapper {
boolean needsAssociationConversion = property.isAssociation() && !keyword.isExists(); boolean needsAssociationConversion = property.isAssociation() && !keyword.isExists();
Object value = keyword.getValue(); Object value = keyword.getValue();
Object convertedValue = needsAssociationConversion ? convertAssociation(value, property) : getMappedValue( Object convertedValue = needsAssociationConversion ? convertAssociation(value, property)
property.with(keyword.getKey()), value); : getMappedValue(property.with(keyword.getKey()), value);
return new BasicDBObject(keyword.key, convertedValue); return new BasicDBObject(keyword.key, convertedValue);
} }
@@ -274,7 +356,8 @@ public class QueryMapper {
} }
MongoPersistentEntity<?> entity = documentField.getPropertyEntity(); MongoPersistentEntity<?> entity = documentField.getPropertyEntity();
return entity.hasIdProperty() && entity.getIdProperty().getActualType().isAssignableFrom(type); return entity.hasIdProperty()
&& (type.equals(DBRef.class) || entity.getIdProperty().getActualType().isAssignableFrom(type));
} }
/** /**
@@ -322,10 +405,16 @@ public class QueryMapper {
*/ */
protected Object convertAssociation(Object source, MongoPersistentProperty property) { protected Object convertAssociation(Object source, MongoPersistentProperty property) {
if (property == null || source == null || source instanceof DBRef || source instanceof DBObject) { if (property == null || source == null || source instanceof DBObject) {
return source; return source;
} }
if (source instanceof DBRef) {
DBRef ref = (DBRef) source;
return new DBRef(ref.getCollectionName(), convertId(ref.getId()));
}
if (source instanceof Iterable) { if (source instanceof Iterable) {
BasicDBList result = new BasicDBList(); BasicDBList result = new BasicDBList();
for (Object element : (Iterable<?>) source) { for (Object element : (Iterable<?>) source) {
@@ -397,13 +486,20 @@ public class QueryMapper {
*/ */
public Object convertId(Object id) { public Object convertId(Object id) {
try { if (id == null) {
return conversionService.convert(id, ObjectId.class); return null;
} catch (ConversionException e) {
// Ignore
} }
return delegateConvertToMongoType(id, null); if (id instanceof String) {
return ObjectId.isValid(id.toString()) ? conversionService.convert(id, ObjectId.class) : id;
}
try {
return conversionService.canConvert(id.getClass(), ObjectId.class) ? conversionService.convert(id, ObjectId.class)
: delegateConvertToMongoType(id, null);
} catch (ConversionException o_O) {
return delegateConvertToMongoType(id, null);
}
} }
/** /**
@@ -478,6 +574,26 @@ public class QueryMapper {
return key.matches(N_OR_PATTERN); return key.matches(N_OR_PATTERN);
} }
/**
* Returns whether the current keyword is the {@code $geometry} keyword.
*
* @return
* @since 1.8
*/
public boolean isGeometry() {
return "$geometry".equalsIgnoreCase(key);
}
/**
* Returns wheter the current keyword indicates a sample object.
*
* @return
* @since 1.8
*/
public boolean isSample() {
return "$sample".equalsIgnoreCase(key);
}
public boolean hasIterableValue() { public boolean hasIterableValue() {
return value instanceof Iterable; return value instanceof Iterable;
} }
@@ -583,6 +699,10 @@ public class QueryMapper {
public Association<MongoPersistentProperty> getAssociation() { public Association<MongoPersistentProperty> getAssociation() {
return null; return null;
} }
public TypeInformation<?> getTypeHint() {
return ClassTypeInformation.OBJECT;
}
} }
/** /**
@@ -725,7 +845,7 @@ public class QueryMapper {
*/ */
@Override @Override
public String getMappedKey() { public String getMappedKey() {
return path == null ? name : path.toDotPath(getPropertyConverter()); return path == null ? name : path.toDotPath(isAssociation() ? getAssociationConverter() : getPropertyConverter());
} }
protected PersistentPropertyPath<MongoPersistentProperty> getPath() { protected PersistentPropertyPath<MongoPersistentProperty> getPath() {
@@ -742,7 +862,7 @@ public class QueryMapper {
try { try {
PropertyPath path = PropertyPath.from(pathExpression, entity.getTypeInformation()); PropertyPath path = PropertyPath.from(pathExpression.replaceAll("\\.\\d", ""), entity.getTypeInformation());
PersistentPropertyPath<MongoPersistentProperty> propertyPath = mappingContext.getPersistentPropertyPath(path); PersistentPropertyPath<MongoPersistentProperty> propertyPath = mappingContext.getPersistentPropertyPath(path);
Iterator<MongoPersistentProperty> iterator = propertyPath.iterator(); Iterator<MongoPersistentProperty> iterator = propertyPath.iterator();
@@ -775,7 +895,156 @@ public class QueryMapper {
* @return * @return
*/ */
protected Converter<MongoPersistentProperty, String> getPropertyConverter() { protected Converter<MongoPersistentProperty, String> getPropertyConverter() {
return PropertyToFieldNameConverter.INSTANCE; return new PositionParameterRetainingPropertyKeyConverter(name);
}
/**
* Return the {@link Converter} to use for creating the mapped key of an association. Default implementation is
* {@link AssociationConverter}.
*
* @return
* @since 1.7
*/
protected Converter<MongoPersistentProperty, String> getAssociationConverter() {
return new AssociationConverter(getAssociation());
}
/**
* @author Christoph Strobl
* @since 1.8
*/
static class PositionParameterRetainingPropertyKeyConverter implements Converter<MongoPersistentProperty, String> {
private final KeyMapper keyMapper;
public PositionParameterRetainingPropertyKeyConverter(String rawKey) {
this.keyMapper = new KeyMapper(rawKey);
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public String convert(MongoPersistentProperty source) {
return keyMapper.mapPropertyName(source);
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.Field#getTypeHint()
*/
@Override
public TypeInformation<?> getTypeHint() {
MongoPersistentProperty property = getProperty();
if (property == null) {
return super.getTypeHint();
}
if (property.getActualType().isInterface()
|| java.lang.reflect.Modifier.isAbstract(property.getActualType().getModifiers())) {
return ClassTypeInformation.OBJECT;
}
return NESTED_DOCUMENT;
}
/**
* @author Christoph Strobl
* @since 1.8
*/
static class KeyMapper {
private final Iterator<String> iterator;
public KeyMapper(String key) {
this.iterator = Arrays.asList(key.split("\\.")).iterator();
this.iterator.next();
}
/**
* Maps the property name while retaining potential positional operator {@literal $}.
*
* @param property
* @return
*/
protected String mapPropertyName(MongoPersistentProperty property) {
StringBuilder mappedName = new StringBuilder(PropertyToFieldNameConverter.INSTANCE.convert(property));
boolean inspect = iterator.hasNext();
while (inspect) {
String partial = iterator.next();
boolean isPositional = (isPositionalParameter(partial) && (property.isMap() || property.isCollectionLike()));
if (isPositional) {
mappedName.append(".").append(partial);
}
inspect = isPositional && iterator.hasNext();
}
return mappedName.toString();
}
private static boolean isPositionalParameter(String partial) {
if ("$".equals(partial)) {
return true;
}
try {
Long.valueOf(partial);
return true;
} catch (NumberFormatException e) {
return false;
}
}
}
}
/**
* Converter to skip all properties after an association property was rendered.
*
* @author Oliver Gierke
*/
protected static class AssociationConverter implements Converter<MongoPersistentProperty, String> {
private final MongoPersistentProperty property;
private boolean associationFound;
/**
* Creates a new {@link AssociationConverter} for the given {@link Association}.
*
* @param association must not be {@literal null}.
*/
public AssociationConverter(Association<MongoPersistentProperty> association) {
Assert.notNull(association, "Association must not be null!");
this.property = association.getInverse();
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public String convert(MongoPersistentProperty source) {
if (associationFound) {
return null;
}
if (property.equals(source)) {
associationFound = true;
}
return source.getFieldName();
} }
} }
} }

View File

@@ -0,0 +1,66 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.util.Assert;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
/**
* {@link ReflectiveDBRefResolver} provides reflective access to {@link DBRef} API that is not consistently available
* for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class ReflectiveDBRefResolver {
private static final Method FETCH_METHOD;
static {
FETCH_METHOD = findMethod(DBRef.class, "fetch");
}
/**
* Fetches the object referenced from the database either be directly calling {@link DBRef#fetch()} or
* {@link DBCollection#findOne(Object)}.
*
* @param db can be {@literal null} when using MongoDB Java driver in version 2.x.
* @param ref must not be {@literal null}.
* @return the document that this references.
*/
public static DBObject fetch(MongoDbFactory factory, DBRef ref) {
Assert.notNull(ref, "DBRef to fetch must not be null!");
if (isMongo3Driver()) {
Assert.notNull(factory, "DbFactory to fetch DB from must not be null!");
return factory.getDb().getCollection(ref.getCollectionName()).findOne(ref.getId());
}
return (DBObject) invokeMethod(FETCH_METHOD, ref);
}
}

View File

@@ -1,5 +1,5 @@
/* /*
* Copyright 2013-2014 the original author or authors. * Copyright 2013-2015 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
@@ -15,8 +15,6 @@
*/ */
package org.springframework.data.mongodb.core.convert; package org.springframework.data.mongodb.core.convert;
import java.util.Arrays;
import java.util.Iterator;
import java.util.Map.Entry; import java.util.Map.Entry;
import org.springframework.core.convert.converter.Converter; import org.springframework.core.convert.converter.Converter;
@@ -24,12 +22,11 @@ import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.context.MappingContext; import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity; import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty; import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty.PropertyToFieldNameConverter;
import org.springframework.data.mongodb.core.query.Query; import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update.Modifier; import org.springframework.data.mongodb.core.query.Update.Modifier;
import org.springframework.data.mongodb.core.query.Update.Modifiers; import org.springframework.data.mongodb.core.query.Update.Modifiers;
import org.springframework.data.util.ClassTypeInformation; import org.springframework.data.util.ClassTypeInformation;
import org.springframework.util.Assert; import org.springframework.data.util.TypeInformation;
import com.mongodb.BasicDBObject; import com.mongodb.BasicDBObject;
import com.mongodb.DBObject; import com.mongodb.DBObject;
@@ -65,8 +62,8 @@ public class UpdateMapper extends QueryMapper {
*/ */
@Override @Override
protected Object delegateConvertToMongoType(Object source, MongoPersistentEntity<?> entity) { protected Object delegateConvertToMongoType(Object source, MongoPersistentEntity<?> entity) {
return entity == null ? super.delegateConvertToMongoType(source, null) : converter.convertToMongoType(source, return converter.convertToMongoType(source,
entity.getTypeInformation()); entity == null ? ClassTypeInformation.OBJECT : getTypeHintForEntity(source, entity));
} }
/* /*
@@ -89,7 +86,7 @@ public class UpdateMapper extends QueryMapper {
return getMappedUpdateModifier(field, rawValue); return getMappedUpdateModifier(field, rawValue);
} }
return super.getMappedObjectForField(field, getMappedValue(field, rawValue)); return super.getMappedObjectForField(field, rawValue);
} }
private Entry<String, Object> getMappedUpdateModifier(Field field, Object rawValue) { private Entry<String, Object> getMappedUpdateModifier(Field field, Object rawValue) {
@@ -97,14 +94,14 @@ public class UpdateMapper extends QueryMapper {
if (rawValue instanceof Modifier) { if (rawValue instanceof Modifier) {
value = getMappedValue((Modifier) rawValue); value = getMappedValue(field, (Modifier) rawValue);
} else if (rawValue instanceof Modifiers) { } else if (rawValue instanceof Modifiers) {
DBObject modificationOperations = new BasicDBObject(); DBObject modificationOperations = new BasicDBObject();
for (Modifier modifier : ((Modifiers) rawValue).getModifiers()) { for (Modifier modifier : ((Modifiers) rawValue).getModifiers()) {
modificationOperations.putAll(getMappedValue(modifier).toMap()); modificationOperations.putAll(getMappedValue(field, modifier).toMap());
} }
value = modificationOperations; value = modificationOperations;
@@ -132,12 +129,30 @@ public class UpdateMapper extends QueryMapper {
return value instanceof Query; return value instanceof Query;
} }
private DBObject getMappedValue(Modifier modifier) { private DBObject getMappedValue(Field field, Modifier modifier) {
Object value = converter.convertToMongoType(modifier.getValue(), ClassTypeInformation.OBJECT); TypeInformation<?> typeHint = field == null ? ClassTypeInformation.OBJECT : field.getTypeHint();
Object value = converter.convertToMongoType(modifier.getValue(), typeHint);
return new BasicDBObject(modifier.getKey(), value); return new BasicDBObject(modifier.getKey(), value);
} }
private TypeInformation<?> getTypeHintForEntity(Object source, MongoPersistentEntity<?> entity) {
TypeInformation<?> info = entity.getTypeInformation();
Class<?> type = info.getActualType().getType();
if (source == null || type.isInterface() || java.lang.reflect.Modifier.isAbstract(type.getModifiers())) {
return info;
}
if (!type.equals(source.getClass())) {
return info;
}
return NESTED_DOCUMENT;
}
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper#createPropertyField(org.springframework.data.mongodb.core.mapping.MongoPersistentEntity, java.lang.String, org.springframework.data.mapping.context.MappingContext) * @see org.springframework.data.mongodb.core.convert.QueryMapper#createPropertyField(org.springframework.data.mongodb.core.mapping.MongoPersistentEntity, java.lang.String, org.springframework.data.mapping.context.MappingContext)
@@ -146,8 +161,8 @@ public class UpdateMapper extends QueryMapper {
protected Field createPropertyField(MongoPersistentEntity<?> entity, String key, protected Field createPropertyField(MongoPersistentEntity<?> entity, String key,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) { MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
return entity == null ? super.createPropertyField(entity, key, mappingContext) : // return entity == null ? super.createPropertyField(entity, key, mappingContext)
new MetadataBackedUpdateField(entity, key, mappingContext); : new MetadataBackedUpdateField(entity, key, mappingContext);
} }
/** /**
@@ -194,28 +209,36 @@ public class UpdateMapper extends QueryMapper {
*/ */
@Override @Override
protected Converter<MongoPersistentProperty, String> getPropertyConverter() { protected Converter<MongoPersistentProperty, String> getPropertyConverter() {
return isAssociation() ? new AssociationConverter(getAssociation()) : new UpdatePropertyConverter(key); return new PositionParameterRetainingPropertyKeyConverter(key);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.MetadataBackedField#getAssociationConverter()
*/
@Override
protected Converter<MongoPersistentProperty, String> getAssociationConverter() {
return new UpdateAssociationConverter(getAssociation(), key);
} }
/** /**
* Converter to skip all properties after an association property was rendered. * {@link Converter} retaining positional parameter {@literal $} for {@link Association}s.
* *
* @author Oliver Gierke * @author Christoph Strobl
*/ */
private static class AssociationConverter implements Converter<MongoPersistentProperty, String> { protected static class UpdateAssociationConverter extends AssociationConverter {
private final MongoPersistentProperty property; private final KeyMapper mapper;
private boolean associationFound;
/** /**
* Creates a new {@link AssociationConverter} for the given {@link Association}. * Creates a new {@link AssociationConverter} for the given {@link Association}.
* *
* @param association must not be {@literal null}. * @param association must not be {@literal null}.
*/ */
public AssociationConverter(Association<MongoPersistentProperty> association) { public UpdateAssociationConverter(Association<MongoPersistentProperty> association, String key) {
Assert.notNull(association, "Association must not be null!"); super(association);
this.property = association.getInverse(); this.mapper = new KeyMapper(key);
} }
/* /*
@@ -224,51 +247,7 @@ public class UpdateMapper extends QueryMapper {
*/ */
@Override @Override
public String convert(MongoPersistentProperty source) { public String convert(MongoPersistentProperty source) {
return super.convert(source) == null ? null : mapper.mapPropertyName(source);
if (associationFound) {
return null;
}
if (property.equals(source)) {
associationFound = true;
}
return source.getFieldName();
}
}
/**
* Special {@link Converter} for {@link MongoPersistentProperty} instances that will concatenate the {@literal $}
* contained in the source update key.
*
* @author Oliver Gierke
*/
private static class UpdatePropertyConverter implements Converter<MongoPersistentProperty, String> {
private final Iterator<String> iterator;
/**
* Creates a new {@link UpdatePropertyConverter} with the given update key.
*
* @param updateKey must not be {@literal null} or empty.
*/
public UpdatePropertyConverter(String updateKey) {
Assert.hasText(updateKey, "Update key must not be null or empty!");
this.iterator = Arrays.asList(updateKey.split("\\.")).iterator();
this.iterator.next();
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public String convert(MongoPersistentProperty property) {
String mappedName = PropertyToFieldNameConverter.INSTANCE.convert(property);
return iterator.hasNext() && iterator.next().equals("$") ? String.format("%s.$", mappedName) : mappedName;
} }
} }
} }

View File

@@ -0,0 +1,42 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBObject;
/**
* Internal API to trigger the resolution of properties.
*
* @author Oliver Gierke
*/
interface ValueResolver {
/**
* Resolves the value for the given {@link MongoPersistentProperty} within the given {@link DBObject} using the given
* {@link SpELExpressionEvaluator} and {@link ObjectPath}.
*
* @param prop
* @param dbo
* @param evaluator
* @param parent
* @return
*/
Object getValueInternal(MongoPersistentProperty prop, DBObject dbo, SpELExpressionEvaluator evaluator,
ObjectPath parent);
}

View File

@@ -1,75 +0,0 @@
/*
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.geo.Point;
/**
* Represents a geospatial box value.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Box}. This class is scheduled to be
* removed in the next major release.
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class Box extends org.springframework.data.geo.Box implements Shape {
public static final String COMMAND = "$box";
public Box(Point lowerLeft, Point upperRight) {
super(lowerLeft, upperRight);
}
public Box(double[] lowerLeft, double[] upperRight) {
super(lowerLeft, upperRight);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
public List<? extends Object> asList() {
List<List<Double>> list = new ArrayList<List<Double>>();
list.add(Arrays.asList(getFirst().getX(), getFirst().getY()));
list.add(Arrays.asList(getSecond().getX(), getSecond().getY()));
return list;
}
public org.springframework.data.mongodb.core.geo.Point getLowerLeft() {
return new org.springframework.data.mongodb.core.geo.Point(getFirst());
}
public org.springframework.data.mongodb.core.geo.Point getUpperRight() {
return new org.springframework.data.mongodb.core.geo.Point(getSecond());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
public String getCommand() {
return COMMAND;
}
}

View File

@@ -1,154 +0,0 @@
/*
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Metrics;
import org.springframework.data.geo.Point;
import org.springframework.util.Assert;
/**
* Represents a geospatial circle value.
* <p>
* Note: We deliberately do not extend org.springframework.data.geo.Circle because introducing it's distance concept
* would break the clients that use the old Circle API.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Circle}. This class is scheduled to be
* removed in the next major release.
*/
@Deprecated
public class Circle implements Shape {
public static final String COMMAND = "$center";
private final Point center;
private final double radius;
/**
* Creates a new {@link Circle} from the given {@link Point} and radius.
*
* @param center must not be {@literal null}.
* @param radius must be greater or equal to zero.
*/
@PersistenceConstructor
public Circle(Point center, double radius) {
Assert.notNull(center);
Assert.isTrue(radius >= 0, "Radius must not be negative!");
this.center = center;
this.radius = radius;
}
/**
* Creates a new {@link Circle} from the given coordinates and radius as {@link Distance} with a
* {@link Metrics#NEUTRAL}.
*
* @param centerX
* @param centerY
* @param radius must be greater or equal to zero.
*/
public Circle(double centerX, double centerY, double radius) {
this(new Point(centerX, centerY), radius);
}
/**
* Returns the center of the {@link Circle}.
*
* @return will never be {@literal null}.
*/
public Point getCenter() {
return center;
}
/**
* Returns the radius of the {@link Circle}.
*
* @return
*/
public double getRadius() {
return radius;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
public List<Object> asList() {
List<Object> result = new ArrayList<Object>();
result.add(Arrays.asList(getCenter().getX(), getCenter().getY()));
result.add(getRadius());
return result;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
public String getCommand() {
return COMMAND;
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return String.format("Circle [center=%s, radius=%f]", center, radius);
}
/* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
Circle that = (Circle) obj;
return this.center.equals(that.center) && this.radius == that.radius;
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * center.hashCode();
result += 31 * radius;
return result;
}
}

View File

@@ -1,44 +0,0 @@
/*
* Copyright 2010-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import org.springframework.data.geo.Metric;
import org.springframework.data.geo.Metrics;
/**
* Value object to represent distances in a given metric.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Distance}. This class is scheduled to
* be removed in the next major release.
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public class Distance extends org.springframework.data.geo.Distance {
/**
* Creates a new {@link Distance}.
*
* @param value
*/
public Distance(double value) {
this(value, Metrics.NEUTRAL);
}
public Distance(double value, Metric metric) {
super(value, metric);
}
}

View File

@@ -0,0 +1,42 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
/**
* Interface definition for structures defined in GeoJSON ({@link http://geojson.org/}) format.
*
* @author Christoph Strobl
* @since 1.7
*/
public interface GeoJson<T extends Iterable<?>> {
/**
* String value representing the type of the {@link GeoJson} object.
*
* @return will never be {@literal null}.
* @see http://geojson.org/geojson-spec.html#geojson-objects
*/
String getType();
/**
* The value of the coordinates member is always an {@link Iterable}. The structure for the elements within is
* determined by {@link #getType()} of geometry.
*
* @return will never be {@literal null}.
* @see http://geojson.org/geojson-spec.html#geometry-objects
*/
T getCoordinates();
}

View File

@@ -0,0 +1,96 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Defines a {@link GeoJsonGeometryCollection} that consists of a {@link List} of {@link GeoJson} objects.
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#geometry-collection
*/
public class GeoJsonGeometryCollection implements GeoJson<Iterable<GeoJson<?>>> {
private static final String TYPE = "GeometryCollection";
private final List<GeoJson<?>> geometries = new ArrayList<GeoJson<?>>();
/**
* Creates a new {@link GeoJsonGeometryCollection} for the given {@link GeoJson} instances.
*
* @param geometries
*/
public GeoJsonGeometryCollection(List<GeoJson<?>> geometries) {
Assert.notNull(geometries, "Geometries must not be null!");
this.geometries.addAll(geometries);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getType()
*/
@Override
public String getType() {
return TYPE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getCoordinates()
*/
@Override
public Iterable<GeoJson<?>> getCoordinates() {
return Collections.unmodifiableList(this.geometries);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(this.geometries);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof GeoJsonGeometryCollection)) {
return false;
}
GeoJsonGeometryCollection other = (GeoJsonGeometryCollection) obj;
return ObjectUtils.nullSafeEquals(this.geometries, other.geometries);
}
}

View File

@@ -0,0 +1,61 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.List;
import org.springframework.data.geo.Point;
/**
* {@link GeoJsonLineString} is defined as list of at least 2 {@link Point}s.
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#linestring
*/
public class GeoJsonLineString extends GeoJsonMultiPoint {
private static final String TYPE = "LineString";
/**
* Creates a new {@link GeoJsonLineString} for the given {@link Point}s.
*
* @param points must not be {@literal null} and have at least 2 entries.
*/
public GeoJsonLineString(List<Point> points) {
super(points);
}
/**
* Creates a new {@link GeoJsonLineString} for the given {@link Point}s.
*
* @param first must not be {@literal null}
* @param second must not be {@literal null}
* @param others can be {@literal null}
*/
public GeoJsonLineString(Point first, Point second, Point... others) {
super(first, second, others);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJsonMultiPoint#getType()
*/
@Override
public String getType() {
return TYPE;
}
}

View File

@@ -0,0 +1,341 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.data.geo.Point;
import com.fasterxml.jackson.core.JsonParser;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.DeserializationContext;
import com.fasterxml.jackson.databind.JsonDeserializer;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.JsonSerializer;
import com.fasterxml.jackson.databind.Module;
import com.fasterxml.jackson.databind.module.SimpleModule;
import com.fasterxml.jackson.databind.node.ArrayNode;
/**
* A Jackson {@link Module} to register custom {@link JsonSerializer} and {@link JsonDeserializer}s for GeoJSON types.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
public class GeoJsonModule extends SimpleModule {
private static final long serialVersionUID = -8723016728655643720L;
public GeoJsonModule() {
addDeserializer(GeoJsonPoint.class, new GeoJsonPointDeserializer());
addDeserializer(GeoJsonMultiPoint.class, new GeoJsonMultiPointDeserializer());
addDeserializer(GeoJsonLineString.class, new GeoJsonLineStringDeserializer());
addDeserializer(GeoJsonMultiLineString.class, new GeoJsonMultiLineStringDeserializer());
addDeserializer(GeoJsonPolygon.class, new GeoJsonPolygonDeserializer());
addDeserializer(GeoJsonMultiPolygon.class, new GeoJsonMultiPolygonDeserializer());
}
/**
* @author Christoph Strobl
* @since 1.7
*/
private static abstract class GeoJsonDeserializer<T extends GeoJson<?>> extends JsonDeserializer<T> {
/*
* (non-Javadoc)
* @see com.fasterxml.jackson.databind.JsonDeserializer#deserialize(com.fasterxml.jackson.core.JsonParser, com.fasterxml.jackson.databind.DeserializationContext)
*/
@Override
public T deserialize(JsonParser jp, DeserializationContext ctxt) throws IOException, JsonProcessingException {
JsonNode node = jp.readValueAsTree();
JsonNode coordinates = node.get("coordinates");
if (coordinates != null && coordinates.isArray()) {
return doDeserialize((ArrayNode) coordinates);
}
return null;
}
/**
* Perform the actual deserialization given the {@literal coordinates} as {@link ArrayNode}.
*
* @param coordinates
* @return
*/
protected abstract T doDeserialize(ArrayNode coordinates);
/**
* Get the {@link GeoJsonPoint} representation of given {@link ArrayNode} assuming {@code node.[0]} represents
* {@literal x - coordinate} and {@code node.[1]} is {@literal y}.
*
* @param node can be {@literal null}.
* @return {@literal null} when given a {@code null} value.
*/
protected GeoJsonPoint toGeoJsonPoint(ArrayNode node) {
if (node == null) {
return null;
}
return new GeoJsonPoint(node.get(0).asDouble(), node.get(1).asDouble());
}
/**
* Get the {@link Point} representation of given {@link ArrayNode} assuming {@code node.[0]} represents
* {@literal x - coordinate} and {@code node.[1]} is {@literal y}.
*
* @param node can be {@literal null}.
* @return {@literal null} when given a {@code null} value.
*/
protected Point toPoint(ArrayNode node) {
if (node == null) {
return null;
}
return new Point(node.get(0).asDouble(), node.get(1).asDouble());
}
/**
* Get the points nested within given {@link ArrayNode}.
*
* @param node can be {@literal null}.
* @return {@literal empty list} when given a {@code null} value.
*/
protected List<Point> toPoints(ArrayNode node) {
if (node == null) {
return Collections.emptyList();
}
List<Point> points = new ArrayList<Point>(node.size());
for (JsonNode coordinatePair : node) {
if (coordinatePair.isArray()) {
points.add(toPoint((ArrayNode) coordinatePair));
}
}
return points;
}
protected GeoJsonLineString toLineString(ArrayNode node) {
return new GeoJsonLineString(toPoints((ArrayNode) node));
}
}
/**
* {@link JsonDeserializer} converting GeoJSON representation of {@literal Point}.
*
* <pre>
* <code>
* { "type": "Point", "coordinates": [10.0, 20.0] }
* </code>
* </pre>
*
* @author Christoph Strobl
* @since 1.7
*/
private static class GeoJsonPointDeserializer extends GeoJsonDeserializer<GeoJsonPoint> {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJsonModule.GeoJsonDeserializer#doDeserialize(com.fasterxml.jackson.databind.node.ArrayNode)
*/
@Override
protected GeoJsonPoint doDeserialize(ArrayNode coordinates) {
return toGeoJsonPoint(coordinates);
}
}
/**
* {@link JsonDeserializer} converting GeoJSON representation of {@literal LineString}.
*
* <pre>
* <code>
* {
* "type": "LineString",
* "coordinates": [
* [10.0, 20.0], [30.0, 40.0], [50.0, 60.0]
* ]
* }
* </code>
* </pre>
*
* @author Christoph Strobl
* @since 1.7
*/
private static class GeoJsonLineStringDeserializer extends GeoJsonDeserializer<GeoJsonLineString> {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJsonModule.GeoJsonDeserializer#doDeserialize(com.fasterxml.jackson.databind.node.ArrayNode)
*/
@Override
protected GeoJsonLineString doDeserialize(ArrayNode coordinates) {
return new GeoJsonLineString(toPoints(coordinates));
}
}
/**
* {@link JsonDeserializer} converting GeoJSON representation of {@literal MultiPoint}.
*
* <pre>
* <code>
* {
* "type": "MultiPoint",
* "coordinates": [
* [10.0, 20.0], [30.0, 40.0], [50.0, 60.0]
* ]
* }
* </code>
* </pre>
*
* @author Christoph Strobl
* @since 1.7
*/
private static class GeoJsonMultiPointDeserializer extends GeoJsonDeserializer<GeoJsonMultiPoint> {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJsonModule.GeoJsonDeserializer#doDeserialize(com.fasterxml.jackson.databind.node.ArrayNode)
*/
@Override
protected GeoJsonMultiPoint doDeserialize(ArrayNode coordinates) {
return new GeoJsonMultiPoint(toPoints(coordinates));
}
}
/**
* {@link JsonDeserializer} converting GeoJSON representation of {@literal MultiLineString}.
*
* <pre>
* <code>
* {
* "type": "MultiLineString",
* "coordinates": [
* [ [10.0, 20.0], [30.0, 40.0] ],
* [ [50.0, 60.0] , [70.0, 80.0] ]
* ]
* }
* </code>
* </pre>
*
* @author Christoph Strobl
* @since 1.7
*/
private static class GeoJsonMultiLineStringDeserializer extends GeoJsonDeserializer<GeoJsonMultiLineString> {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJsonModule.GeoJsonDeserializer#doDeserialize(com.fasterxml.jackson.databind.node.ArrayNode)
*/
@Override
protected GeoJsonMultiLineString doDeserialize(ArrayNode coordinates) {
List<GeoJsonLineString> lines = new ArrayList<GeoJsonLineString>(coordinates.size());
for (JsonNode lineString : coordinates) {
if (lineString.isArray()) {
lines.add(toLineString((ArrayNode) lineString));
}
}
return new GeoJsonMultiLineString(lines);
}
}
/**
* {@link JsonDeserializer} converting GeoJSON representation of {@literal Polygon}.
*
* <pre>
* <code>
* {
* "type": "Polygon",
* "coordinates": [
* [ [100.0, 0.0], [101.0, 0.0], [101.0, 1.0], [100.0, 1.0], [100.0, 0.0] ]
* ]
* }
* </code>
* </pre>
*
* @author Christoph Strobl
* @since 1.7
*/
private static class GeoJsonPolygonDeserializer extends GeoJsonDeserializer<GeoJsonPolygon> {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJsonModule.GeoJsonDeserializer#doDeserialize(com.fasterxml.jackson.databind.node.ArrayNode)
*/
@Override
protected GeoJsonPolygon doDeserialize(ArrayNode coordinates) {
for (JsonNode ring : coordinates) {
// currently we do not support holes in polygons.
return new GeoJsonPolygon(toPoints((ArrayNode) ring));
}
return null;
}
}
/**
* {@link JsonDeserializer} converting GeoJSON representation of {@literal MultiPolygon}.
*
* <pre>
* <code>
* {
* "type": "MultiPolygon",
* "coordinates": [
* [[[102.0, 2.0], [103.0, 2.0], [103.0, 3.0], [102.0, 3.0], [102.0, 2.0]]],
* [[[100.0, 0.0], [101.0, 0.0], [101.0, 1.0], [100.0, 1.0], [100.0, 0.0]],
* [[100.2, 0.2], [100.8, 0.2], [100.8, 0.8], [100.2, 0.8], [100.2, 0.2]]]
* ]
* }
* </code>
* </pre>
*
* @author Christoph Strobl
* @since 1.7
*/
private static class GeoJsonMultiPolygonDeserializer extends GeoJsonDeserializer<GeoJsonMultiPolygon> {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJsonModule.GeoJsonDeserializer#doDeserialize(com.fasterxml.jackson.databind.node.ArrayNode)
*/
@Override
protected GeoJsonMultiPolygon doDeserialize(ArrayNode coordinates) {
List<GeoJsonPolygon> polygones = new ArrayList<GeoJsonPolygon>(coordinates.size());
for (JsonNode polygon : coordinates) {
for (JsonNode ring : (ArrayNode) polygon) {
polygones.add(new GeoJsonPolygon(toPoints((ArrayNode) ring)));
}
}
return new GeoJsonMultiPolygon(polygones);
}
}
}

View File

@@ -0,0 +1,109 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.data.geo.Point;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* {@link GeoJsonMultiLineString} is defined as list of {@link GeoJsonLineString}s.
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#multilinestring
*/
public class GeoJsonMultiLineString implements GeoJson<Iterable<GeoJsonLineString>> {
private static final String TYPE = "MultiLineString";
private List<GeoJsonLineString> coordinates = new ArrayList<GeoJsonLineString>();
/**
* Creates new {@link GeoJsonMultiLineString} for the given {@link Point}s.
*
* @param lines must not be {@literal null}.
*/
public GeoJsonMultiLineString(List<Point>... lines) {
Assert.notEmpty(lines, "Points for MultiLineString must not be null!");
for (List<Point> line : lines) {
this.coordinates.add(new GeoJsonLineString(line));
}
}
/**
* Creates new {@link GeoJsonMultiLineString} for the given {@link GeoJsonLineString}s.
*
* @param lines must not be {@literal null}.
*/
public GeoJsonMultiLineString(List<GeoJsonLineString> lines) {
Assert.notNull(lines, "Lines for MultiLineString must not be null!");
this.coordinates.addAll(lines);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getType()
*/
@Override
public String getType() {
return TYPE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getCoordinates()
*/
@Override
public Iterable<GeoJsonLineString> getCoordinates() {
return Collections.unmodifiableList(this.coordinates);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(this.coordinates);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof GeoJsonMultiLineString)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.coordinates, ((GeoJsonMultiLineString) obj).coordinates);
}
}

View File

@@ -0,0 +1,116 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import org.springframework.data.geo.Point;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* {@link GeoJsonMultiPoint} is defined as list of {@link Point}s.
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#multipoint
*/
public class GeoJsonMultiPoint implements GeoJson<Iterable<Point>> {
private static final String TYPE = "MultiPoint";
private final List<Point> points;
/**
* Creates a new {@link GeoJsonMultiPoint} for the given {@link Point}s.
*
* @param points points must not be {@literal null} and have at least 2 entries.
*/
public GeoJsonMultiPoint(List<Point> points) {
Assert.notNull(points, "Points must not be null.");
Assert.isTrue(points.size() >= 2, "Minimum of 2 Points required.");
this.points = new ArrayList<Point>(points);
}
/**
* Creates a new {@link GeoJsonMultiPoint} for the given {@link Point}s.
*
* @param first must not be {@literal null}.
* @param second must not be {@literal null}.
* @param others must not be {@literal null}.
*/
public GeoJsonMultiPoint(Point first, Point second, Point... others) {
Assert.notNull(first, "First point must not be null!");
Assert.notNull(second, "Second point must not be null!");
Assert.notNull(others, "Additional points must not be null!");
this.points = new ArrayList<Point>();
this.points.add(first);
this.points.add(second);
this.points.addAll(Arrays.asList(others));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getType()
*/
@Override
public String getType() {
return TYPE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getCoordinates()
*/
@Override
public List<Point> getCoordinates() {
return Collections.unmodifiableList(this.points);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(this.points);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof GeoJsonMultiPoint)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.points, ((GeoJsonMultiPoint) obj).points);
}
}

View File

@@ -0,0 +1,93 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* {@link GeoJsonMultiPolygon} is defined as a list of {@link GeoJsonPolygon}s.
*
* @author Christoph Strobl
* @since 1.7
*/
public class GeoJsonMultiPolygon implements GeoJson<Iterable<GeoJsonPolygon>> {
private static final String TYPE = "MultiPolygon";
private List<GeoJsonPolygon> coordinates = new ArrayList<GeoJsonPolygon>();
/**
* Creates a new {@link GeoJsonMultiPolygon} for the given {@link GeoJsonPolygon}s.
*
* @param polygons must not be {@literal null}.
*/
public GeoJsonMultiPolygon(List<GeoJsonPolygon> polygons) {
Assert.notNull(polygons, "Polygons for MultiPolygon must not be null!");
this.coordinates.addAll(polygons);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getType()
*/
@Override
public String getType() {
return TYPE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getCoordinates()
*/
@Override
public List<GeoJsonPolygon> getCoordinates() {
return Collections.unmodifiableList(this.coordinates);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(this.coordinates);
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof GeoJsonMultiPolygon)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.coordinates, ((GeoJsonMultiPolygon) obj).coordinates);
}
}

View File

@@ -0,0 +1,72 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.geo.Point;
/**
* {@link GeoJson} representation of {@link Point}.
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#point
*/
public class GeoJsonPoint extends Point implements GeoJson<List<Double>> {
private static final long serialVersionUID = -8026303425147474002L;
private static final String TYPE = "Point";
/**
* Creates {@link GeoJsonPoint} for given coordinates.
*
* @param x
* @param y
*/
public GeoJsonPoint(double x, double y) {
super(x, y);
}
/**
* Creates {@link GeoJsonPoint} for given {@link Point}.
*
* @param point must not be {@literal null}.
*/
public GeoJsonPoint(Point point) {
super(point);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getType()
*/
@Override
public String getType() {
return TYPE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJson#getCoordinates()
*/
@Override
public List<Double> getCoordinates() {
return Arrays.asList(Double.valueOf(getX()), Double.valueOf(getY()));
}
}

Some files were not shown because too many files have changed in this diff Show More