Compare commits

...

84 Commits

Author SHA1 Message Date
Christoph Strobl
8318e956a0 DATAMONGO-1794 - Release version 2.1 M1 (Lovelace). 2018-02-06 09:47:10 +01:00
Christoph Strobl
d755928b54 DATAMONGO-1794 - Prepare 2.1 M1 (Lovelace). 2018-02-06 09:45:50 +01:00
Christoph Strobl
f57be57a81 DATAMONGO-1794 - Updated changelog. 2018-02-06 09:45:41 +01:00
Christoph Strobl
b834af9779 DATAMONGO-1864 - Upgrade to MongoDB Java Driver 3.6.2 2018-02-06 07:58:15 +01:00
Mark Paluch
8745518131 DATAMONGO-1803 - Polishing.
Allow reuse of builders instead of resetting state after MessagePropertiesBuilder.build(). Use Java streams where possible. Slightly reorder fields to match constructor argument order. Add generics to request builders and introduce typed builder(…) methods to retain builder generics. Add builder for TailableCursorRequest.

Introduce factory method on MessageListenerContainer for container creation. Change Subscription.await() to use CountDownLatch instead of polling to integrate better with ManagedBlocker.

Add protected constructors to options and builder classes. Add assertions where appropriate. Move task classes into top-level types. Extract methods. Typo fixes in reference docs.

Original pull request: #528.
2018-02-05 11:36:04 +01:00
Christoph Strobl
162a936736 DATAMONGO-1803 - Add support for ChangeStreams.
As of MongoDB 3.6, Change Streams allow application to get notified about changes without having to tailing the oplog.

NOTE: Change Stream support is only available with replica sets or a sharded cluster.

Change Streams can be subscribed to with both the imperative and the reactive MongoDB java driver. It is highly recommended to use the reactive variant as it is less resource intensive. However if you do not feel comfortable using the reactive API for whatever reason, you can sill obtain the change events via a Messaging concept already common in the Spring ecosystem.

== Change Streams - Sync ==

Listening to a Change Stream using a Sync Driver is a long running, blocking task that needs to be delegated to a separate component.
In this case we need to create a MessageListenerContainer first which will be the main entry point for running the specific SubscriptionRequests.
Spring Data MongoDB already ships with a default implementation that operates upon MongoTemplate and is capable of creating and executing Tasks for a ChangeStreamRequest.

MessageListenerContainer container = MessageListenerContainer.create(template);
container.start();
MessageListener<ChangeStreamDocument<Document>, User> listener = System.out::println;
ChangeStreamRequestOptions options = new ChangeStreamRequestOptions("user", ChangeStreamOptions.empty());

Subscription subscription = container.register(new ChangeStreamRequest<>(listener, options), User.class);

== Change Streams - Reactive ==

Subscribing to Change Stream via the reactive API is clearly more straight forward. Still the building blocks like ChangeStreamOptions remain the same.

Aggregation filter = newAggregation(User.class, match(where("age").gte(38));
Flux<ChangeStreamEvent<User>> flux = reactiveTemplate.changeStream(filter), User.class, ChangeStreamOptions.empty());

== Tailable Cursors - Sync ==

This commit also adds support for tailable cursors using the synchronous driver to be used with capped collections:

MessageListenerContainer container = MessageListenerContainer.create(template);
container.start();
TailableCursorRequestOptions options = TailableCursorRequestOptions.builder()
  .collection("user")
  .filter(query(where("age").is(7)))
  .build()

container.register(new TailableCursorRequest<>(messageListener, options, User.class));

Original pull request: #528.
2018-02-05 11:35:51 +01:00
Mark Paluch
af757295d9 DATAMONGO-1830 - Updated changelog. 2018-01-24 13:41:23 +01:00
Mark Paluch
8ba4b476a2 DATAMONGO-1858 - Fix line endings to LF. 2018-01-24 12:56:52 +01:00
Mark Paluch
2a6d512800 DATAMONGO-1829 - Updated changelog. 2018-01-24 12:22:07 +01:00
Mark Paluch
047eb18600 DATAMONGO-1843 - Polishing.
Convert anonymous classes to lambdas. Typo fixes. Migrate test to AssertJ.

Original pull request: #526.
2018-01-23 10:50:30 +01:00
Christoph Strobl
d3a54e7fe0 DATAMONGO-1843 - Fix parameter shadowing in ArrayOperators reduce.
Original pull request: #526.
2018-01-23 10:33:49 +01:00
Christoph Strobl
95def1f0aa DATAMONGO-1850 - Polishing.
Remove blank line, add tests and migrate to AssertJ.

Original Pull Request: #527
2018-01-22 16:16:37 +01:00
Mark Paluch
d180f4bd65 DATAMONGO-1850 - Guard GridFsResource.getContentType() against absent file metadata.
We now consider fall back to GridFS.getContentType() if GridFS metadata is absent to prevent null dereference.

Original Pull Request: #527
2018-01-22 15:11:05 +01:00
Mark Paluch
c3dbce4d9e DATAMONGO-1322 - Polishing.
Convert lineendings from CRLF to LF. Reduce validator implementations visibility to package. Extend Javadoc. Slight rewording in reference documentation.

Original pull request: #525.
Related pull request: #511.
Related ticket: DATACMNS-1835.
2018-01-16 12:17:52 +01:00
Christoph Strobl
36614ef0ab DATAMONGO-1322 - Polishing.
Remove ValidationAction/Level and ValidationOptions duplicates. Rename ValidatorDefinition to Validator and Validator to ValidationOptions.
Update and rename some of the tests. Also make sure to run the created validator document through the QueryMapper for value conversion and potential field mappings. Streamline Validator usage by providing plain Document and JsonSchema validator options next to the Criteria based one.
Finally update the reference documentation.

Original pull request: #525.
Related pull request: #511.
Related ticket: DATACMNS-1835.
2018-01-16 12:17:52 +01:00
Andreas Zink
2024b30059 DATAMONGO-1322 - Add support for Criteria-based validator for collection creation.
Extended the CollectionOptions with a ValidationOptions property which
corresponds to the MongoDB createCollection() parameters. A validator
object can be defined using the Criteria API, or by writing a custom
provider.

Original pull request: #511.
Related pull request: #525.
Related ticket: DATACMNS-1835.
2018-01-16 12:17:19 +01:00
Mark Paluch
f2bb46724c DATAMONGO-1835 - Polishing.
Add JSON schema to collection creation using reactive API. Refactor MongoJsonSchema to interface with default implementations for JsonSchemaObject and Document-based schemas. Make fields final and methods static where possible.

Add minItems/maxItems/additionalItems properties to ArrayJsonSchemaProperty. Add missing overrides to NullJsonSchemaProperty.
Slightly rename methods for item/property counts. Add generics, Javadoc, minor tweaks.

Original pull request: #524.
2018-01-12 11:39:17 +01:00
Mark Paluch
365430ce44 DATAMONGO-1835 - Documentation.
Original pull request: #524.
2018-01-12 11:39:17 +01:00
Christoph Strobl
14ccb5152a DATAMONGO-1835 - Add support for JSON Schema.
We now can create a $jsonSchema that can be used as a validator when creating collections and as predicate for queries. Required fields and properties get mapped according to the @Field annotation on domain objects.

MongoJsonSchema schema = MongoJsonSchema.builder().required("firstname", "lastname")
  .properties(string("firstname").possibleValues("luke", "han"),
              object("address").properties(string("postCode").minLength(4).maxLength(5)))
  .build();

resulting in the following schema:

{
  "type": "object",
  "required": [ "firstname", "lastname" ],
  "properties": {
    "firstname": {
      "type": "string", "enum": [ "luke", "han" ],
    },
    "address": {
      "type": "object",
      "properties": {
        "postCode": { "type": "string", "minLength": 4, "maxLength": 5 }
      }
    }
  }
}

Query usage:

MongoJsonSchema schema = MongoJsonSchema.builder()
  .required("address")
  .property(object("address").properties(string("street").matching("^Apple.*"))).build();

List<Person> person = template.find(query(matchingDocumentStructure(schema)), Person.class));

Collection validation:

MongoJsonSchema schema = MongoJsonSchema.builder().required("firstname", "lastname")
  .properties(string("firstname").possibleValues("luke", "han"),
              object("address").properties(string("postCode").minLength(4).maxLength(5)))
  .build();

template.createCollection(Person.class, CollectionOptions.empty()
  .schema(schema)
  .failOnValidationError());

Original pull request: #524.
2018-01-12 11:39:17 +01:00
Christoph Strobl
ddc6e4a219 DATAMONGO-1846 - Upgrade to MongoDB Java Driver 3.6.
Original pull request: #524.
2018-01-12 11:39:17 +01:00
Mark Paluch
0df97ff9f0 DATAMONGO-1844 - Update copyright years to 2018.
Remove also trailing whitespaces and align outdated license header format.
2018-01-08 16:34:43 +01:00
Mark Paluch
9a13a3fce4 DATAMONGO-1553 - Polishing.
Convert spaces to tabs. Reorder methods. Add tests, nullability annotations, author tags and slightly rearrange documentation. Migrate tests to AssertJ. Extend year range in license headers.

Original pull request: #519.
2018-01-08 15:35:56 +01:00
Jérome GUYON
7123f844cb DATAMONGO-1553 - Add $sortByCount aggregation stage.
Original pull request: #519.
2018-01-08 15:34:42 +01:00
Mark Paluch
5cca849ecb DATAMONGO-1761 - Polishing.
Refactor MongoConverter.mapValueToTargetType to imperative method instead of returning a function for easier consumption. Adapt return types in Javadoc to the actual return type. Remove undocumented type parameters. Add overload using reified entity and return type generics. Slight documentation tweaks.

Original pull request: #494.
Related pull request: #514.
2018-01-08 14:24:54 +01:00
Christoph Strobl
45fa1e50f9 DATAMONGO-1761 - Add domain type mapping, reactive/fluent interface operations and Kotlin extensions.
Add rename and add methods to MongoOperations.
Map query, and use execute method to run errors through the execption translator. Add Javadoc and missing issue references.
Move to assertJ.

Also add fluent and reactive api counterpart and fix BsonValue to plain java Object / domain Type conversion.
Provide Kotlin extensions for operations added.

Original pull request: #494.
Related pull request: #514.
2018-01-08 14:24:51 +01:00
eric
c9f3a52dc0 DATAMONGO-1761 - Add distinct operation to MongoTemplate.
We now support distinct queries via MongoTemplate to select distinct field values based on a collection and query.

List<String> distinctNames = template.findDistinct("name", Person.class, String.class);

List<String> distinctNames = template.query(Person.class).distinct("name").as(String.class).all();

Original pull request: #494.
Related pull request: #514.
2018-01-08 14:24:41 +01:00
Christoph Strobl
7401e5e01b DATAMONGO-1831 - Fix array type conversion for empty source.
We now make sure that we convert empty sources to the corresponding target type. This prevents entity instantiation from failing due to incorrect argument types when invoking the constructor.

Original pull request: #520.
2017-12-02 12:01:00 -08:00
Mark Paluch
ad995fd527 DATAMONGO-1816 - Updated changelog. 2017-11-27 16:43:40 +01:00
Mark Paluch
7b55120368 DATAMONGO-1799 - Updated changelog. 2017-11-27 15:58:44 +01:00
Christoph Strobl
d2519eb059 DATAMONGO-1818 - Polishing.
Move overlapping/duplicate documentation into one place.

Original Pull Request: #512
2017-11-27 07:51:47 +01:00
Mark Paluch
d7a8509198 DATAMONGO-1818 - Reword tailable cursors documentation.
Fix reference to @Tailable annotation. Slightly reword documentation.

Original Pull Request: #512
2017-11-27 07:50:59 +01:00
Mark Paluch
f13a4bba25 DATAMONGO-1823 - Polishing.
Replace constructor with lombok's RequiredArgsConstructor. Add Nullable annotation. Tiny reformatting. Align license header. Migrate test to AssertJ.

Original pull request: #517.
2017-11-22 14:33:05 +01:00
Christoph Strobl
e411831e76 DATAMONGO-1823 - Emit ApplicationEvents using projecting find methods.
We now again emit application events when using finder methods that apply projection.

Original pull request: #517.
2017-11-22 14:31:14 +01:00
Oliver Gierke
eb51828d4d DATAMONGO-1737 - BasicMongoPersistentEntity now correctly initializes comparator.
In BasicMongoPersistentEntity.verify() we now properly call the super method to make sure the comparators that honor the @Field's order value are initialized properly.
2017-11-17 15:02:32 +01:00
Mark Paluch
e0237174ad DATAMONGO-1819 - Polishing.
Use native field names for NamedMongoScript query instead of relying on metadata-based mapping as NamedMongoScript is considered a simple top-level type. Migrate tests to AssertJ.

Related pull request: #513.
2017-11-17 13:54:36 +01:00
Mark Paluch
dc6bbbfd13 DATAMONGO-1824 - Polishing.
Introduce Aggregation.toPipeline(…) method to render the aggregation pipeline directly. Adapt code using aggregation pipelines. Consider allowDiskUse and batchSize cursor options. Move introduction versions to 2.1. Mention migration to cursor-based aggregation in reference docs.

Original pull request: #515.
2017-11-17 13:40:18 +01:00
Christoph Strobl
0119af1679 DATAMONGO-1824 - Use Rule to verify server version in tests.
Original pull request: #515.
2017-11-17 13:40:18 +01:00
Christoph Strobl
4f78aacbf7 DATAMONGO-1824 - Replace executeCommand for aggregations with MongoCollection.aggregate(…) to support MongoDB 3.6.
We now use the driver native option for aggregate instead of a plain command execution. This is necessary as of MongoDB 3.6 the cursor option is required for aggregations changing the return and execution model.

Still we maintain the raw values of AggregationResult as we used to do before. However, relying on executeCommand to change aggregation behavior in custom code will no longer work.

Tested against MongoDB: 3.6.RC3, 3.4.9 and 3.2.6

Along the the way we opened up Aggregation itself to expose the AggregationOptions in use und deprecated the $pushAll option on Update since it has been removed in MongoDB 3.6.

Original pull request: #515.
2017-11-17 13:39:50 +01:00
Mark Paluch
4d207e9d54 DATAMONGO-1822 - Adapt readme to changed configuration support. 2017-11-09 14:52:43 +01:00
Mark Paluch
3d248d1aa6 DATAMONGO-1821 - Fix method ambiguity in tests when compiling against MongoDB 3.6 2017-11-07 12:47:44 +01:00
Mark Paluch
e193f2e2b7 DATAMONGO-1820 - Set Mongo's Feature Compatibility flag for TravisCI build to 3.4.
Apply setFeatureCompatibilityVersion to upgrade MongoDB to 3.4 features.
2017-11-06 10:28:00 +01:00
Mark Paluch
6b3ec31605 DATAMONGO-1817 - Polishing.
Remove blank line.

Original pull request: #510.
2017-11-06 10:02:01 +01:00
Sola
f79dc398ae DATAMONGO-1817 - Align nullability in Kotlin MongoOperationsExtensions with Java API.
Return types in MongoOperationsExtensions are now aligned to the nullability of MongoOperations.

Original pull request: #510.
2017-11-06 09:55:31 +01:00
Oliver Gierke
35abef4ba3 DATAMONGO-1793 - Updated changelog. 2017-10-27 16:36:46 +02:00
Oliver Gierke
9032509e50 DATAMONGO-1815 - Adapt API changes in Property in test cases. 2017-10-27 11:05:25 +02:00
Mark Paluch
ccaea8db8e DATAMONGO-1814 - Update reference documentation for faceted classification.
Original pull request: #426.
Original ticket: DATAMONGO-1552.
2017-10-26 09:41:24 +02:00
Christoph Strobl
37df6efe5f DATAMONGO-1811 - Update documentation of MongoOperations.executeCommand.
Update Javadoc and reference documentation.
2017-10-24 14:58:30 +02:00
Christoph Strobl
4ea6dd481d DATAMONGO-1805 - Update GridFsOperations documentation.
Fix return type in reference documentation and update Javadoc.
2017-10-24 14:58:15 +02:00
Christoph Strobl
24cd25fdb0 DATAMONGO-1806 - Polishing.
Remove unused import, trailing whitespaces and update Javadoc.

Original Pull Request: #506
2017-10-24 14:57:52 +02:00
hartmut
2be3781550 DATAMONGO-1806 - Fix Javadoc for GridFsResource.
Original Pull Request: #506
2017-10-24 14:56:34 +02:00
Mark Paluch
436b6994e1 DATAMONGO-1809 - Introduce AssertJ assertions for Document.
Original pull request: #508.
2017-10-24 14:44:48 +02:00
Christoph Strobl
296a8102d0 DATAMONGO-1809 - Polishing.
Move tests to AssertJ.

Original pull request: #508.
2017-10-24 14:44:36 +02:00
Christoph Strobl
0ce067ccf3 DATAMONGO-1809 - Fix positional parameter detection for PropertyPaths.
We now make sure to capture all digits for positional parameters.

Original pull request: #508.
2017-10-24 14:44:21 +02:00
Oliver Gierke
2974cce2b8 DATAMONGO-1812 - Add milestone repository to plugin repositories to resolve AspectJ milestones. 2017-10-24 14:10:47 +02:00
Mark Paluch
c89a343794 DATAMONGO-1696 - Mention appropriate EnableMongoAuditing annotation in reference documentation. 2017-10-20 08:44:53 +02:00
Mark Paluch
745e2650e5 DATAMONGO-1802 - Polishing.
Reduce converter visibility to MongoConverters's package-scope visibility. Tiny alignment in Javadoc wording. Copyright year, create empty byte array with element count instead initializer.

Original pull request: #505.
2017-10-17 14:51:12 +02:00
Christoph Strobl
75733141c2 DATAMONGO-1802 - Add Binary to byte array converter.
We now provide and register a Binary to byte[] converter to provide conversion of binary data to a byte array. MongoDB deserializes binary data using the document API to its Binary type. With this converter, we reinstantiated the previous capability to use byte arrays for binary data within domain types.

Original pull request: #505.
2017-10-17 14:36:20 +02:00
Oliver Gierke
0f7460f8e8 DATAMONGO-1775 - Updated changelog. 2017-10-11 19:01:37 +02:00
Oliver Gierke
dc1b59f464 DATAMONGO-1795 - Removed obsolete Kotlin build setup. 2017-10-04 11:05:18 +02:00
Mark Paluch
6fbdd303cb DATAMONGO-1776 - After release cleanups. 2017-10-02 11:37:22 +02:00
Mark Paluch
f0255ea3de DATAMONGO-1776 - Prepare next development iteration. 2017-10-02 11:37:21 +02:00
Mark Paluch
e5e8fa45c2 DATAMONGO-1776 - Release version 2.0 GA (Kay). 2017-10-02 11:10:22 +02:00
Mark Paluch
f5ad4e42f9 DATAMONGO-1776 - Prepare 2.0 GA (Kay). 2017-10-02 11:09:17 +02:00
Mark Paluch
e6b7d2ffd0 DATAMONGO-1776 - Updated changelog. 2017-10-02 11:09:09 +02:00
Mark Paluch
5b24d3fd0b DATAMONGO-1778 - Polishing.
Javadoc, modifiers.

Original pull request: #503.
2017-10-02 10:38:20 +02:00
Christoph Strobl
10f13c8f37 DATAMONGO-1778 - Polishing.
Migrate UpdateTests to AssertJ and adjust constructor visibility.

Original pull request: #503.
2017-10-02 10:38:20 +02:00
Christoph Strobl
c05f8f056c DATAMONGO-1778 - Fix equals() and hashCode() for Update.
We now include the entire update document with its modifiers and the isolation flag when computing the hash code and comparing for object equality.

Original pull request: #503.
2017-10-02 10:37:34 +02:00
Mark Paluch
dbd38a8e82 DATAMONGO-1791 - Polishing.
Replace RxJava 1 repositories with RxJava 2 repositories. Fix broken links. Fix duplicate section ids.
2017-09-27 12:19:11 +02:00
Mark Paluch
77b1f3cb37 DATAMONGO-1791 - Adapt to changed Spring Framework 5 documentation structure.
Update links in the reference docs to their new locations.
2017-09-27 12:13:51 +02:00
Mark Paluch
5444ac39b5 DATAMONGO-1785 - Downgrade to CDI 1.0.
We now build against CDI 1.0 again while using CDI 2.0 for testing.
2017-09-21 13:50:39 +02:00
Mark Paluch
cf476b9bc8 DATAMONGO-1787 - Added explicit automatic module name for JDK 9. 2017-09-21 13:50:39 +02:00
Christoph Strobl
f28d47b01b DATAMONGO-1777 - Polishing. 2017-09-21 11:32:52 +02:00
Christoph Strobl
5bf03cfa70 DATAMONGO-1777 - Pretty print Modifiers when calling Update.toString().
We now make sure to pretty print Update modifiers by safely rendering to a json representation including an optional $isolated opereator if applicable.
Along the way we also fix a flaw in PushOperationBuilder ignoring eg. $position when pushing single values.
2017-09-21 11:32:50 +02:00
Oliver Gierke
98e893636b DATAMONGO-1779 - Polishing.
Fixed imports.
2017-09-21 11:31:45 +02:00
Oliver Gierke
4b552b051e DATAMONGO-1779 - Fixed handling of empty queries in MongoTemplate.find(…).
Calls to MongoTemplate.find(…) were routed to ….findAll(…) in case no criteria definition or sort was defined on the query. This however neglected that cursor preparation aspects (limits, skips) are defined on the query as well which cause them not to be applied correctly. Removed the over-optimistic re-routing so that normal query execution now always gets applied.
2017-09-21 11:31:44 +02:00
Mark Paluch
1c295b62c6 DATAMONGO-1786 - Adapt tests to nullability validation in Spring Data Commons.
Related issue: DATACMNS-1157.
2017-09-21 11:27:41 +02:00
Christoph Strobl
0a8458a045 DATAMONGO-1784 - Polishing.
Update JavaDoc, enforce nullability constraints and add tests.

Original Pull Request: #501
2017-09-20 12:47:29 +02:00
Sergey Shcherbakov
a3b9fb33ea DATAMONGO-1784 - Add expression support to GroupOperation#sum().
We now allow passing an AggregationExpression to GroupOperation.sum which allows construction of more complex expressions.

Original Pull Request: #501
2017-09-20 12:46:14 +02:00
Christoph Strobl
3d651b72ad DATAMONGO-1782 - Polishing.
toCyclePath now returns an empty String when Path does not cycle.
Also split and add tests and move code to Java8.

Original Pull Request: #500
2017-09-19 11:07:04 +02:00
Mark Paluch
187c25bcc0 DATAMONGO-1782 - Detect type cycles using PersistentProperty paths.
We now rely on PersistentProperty paths to detect cycles between types. Cycles are detected when building up the path object and traversing PersistentProperty stops after the cycle was hit for the second time to generated indexes for at least one hierarchy level.

Previously, we used String-based property dot paths and compared whether paths to a particular property was already found by a substring search which caused false positives if a property was reachable via multiple paths.

Original Pull Request: #500
2017-09-19 10:03:40 +02:00
Mark Paluch
087482d82e DATAMONGO-1785 - Upgrade to OpenWebBeans 2.0.1
Upgrade also to Equalsverifier 1.7.8 to resolve ASM version conflict.
2017-09-18 15:21:57 +02:00
Mark Paluch
e80d1df571 DATAMONGO-1781 - Update what's new in reference documentation. 2017-09-14 14:09:08 +02:00
Oliver Gierke
a9b1b640c0 DATAMONGO-1754 - After release cleanups. 2017-09-11 17:40:21 +02:00
Oliver Gierke
b888864407 DATAMONGO-1754 - Prepare next development iteration. 2017-09-11 17:40:18 +02:00
740 changed files with 18781 additions and 4508 deletions

View File

@@ -31,6 +31,8 @@ cache:
directories:
- $HOME/.m2
install: true
install:
- |-
mongo admin --eval "db.adminCommand({setFeatureCompatibilityVersion: '3.4'});"
script: "mvn clean dependency:list test -P${PROFILE} -Dsort"

View File

@@ -83,7 +83,7 @@ You can have Spring automatically create a proxy for the interface by using the
class ApplicationConfig extends AbstractMongoConfiguration {
@Override
public Mongo mongo() throws Exception {
public MongoClient mongoClient() throws Exception {
return new MongoClient();
}

15
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.0.RC3</version>
<version>2.1.0.M1</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>2.0.0.RC3</version>
<version>2.1.0.M1</version>
</parent>
<modules>
@@ -27,9 +27,9 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>2.0.0.RC3</springdata.commons>
<mongo>3.5.0</mongo>
<mongo.reactivestreams>1.6.0</mongo.reactivestreams>
<springdata.commons>2.1.0.M1</springdata.commons>
<mongo>3.6.2</mongo>
<mongo.reactivestreams>1.7.0</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
</properties>
@@ -193,6 +193,11 @@
<id>spring-plugins-release</id>
<url>https://repo.spring.io/plugins-release</url>
</pluginRepository>
<pluginRepository>
<id>spring-libs-milestone</id>
<url>https://repo.spring.io/libs-milestone</url>
</pluginRepository>
</pluginRepositories>
</project>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.0.RC3</version>
<version>2.1.0.M1</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.0.RC3</version>
<version>2.1.0.M1</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -16,6 +16,7 @@
<properties>
<jpa>2.1.1</jpa>
<hibernate>5.2.1.Final</hibernate>
<java-module-name>spring.data.mongodb.cross.store</java-module-name>
</properties>
<dependencies>
@@ -48,7 +49,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>2.0.0.RC3</version>
<version>2.1.0.M1</version>
</dependency>
<!-- reactive -->

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,214 +1,214 @@
/*
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManagerFactory;
import org.bson.Document;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.crossstore.ChangeSet;
import org.springframework.data.crossstore.ChangeSetBacked;
import org.springframework.data.crossstore.ChangeSetPersister;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.util.ClassUtils;
import com.mongodb.MongoException;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.model.Filters;
import com.mongodb.client.result.DeleteResult;
/**
* @author Thomas Risberg
* @author Oliver Gierke
* @author Alex Vengrovsk
* @author Mark Paluch
* @deprecated will be removed without replacement.
*/
@Deprecated
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_CLASS = "_entity_class";
private static final String ENTITY_ID = "_entity_id";
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
private final Logger log = LoggerFactory.getLogger(getClass());
private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory;
public void setMongoTemplate(MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
public void setEntityManagerFactory(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentState(java.lang.Class, java.lang.Object, org.springframework.data.crossstore.ChangeSet)
*/
public void getPersistentState(Class<? extends ChangeSetBacked> entityClass, Object id, final ChangeSet changeSet)
throws DataAccessException, NotFoundException {
if (id == null) {
log.debug("Unable to load MongoDB data for null id");
return;
}
String collName = getCollectionNameForEntity(entityClass);
final Document dbk = new Document();
dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for {}", dbk);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException {
for (Document dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) {
log.debug("Processing key: {}", key);
}
if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) {
throw new DataIntegrityViolationException(
"Unble to convert property " + key + ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
}
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: {}", key);
}
changeSet.set(key, value);
}
}
return null;
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (log.isDebugEnabled()) {
log.debug("getPersistentId called on {}", entity);
}
if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
}
return entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity);
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#persistState(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object persistState(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (cs == null) {
log.debug("Flush: changeset was null, nothing to flush.");
return 0L;
}
if (log.isDebugEnabled()) {
log.debug("Flush: changeset: {}", cs.getValues());
}
String collName = getCollectionNameForEntity(entity.getClass());
if (mongoTemplate.getCollection(collName) == null) {
mongoTemplate.createCollection(collName);
}
for (String key : cs.getValues().keySet()) {
if (key != null && !key.startsWith("_") && !key.equals(ChangeSetPersister.ID_KEY)) {
Object value = cs.getValues().get(key);
final Document dbQuery = new Document();
dbQuery.put(ENTITY_ID, getPersistentId(entity, cs));
dbQuery.put(ENTITY_CLASS, entity.getClass().getName());
dbQuery.put(ENTITY_FIELD_NAME, key);
final Document dbId = mongoTemplate.execute(collName, new CollectionCallback<Document>() {
public Document doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
Document id = collection.find(dbQuery).first();
return id;
}
});
if (value == null) {
if (log.isDebugEnabled()) {
log.debug("Flush: removing: {}", dbQuery);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
DeleteResult dr = collection.deleteMany(dbQuery);
return null;
}
});
} else {
final Document dbDoc = new Document();
dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) {
log.debug("Flush: saving: {}", dbQuery);
}
mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());
if (dbId != null) {
dbDoc.put("_id", dbId.get("_id"));
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
if (dbId != null) {
collection.replaceOne(Filters.eq("_id", dbId.get("_id")), dbDoc);
} else {
if (dbDoc.containsKey("_id") && dbDoc.get("_id") == null) {
dbDoc.remove("_id");
}
collection.insertOne(dbDoc);
}
return null;
}
});
}
}
}
return 0L;
}
/**
* Returns the collection the given entity type shall be persisted to.
*
* @param entityClass must not be {@literal null}.
* @return
*/
private String getCollectionNameForEntity(Class<? extends ChangeSetBacked> entityClass) {
return mongoTemplate.getCollectionName(entityClass);
}
}
/*
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManagerFactory;
import org.bson.Document;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.crossstore.ChangeSet;
import org.springframework.data.crossstore.ChangeSetBacked;
import org.springframework.data.crossstore.ChangeSetPersister;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.util.ClassUtils;
import com.mongodb.MongoException;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.model.Filters;
import com.mongodb.client.result.DeleteResult;
/**
* @author Thomas Risberg
* @author Oliver Gierke
* @author Alex Vengrovsk
* @author Mark Paluch
* @deprecated will be removed without replacement.
*/
@Deprecated
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_CLASS = "_entity_class";
private static final String ENTITY_ID = "_entity_id";
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
private final Logger log = LoggerFactory.getLogger(getClass());
private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory;
public void setMongoTemplate(MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
public void setEntityManagerFactory(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentState(java.lang.Class, java.lang.Object, org.springframework.data.crossstore.ChangeSet)
*/
public void getPersistentState(Class<? extends ChangeSetBacked> entityClass, Object id, final ChangeSet changeSet)
throws DataAccessException, NotFoundException {
if (id == null) {
log.debug("Unable to load MongoDB data for null id");
return;
}
String collName = getCollectionNameForEntity(entityClass);
final Document dbk = new Document();
dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for {}", dbk);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException {
for (Document dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) {
log.debug("Processing key: {}", key);
}
if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) {
throw new DataIntegrityViolationException(
"Unble to convert property " + key + ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
}
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: {}", key);
}
changeSet.set(key, value);
}
}
return null;
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (log.isDebugEnabled()) {
log.debug("getPersistentId called on {}", entity);
}
if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
}
return entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity);
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#persistState(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object persistState(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (cs == null) {
log.debug("Flush: changeset was null, nothing to flush.");
return 0L;
}
if (log.isDebugEnabled()) {
log.debug("Flush: changeset: {}", cs.getValues());
}
String collName = getCollectionNameForEntity(entity.getClass());
if (mongoTemplate.getCollection(collName) == null) {
mongoTemplate.createCollection(collName);
}
for (String key : cs.getValues().keySet()) {
if (key != null && !key.startsWith("_") && !key.equals(ChangeSetPersister.ID_KEY)) {
Object value = cs.getValues().get(key);
final Document dbQuery = new Document();
dbQuery.put(ENTITY_ID, getPersistentId(entity, cs));
dbQuery.put(ENTITY_CLASS, entity.getClass().getName());
dbQuery.put(ENTITY_FIELD_NAME, key);
final Document dbId = mongoTemplate.execute(collName, new CollectionCallback<Document>() {
public Document doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
Document id = collection.find(dbQuery).first();
return id;
}
});
if (value == null) {
if (log.isDebugEnabled()) {
log.debug("Flush: removing: {}", dbQuery);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
DeleteResult dr = collection.deleteMany(dbQuery);
return null;
}
});
} else {
final Document dbDoc = new Document();
dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) {
log.debug("Flush: saving: {}", dbQuery);
}
mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());
if (dbId != null) {
dbDoc.put("_id", dbId.get("_id"));
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
if (dbId != null) {
collection.replaceOne(Filters.eq("_id", dbId.get("_id")), dbDoc);
} else {
if (dbDoc.containsKey("_id") && dbDoc.get("_id") == null) {
dbDoc.remove("_id");
}
collection.insertOne(dbDoc);
}
return null;
}
});
}
}
}
return 0L;
}
/**
* Returns the collection the given entity type shall be persisted to.
*
* @param entityClass must not be {@literal null}.
* @return
*/
private String getCollectionNameForEntity(Class<? extends ChangeSetBacked> entityClass) {
return mongoTemplate.getCollectionName(entityClass);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -39,7 +39,7 @@ import org.springframework.transaction.support.TransactionTemplate;
/**
* Integration tests for MongoDB cross-store persistence (mainly {@link MongoChangeSetPersister}).
*
*
* @author Thomas Risberg
* @author Oliver Gierke
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.0.RC3</version>
<version>2.1.0.M1</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,13 +11,15 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.0.0.RC3</version>
<version>2.1.0.M1</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<objenesis>1.3</objenesis>
<equalsverifier>1.5</equalsverifier>
<equalsverifier>1.7.8</equalsverifier>
<java-module-name>spring.data.mongodb</java-module-name>
<multithreadedtc>1.01</multithreadedtc>
</properties>
<dependencies>
@@ -137,6 +139,21 @@
</dependency>
<!-- CDI -->
<!-- Dependency order required to build against CDI 1.0 and test with CDI 2.0 -->
<dependency>
<groupId>org.apache.geronimo.specs</groupId>
<artifactId>geronimo-jcdi_2.0_spec</artifactId>
<version>1.0.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.interceptor</groupId>
<artifactId>javax.interceptor-api</artifactId>
<version>1.2.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.enterprise</groupId>
<artifactId>cdi-api</artifactId>
@@ -146,26 +163,19 @@
</dependency>
<dependency>
<groupId>javax.el</groupId>
<artifactId>el-api</artifactId>
<version>${cdi}</version>
<groupId>javax.annotation</groupId>
<artifactId>javax.annotation-api</artifactId>
<version>${javax-annotation-api}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans.test</groupId>
<artifactId>cditest-owb</artifactId>
<groupId>org.apache.openwebbeans</groupId>
<artifactId>openwebbeans-se</artifactId>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>2.5</version>
<scope>test</scope>
</dependency>
<!-- JSR 303 Validation -->
<dependency>
<groupId>javax.validation</groupId>
@@ -236,6 +246,13 @@
<scope>test</scope>
</dependency>
<dependency>
<groupId>edu.umd.cs.mtc</groupId>
<artifactId>multithreadedtc</artifactId>
<version>${multithreadedtc}</version>
<scope>test</scope>
</dependency>
<!-- Kotlin extension -->
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
@@ -279,74 +296,9 @@
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>kotlin-maven-plugin</artifactId>
<groupId>org.jetbrains.kotlin</groupId>
<version>${kotlin}</version>
<configuration>
<jvmTarget>${source.level}</jvmTarget>
</configuration>
<executions>
<execution>
<id>compile</id>
<phase>compile</phase>
<goals>
<goal>compile</goal>
</goals>
<configuration>
<sourceDirs>
<sourceDir>${project.basedir}/src/main/kotlin</sourceDir>
<sourceDir>${project.basedir}/src/main/java</sourceDir>
</sourceDirs>
</configuration>
</execution>
<execution>
<id>test-compile</id>
<phase>test-compile</phase>
<goals>
<goal>test-compile</goal>
</goals>
<configuration>
<sourceDirs>
<sourceDir>${project.basedir}/src/test/kotlin</sourceDir>
<sourceDir>${project.basedir}/src/test/java</sourceDir>
</sourceDirs>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<executions>
<execution>
<id>default-compile</id>
<phase>none</phase>
</execution>
<execution>
<id>default-testCompile</id>
<phase>none</phase>
</execution>
<execution>
<id>java-compile</id>
<phase>compile</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>java-test-compile</id>
<phase>test-compile</phase>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>com.mysema.maven</groupId>
<artifactId>apt-maven-plugin</artifactId>
@@ -375,7 +327,6 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12</version>
<configuration>
<useFile>false</useFile>
<includes>
@@ -397,6 +348,8 @@
</properties>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -25,7 +25,7 @@ import com.mongodb.BulkWriteResult;
/**
* Is thrown when errors occur during bulk operations.
*
*
* @author Tobias Trelle
* @author Oliver Gierke
* @since 1.9
@@ -39,7 +39,7 @@ public class BulkOperationException extends DataAccessException {
/**
* Creates a new {@link BulkOperationException} with the given message and source {@link BulkWriteException}.
*
*
* @param message must not be {@literal null}.
* @param source must not be {@literal null}.
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2017 the original author or authors.
* Copyright 2010-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -0,0 +1,74 @@
/*
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import java.util.Optional;
import org.bson.codecs.Codec;
import org.bson.codecs.configuration.CodecConfigurationException;
import org.bson.codecs.configuration.CodecRegistry;
import org.springframework.util.Assert;
/**
* Provider interface to obtain {@link CodecRegistry} from the underlying MongoDB Java driver.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.1
*/
@FunctionalInterface
public interface CodecRegistryProvider {
/**
* Get the underlying {@link CodecRegistry} used by the MongoDB Java driver.
*
* @return never {@literal null}.
* @throws IllegalStateException if {@link CodecRegistry} cannot be obtained.
*/
CodecRegistry getCodecRegistry();
/**
* Checks if a {@link Codec} is registered for a given type.
*
* @param type must not be {@literal null}.
* @return true if {@link #getCodecRegistry()} holds a {@link Codec} for given type.
* @throws IllegalStateException if {@link CodecRegistry} cannot be obtained.
*/
default boolean hasCodecFor(Class<?> type) {
return getCodecFor(type).isPresent();
}
/**
* Get the {@link Codec} registered for the given {@literal type} or an {@link Optional#empty() empty Optional}
* instead.
*
* @param type must not be {@literal null}.
* @param <T>
* @return never {@literal null}.
* @throws IllegalArgumentException if {@literal type} is {@literal null}.
*/
default <T> Optional<Codec<T>> getCodecFor(Class<T> type) {
Assert.notNull(type, "Type must not be null!");
try {
return Optional.of(getCodecRegistry().get(type));
} catch (CodecConfigurationException e) {
// ignore
}
return Optional.empty();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -23,7 +23,7 @@ import org.springframework.util.StringUtils;
* <p/>
* <p/>
* Mainly intended for internal use within the framework.
*
*
* @author Thomas Risberg
* @since 1.0
*/
@@ -38,7 +38,7 @@ public abstract class MongoCollectionUtils {
/**
* Obtains the collection name to use for the provided class
*
*
* @param entityClass The class to determine the preferred collection name for
* @return The preferred collection name
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb;
import org.bson.codecs.configuration.CodecRegistry;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.MongoExceptionTranslator;
@@ -24,15 +25,16 @@ import com.mongodb.client.MongoDatabase;
/**
* Interface for factories creating {@link DB} instances.
*
*
* @author Mark Pollack
* @author Thomas Darimont
* @author Christoph Strobl
*/
public interface MongoDbFactory {
public interface MongoDbFactory extends CodecRegistryProvider {
/**
* Creates a default {@link DB} instance.
*
*
* @return
* @throws DataAccessException
*/
@@ -40,7 +42,7 @@ public interface MongoDbFactory {
/**
* Creates a {@link DB} instance to access the database with the given name.
*
*
* @param dbName must not be {@literal null} or empty.
* @return
* @throws DataAccessException
@@ -49,10 +51,20 @@ public interface MongoDbFactory {
/**
* Exposes a shared {@link MongoExceptionTranslator}.
*
*
* @return will never be {@literal null}.
*/
PersistenceExceptionTranslator getExceptionTranslator();
DB getLegacyDb();
/**
* Get the underlying {@link CodecRegistry} used by the MongoDB Java driver.
*
* @return never {@literal null}.
*/
@Override
default CodecRegistry getCodecRegistry() {
return getDb().getCodecRegistry();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016 the original author or authors.
* Copyright 2016-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,6 +16,7 @@
package org.springframework.data.mongodb;
import org.bson.codecs.configuration.CodecRegistry;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.MongoExceptionTranslator;
@@ -26,9 +27,10 @@ import com.mongodb.reactivestreams.client.MongoDatabase;
* Interface for factories creating reactive {@link MongoDatabase} instances.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.0
*/
public interface ReactiveMongoDatabaseFactory {
public interface ReactiveMongoDatabaseFactory extends CodecRegistryProvider {
/**
* Creates a default {@link MongoDatabase} instance.
@@ -53,4 +55,14 @@ public interface ReactiveMongoDatabaseFactory {
* @return will never be {@literal null}.
*/
PersistenceExceptionTranslator getExceptionTranslator();
/**
* Get the underlying {@link CodecRegistry} used by the reactive MongoDB Java driver.
*
* @return never {@literal null}.
*/
@Override
default CodecRegistry getCodecRegistry() {
return getMongoDatabase().getCodecRegistry();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,7 +17,7 @@ package org.springframework.data.mongodb.config;
/**
* Constants to declare bean names used by the namespace configuration.
*
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Martin Baumgartner

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,7 +28,7 @@ import org.springframework.data.domain.AuditorAware;
/**
* Annotation to enable auditing in MongoDB via annotation configuration.
*
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
@@ -41,21 +41,21 @@ public @interface EnableMongoAuditing {
/**
* Configures the {@link AuditorAware} bean to be used to lookup the current principal.
*
*
* @return
*/
String auditorAwareRef() default "";
/**
* Configures whether the creation and modification dates are set. Defaults to {@literal true}.
*
*
* @return
*/
boolean setDates() default true;
/**
* Configures whether the entity shall be marked as modified on creation. Defaults to {@literal true}.
*
*
* @return
*/
boolean modifyOnCreate() default true;
@@ -63,7 +63,7 @@ public @interface EnableMongoAuditing {
/**
* Configures a {@link DateTimeProvider} bean name that allows customizing the {@link org.joda.time.DateTime} to be
* used for setting creation and modification dates.
*
*
* @return
*/
String dateTimeProviderRef() default "";

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2017 the original author or authors.
* Copyright 2015-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,7 +21,7 @@ import org.springframework.data.web.config.SpringDataJacksonModules;
/**
* Configuration class to expose {@link GeoJsonModule} as a Spring bean.
*
*
* @author Oliver Gierke
* @author Jens Schauder
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -29,7 +29,7 @@ import org.w3c.dom.Element;
/**
* {@link BeanDefinitionParser} to parse {@code gridFsTemplate} elements into {@link BeanDefinition}s.
*
*
* @author Martin Baumgartner
*/
class GridFsTemplateParser extends AbstractBeanDefinitionParser {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012-2014 the original author or authors.
* Copyright 2012-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -33,12 +33,12 @@ import org.w3c.dom.Element;
/**
* {@link BeanDefinitionParser} to register a {@link AuditingEventListener} to transparently set auditing information on
* an entity.
*
*
* @author Oliver Gierke
*/
public class MongoAuditingBeanDefinitionParser extends AbstractSingleBeanDefinitionParser {
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#getBeanClass(org.w3c.dom.Element)
*/
@@ -47,7 +47,7 @@ public class MongoAuditingBeanDefinitionParser extends AbstractSingleBeanDefinit
return AuditingEventListener.class;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#shouldGenerateId()
*/
@@ -56,7 +56,7 @@ public class MongoAuditingBeanDefinitionParser extends AbstractSingleBeanDefinit
return true;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractSingleBeanDefinitionParser#doParse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext, org.springframework.beans.factory.support.BeanDefinitionBuilder)
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2016 the original author or authors.
* Copyright 2013-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -37,13 +37,13 @@ import org.springframework.util.Assert;
/**
* {@link ImportBeanDefinitionRegistrar} to enable {@link EnableMongoAuditing} annotation.
*
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
/*
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAnnotation()
*/
@@ -52,7 +52,7 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
return EnableMongoAuditing.class;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditingHandlerBeanName()
*/
@@ -61,7 +61,7 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
return "mongoAuditingHandler";
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerBeanDefinitions(org.springframework.core.type.AnnotationMetadata, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@@ -74,7 +74,7 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
super.registerBeanDefinitions(annotationMetadata, registry);
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditHandlerBeanDefinitionBuilder(org.springframework.data.auditing.config.AuditingConfiguration)
*/
@@ -92,7 +92,7 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
return configureDefaultAuditHandlerAttributes(configuration, builder);
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerAuditListener(org.springframework.beans.factory.config.BeanDefinition, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@@ -125,14 +125,14 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
/**
* Creates a new {@link MongoMappingContextLookup} for the given {@link MappingMongoConverter}.
*
*
* @param converter must not be {@literal null}.
*/
public MongoMappingContextLookup(MappingMongoConverter converter) {
this.converter = converter;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
@@ -141,7 +141,7 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
return converter.getMappingContext();
}
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
@@ -150,7 +150,7 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
return MappingContext.class;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -29,13 +29,13 @@ import org.w3c.dom.Element;
/**
* Parser for {@code mongo-client} definitions.
*
*
* @author Christoph Strobl
* @since 1.7
*/
public class MongoClientParser implements BeanDefinitionParser {
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2017 the original author or authors.
* Copyright 2015-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,7 +19,7 @@ import org.springframework.beans.factory.xml.NamespaceHandlerSupport;
/**
* {@link org.springframework.beans.factory.xml.NamespaceHandler} for Mongo DB configuration.
*
*
* @author Oliver Gierke
* @author Martin Baumgartner
* @author Christoph Strobl

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -30,7 +30,7 @@ import org.w3c.dom.Element;
/**
* Utility methods for {@link BeanDefinitionParser} implementations for MongoDB.
*
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
@@ -43,7 +43,7 @@ abstract class MongoParsingUtils {
/**
* Parses the mongo replica-set element.
*
*
* @param parserContext the parser context
* @param element the mongo element
* @param mongoBuilder the bean definition builder to populate
@@ -56,7 +56,7 @@ abstract class MongoParsingUtils {
/**
* Parses the {@code mongo:client-options} sub-element. Populates the given attribute factory with the proper
* attributes.
*
*
* @param element must not be {@literal null}.
* @param mongoClientBuilder must not be {@literal null}.
* @return
@@ -102,7 +102,7 @@ abstract class MongoParsingUtils {
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link WriteConcernPropertyEditor}.
*
*
* @return
*/
static BeanDefinitionBuilder getWriteConcernPropertyEditorBuilder() {
@@ -135,7 +135,7 @@ abstract class MongoParsingUtils {
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link ReadPreferencePropertyEditor}.
*
*
* @return
* @since 1.7
*/
@@ -153,7 +153,7 @@ abstract class MongoParsingUtils {
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link MongoCredentialPropertyEditor}.
*
*
* @return
* @since 1.7
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -33,13 +33,13 @@ import org.w3c.dom.Element;
/**
* {@link BeanDefinitionParser} to parse {@code template} elements into {@link BeanDefinition}s.
*
*
* @author Martin Baumgartner
* @author Oliver Gierke
*/
class MongoTemplateParser extends AbstractBeanDefinitionParser {
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2017 the original author or authors.
* Copyright 2015-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -23,7 +23,7 @@ import com.mongodb.ReadPreference;
/**
* Parse a {@link String} to a {@link ReadPreference}.
*
*
* @author Christoph Strobl
* @since 1.7
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012 the original author or authors.
* Copyright 2012-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,12 +21,12 @@ import com.mongodb.WriteConcern;
/**
* Converter to create {@link WriteConcern} instances from String representations.
*
*
* @author Oliver Gierke
*/
public class StringToWriteConcernConverter implements Converter<String, WriteConcern> {
/*
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,7 +27,7 @@ import com.mongodb.WriteConcern;
* {@link WriteConcern#valueOf(String)}, use the well known {@link WriteConcern} value, otherwise pass the string as is
* to the constructor of the write concern. There is no support for other constructor signatures when parsing from a
* string value.
*
*
* @author Mark Pollack
* @author Christoph Strobl
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2017 the original author or authors.
* Copyright 2015-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -0,0 +1,117 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import lombok.EqualsAndHashCode;
import java.util.concurrent.atomic.AtomicReference;
import org.bson.Document;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.messaging.Message;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
import com.mongodb.client.model.changestream.ChangeStreamDocument;
/**
* {@link Message} implementation specific to MongoDB <a href="https://docs.mongodb.com/manual/changeStreams/">Change
* Streams</a>.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.1
*/
@EqualsAndHashCode
public class ChangeStreamEvent<T> {
private final @Nullable ChangeStreamDocument<Document> raw;
private final Class<T> targetType;
private final MongoConverter converter;
private final AtomicReference<T> converted = new AtomicReference<>();
/**
* @param raw can be {@literal null}.
* @param targetType must not be {@literal null}.
* @param converter must not be {@literal null}.
*/
public ChangeStreamEvent(@Nullable ChangeStreamDocument<Document> raw, Class<T> targetType,
MongoConverter converter) {
this.raw = raw;
this.targetType = targetType;
this.converter = converter;
}
/**
* Get the raw {@link ChangeStreamDocument} as emitted by the driver.
*
* @return can be {@literal null}.
*/
@Nullable
public ChangeStreamDocument<Document> getRaw() {
return raw;
}
/**
* Get the potentially converted {@link ChangeStreamDocument#getFullDocument()}.
*
* @return {@literal null} when {@link #getRaw()} or {@link ChangeStreamDocument#getFullDocument()} is
* {@literal null}.
*/
@Nullable
public T getBody() {
if (raw == null) {
return null;
}
if (raw.getFullDocument() == null) {
return targetType.cast(raw.getFullDocument());
}
return getConverted();
}
private T getConverted() {
T result = converted.get();
if (result != null) {
return result;
}
if (ClassUtils.isAssignable(Document.class, raw.getFullDocument().getClass())) {
result = converter.read(targetType, raw.getFullDocument());
return converted.compareAndSet(null, result) ? result : converted.get();
}
if (converter.getConversionService().canConvert(raw.getFullDocument().getClass(), targetType)) {
result = converter.getConversionService().convert(raw.getFullDocument(), targetType);
return converted.compareAndSet(null, result) ? result : converted.get();
}
throw new IllegalArgumentException(String.format("No converter found capable of converting %s to %s",
raw.getFullDocument().getClass(), targetType));
}
@Override
public String toString() {
return "ChangeStreamEvent {" + "raw=" + raw + ", targetType=" + targetType + '}';
}
}

View File

@@ -0,0 +1,218 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import lombok.EqualsAndHashCode;
import java.util.Arrays;
import java.util.Optional;
import org.bson.BsonValue;
import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.client.model.changestream.ChangeStreamDocument;
import com.mongodb.client.model.changestream.FullDocument;
/**
* Options applicable to MongoDB <a href="https://docs.mongodb.com/manual/changeStreams/">Change Streams</a>. Intended
* to be used along with {@link org.springframework.data.mongodb.core.messaging.ChangeStreamRequest} in a sync world as
* well {@link ReactiveMongoOperations} if you prefer it that way.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.1
*/
@EqualsAndHashCode
public class ChangeStreamOptions {
private @Nullable Object filter;
private @Nullable BsonValue resumeToken;
private @Nullable FullDocument fullDocumentLookup;
private @Nullable Collation collation;
protected ChangeStreamOptions() {}
/**
* @return {@link Optional#empty()} if not set.
*/
public Optional<Object> getFilter() {
return Optional.ofNullable(filter);
}
/**
* @return {@link Optional#empty()} if not set.
*/
public Optional<BsonValue> getResumeToken() {
return Optional.ofNullable(resumeToken);
}
/**
* @return {@link Optional#empty()} if not set.
*/
public Optional<FullDocument> getFullDocumentLookup() {
return Optional.ofNullable(fullDocumentLookup);
}
/**
* @return {@link Optional#empty()} if not set.
*/
public Optional<Collation> getCollation() {
return Optional.ofNullable(collation);
}
/**
* @return empty {@link ChangeStreamOptions}.
*/
public static ChangeStreamOptions empty() {
return ChangeStreamOptions.builder().build();
}
/**
* Obtain a shiny new {@link ChangeStreamOptionsBuilder} and start defining options in this fancy fluent way. Just
* don't forget to call {@link ChangeStreamOptionsBuilder#build() build()} when your're done.
*
* @return new instance of {@link ChangeStreamOptionsBuilder}.
*/
public static ChangeStreamOptionsBuilder builder() {
return new ChangeStreamOptionsBuilder();
}
/**
* Builder for creating {@link ChangeStreamOptions}.
*
* @author Christoph Strobl
* @since 2.1
*/
public static class ChangeStreamOptionsBuilder {
private @Nullable Object filter;
private @Nullable BsonValue resumeToken;
private @Nullable FullDocument fullDocumentLookup;
private @Nullable Collation collation;
private ChangeStreamOptionsBuilder() {}
/**
* Set the collation to use.
*
* @param collation must not be {@literal null} nor {@literal empty}.
* @return this.
*/
public ChangeStreamOptionsBuilder collation(Collation collation) {
Assert.notNull(collation, "Collation must not be null nor empty!");
this.collation = collation;
return this;
}
/**
* Set the filter to apply.
* <p/>
* Fields on aggregation expression root level are prefixed to map to fields contained in
* {@link ChangeStreamDocument#getFullDocument() fullDocument}. However {@literal operationType}, {@literal ns},
* {@literal documentKey} and {@literal fullDocument} are reserved words that will be omitted, and therefore taken
* as given, during the mapping procedure. You may want to have a look at the
* <a href="https://docs.mongodb.com/manual/reference/change-events/">structure of Change Events</a>.
* <p/>
* Use {@link org.springframework.data.mongodb.core.aggregation.TypedAggregation} to ensure filter expressions are
* mapped to domain type fields.
*
* @param filter the {@link Aggregation Aggregation pipeline} to apply for filtering events. Must not be
* {@literal null}.
* @return this.
*/
public ChangeStreamOptionsBuilder filter(Aggregation filter) {
Assert.notNull(filter, "Filter must not be null!");
this.filter = filter;
return this;
}
/**
* Set the plain filter chain to apply.
*
* @param filter must not be {@literal null} nor contain {@literal null} values.
* @return this.
*/
public ChangeStreamOptionsBuilder filter(Document... filter) {
Assert.noNullElements(filter, "Filter must not contain null values");
this.filter = Arrays.asList(filter);
return this;
}
/**
* Set the resume token (typically a {@link org.bson.BsonDocument} containing a {@link org.bson.BsonBinary binary
* token}) after which to start with listening.
*
* @param resumeToken must not be {@literal null}.
* @return this.
*/
public ChangeStreamOptionsBuilder resumeToken(BsonValue resumeToken) {
Assert.notNull(resumeToken, "ResumeToken must not be null!");
this.resumeToken = resumeToken;
return this;
}
/**
* Set the {@link FullDocument} lookup to {@link FullDocument#UPDATE_LOOKUP}.
*
* @return this.
* @see #fullDocumentLookup(FullDocument)
*/
public ChangeStreamOptionsBuilder returnFullDocumentOnUpdate() {
return fullDocumentLookup(FullDocument.UPDATE_LOOKUP);
}
/**
* Set the {@link FullDocument} lookup to use.
*
* @param lookup must not be {@literal null}.
* @return this.
*/
public ChangeStreamOptionsBuilder fullDocumentLookup(FullDocument lookup) {
Assert.notNull(lookup, "Lookup must not be null!");
this.fullDocumentLookup = lookup;
return this;
}
/**
* @return the built {@link ChangeStreamOptions}
*/
public ChangeStreamOptions build() {
ChangeStreamOptions options = new ChangeStreamOptions();
options.filter = filter;
options.resumeToken = resumeToken;
options.fullDocumentLookup = fullDocumentLookup;
options.collation = collation;
return options;
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2017 the original author or authors.
* Copyright 2010-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2017 the original author or authors.
* Copyright 2010-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,18 +15,27 @@
*/
package org.springframework.data.mongodb.core;
import lombok.RequiredArgsConstructor;
import java.util.Optional;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema;
import org.springframework.data.mongodb.core.validation.Validator;
import org.springframework.data.util.Optionals;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.client.model.ValidationAction;
import com.mongodb.client.model.ValidationLevel;
/**
* Provides a simple wrapper to encapsulate the variety of settings you can use when creating a collection.
*
* @author Thomas Risberg
* @author Christoph Strobl
* @author Mark Paluch
* @author Andreas Zink
*/
public class CollectionOptions {
@@ -34,6 +43,7 @@ public class CollectionOptions {
private @Nullable Long size;
private @Nullable Boolean capped;
private @Nullable Collation collation;
private ValidationOptions validationOptions;
/**
* Constructs a new <code>CollectionOptions</code> instance.
@@ -46,16 +56,17 @@ public class CollectionOptions {
*/
@Deprecated
public CollectionOptions(@Nullable Long size, @Nullable Long maxDocuments, @Nullable Boolean capped) {
this(size, maxDocuments, capped, null);
this(size, maxDocuments, capped, null, ValidationOptions.none());
}
private CollectionOptions(@Nullable Long size, @Nullable Long maxDocuments, @Nullable Boolean capped,
@Nullable Collation collation) {
@Nullable Collation collation, ValidationOptions validationOptions) {
this.maxDocuments = maxDocuments;
this.size = size;
this.capped = capped;
this.collation = collation;
this.validationOptions = validationOptions;
}
/**
@@ -69,7 +80,7 @@ public class CollectionOptions {
Assert.notNull(collation, "Collation must not be null!");
return new CollectionOptions(null, null, null, collation);
return new CollectionOptions(null, null, null, collation, ValidationOptions.none());
}
/**
@@ -79,7 +90,7 @@ public class CollectionOptions {
* @since 2.0
*/
public static CollectionOptions empty() {
return new CollectionOptions(null, null, null, null);
return new CollectionOptions(null, null, null, null, ValidationOptions.none());
}
/**
@@ -90,7 +101,7 @@ public class CollectionOptions {
* @since 2.0
*/
public CollectionOptions capped() {
return new CollectionOptions(size, maxDocuments, true, collation);
return new CollectionOptions(size, maxDocuments, true, collation, validationOptions);
}
/**
@@ -101,7 +112,7 @@ public class CollectionOptions {
* @since 2.0
*/
public CollectionOptions maxDocuments(long maxDocuments) {
return new CollectionOptions(size, maxDocuments, capped, collation);
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions);
}
/**
@@ -112,7 +123,7 @@ public class CollectionOptions {
* @since 2.0
*/
public CollectionOptions size(long size) {
return new CollectionOptions(size, maxDocuments, capped, collation);
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions);
}
/**
@@ -123,7 +134,127 @@ public class CollectionOptions {
* @since 2.0
*/
public CollectionOptions collation(@Nullable Collation collation) {
return new CollectionOptions(size, maxDocuments, capped, collation);
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions);
}
/**
* Create new {@link CollectionOptions} with already given settings and {@code validationOptions} set to given
* {@link MongoJsonSchema}.
*
* @param schema can be {@literal null}.
* @return new {@link CollectionOptions}.
* @since 2.1
*/
public CollectionOptions schema(@Nullable MongoJsonSchema schema) {
return validator(Validator.schema(schema));
}
/**
* Create new {@link CollectionOptions} with already given settings and {@code validationOptions} set to given
* {@link Validator}.
*
* @param validator can be {@literal null}.
* @return new {@link CollectionOptions}.
* @since 2.1
*/
public CollectionOptions validator(@Nullable Validator validator) {
return validation(validationOptions.validator(validator));
}
/**
* Create new {@link CollectionOptions} with already given settings and {@code validationLevel} set to
* {@link ValidationLevel#OFF}.
*
* @return new {@link CollectionOptions}.
* @since 2.1
*/
public CollectionOptions disableValidation() {
return schemaValidationLevel(ValidationLevel.OFF);
}
/**
* Create new {@link CollectionOptions} with already given settings and {@code validationLevel} set to
* {@link ValidationLevel#STRICT}.
*
* @return new {@link CollectionOptions}.
* @since 2.1
*/
public CollectionOptions strictValidation() {
return schemaValidationLevel(ValidationLevel.STRICT);
}
/**
* Create new {@link CollectionOptions} with already given settings and {@code validationLevel} set to
* {@link ValidationLevel#MODERATE}.
*
* @return new {@link CollectionOptions}.
* @since 2.1
*/
public CollectionOptions moderateValidation() {
return schemaValidationLevel(ValidationLevel.MODERATE);
}
/**
* Create new {@link CollectionOptions} with already given settings and {@code validationAction} set to
* {@link ValidationAction#WARN}.
*
* @return new {@link CollectionOptions}.
* @since 2.1
*/
public CollectionOptions warnOnValidationError() {
return schemaValidationAction(ValidationAction.WARN);
}
/**
* Create new {@link CollectionOptions} with already given settings and {@code validationAction} set to
* {@link ValidationAction#ERROR}.
*
* @return new {@link CollectionOptions}.
* @since 2.1
*/
public CollectionOptions failOnValidationError() {
return schemaValidationAction(ValidationAction.ERROR);
}
/**
* Create new {@link CollectionOptions} with already given settings and {@code validationLevel} set given
* {@link ValidationLevel}.
*
* @param validationLevel must not be {@literal null}.
* @return new {@link CollectionOptions}.
* @since 2.1
*/
public CollectionOptions schemaValidationLevel(ValidationLevel validationLevel) {
Assert.notNull(validationLevel, "ValidationLevel must not be null!");
return validation(validationOptions.validationLevel(validationLevel));
}
/**
* Create new {@link CollectionOptions} with already given settings and {@code validationAction} set given
* {@link ValidationAction}.
*
* @param validationAction must not be {@literal null}.
* @return new {@link CollectionOptions}.
* @since 2.1
*/
public CollectionOptions schemaValidationAction(ValidationAction validationAction) {
Assert.notNull(validationAction, "ValidationAction must not be null!");
return validation(validationOptions.validationAction(validationAction));
}
/**
* Create new {@link CollectionOptions} with the given {@link ValidationOptions}.
*
* @param validationOptions must not be {@literal null}. Use {@link ValidationOptions#none()} to remove validation.
* @return new {@link CollectionOptions}.
* @since 2.1
*/
public CollectionOptions validation(ValidationOptions validationOptions) {
Assert.notNull(validationOptions, "ValidationOptions must not be null!");
return new CollectionOptions(size, maxDocuments, capped, collation, validationOptions);
}
/**
@@ -163,4 +294,104 @@ public class CollectionOptions {
public Optional<Collation> getCollation() {
return Optional.ofNullable(collation);
}
/**
* Get the {@link MongoJsonSchema} for the collection.
*
* @return {@link Optional#empty()} if not set.
* @since 2.1
*/
public Optional<ValidationOptions> getValidationOptions() {
return validationOptions.isEmpty() ? Optional.empty() : Optional.of(validationOptions);
}
/**
* Encapsulation of ValidationOptions options.
*
* @author Christoph Strobl
* @author Andreas Zink
* @since 2.1
*/
@RequiredArgsConstructor
public static class ValidationOptions {
private static final ValidationOptions NONE = new ValidationOptions(null, null, null);
private final @Nullable Validator validator;
private final @Nullable ValidationLevel validationLevel;
private final @Nullable ValidationAction validationAction;
/**
* Create an empty {@link ValidationOptions}.
*
* @return never {@literal null}.
*/
public static ValidationOptions none() {
return NONE;
}
/**
* Define the {@link Validator} to be used for document validation.
*
* @param validator can be {@literal null}.
* @return new instance of {@link ValidationOptions}.
*/
public ValidationOptions validator(@Nullable Validator validator) {
return new ValidationOptions(validator, validationLevel, validationAction);
}
/**
* Define the validation level to apply.
*
* @param validationLevel can be {@literal null}.
* @return new instance of {@link ValidationOptions}.
*/
public ValidationOptions validationLevel(ValidationLevel validationLevel) {
return new ValidationOptions(validator, validationLevel, validationAction);
}
/**
* Define the validation action to take.
*
* @param validationAction can be {@literal null}.
* @return new instance of {@link ValidationOptions}.
*/
public ValidationOptions validationAction(ValidationAction validationAction) {
return new ValidationOptions(validator, validationLevel, validationAction);
}
/**
* Get the {@link Validator} to use.
*
* @return never {@literal null}.
*/
public Optional<Validator> getValidator() {
return Optional.ofNullable(validator);
}
/**
* Get the {@code validationLevel} to apply.
*
* @return {@link Optional#empty()} if not set.
*/
public Optional<ValidationLevel> getValidationLevel() {
return Optional.ofNullable(validationLevel);
}
/**
* Get the {@code validationAction} to perform.
*
* @return @return {@link Optional#empty()} if not set.
*/
public Optional<ValidationAction> getValidationAction() {
return Optional.ofNullable(validationAction);
}
/**
* @return {@literal true} if no arguments set.
*/
boolean isEmpty() {
return !Optionals.isAnyPresent(getValidator(), getValidationAction(), getValidationLevel());
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2016 the original author or authors.
* Copyright 2002-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,7 +21,7 @@ import com.mongodb.client.FindIterable;
/**
* Simple callback interface to allow customization of a {@link FindIterable}.
*
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
@@ -29,7 +29,7 @@ interface CursorPreparer {
/**
* Prepare the given cursor (apply limits, skips and so on). Returns the prepared cursor.
*
*
* @param cursor
*/
FindIterable<Document> prepare(FindIterable<Document> cursor);

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2017 the original author or authors.
* Copyright 2010-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2017 the original author or authors.
* Copyright 2015-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2014-2017 the original author or authors.
* Copyright 2014-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -43,9 +43,10 @@ import com.mongodb.client.MongoDatabase;
/**
* Default implementation of {@link ScriptOperations} capable of saving and executing {@link ServerSideJavaScript}.
*
*
* @author Christoph Strobl
* @author Oliver Gierke
* @author Mark Paluch
* @since 1.7
*/
class DefaultScriptOperations implements ScriptOperations {
@@ -141,7 +142,7 @@ class DefaultScriptOperations implements ScriptOperations {
Assert.hasText(scriptName, "ScriptName must not be null or empty!");
return mongoOperations.exists(query(where("name").is(scriptName)), NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
return mongoOperations.exists(query(where("_id").is(scriptName)), NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
}
/*
@@ -190,7 +191,7 @@ class DefaultScriptOperations implements ScriptOperations {
* Generate a valid name for the {@literal JavaScript}. MongoDB requires an id of type String for scripts. Calling
* scripts having {@link ObjectId} as id fails. Therefore we create a random UUID without {@code -} (as this won't
* work) an prefix the result with {@link #SCRIPT_NAME_PREFIX}.
*
*
* @return
*/
private static String generateScriptName() {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2016 the original author or authors.
* Copyright 2010-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,7 +26,7 @@ import com.mongodb.MongoException;
* about exception handling. {@MongoException}s will be caught and translated by the calling MongoTemplate An
* DocumentCallbackHandler is typically stateful: It keeps the result state within the object, to be available later for
* later inspection.
*
*
* @author Mark Pollack
* @author Grame Rocher
* @author Oliver Gierke

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,11 +19,14 @@ import java.util.List;
import java.util.Optional;
import java.util.stream.Stream;
import org.springframework.dao.DataAccessException;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.lang.Nullable;
import com.mongodb.client.MongoCollection;
/**
* {@link ExecutableFindOperation} allows creation and execution of MongoDB find operations in a fluent API style.
* <br />
@@ -202,7 +205,7 @@ public interface ExecutableFindOperation {
* @author Christoph Strobl
* @since 2.0
*/
interface FindWithProjection<T> extends FindWithQuery<T> {
interface FindWithProjection<T> extends FindWithQuery<T>, FindDistinct {
/**
* Define the target type fields should be mapped to. <br />
@@ -214,6 +217,101 @@ public interface ExecutableFindOperation {
* @throws IllegalArgumentException if resultType is {@literal null}.
*/
<R> FindWithQuery<R> as(Class<R> resultType);
}
/**
* Distinct Find support.
*
* @author Christoph Strobl
* @since 2.1
*/
interface FindDistinct {
/**
* Finds the distinct values for a specified {@literal field} across a single {@link MongoCollection} or view.
*
* @param field name of the field. Must not be {@literal null}.
* @return new instance of {@link TerminatingDistinct}.
* @throws IllegalArgumentException if field is {@literal null}.
*/
TerminatingDistinct<Object> distinct(String field);
}
/**
* Result type override. Optional.
*
* @author Christoph Strobl
* @since 2.1
*/
interface DistinctWithProjection {
/**
* Define the target type the result should be mapped to. <br />
* Skip this step if you are anyway fine with the default conversion.
* <dl>
* <dt>{@link Object} (the default)</dt>
* <dd>Result is mapped according to the {@link org.bson.BsonType} converting eg. {@link org.bson.BsonString} into
* plain {@link String}, {@link org.bson.BsonInt64} to {@link Long}, etc. always picking the most concrete type with
* respect to the domain types property.<br />
* Any {@link org.bson.BsonType#DOCUMENT} is run through the {@link org.springframework.data.convert.EntityReader}
* to obtain the domain type. <br />
* Using {@link Object} also works for non strictly typed fields. Eg. a mixture different types like fields using
* {@link String} in one {@link org.bson.Document} while {@link Long} in another.</dd>
* <dt>Any Simple type like {@link String} or {@link Long}.</dt>
* <dd>The result is mapped directly by the MongoDB Java driver and the {@link org.bson.codecs.CodeCodec Codecs} in
* place. This works only for results where all documents considered for the operation use the very same type for
* the field.</dd>
* <dt>Any Domain type</dt>
* <dd>Domain types can only be mapped if the if the result of the actual {@code distinct()} operation returns
* {@link org.bson.BsonType#DOCUMENT}.</dd>
* <dt>{@link org.bson.BsonValue}</dt>
* <dd>Using {@link org.bson.BsonValue} allows retrieval of the raw driver specific format, which returns eg.
* {@link org.bson.BsonString}.</dd>
* </dl>
*
* @param resultType must not be {@literal null}.
* @param <R> result type.
* @return new instance of {@link TerminatingDistinct}.
* @throws IllegalArgumentException if resultType is {@literal null}.
*/
<R> TerminatingDistinct<R> as(Class<R> resultType);
}
/**
* Result restrictions. Optional.
*
* @author Christoph Strobl
* @since 2.1
*/
interface DistinctWithQuery<T> extends DistinctWithProjection {
/**
* Set the filter query to be used.
*
* @param query must not be {@literal null}.
* @return new instance of {@link TerminatingDistinct}.
* @throws IllegalArgumentException if resultType is {@literal null}.
*/
TerminatingDistinct<T> matching(Query query);
}
/**
* Terminating distinct find operations.
*
* @author Christoph Strobl
* @since 2.1
*/
interface TerminatingDistinct<T> extends DistinctWithQuery<T> {
/**
* Get all matching distinct field values.
*
* @return empty {@link List} if not match found. Never {@literal null}.
* @throws DataAccessException if eg. result cannot be converted correctly which may happen if the document contains
* {@link String} whereas the result type is specified as {@link Long}.
*/
List<T> all();
}
/**
@@ -222,5 +320,5 @@ public interface ExecutableFindOperation {
* @author Christoph Strobl
* @since 2.0
*/
interface ExecutableFind<T> extends FindWithCollection<T>, FindWithProjection<T> {}
interface ExecutableFind<T> extends FindWithCollection<T>, FindWithProjection<T>, FindDistinct {}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -205,6 +205,19 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return template.exists(query, domainType, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.FindDistinct#distinct(java.lang.String)
*/
@SuppressWarnings("unchecked")
@Override
public TerminatingDistinct<Object> distinct(String field) {
Assert.notNull(field, "Field must not be null!");
return new DistinctOperationSupport(this, field);
}
private List<T> doFind(@Nullable CursorPreparer preparer) {
Document queryObject = query.getQueryObject();
@@ -214,6 +227,12 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
getCursorPreparer(query, preparer));
}
private List<T> doFindDistinct(String field) {
return template.findDistinct(query, field, getCollectionName(), domainType,
returnType == domainType ? (Class<T>) Object.class : returnType);
}
private CloseableIterator<T> doStream() {
return template.doStream(query, domainType, getCollectionName(), returnType);
}
@@ -261,4 +280,54 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
return this;
}
}
/**
* @author Christoph Strobl
* @since 2.1
*/
static class DistinctOperationSupport<T> implements TerminatingDistinct<T> {
private final String field;
private final ExecutableFindSupport<T> delegate;
public DistinctOperationSupport(ExecutableFindSupport<T> delegate, String field) {
this.delegate = delegate;
this.field = field;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.DistinctWithProjection#as(java.lang.Class)
*/
@Override
@SuppressWarnings("unchecked")
public <R> TerminatingDistinct<R> as(Class<R> resultType) {
Assert.notNull(resultType, "ResultType must not be null!");
return new DistinctOperationSupport<>((ExecutableFindSupport) delegate.as(resultType), field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.DistinctWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public TerminatingDistinct<T> matching(Query query) {
Assert.notNull(query, "Query must not be null!");
return new DistinctOperationSupport<>((ExecutableFindSupport<T>) delegate.matching(query), field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingDistinct#all()
*/
@Override
public List<T> all() {
return delegate.doFindDistinct(field);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2017 the original author or authors.
* Copyright 2010-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,7 +20,7 @@ import org.springframework.util.Assert;
/**
* Value object to mitigate different representations of geo command execution results in MongoDB.
*
*
* @author Oliver Gierke
* @author Christoph Strobl
* @soundtrack Fruitcake - Jeff Coffin (The Inside of the Outside)
@@ -34,7 +34,7 @@ class GeoCommandStatistics {
/**
* Creates a new {@link GeoCommandStatistics} instance with the given source document.
*
*
* @param source must not be {@literal null}.
*/
private GeoCommandStatistics(Document source) {
@@ -45,7 +45,7 @@ class GeoCommandStatistics {
/**
* Creates a new {@link GeoCommandStatistics} from the given command result extracting the statistics.
*
*
* @param commandResult must not be {@literal null}.
* @return
*/
@@ -60,7 +60,7 @@ class GeoCommandStatistics {
/**
* Returns the average distance reported by the command result. Mitigating a removal of the field in case the command
* didn't return any result introduced in MongoDB 3.2 RC1.
*
*
* @return
* @see <a href="https://jira.mongodb.org/browse/SERVER-21024">MongoDB Jira SERVER-21024</a>
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -29,7 +29,7 @@ import com.mongodb.WriteConcern;
* <li>REMOVE has null document</li>
* <li>INSERT_LIST has null entityType, document, and query</li>
* </ul>
*
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
@@ -46,7 +46,7 @@ public class MongoAction {
/**
* Create an instance of a {@link MongoAction}.
*
*
* @param defaultWriteConcern the default write concern. Can be {@literal null}.
* @param mongoActionOperation action being taken against the collection. Must not be {@literal null}.
* @param collectionName the collection name, must not be {@literal null} or empty.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,7 +18,7 @@ package org.springframework.data.mongodb.core;
/**
* Enumeration for operations on a collection. Used with {@link MongoAction} to help determine the WriteConcern to use
* for a given mutating operation
*
*
* @author Mark Pollack
* @author Oliver Gierke
* @see MongoAction

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -25,7 +25,7 @@ import com.mongodb.client.MongoDatabase;
/**
* Mongo server administration exposed via JMX annotations
*
*
* @author Mark Pollack
* @author Thomas Darimont
* @author Mark Paluch

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2017 the original author or authors.
* Copyright 2015-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015-2017 the original author or authors.
* Copyright 2015-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2017 the original author or authors.
* Copyright 2013-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -23,7 +23,7 @@ import com.mongodb.WriteResult;
/**
* Mongo-specific {@link DataIntegrityViolationException}.
*
*
* @author Oliver Gierke
*/
public class MongoDataIntegrityViolationException extends DataIntegrityViolationException {
@@ -35,7 +35,7 @@ public class MongoDataIntegrityViolationException extends DataIntegrityViolation
/**
* Creates a new {@link MongoDataIntegrityViolationException} using the given message and {@link WriteResult}.
*
*
* @param message the exception message
* @param writeResult the {@link WriteResult} that causes the exception, must not be {@literal null}.
* @param actionOperation the {@link MongoActionOperation} that caused the exception, must not be {@literal null}.
@@ -54,7 +54,7 @@ public class MongoDataIntegrityViolationException extends DataIntegrityViolation
/**
* Returns the {@link WriteResult} that caused the exception.
*
*
* @return the writeResult
*/
public WriteResult getWriteResult() {
@@ -63,7 +63,7 @@ public class MongoDataIntegrityViolationException extends DataIntegrityViolation
/**
* Returns the {@link MongoActionOperation} in which the current exception occured.
*
*
* @return the actionOperation
*/
public MongoActionOperation getActionOperation() {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2017 the original author or authors.
* Copyright 2010-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -44,7 +44,7 @@ import com.mongodb.bulk.BulkWriteError;
* Simple {@link PersistenceExceptionTranslator} for Mongo. Convert the given runtime exception to an appropriate
* exception from the {@code org.springframework.dao} hierarchy. Return {@literal null} if no translation is
* appropriate: any other exception may have resulted from user code, and should not be translated.
*
*
* @author Oliver Gierke
* @author Michal Vich
* @author Christoph Strobl

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -59,6 +59,7 @@ import com.mongodb.client.result.UpdateResult;
* @author Christoph Strobl
* @author Thomas Darimont
* @author Maninder Singh
* @author Mark Paluch
*/
public interface MongoOperations extends FluentMongoOperations {
@@ -71,11 +72,11 @@ public interface MongoOperations extends FluentMongoOperations {
String getCollectionName(Class<?> entityClass);
/**
* Execute the a MongoDB command expressed as a JSON string. This will call the method JSON.parse that is part of the
* MongoDB driver to convert the JSON string to a Document. Any errors that result from executing this command will be
* Execute the a MongoDB command expressed as a JSON string. Parsing is delegated to {@link Document#parse(String)} to
* obtain the {@link Document} holding the actual command. Any errors that result from executing this command will be
* converted into Spring's DAO exception hierarchy.
*
* @param jsonCommand a MongoDB command expressed as a JSON string.
* @param jsonCommand a MongoDB command expressed as a JSON string. Must not be {@literal null}.
* @return a result object returned by the action.
*/
Document executeCommand(String jsonCommand);
@@ -708,6 +709,65 @@ public interface MongoOperations extends FluentMongoOperations {
@Nullable
<T> T findById(Object id, Class<T> entityClass, String collectionName);
/**
* Finds the distinct values for a specified {@literal field} across a single {@link MongoCollection} or view and
* returns the results in a {@link List}.
*
* @param field the name of the field to inspect for distinct values. Must not be {@literal null}.
* @param entityClass the domain type used for determining the actual {@link MongoCollection}. Must not be
* {@literal null}.
* @param resultClass the result type. Must not be {@literal null}.
* @return never {@literal null}.
* @since 2.1
*/
default <T> List<T> findDistinct(String field, Class<?> entityClass, Class<T> resultClass) {
return findDistinct(new Query(), field, entityClass, resultClass);
}
/**
* Finds the distinct values for a specified {@literal field} across a single {@link MongoCollection} or view and
* returns the results in a {@link List}.
*
* @param query filter {@link Query} to restrict search. Must not be {@literal null}.
* @param field the name of the field to inspect for distinct values. Must not be {@literal null}.
* @param entityClass the domain type used for determining the actual {@link MongoCollection} and mapping the
* {@link Query} to the domain type fields. Must not be {@literal null}.
* @param resultClass the result type. Must not be {@literal null}.
* @return never {@literal null}.
* @since 2.1
*/
<T> List<T> findDistinct(Query query, String field, Class<?> entityClass, Class<T> resultClass);
/**
* Finds the distinct values for a specified {@literal field} across a single {@link MongoCollection} or view and
* returns the results in a {@link List}.
*
* @param query filter {@link Query} to restrict search. Must not be {@literal null}.
* @param field the name of the field to inspect for distinct values. Must not be {@literal null}.
* @param collectionName the explicit name of the actual {@link MongoCollection}. Must not be {@literal null}.
* @param entityClass the domain type used for mapping the {@link Query} to the domain type fields.
* @param resultClass the result type. Must not be {@literal null}.
* @return never {@literal null}.
* @since 2.1
*/
<T> List<T> findDistinct(Query query, String field, String collectionName, Class<?> entityClass,
Class<T> resultClass);
/**
* Finds the distinct values for a specified {@literal field} across a single {@link MongoCollection} or view and
* returns the results in a {@link List}.
*
* @param query filter {@link Query} to restrict search. Must not be {@literal null}.
* @param field the name of the field to inspect for distinct values. Must not be {@literal null}.
* @param collection the explicit name of the actual {@link MongoCollection}. Must not be {@literal null}.
* @param resultClass the result type. Must not be {@literal null}.
* @return never {@literal null}.
* @since 2.1
*/
default <T> List<T> findDistinct(Query query, String field, String collection, Class<T> resultClass) {
return findDistinct(query, field, collection, Object.class, resultClass);
}
/**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
@@ -850,8 +910,8 @@ public interface MongoOperations extends FluentMongoOperations {
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" >
* Spring's Type Conversion"</a> for more details.
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's Type
* Conversion"</a> for more details.
* <p/>
* <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method.
@@ -907,8 +967,8 @@ public interface MongoOperations extends FluentMongoOperations {
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" >
* Spring's Type Conversion"</a> for more details.
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation" > Spring's Type
* Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
*/
@@ -924,8 +984,8 @@ public interface MongoOperations extends FluentMongoOperations {
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert">Spring's
* Type Conversion"</a> for more details.
* http://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#validation">Spring's Type
* Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection. Must not be {@literal null}.
* @param collectionName name of the collection to store the object in. Must not be {@literal null}.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2017 the original author or authors.
* Copyright 2010-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,8 +27,11 @@ import java.io.IOException;
import java.util.*;
import java.util.Map.Entry;
import java.util.concurrent.TimeUnit;
import java.util.stream.Collectors;
import org.bson.BsonValue;
import org.bson.Document;
import org.bson.codecs.Codec;
import org.bson.conversions.Bson;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@@ -54,6 +57,8 @@ import org.springframework.data.geo.GeoResults;
import org.springframework.data.geo.Metric;
import org.springframework.data.mapping.MappingException;
import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.mapping.PropertyPath;
import org.springframework.data.mapping.PropertyReferenceException;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.ConvertingPropertyAccessor;
import org.springframework.data.mongodb.MongoDbFactory;
@@ -68,9 +73,11 @@ import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOpe
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.JsonSchemaMapper;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.convert.MongoJsonSchemaMapper;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
@@ -100,6 +107,7 @@ import org.springframework.data.mongodb.core.query.Meta;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.validation.Validator;
import org.springframework.data.mongodb.util.MongoClientVersion;
import org.springframework.data.projection.ProjectionInformation;
import org.springframework.data.projection.SpelAwareProxyProjectionFactory;
@@ -116,7 +124,6 @@ import org.springframework.util.ObjectUtils;
import org.springframework.util.ResourceUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Cursor;
import com.mongodb.DBCollection;
import com.mongodb.DBCursor;
import com.mongodb.Mongo;
@@ -125,19 +132,14 @@ import com.mongodb.MongoException;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
import com.mongodb.client.AggregateIterable;
import com.mongodb.client.DistinctIterable;
import com.mongodb.client.FindIterable;
import com.mongodb.client.MapReduceIterable;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.MongoCursor;
import com.mongodb.client.MongoDatabase;
import com.mongodb.client.model.CountOptions;
import com.mongodb.client.model.CreateCollectionOptions;
import com.mongodb.client.model.DeleteOptions;
import com.mongodb.client.model.Filters;
import com.mongodb.client.model.FindOneAndDeleteOptions;
import com.mongodb.client.model.FindOneAndUpdateOptions;
import com.mongodb.client.model.ReturnDocument;
import com.mongodb.client.model.UpdateOptions;
import com.mongodb.client.MongoIterable;
import com.mongodb.client.model.*;
import com.mongodb.client.result.DeleteResult;
import com.mongodb.client.result.UpdateResult;
import com.mongodb.util.JSONParseException;
@@ -162,6 +164,8 @@ import com.mongodb.util.JSONParseException;
* @author Laszlo Csontos
* @author Maninder Singh
* @author Borislav Rangelov
* @author duozhilin
* @author Andreas Zink
*/
@SuppressWarnings("deprecation")
public class MongoTemplate implements MongoOperations, ApplicationContextAware, IndexOperationsProvider {
@@ -187,6 +191,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
private final PersistenceExceptionTranslator exceptionTranslator;
private final QueryMapper queryMapper;
private final UpdateMapper updateMapper;
private final JsonSchemaMapper schemaMapper;
private final SpelAwareProxyProjectionFactory projectionFactory;
private @Nullable WriteConcern writeConcern;
@@ -231,6 +236,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
this.mongoConverter = mongoConverter == null ? getDefaultMongoConverter(mongoDbFactory) : mongoConverter;
this.queryMapper = new QueryMapper(this.mongoConverter);
this.updateMapper = new UpdateMapper(this.mongoConverter);
this.schemaMapper = new MongoJsonSchemaMapper(this.mongoConverter);
this.projectionFactory = new SpelAwareProxyProjectionFactory();
// We always have a mapping context in the converter, whether it's a simple one or not
@@ -541,7 +547,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
*/
public <T> MongoCollection<Document> createCollection(Class<T> entityClass,
@Nullable CollectionOptions collectionOptions) {
return createCollection(determineCollectionName(entityClass), collectionOptions);
Assert.notNull(entityClass, "EntityClass must not be null!");
return doCreateCollection(determineCollectionName(entityClass), convertToDocument(collectionOptions, entityClass));
}
/*
@@ -568,7 +576,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation#getCollection(java.lang.String)
* @see org.springframework.data.mongodb.core.MongoOperations#getCollection(java.lang.String)
*/
public MongoCollection<Document> getCollection(final String collectionName) {
@@ -761,16 +769,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @see org.springframework.data.mongodb.core.MongoOperations#findOne(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String)
*/
@Override
public <T> List<T> find(final Query query, Class<T> entityClass, String collectionName) {
public <T> List<T> find(Query query, Class<T> entityClass, String collectionName) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(collectionName, "CollectionName must not be null!");
Assert.notNull(entityClass, "EntityClass must not be null!");
if (query.getQueryObject().isEmpty() && query.getSortObject().isEmpty()) {
return findAll(entityClass, collectionName);
}
return doFind(collectionName, query.getQueryObject(), query.getFieldsObject(), entityClass,
new QueryCursorPreparer(query, entityClass));
}
@@ -801,6 +805,88 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
return doFindOne(collectionName, new Document(idKey, id), new Document(), entityClass);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#findDistinct(org.springframework.data.mongodb.core.query.Query, java.lang.String, java.lang.Class, java.lang.Class)
*/
@Override
public <T> List<T> findDistinct(Query query, String field, Class<?> entityClass, Class<T> resultClass) {
return findDistinct(query, field, determineCollectionName(entityClass), entityClass, resultClass);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#findDistinct(org.springframework.data.mongodb.core.query.Query, java.lang.String, java.lang.String, java.lang.Class, java.lang.Class)
*/
@Override
@SuppressWarnings("unchecked")
public <T> List<T> findDistinct(Query query, String field, String collectionName, Class<?> entityClass,
Class<T> resultClass) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(field, "Field must not be null!");
Assert.notNull(collectionName, "CollectionName must not be null!");
Assert.notNull(entityClass, "EntityClass must not be null!");
Assert.notNull(resultClass, "ResultClass must not be null!");
MongoPersistentEntity<?> entity = entityClass != Object.class ? getPersistentEntity(entityClass) : null;
Document mappedQuery = queryMapper.getMappedObject(query.getQueryObject(), entity);
String mappedFieldName = queryMapper.getMappedFields(new Document(field, 1), entity).keySet().iterator().next();
Class<T> mongoDriverCompatibleType = getMongoDbFactory().getCodecFor(resultClass).map(Codec::getEncoderClass)
.orElse((Class) BsonValue.class);
MongoIterable<?> result = execute((db) -> {
DistinctIterable<T> iterable = db.getCollection(collectionName).distinct(mappedFieldName, mappedQuery,
mongoDriverCompatibleType);
return query.getCollation().map(Collation::toMongoCollation).map(iterable::collation).orElse(iterable);
});
if (resultClass == Object.class || mongoDriverCompatibleType != resultClass) {
MongoConverter converter = getConverter();
DefaultDbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory);
result = result.map((source) -> converter.mapValueToTargetType(source,
getMostSpecificConversionTargetType(resultClass, entityClass, field), dbRefResolver));
}
try {
return (List<T>) result.into(new ArrayList<>());
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e, exceptionTranslator);
}
}
/**
* @param userType must not be {@literal null}.
* @param domainType must not be {@literal null}.
* @param field must not be {@literal null}.
* @return the most specific conversion target type depending on user preference and domain type property.
* @since 2.1
*/
private static Class<?> getMostSpecificConversionTargetType(Class<?> userType, Class<?> domainType, String field) {
Class<?> conversionTargetType = userType;
try {
Class<?> propertyType = PropertyPath.from(field, domainType).getLeafProperty().getLeafType();
// use the more specific type but favor UserType over property one
if (ClassUtils.isAssignable(userType, propertyType)) {
conversionTargetType = propertyType;
}
} catch (PropertyReferenceException e) {
// just don't care about it as we default to Object.class anyway.
}
return conversionTargetType;
}
@Override
public <T> GeoResults<T> geoNear(NearQuery near, Class<T> entityClass) {
return geoNear(near, entityClass, determineCollectionName(entityClass));
@@ -1927,87 +2013,93 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Assert.notNull(outputType, "Output type must not be null!");
AggregationOperationContext rootContext = context == null ? Aggregation.DEFAULT_CONTEXT : context;
Document command = aggregation.toDocument(collectionName, rootContext);
return doAggregate(aggregation, collectionName, outputType, rootContext);
}
@SuppressWarnings("ConstantConditions")
protected <O> AggregationResults<O> doAggregate(Aggregation aggregation, String collectionName, Class<O> outputType,
AggregationOperationContext context) {
DocumentCallback<O> callback = new UnwrapAndReadDocumentCallback<>(mongoConverter, outputType, collectionName);
AggregationOptions options = aggregation.getOptions();
if (options.isExplain()) {
Document command = aggregation.toDocument(collectionName, context);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing aggregation: {}", serializeToJsonSafely(command));
}
Document commandResult = executeCommand(command);
return new AggregationResults<>(commandResult.get("results", new ArrayList<Document>(0)).stream()
.map(callback::doWith).collect(Collectors.toList()), commandResult);
}
List<Document> pipeline = aggregation.toPipeline(context);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing aggregation: {}", serializeToJsonSafely(command));
LOGGER.debug("Executing aggregation: {} in collection {}", serializeToJsonSafely(pipeline), collectionName);
}
Document commandResult = executeCommand(command, this.readPreference);
return execute(collectionName, collection -> {
return new AggregationResults<O>(returnPotentiallyMappedResults(outputType, commandResult, collectionName),
commandResult);
}
/**
* Returns the potentially mapped results of the given {@code commandResult}.
*
* @param outputType
* @param commandResult
* @return
*/
private <O> List<O> returnPotentiallyMappedResults(Class<O> outputType, Document commandResult,
String collectionName) {
@SuppressWarnings("unchecked")
Iterable<Document> resultSet = (Iterable<Document>) commandResult.get("result");
if (resultSet == null) {
return Collections.emptyList();
}
DocumentCallback<O> callback = new UnwrapAndReadDocumentCallback<O>(mongoConverter, outputType, collectionName);
List<O> mappedResults = new ArrayList<O>();
for (Document document : resultSet) {
mappedResults.add(callback.doWith(document));
}
return mappedResults;
List<Document> rawResult = new ArrayList<>();
AggregateIterable<Document> aggregateIterable = collection.aggregate(pipeline, Document.class) //
.collation(options.getCollation().map(Collation::toMongoCollation).orElse(null)) //
.allowDiskUse(options.isAllowDiskUse());
if (options.getCursorBatchSize() != null) {
aggregateIterable = aggregateIterable.batchSize(options.getCursorBatchSize());
}
MongoIterable<O> iterable = aggregateIterable.map(val -> {
rawResult.add(val);
return callback.doWith(val);
});
return new AggregationResults<>(iterable.into(new ArrayList<>()),
new Document("results", rawResult).append("ok", 1.0D));
});
}
@SuppressWarnings("ConstantConditions")
protected <O> CloseableIterator<O> aggregateStream(Aggregation aggregation, String collectionName,
Class<O> outputType, @Nullable AggregationOperationContext context) {
Assert.hasText(collectionName, "Collection name must not be null or empty!");
Assert.notNull(aggregation, "Aggregation pipeline must not be null!");
Assert.notNull(outputType, "Output type must not be null!");
Assert.isTrue(!aggregation.getOptions().isExplain(), "Can't use explain option with streaming!");
AggregationOperationContext rootContext = context == null ? Aggregation.DEFAULT_CONTEXT : context;
Document command = aggregation.toDocument(collectionName, rootContext);
assertNotExplain(command);
AggregationOptions options = aggregation.getOptions();
List<Document> pipeline = aggregation.toPipeline(rootContext);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Streaming aggregation: {}", serializeToJsonSafely(command));
LOGGER.debug("Streaming aggregation: {} in collection {}", serializeToJsonSafely(pipeline), collectionName);
}
ReadDocumentCallback<O> readCallback = new ReadDocumentCallback<O>(mongoConverter, outputType, collectionName);
ReadDocumentCallback<O> readCallback = new ReadDocumentCallback<>(mongoConverter, outputType, collectionName);
return execute(collectionName, new CollectionCallback<CloseableIterator<O>>() {
return execute(collectionName, (CollectionCallback<CloseableIterator<O>>) collection -> {
@Override
public CloseableIterator<O> doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
AggregateIterable<Document> cursor = collection.aggregate(pipeline) //
.allowDiskUse(options.isAllowDiskUse()) //
.useCursor(true);
List<Document> pipeline = (List<Document>) command.get("pipeline");
AggregationOptions options = AggregationOptions.fromDocument(command);
AggregateIterable<Document> cursor = collection.aggregate(pipeline).allowDiskUse(options.isAllowDiskUse())
.useCursor(true);
Integer cursorBatchSize = options.getCursorBatchSize();
if (cursorBatchSize != null) {
cursor = cursor.batchSize(cursorBatchSize);
}
if (options.getCollation().isPresent()) {
cursor = cursor.collation(options.getCollation().map(Collation::toMongoCollation).get());
}
return new CloseableIterableCursorAdapter<O>(cursor.iterator(), exceptionTranslator, readCallback);
if (options.getCursorBatchSize() != null) {
cursor = cursor.batchSize(options.getCursorBatchSize());
}
if (options.getCollation().isPresent()) {
cursor = cursor.collation(options.getCollation().map(Collation::toMongoCollation).get());
}
return new CloseableIterableCursorAdapter<>(cursor.iterator(), exceptionTranslator, readCallback);
});
}
@@ -2056,20 +2148,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
return new ExecutableInsertOperationSupport(this).insert(domainType);
}
/**
* Assert that the {@link Document} does not enable Aggregation explain mode.
*
* @param command the command {@link Document}.
*/
private void assertNotExplain(Document command) {
Boolean explain = command.get("explain", Boolean.class);
if (explain != null && explain) {
throw new IllegalArgumentException("Can't use explain option with streaming!");
}
}
protected String replaceWithResourceIfNecessary(String function) {
String func = function;
@@ -2153,6 +2231,21 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
co.collation(IndexConverters.fromDocument(collectionOptions.get("collation", Document.class)));
}
if (collectionOptions.containsKey("validator")) {
com.mongodb.client.model.ValidationOptions options = new com.mongodb.client.model.ValidationOptions();
if (collectionOptions.containsKey("validationLevel")) {
options.validationLevel(ValidationLevel.fromString(collectionOptions.getString("validationLevel")));
}
if (collectionOptions.containsKey("validationAction")) {
options.validationAction(ValidationAction.fromString(collectionOptions.getString("validationAction")));
}
options.validator(collectionOptions.get("validator", Document.class));
co.validationOptions(options);
}
db.createCollection(collectionName, co);
MongoCollection<Document> coll = db.getCollection(collectionName, Document.class);
@@ -2265,19 +2358,70 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
new ProjectingReadCallback<>(mongoConverter, sourceClass, targetClass, collectionName), collectionName);
}
/**
* Convert given {@link CollectionOptions} to a document and take the domain type information into account when
* creating a mapped schema for validation. <br />
* This method calls {@link #convertToDocument(CollectionOptions)} for backwards compatibility and potentially
* overwrites the validator with the mapped validator document. In the long run
* {@link #convertToDocument(CollectionOptions)} will be removed so that this one becomes the only source of truth.
*
* @param collectionOptions can be {@literal null}.
* @param targetType must not be {@literal null}. Use {@link Object} type instead.
* @return never {@literal null}.
* @since 2.1
*/
protected Document convertToDocument(@Nullable CollectionOptions collectionOptions, Class<?> targetType) {
Document doc = convertToDocument(collectionOptions);
if (collectionOptions != null) {
collectionOptions.getValidationOptions().ifPresent(it -> it.getValidator() //
.ifPresent(val -> doc.put("validator", getMappedValidator(val, targetType))));
}
return doc;
}
/**
* @param collectionOptions can be {@literal null}.
* @return never {@literal null}.
* @deprecated since 2.1 in favor of {@link #convertToDocument(CollectionOptions, Class)}.
*/
@Deprecated
protected Document convertToDocument(@Nullable CollectionOptions collectionOptions) {
Document document = new Document();
if (collectionOptions != null) {
collectionOptions.getCapped().ifPresent(val -> document.put("capped", val));
collectionOptions.getSize().ifPresent(val -> document.put("size", val));
collectionOptions.getMaxDocuments().ifPresent(val -> document.put("max", val));
collectionOptions.getCollation().ifPresent(val -> document.append("collation", val.toDocument()));
collectionOptions.getValidationOptions().ifPresent(it -> {
it.getValidationLevel().ifPresent(val -> document.append("validationLevel", val.getValue()));
it.getValidationAction().ifPresent(val -> document.append("validationAction", val.getValue()));
it.getValidator().ifPresent(val -> document.append("validator", getMappedValidator(val, Object.class)));
});
}
return document;
}
Document getMappedValidator(Validator validator, Class<?> domainType) {
Document validationRules = validator.toDocument();
if (validationRules.containsKey("$jsonSchema")) {
return schemaMapper.mapSchema(validationRules, domainType);
}
return queryMapper.getMappedObject(validationRules, mappingContext.getPersistentEntity(domainType));
}
/**
* Map the results of an ad-hoc query on the default MongoDB collection to an object using the template's converter.
* The first document that matches the query is returned and also removed from the collection in the database.
@@ -2764,31 +2908,26 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @author Oliver Gierke
* @author Christoph Strobl
*/
@RequiredArgsConstructor
private class ReadDocumentCallback<T> implements DocumentCallback<T> {
private final EntityReader<? super T, Bson> reader;
private final Class<T> type;
private final @NonNull EntityReader<? super T, Bson> reader;
private final @NonNull Class<T> type;
private final String collectionName;
public ReadDocumentCallback(EntityReader<? super T, Bson> reader, Class<T> type, String collectionName) {
Assert.notNull(reader, "EntityReader must not be null!");
Assert.notNull(type, "Entity type must not be null!");
this.reader = reader;
this.type = type;
this.collectionName = collectionName;
}
@Nullable
public T doWith(Document object) {
public T doWith(@Nullable Document object) {
if (null != object) {
maybeEmitEvent(new AfterLoadEvent<T>(object, type, collectionName));
}
T source = reader.read(type, object);
if (null != source) {
maybeEmitEvent(new AfterConvertEvent<T>(object, source, collectionName));
}
return source;
}
}
@@ -2823,10 +2962,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Class<?> typeToRead = targetType.isInterface() || targetType.isAssignableFrom(entityType) ? entityType
: targetType;
if (null != object) {
maybeEmitEvent(new AfterLoadEvent<T>(object, targetType, collectionName));
}
Object source = reader.read(typeToRead, object);
Object result = targetType.isInterface() ? projectionFactory.createProjection(targetType, source) : source;
if (result == null) {
if (null != result) {
maybeEmitEvent(new AfterConvertEvent<>(object, result, collectionName));
}
@@ -2979,7 +3123,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
T doWith = delegate.doWith(content);
return new GeoResult<T>(doWith, new Distance(distance, metric));
return new GeoResult<>(doWith, new Distance(distance, metric));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016 the original author or authors.
* Copyright 2016-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -156,7 +156,7 @@ public interface ReactiveFindOperation {
/**
* Result type override (optional).
*/
interface FindWithProjection<T> extends FindWithQuery<T> {
interface FindWithProjection<T> extends FindWithQuery<T>, FindDistinct {
/**
* Define the target type fields should be mapped to. <br />
@@ -170,8 +170,101 @@ public interface ReactiveFindOperation {
<R> FindWithQuery<R> as(Class<R> resultType);
}
/**
* Distinct Find support.
*
* @author Christoph Strobl
* @since 2.1
*/
interface FindDistinct {
/**
* Finds the distinct values for a specified {@literal field} across a single
* {@link com.mongodb.reactivestreams.client.MongoCollection} or view.
*
* @param field name of the field. Must not be {@literal null}.
* @return new instance of {@link TerminatingDistinct}.
* @throws IllegalArgumentException if field is {@literal null}.
*/
TerminatingDistinct<Object> distinct(String field);
}
/**
* Result type override. Optional.
*
* @author Christoph Strobl
* @since 2.1
*/
interface DistinctWithProjection {
/**
* Define the target type the result should be mapped to. <br />
* Skip this step if you are anyway fine with the default conversion.
* <dl>
* <dt>{@link Object} (the default)</dt>
* <dd>Result is mapped according to the {@link org.bson.BsonType} converting eg. {@link org.bson.BsonString} into
* plain {@link String}, {@link org.bson.BsonInt64} to {@link Long}, etc. always picking the most concrete type with
* respect to the domain types property.<br />
* Any {@link org.bson.BsonType#DOCUMENT} is run through the {@link org.springframework.data.convert.EntityReader}
* to obtain the domain type. <br />
* Using {@link Object} also works for non strictly typed fields. Eg. a mixture different types like fields using
* {@link String} in one {@link org.bson.Document} while {@link Long} in another.</dd>
* <dt>Any Simple type like {@link String}, {@link Long}, ...</dt>
* <dd>The result is mapped directly by the MongoDB Java driver and the {@link org.bson.codecs.CodeCodec Codecs} in
* place. This works only for results where all documents considered for the operation use the very same type for
* the field.</dd>
* <dt>Any Domain type</dt>
* <dd>Domain types can only be mapped if the if the result of the actual {@code distinct()} operation returns
* {@link org.bson.BsonType#DOCUMENT}.</dd>
* <dt>{@link org.bson.BsonValue}</dt>
* <dd>Using {@link org.bson.BsonValue} allows retrieval of the raw driver specific format, which returns eg.
* {@link org.bson.BsonString}.</dd>
* </dl>
*
* @param resultType must not be {@literal null}.
* @param <R> result type.
* @return new instance of {@link TerminatingDistinct}.
* @throws IllegalArgumentException if resultType is {@literal null}.
*/
<R> TerminatingDistinct<R> as(Class<R> resultType);
}
/**
* Result restrictions. Optional.
*
* @author Christoph Strobl
* @since 2.1
*/
interface DistinctWithQuery<T> extends DistinctWithProjection {
/**
* Set the filter query to be used.
*
* @param query must not be {@literal null}.
* @return new instance of {@link TerminatingDistinct}.
* @throws IllegalArgumentException if resultType is {@literal null}.
*/
TerminatingDistinct<T> matching(Query query);
}
/**
* Terminating distinct find operations.
*
* @author Christoph Strobl
* @since 2.1
*/
interface TerminatingDistinct<T> extends DistinctWithQuery<T> {
/**
* Get all matching distinct field values.
*
* @return empty {@link Flux} if not match found. Never {@literal null}.
*/
Flux<T> all();
}
/**
* {@link ReactiveFind} provides methods for constructing lookup operations in a fluent way.
*/
interface ReactiveFind<T> extends FindWithCollection<T>, FindWithProjection<T> {}
interface ReactiveFind<T> extends FindWithCollection<T>, FindWithProjection<T>, FindDistinct {}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,7 +19,6 @@ import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import org.springframework.lang.Nullable;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
@@ -28,6 +27,7 @@ import org.springframework.dao.IncorrectResultSizeDataAccessException;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
@@ -196,6 +196,18 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
return template.exists(query, domainType, getCollectionName());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.FindDistinct#distinct(java.lang.String)
*/
@Override
public TerminatingDistinct<Object> distinct(String field) {
Assert.notNull(field, "Field must not be null!");
return new DistinctOperationSupport<>(this, field);
}
private Flux<T> doFind(@Nullable FindPublisherPreparer preparer) {
Document queryObject = query.getQueryObject();
@@ -205,6 +217,13 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
preparer != null ? preparer : getCursorPreparer(query));
}
@SuppressWarnings("unchecked")
private Flux<T> doFindDistinct(String field) {
return template.findDistinct(query, field, getCollectionName(), domainType,
returnType == domainType ? (Class<T>) Object.class : returnType);
}
private FindPublisherPreparer getCursorPreparer(Query query) {
return template.new QueryFindPublisherPreparer(query, domainType);
}
@@ -216,5 +235,55 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
private String asString() {
return SerializationUtils.serializeToJsonSafely(query);
}
/**
* @author Christoph Strobl
* @since 2.1
*/
static class DistinctOperationSupport<T> implements TerminatingDistinct<T> {
private final String field;
private final ReactiveFindSupport delegate;
public DistinctOperationSupport(ReactiveFindSupport delegate, String field) {
this.delegate = delegate;
this.field = field;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.DistinctWithProjection#as(java.lang.Class)
*/
@Override
public <R> TerminatingDistinct<R> as(Class<R> resultType) {
Assert.notNull(resultType, "ResultType must not be null!");
return new DistinctOperationSupport<>((ReactiveFindSupport) delegate.as(resultType), field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation.DistinctWithQuery#matching(org.springframework.data.mongodb.core.query.Query)
*/
@Override
@SuppressWarnings("unchecked")
public TerminatingDistinct<T> matching(Query query) {
Assert.notNull(query, "Query must not be null!");
return new DistinctOperationSupport<>((ReactiveFindSupport<T>) delegate.matching(query), field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core..ReactiveFindOperation.TerminatingDistinct#all()
*/
@Override
public Flux<T> all() {
return delegate.doFindDistinct(field);
}
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2017 the original author or authors.
* Copyright 2017-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.

Some files were not shown because too many files have changed in this diff Show More