Compare commits

..

245 Commits

Author SHA1 Message Date
Oliver Gierke
bf61c5782e DATAMONGO-1535 - Release version 2.0 M2 (Kay). 2017-04-04 21:12:35 +02:00
Oliver Gierke
431e71f4a0 DATAMONGO-1535 - Prepare 2.0 M2 (Kay). 2017-04-04 21:12:02 +02:00
Oliver Gierke
3192d7dd78 DATAMONGO-1535 - Updated changelog. 2017-04-04 21:11:53 +02:00
Christoph Strobl
d7a6594933 DATAMONGO-1655 - Remove obsolete build profiles.
And add snapshot profile for 3.5.0.
2017-04-03 18:37:19 +02:00
Christoph Strobl
609d6d5a19 DATAMONGO-1656 - Upgrade to MongoDB Driver 3.4.2 and Reactive Streams Driver 1.3.0.
Prepare issue branch.
2017-04-03 18:37:06 +02:00
Mark Paluch
19c8788376 DATAMONGO-1643 - Polishing.
Fix documentation for namespace types. Remove unused ReflectiveDBCollectionInvoker. Simplify tests, align bean name to mongoClient. Remove injection of Mongo in favor of injecting MongoTemplate.

Remove throws declaration from AbstractMongoConfiguration.mongoClient(). MongoClient creation does not throw checked exceptions so throwing an Exception is no longer required. Replace Mongo by MongoClient in reference documentation.

Original pull request: #451.
2017-04-03 15:36:58 +02:00
Christoph Strobl
db9934c7d8 DATAMONGO-1643 - Replace references to Mongo by MongoClient.
Remove and replace usage of "mongo" by "mongoClient". This involves xsd schema, bean names, constructor and parameter types. This required some API changes as some server commands are no longer directly available through the api, but have to be invoked via runCommand.

Also remove references to outdated API using Credentials and an authentication DB instead of MongoCredentials for authentication.

Updated and removed (unused) tests; Altered documentation.

Original pull request: #451.
2017-04-03 15:35:43 +02:00
Christoph Strobl
b5679744e7 DATAMONGO-1643 - Add and use 2.0 namespace xsd.
Original pull request: #451.
2017-04-03 15:35:27 +02:00
Mark Paluch
b1acda4188 DATAMONGO-1610 - Support RxJava 2 repositories.
Add RxJava 2 dependency. Add test to verify RxJava 2 interoperability.

Original Pull Request: #440
2017-04-03 14:57:11 +02:00
Christoph Strobl
2ab466eb35 DATAMONGO-1447 - Add support for isolations on Update.
We now allow usage of the $isolated update operator via Update.isolated().
In case isolated is set the query involved in MongoOperations.updateMulti will be enhanced by '$isolated' : 1 in case the isolation level has not already been set explicitly via eg. new BasicQuery("{'$isolated' : 0}").

Original pull request: #371.
2017-04-03 13:53:00 +02:00
Mark Paluch
e9da449667 DATAMONGO-1559 - Drop collections used in ReactiveMongoTemplateExecuteTests before tests.
We now drop the collections used in ReactiveMongoTemplateExecuteTests on test start to create a clean state for the tests. Previously, collections were dropped after the tests only so existing data in the collections could interfere with the tests themselves.
2017-03-27 08:40:10 +02:00
John Blum
6bc72a654f DATAMONGO-1648 - Rename getRepositoryFactoryClassName to getRepositoryFactoryBeanClassName. 2017-03-24 13:36:11 -07:00
Mark Paluch
4fd9edf585 DATAMONGO-1559 - Polishing.
Migrate off deprecated Cancellation API to Disposable.
2017-03-24 17:46:41 +01:00
Mark Paluch
955597bb54 DATAMONGO-1559 - Migrate reactive tests from TestSubscriber to StepVerifier. 2017-03-24 17:42:31 +01:00
Oliver Gierke
f59bbd351d DATAMONGO-1647 - Polishing. 2017-03-24 12:25:21 +01:00
Oliver Gierke
43ab3cad13 DATAMONGO-1647 - Switched to use IdentifierAccessor.getRequiredIdentifier() in MongoTemplate.doSaveVersioned(…). 2017-03-24 12:24:57 +01:00
Oliver Gierke
5ba46dadb8 DATAMONGO-1609 - Switched to Mockito from parent POM.
Moved of deprecated runner declarations.
2017-03-24 08:54:29 +01:00
Oliver Gierke
a245c0f280 DATAMONGO-1609 - Adapt to API changes in Spring Data Commons. 2017-03-24 08:54:29 +01:00
Christoph Strobl
288f244c34 DATAMONGO-1609 - Fix failing tests.
Fix issues pointed out by failing tests. Main focus was to restore functionality and not a Java 8 code cleanup. So, this one still needs some love and polishing.
2017-03-24 08:54:29 +01:00
Christoph Strobl
90bb6262f9 DATAMONGO-1609 - Fix compile errors.
Still way to go:
  - Failures: 113, Errors: 836, Skipped: 16
2017-03-24 08:54:29 +01:00
Oliver Gierke
826d00afa7 DATAMONGO-1609 - Hacking. 2017-03-24 08:54:29 +01:00
Mark Paluch
ac3f7dbf99 DATAMONGO-1645 - Polishing.
Clean up appender and log level after test run. Suppress log output during tests.

Original pull request: #450.
2017-03-21 10:52:14 +01:00
Christoph Strobl
f70e1fa291 DATAMONGO-1645 - Safely serialize JSON output for log message in LoggingEventListener.
We now make sure to safely serialize JSON output for mapped documents. This prevents the logger from rendering false exception messages to log appender.

Original pull request: #450.
2017-03-21 10:45:13 +01:00
Mark Paluch
a84c4b064d DATAMONGO-1637 - Polishing.
Move aggregation options conversion to AggregationOptions.getMongoAggregationOptions(). Allow cursor options to control cursor batch size. Add command logging to stream execution. Rearrange method order. Close cursor in tests. Change author name from user name to full name.

Original pull request: #447.
2017-03-21 09:57:25 +01:00
Mainder Singh
1a65828365 DATAMONGO-1637 - Add support for aggregation result streaming.
We now support aggregation result streaming backed by a MongoDB cursor. Result streaming fetches aggregation results in batches from MongoDB and converts results as they are retrieved through the iterator.

Aggregation aggregation = …

CloseableIterator<TagCount> results = mongoOperations.aggregateStream(aggregation, "inputCollection", TagCount.class);

List<TagCount> tagCount = new ArrayList<TagCount>();
while (results.hasNext()) {
	tagCount.add(results.next());
}

results.close();

Original pull request: #447.
2017-03-21 09:56:53 +01:00
Mark Paluch
f4f5e02e66 DATAMONGO-1620 - Polishing.
Fix test method name. Add JavaDoc.

Original pull request: #449.
2017-03-13 16:28:54 +01:00
Christoph Strobl
4b8d35262f DATAMONGO-1620 - Add server-selection-timeout to XML MongoClientOptions config.
We now allow server-selection-timeout attribute on MongoClientOptions XML configuration for a MongoDB 3.x client.

Original pull request: #449.
2017-03-13 16:28:49 +01:00
Mark Paluch
b486fec048 DATAMONGO-1641 - Remove formatting config from the repository.
See https://github.com/spring-projects/spring-data-build/tree/master/etc/ide for most recent IDE settings.
2017-03-08 11:45:42 +01:00
Mark Paluch
50fcbd18f2 DATAMONGO-1421 - Polishing.
Remove trailing whitespaces. Construct exception message with String.format(…).

Original pull request: #448.
2017-03-08 08:56:03 +01:00
Christoph Strobl
c37dfd9688 DATAMONGO-1421 - Fix serialization in error message causing error itself.
We now make sure to safely serialize the criteria object used for creating the error message when raising an `InvalidMongoDbApiUsageException` in cases where `addCriteria` is used to add multiple entries for the same property.

Original pull request: #448.
2017-03-08 08:55:43 +01:00
Oliver Gierke
98d655f4e2 DATAMONGO-1639 - Make sure BeforeConvertEvent sees new version for updates.
The changes for DATAMONGO-1617 subtley changed the behavior for entity updates in terms of the version value they see for entities using optimistic locking. Previously the updates already saw the new version value, where after the fix for DATAMONGO-1617 it saw the old one. That caused BeforeConvertEvent listeners not being able to distinguish between an original insert and the first update anymore.

This change is now rolled back and we introduced a test case that encodes this expectation explicitly.
2017-03-06 16:32:19 +01:00
Oliver Gierke
98a9b66e6b DATAMONGO-1617 - Reinstantiate version property initialization before BeforeConvertEvent publication.
Related pull request: #443.
2017-03-03 08:46:28 +01:00
Oliver Gierke
827b6204a9 DATAMONGO-1617 - Polishing.
Some cleanups in MongoTemplateTests. Removed manual ID assignment in general id handling test to make sure we use the id generation. Removed unneccessary code from domain type in favor of Lombok.

Original pull request: #443.
2017-03-03 08:36:21 +01:00
Laszlo Csontos
781717faa8 DATAMONGO-1617 - BeforeConvertEvent is now emitted before updatable idendifier assertion.
We now make sure the BeforeConvertEvent is published before we check for identifier types that can potentially be auto-generated. That allows the event listeners to populate identifiers. Previously the identifier check kicked in before that and thus caused the listener not being able to populate the property.

Original pull request: #443.
2017-03-03 08:36:05 +01:00
Oliver Gierke
084a167e20 DATAMONGO-1598 - Updated changelog. 2017-03-02 11:11:03 +01:00
Mark Paluch
456ae2459f DATAMONGO-1605 - Polishing.
Remove additional quoting around JSON serialization because JSON serialization adds quotes to a string. Reformat code.
2017-03-01 16:14:03 +01:00
Christoph Strobl
d079edb92c DATAMONGO-1605 - Retain type of SpEL expression result when used in @Query.
Fix issue where any result of a SpEL expression had been treated as quoted String within the resulting MongoDB query.
2017-03-01 16:14:00 +01:00
Mark Paluch
eb97944fd4 DATAMONGO-1600 - Make GraphLookupOperationBuilder public.
Make GraphLookupOperationBuilder public so it can be used in types outside the aggregation package.

Original Pull Request: #437
2017-03-01 12:55:39 +01:00
Mark Paluch
e2d6f187c2 DATAMONGO-1603 - Polishing.
Remove code that became unused. Reformat code. Extend years in copyright header.

Original pull request: #441.
2017-03-01 08:54:32 +01:00
Christoph Strobl
8068e36679 DATAMONGO-1603 - Fix Placeholder not replaced correctly in @Query.
Fix issues when placeholders are appended with other chars eg. '?0xyz' or have been reused multiple times within the query. Additional tests and fixes for complex quoted replacements eg. in regex query. Rely on placeholder quotation indication instead of binding one. Might be misleading when placeholder is used more than once.

Original pull request: #441.
2017-03-01 08:52:18 +01:00
Christoph Strobl
c62d13154f DATAMONGO-1608 - Polishing.
Throw an IllegalArgumentException when trying to create a query using 'null' as an argument for queries resulting in a $regex query operator.

Original Pull Request: #439
2017-02-13 08:33:12 +01:00
Edward Prentice
7d53f21d58 DATAMONGO-1608 - Add guard against NPE in MongoQueryCreator when using IgnoreCase.
Original Pull Request: #439
2017-02-13 08:28:26 +01:00
Christoph Strobl
0000a8fd11 DATAMONGO-1607 - Polishing.
Move coordinate conversion to dedicated method. Additionally fix issue with assertions applied to late in the chain and added some tests.

Original Pull Request: #438
2017-02-10 14:24:37 +01:00
Thiago Diniz da Silveira
fad9756a93 DATAMONGO-1607 - Fix ClassCastException in Circle, Point and Sphere when coordinates are not Double.
Original Pull Request: #438
2017-02-10 14:19:06 +01:00
Mark Paluch
54acc4934c DATAMONGO-1602 - Remove references to Assert single-arg methods.
Replace references to Assert single-arg methods with references to methods accepting the test object and message.

Related ticket: SPR-15196.
2017-02-01 11:08:11 +01:00
Oliver Gierke
2e593bb9b2 DATAMONGO-1573 - Updated changelog. 2017-01-26 12:12:37 +01:00
Oliver Gierke
2bf32a25be DATAMONGO-1574 - Updated changelog. 2017-01-26 12:12:07 +01:00
Christoph Strobl
148c4f9e24 DATAMONGO-1517 - Polishing.
Remove ReflectiveSimpleTypes in favor of MongoSimpleTypes.
Add add integration test.
2017-01-25 17:12:31 +01:00
Mark Paluch
ae6171802e DATAMONGO-1517 - Add support for Decimal128 BSON type.
Support Decimal128 as Mongo simple type if present. Decimal128 is stored as NumberDecimal.

class Person {

  String id;
  Decimal128 decimal128;

  Person(String id, Decimal128 decimal128) {
    this.id = id;
    this.decimal128 = decimal128;
  }
}

mongoTemplate.save(new Person("foo", new Decimal128(new BigDecimal("123.456"))));

is represented as:

{ "_id" : "foo", "decimal128" : NumberDecimal("123.456") }
2017-01-25 17:12:31 +01:00
Mark Paluch
1b97d1d1d0 DATAMONGO-1596 - Fix typo in JavaDoc.
Use correct @RelatedDocument annotation in MongoDB cross store reference documentation.
2017-01-25 16:53:41 +01:00
Mark Paluch
91495825a5 DATAMONGO-1575 - Polishing.
Extend year range in license headers. Use MongoDB JSON serializer for String escaping. Move unquoting/quote checking to inner QuotedString utility class. Reformat code.
2017-01-25 11:44:12 +01:00
Christoph Strobl
18e6b9cfa7 DATAMONGO-1575 - Escape Strings correctly.
Use regex groups and parameter index values for replacement in string based queries.
2017-01-25 11:44:09 +01:00
Christoph Strobl
0e0b8d5f79 DATAMONGO-1594 - Update "what’s new" section in reference documentation. 2017-01-23 08:23:51 +01:00
Oliver Gierke
11882724fa DATAMONGO-1592 - Adapt AuditingEventListenerUnitTests to changes in core auditing.
The core auditing implementation now skips the invocation of auditing in case the candidate aggregate doesn't need any auditing in the first place. We needed to adapt the sample class we use to actually carry the necessary auditing annotations.

Related ticket: DATACMNS-957.
2017-01-20 16:36:45 +01:00
Oliver Gierke
eb2a58cdbe DATAMONGO-1590 - Polishing.
Removed some compiler warnings. Hide newly introduced class in package scope and made use of Lombok annotations to avoid boilerplate code.

Original pull request: #436.
2017-01-18 19:42:46 +01:00
Christoph Strobl
8c9bbf7f91 DATAMONGO-1590 - EntityInformation selected now correctly considers Persistable.
We now wrap the MappingMongoEntityInformation into one that delegates the methods implemented by Persistable to the actual entity in case it implements said interface.

Original pull request: #436.
2017-01-18 19:42:42 +01:00
Mark Paluch
63fc047160 DATAMONGO-1581 - Expose ReactiveMongoRepositoryConfigurationExtension as public type.
Expose ReactiveMongoRepositoryConfigurationExtension so configuration extensions such as Spring Boot's ReactiveMongoRepositoriesAutoConfigureRegistrar can pick it up and reuse the repository configuration extension.
2017-01-17 14:14:08 +01:00
Mark Paluch
44e872c7df DATAMONGO-1588 - Polishing.
Remove unused fields. Fix typo in method name. Reformat inner class to align formatting.

Original pull request: #435.
2017-01-16 09:29:10 +01:00
Christoph Strobl
1135e90be0 DATAMONGO-1588 - Fix derived finder not accepting subclass of parameter type.
We now allow using sub types as arguments for derived queries. This makes it possible to use eg. a GeoJsonPoint for querying while the declared property type in the domain object remains a regular (legacy) Point.

Original pull request: #435.
2017-01-16 09:29:06 +01:00
Mark Paluch
65da90afd8 DATAMONGO-1589 - Update project documentation with the CLA tool integration. 2017-01-13 11:46:58 +01:00
Mark Paluch
4856c9b4f5 DATAMONGO-1587 - Polishing.
Convert @see http://… links to valid format using @see <a href=…>…</a>
2017-01-12 17:24:03 +01:00
Mark Paluch
644c1a2c89 DATAMONGO-1587 - Migrate ticket references in test code to Spring Framework style. 2017-01-12 16:51:57 +01:00
Mark Paluch
8df9d30d2e DATAMONGO-1586 - Consider field name in TypeBasedAggregationOperationContext.getReferenceFor(…).
We now consider the provided field name (alias) in mapped fields with which it is exposed. The field name applies to the exposed field after property path resolution in TypeBasedAggregationOperationContext. Previously, the field reference used the property name which caused fields to be considered non-aliased, so aggregation projection operations dropped the alias and exposed the field with its leaf property name.

Original Pull Request: #434
2017-01-12 10:34:02 +01:00
Christoph Strobl
90ae6d1805 DATAMONGO-1585 - Polishing.
Update documentation for better readability in html and pdf format.

Original Pull Request: #433
2017-01-12 10:19:55 +01:00
Mark Paluch
1fe79f1194 DATAMONGO-1585 - Expose synthetic fields in $project aggregation stage.
Field projections now expose their fields as synthetic simple fields. Projection aggregation stage redefines the available field set available for later aggregation stages entirely so projected fields are considered synthetic. A simple synthetic field has no target field which causes later aggregation stages to not pick up the underlying target but the exposed field name when rendering aggregation operations to Mongo documents.

The change is motivated by a bug where previously an aggregation consisting of projection of an aliased field and sort caused the sort projection stage to render with the original field name instead of the aliased field. The sort did not apply any sorting since projection redefines the available field set entirely and the original field is no longer accessible.

Original Pull Request: #433
2017-01-12 10:19:10 +01:00
Christoph Strobl
2c6bd6ecea DATAMONGO-1576 - Update lifecycle event documentation.
Add note on lifecycle event handling for property types.
2017-01-11 13:10:25 +01:00
Mark Paluch
fc5bb3f1d3 DATAMONGO-1578 - Polishing.
Add ticket references to test methods. Extend license years in copyright header.

Original pull request: #398.
2017-01-02 11:46:57 +01:00
Martin Macko
d4d9c7673a DATAMONGO-1578 - Add missing @Test annotation to ProjectionOperationUnitTests.
Original pull request: #398.
2017-01-02 11:45:04 +01:00
Mark Paluch
20e2cfb273 DATAMONGO-1508 - Improve reference documentation.
Replace Spring Data Document with Spring Data MongoDB. Extend copyright year range. Replace static Spring version leftover with variable. Fix typos.
2017-01-02 11:19:59 +01:00
Lukasz Kryger
634f618eb1 DATAMONGO-1577 - Fix wording repetition in MongoRepository JavaDoc.
Original pull request: #407.
2017-01-02 11:19:59 +01:00
Ken Dombeck
c6e2662151 DATAMONGO-1577 - Fix Reference and JavaDoc spelling issues.
Replaced invalid class name MongoMappingConverter with actual class name of MappingMongoConverter. Fix typos.

Original pull request: #425.
2017-01-02 11:19:56 +01:00
Mark Paluch
51ae618585 DATAMONGO-1508 - Polishing.
Highlight attribute name. Replace tabs with spaces.

Original pull request: #399.
2017-01-02 11:14:25 +01:00
John Lilley
b584f04a41 DATAMONGO-1508 - Document authentication-dbname attribute in db-factory.
Original pull request: #399.
2017-01-02 11:14:25 +01:00
Oliver Gierke
6e1e9967af DATAMONGO-1522 - Updated changelog. 2016-12-21 19:35:37 +01:00
Oliver Gierke
af49230093 DATAMONGO-1469 - Updated changelog. 2016-12-21 18:43:02 +01:00
Oliver Gierke
3355a436c8 DATAMONGO-1467 - Polishing.
Original pull request: #431.
2016-12-20 14:26:40 +01:00
Christoph Strobl
9a0241992e DATAMONGO-1467 - Add support for MongoDB 3.2 partialFilterExpression for index creation.
We now support partial filter expression on indexes via Index.partial(…). This allows to create partial indexes that only index the documents in a collection that meet a specified filter expression.

new Index().named("idx").on("k3y", ASC).partial(filter(where("age").gte(10)))

The filter expression can be set via a plain DBObject or a CriteriaDefinition and is mapped against the associated domain type.

Original pull request: #431.
2016-12-20 14:19:28 +01:00
Oliver Gierke
779145645d DATAMONGO-1565 - Polishing.
Formatting in ExpressionEvaluatingParameterBinder and StringBasedMongoQueryUnitTests. Turned Placeholder into value object.
2016-12-20 11:37:28 +01:00
Mark Paluch
c0f3255e26 DATAMONGO-1565 - Polishing.
Consider quoted/unquoted parameter use with the same parameter reference. Extend date range in license headers.
2016-12-20 11:37:05 +01:00
Christoph Strobl
b1de52f05a DATAMONGO-1565 - Ignore placeholder pattern in replacement values for annotated queries.
We now make sure to quote single and double ticks in the replacement values before actually appending them to the query. We also replace single ticks around parameters in the actual raw annotated query by double quotes to make sure they are treated as a single string parameter.
2016-12-20 11:30:02 +01:00
Mark Paluch
646b525d86 DATAMONGO-1564 - Polishing.
Fix JavaDoc references and minor a import formatting issue.

Original pull request: #429.
2016-12-16 14:10:15 +01:00
Christoph Strobl
2c29f204c3 DATAMONGO-1564 - Split up AggregationExpressions.
Refactored to multiple smaller Aggregation Operator classes reflecting the grouping (array operators, string operators,…) predefined by MongoDB.

Original pull request: #429.
2016-12-16 14:00:22 +01:00
Mark Paluch
9cd44faeb7 DATAMONGO-1567 - Use newer Java 8 on Travis CI. 2016-12-16 10:31:22 +01:00
Mark Paluch
362de45664 DATAMONGO-1533 - Polishing.
Extend JavaDoc. Minor reformatting.

Original pull request: #428.
2016-12-16 09:24:42 +01:00
Christoph Strobl
b7317892a2 DATAMONGO-1533 - Add AggregationExpression derived from SpEL AST.
We added an AggregationExpression that renders a MongoDB Aggregation Framework expression from the AST of a SpEL expression. This allows usage with various stages (eg. $project, $group) throughout the aggregation support.

  // { $and: [ { $gt: [ "$qty", 100 ] }, { $lt: [ "$qty", 250 ] } ] }
  expressionOf("qty > 100 && qty < 250);

  // { $cond : { if : { $gte : [ "$a", 42 ]}, then : "answer", else : "no-answer" } }
  expressionOf("cond(a >= 42, 'answer', 'no-answer')");

Original pull request: #428.
2016-12-16 09:21:27 +01:00
Mark Paluch
35f43e9ab8 DATAMONGO-1566 - Adapt API in ReactiveMongoRepositoryFactoryBean.
Related tickets: DATACMNS-891.
2016-12-16 09:21:27 +01:00
Oliver Gierke
a77c5b6e1d DATAMONGO-1566 - Adapt API in MongoRepositoryFactoryBean.
Related tickets: DATACMNS-891.
2016-12-15 16:17:54 +01:00
Christoph Strobl
5cf8ec3e55 DATAMONGO-1552 - Polishing.
Updated doc, removed whitespaces, minor method wording changes.

Original Pull Request: #426
2016-12-14 13:09:06 +01:00
Mark Paluch
450549150d DATAMONGO-1552 - Add $facet aggregation stage.
Original Pull Request: #426
2016-12-14 12:00:30 +01:00
Mark Paluch
b7a0b1d523 DATAMONGO-1552 - Add $bucketAuto aggregation stage.
Original Pull Request: #426
2016-12-14 11:46:54 +01:00
Mark Paluch
5c5c616be9 DATAMONGO-1552 - Add $bucket aggregation stage.
Original Pull Request: #426
2016-12-14 11:32:50 +01:00
Mark Paluch
0bd98d67c8 DATAMONGO-442 - Polishing.
Reformat code according to Spring Data style. Add test for authenticated use. Add JavaDoc to newly introduced methods. Allow configuration of an authentication database. Update reference documentation.

Original pull request: #419.
2016-12-13 16:51:02 +01:00
Ricardo Espirito Santo
8302727b23 DATAMONGO-442 - Support username/password authentication with MongoLog4jAppender.
Added optional username and password for authentication support on Log4jAppender.

Original pull request: #419.
2016-12-13 16:48:02 +01:00
Christoph Strobl
86dbd95220 DATAMONGO-1551 - Polishing.
Add startWith overload allowing to mix expressions, removed white spaces, updated doc.

Original Pull Request: #424
2016-12-13 15:15:00 +01:00
Mark Paluch
3d0750afc5 DATAMONGO-1551 - Add $graphLookup aggregation stage.
We now support the $graphLookup aggregation pipeline stage via Aggregation to perform recursive lookup adding the lookup result as array to documents entering $graphLookup.

TypedAggregation<Employee> agg = Aggregation.newAggregation(Employee.class,
		graphLookup("employee")
			.startWith("reportsTo")
			.connectFrom("reportsTo")
			.connectTo("name")
			.depthField("depth")
			.maxDepth(5)
			.as("reportingHierarchy"));

Original Pull Request: #424
2016-12-13 14:44:53 +01:00
Christoph Strobl
1bf8eb09ca DATAMONGO-1550 - Polishing $replaceRoot (aggregation stage).
Original Pull Request: #422.
2016-12-13 08:18:40 +01:00
Mark Paluch
ae4cfaa58c DATAMONGO-1550 - Add $replaceRoot aggregation stage.
We now support the $replaceRoot stage in aggregation pipelines. $replaceRoot can reference either a field, an aggregation expression or it can be used to compose a replacement document.

newAggregation(
	replaceRoot().withDocument()
		.andValue("value").as("field")
		.and(MULTIPLY.of(field("total"), field("discounted")))
);

newAggregation(
	replaceRoot("item")));

Original Pull Request: #422
2016-12-13 08:18:40 +01:00
Christoph Strobl
1dea009e32 DATAMONGO-1549 - Polishing $count (aggregation stage).
Original Pull Request: #422
2016-12-13 08:18:27 +01:00
Mark Paluch
4c07235107 DATAMONGO-1549 - Add $count aggregation stage.
We now support the $count stage in aggregation pipelines.

newAggregation(
	match(where("hotelCode").is("0360")),
	count().as("documents"));

Original Pull Request: #422
2016-12-13 08:18:10 +01:00
Christoph Strobl
5ebbf93cf9 DATAMONGO-1558 - Upgrade MongoDB server version to, and add build profile for MongoDB 3.4.
Added MongoDB 3.4 profile to pom.xml and upgraded to MongoDB 3.4 on travis-ci.
Delete assertion checking property that has been removed in MongoDB 3.4 (see: https://jira.mongodb.org/browse/SERVER-24928)
2016-12-13 08:18:10 +01:00
Mark Paluch
794756d055 DATAMONGO-1548 - Polishing.
Enhance JavaDoc. Minor formatting. Fix typos.

Original pull request: #423.
2016-12-12 12:58:36 +01:00
Christoph Strobl
4fcc09c6c1 DATAMONGO-1548 - Add support for MongoDB 3.4 aggregation operators.
We now support the following MongoDB 3.4 aggregation operators:

$indexOfBytes, $indexOfCP, $split, $strLenBytes, $strLenCP, $substrCP, $indexOfArray, $range, $reverseArray, $reduce, $zip, $in, $isoDayOfWeek, $isoWeek, $isoWeekYear, $switch and $type.

Original pull request: #423.
2016-12-12 12:58:32 +01:00
Mark Paluch
d297f5a253 DATAMONGO-1538 - Polishing.
Use InheritingExposedFieldsAggregationOperationContext instead of anonymous context class for condition mapping. Drop aggregation input collections before tests. Minor reformatting.

Original pull request: #417.
2016-12-07 10:01:20 +01:00
Christoph Strobl
696e53ff60 DATAMONGO-1538 - Add support for $let to aggregation.
We now support $let in aggregation $project stage.

ExpressionVariable total = newExpressionVariable("total").forExpression(ADD.of(field("price"), field("tax")));
ExpressionVariable discounted = newExpressionVariable("discounted").forExpression(Cond.when("applyDiscount").then(0.9D).otherwise(1.0D));

newAggregation(Sales.class,
	project()
		.and(define(total, discounted)
			.andApply(MULTIPLY.of(field("total"), field("discounted"))))
		.as("finalTotal"));

Original pull request: #417.
2016-12-07 10:01:17 +01:00
Christoph Strobl
f512d8cb16 DATAMONGO-1542 - Polishing.
Added some static entry points for better readability.

Original Pull Request: #421
2016-12-06 10:08:05 +01:00
Mark Paluch
b98bc0e2bf DATAMONGO-1542 - Refactor CondOperator and IfNullOperator to children of AggregationExpressions.
Renamed CondOperator to Cond and IfNullOperator to IfNull. Both conditional operations are now available from ConditionalOperators.when and ConditionalOperators.ifNull and accept AggregationExpressions for conditions and values.

Original Pull Request: #421
2016-12-06 10:06:58 +01:00
Christoph Strobl
f64d205522 DATAMONGO-1520 - Add overload for aggregation $match accepting CriteriaDefinition.
We now also accept CriteriaDefinition next to Criteria for Aggregation.match. The existing match(Criteria) method remains to preserve binary compatibility.
2016-12-06 09:05:39 +01:00
Mark Paluch
853b2b2d5c DATAMONGO-1540 - Polishing.
Reduce Map aggregation expression builder entrypoint. Fix JavaDoc.

Original pull request: #420.
2016-12-05 16:46:50 +01:00
Christoph Strobl
c3f9af01e6 DATAMONGO-1540 - Add support for $map (aggregation).
We now support $map operator in aggregation.

Original pull request: #420.
2016-12-05 16:46:41 +01:00
Oliver Gierke
655a1e0351 DATAMONGO-1547 - Register MongoRepositoryFactory in spring.factories.
This is required for the switch in support for multi-store detection.

Related ticket: DATACMNS-952.
2016-12-05 14:28:26 +01:00
Oliver Gierke
26395f1b78 DATAMONGO-1546 - Register GeoJsonConfiguration via spring.factories.
Related tickets: DATACMNS-952.
2016-12-05 14:24:18 +01:00
Oliver Gierke
ecd8dae876 DATAMONGO-1141 - Polishing.
Aligned assertion messages for consistency. Fixed imports in  UpdateMapperUnitTests.

Original pull request: #405.
2016-12-02 18:33:08 +01:00
Mark Paluch
4b59736f82 DATAMONGO-1141 - Polishing.
Add property to field name mapping for Sort orders by moving Sort mapping to UpdateMapper. Fix typo. Add JavaDoc. Reformat code. Remove trailing whitespaces.

Original pull request: #405.
2016-12-02 18:32:50 +01:00
Pavel Vodrážka
31d4434562 DATAMONGO-1141 - Add support for $push $sort in Update.
Sorting update modifier added. Supports sorting arrays by document fields and element values.

Original pull request: #405.
2016-12-02 18:26:09 +01:00
Oliver Gierke
f5a339bfe4 DATAMONGO-1454 - Polishing.
Formatting in test case.

Original pull request: #381.
2016-12-02 16:46:06 +01:00
Mark Paluch
5e60867750 DATAMONGO-1454 - Add support for exists projection in repository query methods.
We now support exists projections for query methods in query methods for derived and string queries.

public PersonRepository extends Repository<Person, String> {

  boolean existsByFirstname(String firstname);

  @ExistsQuery(value = "{ 'lastname' : ?0 }")
  boolean someExistQuery(String lastname);

  @Query(value = "{ 'lastname' : ?0 }", exists = true)
  boolean anotherExistQuery(String lastname);
}

Original pull request: #381.
2016-12-02 16:45:58 +01:00
Mark Paluch
8a5da0e737 DATAMONGO-1536 - Polishing.
Add JavaDoc. Change visibility of AbstractAggregationExpression.getMongoMethod() to protected.

Original pull request: #418.
2016-12-02 15:43:11 +01:00
Christoph Strobl
e01ebcf605 DATAMONGO-1536 - Add aggregation operators for array, arithmetic, date and set operations.
We now support the following aggregation framework operators:

- setEquals, setIntersection, setUnion, setDifference, setIsSubset, anyElementTrue, allElementsTrue
- stdDevPop, stdDevSamp
- abs, ceil, exp, floor, ln, log, log10, pow, sqrt, trunc
- arrayElementAt, concatArrays, isArray
- literal
- dayOfYear, dayOfMonth, dayOfWeek, year, month, week, hour, minute, second, millisecond, dateToString

Original pull request: #418.
2016-12-02 15:43:05 +01:00
Oliver Gierke
b01a34b2b1 DATAMONGO-1539 - Polishing.
Renamed @Count and @Delete to @CountQuery and @DeleteQuery. Minor polishing in test cases and test repository methods. JavaDoc, formatting.

Original pull request: #416.
2016-12-02 11:59:10 +01:00
Fırat KÜÇÜK
5650e35eb6 DATAMONGO-1539 - Introduce @CountQuery and @DeleteQuery.
Introducing dedicated annotations for manually defined count and delete queries to avoid misconfiguration and generally simplifying the declaration.

Original pull request: 416.
2016-12-02 11:59:10 +01:00
Christoph Strobl
e5a41ad7f9 DATAMONGO-1534 - Fix bulk operations missing to write type info.
We now correctly convert entities into their MongoDB representation including type information via _class property.

Original pull request: #415.
2016-11-28 09:13:57 +01:00
Christoph Strobl
dd7d25cdb3 DATAMONGO-1530 - Polishing.
Add missing transformations for ConstructorReference, OperatorNot, OpNE, OpEQ, OpGT, OpGE, OpLT, OpLE, OperatorPower, OpOr and OpAnd. This allows usage of logical operators &, || and ! as part of the expression, while ConstructorReference allows instantiating eg. arrays via an expression `new int[]{4,5,6}`. This can be useful eg. comparing arrays using $setEquals.

More complex aggregation operators like $filter can be created by defining the variable references as string inside the expression like filter(a, 'num', '$$num' > 10).
Commands like $let requires usage of InlineMap to pass in required arguments like eg. let({low:1, high:'$$low'}, gt('$$low', '$$high')).

Original Pull Request: #410
2016-11-25 17:14:17 +01:00
Sebastien Gerard
0ddd7e3afd DATAMONGO-1530 - Add support for missing MongoDB 3.2 aggregation pipeline operators.
Original Pull Request: #410
2016-11-25 15:46:49 +01:00
Mark Paluch
81c501dea3 DATAMONGO-784 - Polishing.
Add JavaDoc for compareValue.

Original pull request: #414.
2016-11-24 13:49:12 +01:00
Christoph Strobl
b1cd7cfa53 DATAMONGO-784 - Add support for comparison aggregation operators to group & project.
We now directly support comparison aggregation operators ($cmp, $eq, $gt, $gte, $lt, $lte and $ne) on both group and project stages.

Original pull request: #414.
2016-11-24 13:49:09 +01:00
Mark Paluch
2ffac0a74e DATAMONGO-1491 - Polishing.
Remove variable before returning value. Add generics for list creation.

Original pull request: #412.
2016-11-24 12:49:30 +01:00
Christoph Strobl
ded99a74a3 DATAMONGO-1491 - Add support for $filter (aggregation).
We new support $filter in aggregation pipeline.

Aggregation.newAggregation(Sales.class,
	Aggregation.project()
		.and(filter("items").as("item").by(GTE.of(field("item.price"), 100)))
		.as("items"))

Original pull request: #412.
2016-11-24 12:45:56 +01:00
Christoph Strobl
f275aaad7f DATAMONGO-1327 - Polishing.
Just added overloads for stdDevSamp and stdDevPop taking AggregationExpression and updated the doc.
Also replaced String operation based MongoDB operation building by using operators directly.

Original Pull Request: #360
2016-11-24 07:57:17 +01:00
gustavodegeus
5d2faef072 DATAMONGO-1327 - Added support for $stdDevSamp and $stdDevPop to aggregation $group stage.
Original Pull Request: #360
CLA: 171720160409030719 (Gustavo de Geus)
2016-11-24 07:56:33 +01:00
Oliver Gierke
2c3cbb3613 DATAMONGO-1527 - After release cleanups. 2016-11-23 10:59:11 +01:00
Oliver Gierke
2b635fa151 DATAMONGO-1527 - Prepare next development iteration. 2016-11-23 10:59:08 +01:00
Oliver Gierke
909fda2076 DATAMONGO-1527 - Release version 2.0 M1 (Kay). 2016-11-23 10:35:48 +01:00
Oliver Gierke
3b5d231529 DATAMONGO-1527 - Prepare 2.0 M1 (Kay). 2016-11-23 10:35:08 +01:00
Oliver Gierke
e90fe70ae1 DATAMONGO-1527 - Updated changelog. 2016-11-23 10:35:01 +01:00
Oliver Gierke
27d379ba71 DATAMONGO-1527 - Removed explicit reference to Spring Framework 5 snapshots. 2016-11-19 15:30:23 +01:00
Mark Paluch
e987a853ac DATAMONGO-1509 - Polishing.
Adopt type hint assertion for existing _class field checks. Simplify test code to use Collections.singletonList instead of Arrays.asList. Replace BasicDBList with List in JavaDoc. Use type inference for DocumentTestUtils.getAsDBList to avoid casts in test code. Extend documentation.

Original pull request: #411.
2016-11-17 15:54:10 +01:00
Christoph Strobl
3c16b4db7f DATAMONGO-1509 - Write type hint as last element of a Document.
Always add type hint as last property of a Document.
This is necessary to assure document equality within MongoDB in cases where the query contains full document comparisons. Unfortunately this also might break existing stuff since the order of properties within a Document is changed with this commit.

Original pull request: #411.
2016-11-17 15:12:36 +01:00
Mark Paluch
070d784be3 DATAMONGO-1176 - Exclude null id fields in bulk insert.
We no longer map _id fields into Document that are null to enforce MongoDB ObjectId generation on batch insert like insertAll.
2016-11-17 13:56:45 +01:00
Mark Paluch
af4f0e0913 DATAMONGO-1444 - Accept Collection and subtypes in ReactiveMongoOperations.insertAll(…). 2016-11-14 18:13:35 +01:00
Oliver Gierke
474af92075 DATAMONGO-1444 - Moved to new base class for reactive repository factories. 2016-11-14 18:13:34 +01:00
Oliver Gierke
99838c02fd DATAMONGO-1444 - Adapt to new changes in reactive repository configuration.
We now use the newly introduced ….useRepositoryConfiguration(…) in the module specific RepositoryConfigurationExtension implementations to distinguish between reactive and non-reactive repositories.

Removed RepositoryType class as it was only used by the previous repository typ detection.
2016-11-14 18:13:34 +01:00
Oliver Gierke
8060ebae6a DATAMONGO-1444 - Polishing.
Removed unused references to ConversionService from repository query implementations.
2016-11-14 18:13:34 +01:00
Mark Paluch
9e9495ee54 DATAMONGO-1444 - Make Project Reactor dependency no longer required for blocking repository usage.
Use ReactiveWrapperConverters for reactive wrapper type conversion to not require Project Reactor for blocking repository usage.
2016-11-14 18:13:34 +01:00
Christoph Strobl
098aae41b7 DATAMONGO-1444 - Update bulk insert operations.
Accept Mono<Collection<T>> instead of Publisher<T> to get rid of hidden buffer call inside of MongoTemplate.
2016-11-14 18:13:28 +01:00
Christoph Strobl
df859d0f3a DATAMONGO-1444 - Adopt changes in Spring Data Commons.
- Adopt RxJava to RxJava1 repository interface renaming.
- Remove ReactiveChunk, Slice and Page.
- Update documentation.
- Prevent sliced/paged query execution.
2016-11-14 18:13:25 +01:00
Christoph Strobl
e0f371f648 DATAMONGO-1444 - Polishing.
- Update Javadoc comments and reference documentation.
- Introduce Adapter for blocking IndexOperations.
- Remove transaction synchronization.
- Remove unused types.
- Remove dropDupos options from indexes.
- Prevent usage of Querydsl along with reactive repository.
- Use ReactiveQueryMethod in ReactiveMongoQuery.
2016-11-14 18:13:25 +01:00
Mark Paluch
2145e212ca DATAMONGO-1444 - Add support for RxJava wrapper types and slice queries.
Reactive MongoDB repository can now be composed from Project Reactor and RxJava types for method arguments and return types. Query methods and methods from the base/implementation classes can be invoked with a conversion of input/output types.
2016-11-14 18:13:25 +01:00
Mark Paluch
c814073441 DATAMONGO-1176 - Test code cleanup.
Replace DbObject in documentation with Document. Fix typos. Change dbObject variable names to document.
2016-11-14 18:13:24 +01:00
Mark Paluch
59573b10e6 DATAMONGO-1176 - Cleanup.
Replace DbObject in documentation with Document. Fix typos. Change dbObject variable names to document. Improve error messages. Remove unused code.
2016-11-14 18:13:24 +01:00
Christoph Strobl
054274392e DATAMONGO-1176 - Cleanup.
- Update licenses headers.
- Renname variables and methods from dbo -> document.
- Remove deprecations.
- Remove unused code blocks.
- Upgrade to MongoDB Java Driver 3.3
2016-11-14 18:13:24 +01:00
Mark Paluch
2d3efdc0b4 DATAMONGO-1176 - Polishing.
- Remove dropDups assertion as the MongoDB 3 driver does no longer provide dropDups even if running agains a MongoDB v2.6.7.
- Remove mongo-next build profile as we're based on the Mongo 3 driver now.
- Map update object and merge set operations.
2016-11-14 18:13:23 +01:00
Christoph Strobl
2461575c52 DATAMONGO-1176 - Switch to Document API.
We use the Document API when interacting with the MongoDB Java Driver. This allows us to make use of new features and enables us to use the Codec API and prepares the project for future enhancements concerning the drivers the reactive API.
2016-11-14 18:13:23 +01:00
Mark Paluch
4371760272 DATAMONGO-1461 - Upgrade Hibernate/Validator/JPA dependencies to match Spring 5 baseline.
Upgrade:
* Hibernate Validator to 5.2.4.Final
* JPA API to 2.1.1
* Hibernate Core to 5.2.1.Final
2016-11-14 17:49:48 +01:00
Mark Paluch
da5289fc18 DATAMONGO-1448 - Prepare 2.0 development.
Upgraded to Spring Data Build parent 2.0 snapshots and Spring Data Commons 2.0 snapshots. Removed obsolete distribution key property. Removed obsolete template.mf.
2016-11-14 17:49:48 +01:00
Oliver Gierke
ea9b402547 DATAMONGO-1525 - Improved creation of empty collections, esp. EnumSet.
We now use more type information to create a better empty collection in the first place. The previous algorithm always used an empty HashSet plus a subsequent conversion using the raw collection type. Especially the latter caused problems for EnumSets as the conversion into one requires the presence of component type information.

We now use Spring's collection factory and more available type information to create a proper collection in the first place and only rely on a subsequent conversion for arrays.
2016-11-13 18:33:45 +01:00
Oliver Gierke
255d32513c DATAMONGO-1502 - Updated changelog. 2016-11-03 18:56:45 +01:00
Oliver Gierke
6a9823fd24 DATAMONGO-1521 - Added Aggregation.skip(…) overload to support longs.
Deprecated the one taking an int.
2016-11-03 15:00:20 +01:00
Christoph Strobl
2ae75a4ff9 DATAMONGO-1500 - Fix JSON serialization error in derived queries with field spec.
We now make sure not to eagerly attempt to convert given query parameters into a mongo specific format by calling toString() the query object, but rather delegate this to another step later in the chain.

Original pull request: #404.
2016-11-03 09:36:42 +01:00
Christoph Strobl
9c20da3e8f DATAMONGO-1504 - Assert compatibility with MongoDB 3.4.
We now make sure to comply to the API requirements of mongo-java-driver 3.4 (in current beta1) by using empty DBObjects instead of null, ignoring non appropriate replication settings and cleaning up tests after execution.

Original pull request: #394.
2016-11-03 09:32:56 +01:00
Oliver Gierke
f782338581 DATAMONGO-1513 - Fixed identifier population for event listener generated, non-ObjectId on batch inserts.
The methods in MongoTemplate inserting a batch of documents previously only returned database generated identifiers, more especially ObjectId ones. This caused non-ObjectId identifiers potentially generated by other parties — i.e. an event listener reacting to a BeforeSaveEvent — not being considered for source object identifier population.

This commit adds a workaround augmenting the list of database generated identifiers with the ones actually present in the documents to be inserted. A follow-up ticket DATAMONGO-1519 was created to track the removal of the workaround in favor of a proper fix unfortunately requiring a change in public API (so a 2.0 candidate only).

Related tickets: DATAMONGO-1519.
2016-11-02 09:52:47 +01:00
Oliver Gierke
cb90bfc6a6 DATAMONGO-1514 - Polishing.
Extended license years in copyright header.

Original pull request: #401.
2016-10-27 14:32:37 +02:00
Martin Macko
189d4dd1b7 DATAMONGO-1514 - SpringDataMongodbQuery needs to be public.
SpringDataMongodbQuery is exposed publicly in QuerydslRepositorySupport, that's we've got to make it public to make sure class to the exposed methods from outside the package actually compile.

Original pull request: #401.
2016-10-27 14:31:46 +02:00
Christoph Strobl
10208001f8 DATAMONGO-1480 - Polishing.
Opened up Meta attributes to now allowing usage of more than one cursor option via dedicated enum.

new Query().noCursorTimeout();

and

interface PersonRepository extends CrudRepository<Person, String> {

    @Meta(flags = {CursorOptions.NO_TIMEOUT})
    Iterable<Person> findBy();
}

Original Pull Request: #390
2016-10-24 18:44:35 +02:00
Mark Paluch
98dca5a65e DATAMONGO-1480 - Add support for noCursorTimeout in Query.
We now allow setting noCursorTimeout for queries using `Query` and `@Meta`.

Query query = new Query().noCursorTimeout();

and

interface PersonRepository extends CrudRepository<Person, String> {

    @Meta(noCursorTimeout = true)
    Iterable<Person> findBy();

    @Meta(noCursorTimeout = true)
    Stream<Person> streamBy();
}

Original Pull Request: #390
2016-10-24 18:43:40 +02:00
Christoph Strobl
b6bc0ea316 DATAMONGO-1490 - Polishing.
Restore 1.8.xsd and create new 1.10 one reflecting the changes made.

Original Pull Request: #389
2016-10-24 13:26:51 +02:00
Mark Paluch
b67c551e19 DATAMONGO-1490 - Replace space indentation by tabs.
Original Pull Request: #389
2016-10-24 13:26:51 +02:00
Mark Paluch
e0bc1e0f20 DATAMONGO-1490 - Change the XML data type of boolean flags to String.
We now accept String data types for boolean flags in XML configurations. Boolean data types in the XSD don't allows use of property placeholders even if the resolved value could be converted to boolean. Affected fields by this change are:

* `<mongo:repositories create-query-indexes=… />`
* `<mongo:options ssl=…/>`
* `<mongo:client-options ssl=… />`

Original Pull Request: #389
2016-10-24 13:26:40 +02:00
Oliver Gierke
fcb436dd30 DATAMONGO-1495 - Updated changelog. 2016-09-29 14:42:09 +02:00
Oliver Gierke
0e57dd473c DATAMONGO-1499 - Updated changelog. 2016-09-29 11:42:09 +02:00
Oliver Gierke
8d36e42b1b DATAMONGO-1498 - Removed defaulting of MongoMappingContext for repositories and auditing.
Previously we created a default bean definition for MongoMappingContext if none was present in the application context. That lookup for an existing one unfortunately comes too early, especially with Spring Boot in place. This then caused the MappingContext not being aware of the custom conversions and simply types registered by Boot.

We now removed the defaulting relying on a MappingMongoConverter being present in the Application context (which usually is the case for the usage with AbstractMongoConfiguration or the XML <mongo:mapping-converter /> alternative. We use that bean to lookup the MappingContext.
2016-09-28 16:22:20 +02:00
Oliver Gierke
b66bfae105 DATAMONGO-1497 - MappingMongoConverter now consistently uses DbObjectAccessor.
We now use DbObjectAccessor also for preliminary inspections of the source DBObject (e.g. whether a value is present at all). Previously we operated on the DBObject directly which caused issues with properties mapped to nested fields as the keys weren't exploded correctly and thus the check always failed.
2016-09-22 17:55:50 +02:00
Oliver Gierke
eb3d55e0bd DATAMONGO-1494 - Updated changelog. 2016-09-21 08:04:20 +02:00
Oliver Gierke
84dbfdfd5e DATAMONGO-1450 - Updated changelog. 2016-09-21 08:04:14 +02:00
Mark Paluch
1813b1aea0 DATAMONGO-1493 - Fix minor typo in reference documentation.
Related pull request: #391.
2016-09-19 17:03:29 +02:00
Jordan Jennings
69241737b7 DATAMONGO-1493 - Fix minor typo in docs.
Original pull request: #391.
2016-09-19 16:21:12 +02:00
Christoph Strobl
f053bed447 DATAMONGO-1492 - Make o.s.d.m.core.aggregation.AggregationExpression public.
By turning `AggregationExpression` public we allow adding custom expressions without workarounds. It is now possible to create eg. `ProjectionOperation` like:

ProjectionOperation agg = Aggregation.project()
      .and(new AggregationExpression() {

        @Override
        public DBObject toDbObject(AggregationOperationContext context) {

          DBObject filterExpression = new BasicDBObject();
          filterExpression.put("input", "$x");
          filterExpression.put("as", "y");
          filterExpression.put("cond", new BasicDBObject("$eq", Arrays.<Object> asList("$$y.z", 2)));

          return new BasicDBObject("$filter", filterExpression);
        }
      }).as("profile");

Original pull request: #392.
2016-09-19 16:02:44 +02:00
Christoph Strobl
1a1cd9ef14 DATAMONGO-1485 - Consider potential custom conversion for enums in Querydsl paths.
We now take potential registered converters for enums into account when serializing path expressions via SpringDataMongodbSerializer.

Original pull request: #388.
2016-09-19 07:00:21 +02:00
Oliver Gierke
7effc0e10f DATAMONGO-1468 - Polishing.
Slightly touched test case.

Original pull request: #387
2016-09-08 10:29:23 +02:00
Christoph Strobl
395bb1faa4 DATAMONGO-1486 - Fix ClassCastException when mapping non-String Map key for updates.
We now make sure to convert Map keys into Strings when mapping update values for Map properties.

Original pull request: #387.
2016-09-08 10:29:22 +02:00
Christoph Strobl
eb1392cc1a DATAMONGO-861 - Polishing.
Favor usage of List over BasicDBList.
Rename ProjectionOperation.transform to applyCondition.
Add missing author and since tags, remove trailing white spaces and fix reference documentation headline clash.

Original Pull Request: #385
2016-08-30 14:40:24 +02:00
Mark Paluch
ace01e4e6d DATAMONGO-861 - Add support for $cond and $ifNull operators in aggregation operations.
We now support $cond and $ifNull operators for projection and grouping operations. ConditionalOperator and IfNullOperators are AggregationExpressions that can be applied to transform or generate values during aggregation.

TypedAggregation<InventoryItem> agg = newAggregation(InventoryItem.class,
  project().and("discount")
    .transform(ConditionalOperator.newBuilder().when(Criteria.where("qty").gte(250))
      .then(30)
      .otherwise(20))
    .and(ifNull("description", "Unspecified")).as("description")
);

corresponds to

{ "$project": { "discount": { "$cond": { "if": { "$gte": [ "$qty", 250 ] },
        "then": 30, "else": 20 } },
    "description": { "$ifNull": [ "$description", "Unspecified"] }
  }
}

Original Pull Request: #385
2016-08-30 14:39:43 +02:00
Mark Paluch
116dda63c2 DATAMONGO-1406 - Propagate PersistentEntity when mapping query criteria for nested keywords.
We now propagate the PersistentEntity when mapping nested keywords so that the criteria mapping chain for nested keywords and properties has now access to the PersistentEntity and can use configured field names.

Previously the plain property names have been used as field names and potential customizations via @Field have been ignored.

Original Pull Request: #384
2016-08-25 13:09:04 +02:00
Mark Paluch
4649872394 DATAMONGO-1465 - Polishing.
Replace boolean flag in convertAndJoinScriptArgs with literal. Joined args are rendered to JavaScript and require always string quotation.

Original pull request: #383.
2016-08-23 14:59:43 +02:00
Christoph Strobl
ecc6f3fc4e DATAMONGO-1465 - Fix String quotation in DefaultScriptOperations.execute().
This change prevents Strings from being quoted prior to sending them as args of a script.

Original pull request: #383.
2016-08-23 14:57:10 +02:00
Mark Paluch
512f68a611 DATAMONGO-1476 - Polishing.
Extend year range in license header. Use the specified collection name in doRemove.

Original pull request: #382.
2016-08-23 10:06:53 +02:00
Niko Schmuck
fbcd4ba367 DATAMONGO-1476 - Consistently use specified collection name in MongoTemplate.stream().
We now use the specified argument name instead of DBCollection.getName() when invoking ReadDbObjectCallback.

Original pull request: #382.
2016-08-23 10:01:59 +02:00
Oliver Gierke
760d7d6a32 DATAMONGO-1471 - Converter only applies identifier values if actually available.
Setting the value for the identifier property is an explicit step in MappingMongoConverter and always executed if the type to be created has an identifier property. If the source document doesn't contain an _id field (e.g. because it has been excluded explicitly) that previously caused null to be set on the identifier. This caused an exception if the identifier property is a primitive type.

We now explicitly check whether the field backing the identifier property is actually present in the source document and only explicitly set the value if so.
2016-08-17 16:54:10 +02:00
Oliver Gierke
29a6688e8c DATAMONGO-1470 - AbstractMongoConfiguration now supports multiple base packages for @Document scanning.
Introduced AbstractMongoConfiguration.getMappingBasePackages() to return multiple ones over the previously existing ….getMappingBasePackage(). The former is now used by the code triggering the scanning using what the latter returns by default.
2016-07-28 13:10:48 +02:00
Oliver Gierke
b6e7683202 DATAMONGO-1409 - After release cleanups. 2016-07-27 14:32:37 +02:00
Oliver Gierke
286a977575 DATAMONGO-1409 - Prepare next development iteration. 2016-07-27 14:32:35 +02:00
Oliver Gierke
d7f70e219b DATAMONGO-1409 - Release version 1.10 M1 (Ingalls). 2016-07-27 13:52:12 +02:00
Oliver Gierke
728cc390f6 DATAMONGO-1409 - Prepare 1.10 M1 (Ingalls). 2016-07-27 13:51:38 +02:00
Oliver Gierke
ddcc3914ff DATAMONGO-1409 - Updated changelog. 2016-07-27 13:51:32 +02:00
Oliver Gierke
9a385599af DATAMONGO-1394 - Polishing.
Some internal refactorings to avoid deeply nested if-clauses.

Original pull request: #373.
2016-07-27 10:54:21 +02:00
Christoph Strobl
c14c42fb0c DATAMONGO-1394 - Support identifier references on Querydsl expressions for DBRefs.
We now allow direct usage path.eq(…) on id properties of db referenced objects. This allows to write the query as person.coworker.id.eq(coworker.getId()) instead of person.coworker.eq(coworker). This helps building the query using just the plain id not having to actually create new object wrapping it.

Original pull request: #373.
2016-07-27 10:54:21 +02:00
Oliver Gierke
5a8e4f3dae DATAMONGO-1194 - Polishing.
Some missing JavaDoc and slight code polish.

Original pull request: #377.
2016-07-26 15:30:31 +02:00
Christoph Strobl
5d50155d81 DATAMONGO-1194 - Improve DBRef resolution for maps.
We bulk load maps of referenced objects as long as they are stored in the same collection. This reduces database roundtrips and network traffic.

Original pull request: #377.
2016-07-26 15:30:31 +02:00
Christoph Strobl
babab54ffd DATAMONGO-1194 - Improve DBRef resolution for collections.
We now bulk load collections of referenced objects as long as they are stored in the same collection. This reduces database roundtrips and network traffic.

Original pull request: #377.
2016-07-26 15:30:30 +02:00
Mark Paluch
1ba137b98a DATAMONGO-1464 - Polishing.
Added JavaDoc. Simplified if-check in MongoQueryExecution.isListOfGeoResult(…).

Original pull request: #379.
2016-07-26 14:51:55 +02:00
Mark Paluch
353b836a77 DATAMONGO-1464 - Optimize query execution for pagination queries.
We execute paged queries now in an optimized way. The data is obtained for each paged execution but the count query is deferred. We determine the total from the pageable and the results in which we don't hit the page size bounds (i.e. results are less than a full page without offset or results are greater 0 and less than a full page with offset). In all other cases we issue an additional count query.

Original pull request: #379.
2016-07-26 14:51:29 +02:00
Oliver Gierke
325bcd11b9 DATAMONGO-1431 - Added MongoOperations.stream(…) with explicit collection. 2016-07-20 15:59:52 +02:00
Mark Paluch
9db2dde19b DATAMONGO-1463 - Upgrade to mongo-java-driver 2.14.3.
Upgrade mongo-java-driver 2.14.3 and upgrade the mongo33 profile to use 3.3.0 (release).
2016-07-20 10:33:23 +02:00
Mark Paluch
318ba53e2f DATAMONGO-1462 - Integrate version badge from spring.io.
Add version badge from spring.io and replace fixed version numbers with a placeholder.
2016-07-20 08:43:37 +02:00
Mark Paluch
3db30bd4a6 DATAMONGO-1460 - User placeholder property for JSR-303 API. 2016-07-15 12:48:37 +02:00
Oliver Gierke
dd1fbfeb66 DATAMONGO-1459 - Polishing.
Added missing @Overrides to MongoRepository interface and polished non-JavaDoc references.
2016-07-14 15:52:32 +02:00
Oliver Gierke
3fa17272bb DATAMONGO-1459 - Added support for any-match mode in Query-by-example.
MongoExampleMapper now $or-concatenates the predicates derived from the example in case the ExampleMatcher expresses any-match binding to be desired.

Moved integration tests for Query-by-example to the appropriate package and polished the code a little.

Related ticket: DATACMNS-879.
2016-07-14 15:52:32 +02:00
Christoph Strobl
9361fc3c71 DATAMONGO-1455, DATAMONGO-1456 - Polishing.
Use static imports for org.junit.Assert and org.hamcrest.core. Fix spelling.

Original pull request: #375.
2016-07-12 15:25:49 +02:00
Christoph Strobl
ac55f5e77f DATAMONGO-1455, DATAMONGO-1456 - Add support for $caseSensitive and $diacriticSensitive to $text.
We added methods to set values for $caseSensitive and $diacriticSensitive when using TextCriteria. Both operators are optional and will not be used until explicitly set.

    // { "$text" : { "$search" : "coffee" , "$caseSensitive" : true } }
    TextCriteria.forDefaultLanguage().matching("coffee").caseSensitive(true);

    // { "$text" : { "$search" : "coffee" , "$diacriticSensitive" : true } }
    TextCriteria.forDefaultLanguage().matching("coffee").diacriticSensitive(true);

Original pull request: #375.
2016-07-12 15:25:49 +02:00
Christoph Strobl
116baf9a92 DATAMONGO-832 - Polishing
Moved newly introduced types into order. Added missing @since tag and additional test.
Updated reference documentation for update operators and added $slice operator to "what’s new" section.

Original Pull Request: #374
2016-07-11 10:02:22 +02:00
Mark Paluch
026dce2612 DATAMONGO-832 - Add support for $slice in Update.push.
We now support $slice in Update operations via the PushOperatorBuilder.

    new Update().push("key").slice(5).each(Arrays.asList("one", "two", "three"));

Original Pull Request: #374
2016-07-11 09:49:45 +02:00
Mark Paluch
eae32be568 DATAMONGO-1457 - Polishing.
Add missing Javadoc to size operator. Mention Array Aggregation Operators in reference docs. Fix typos in reference docs.

Original pull request: #372.
2016-07-08 10:34:08 +02:00
Christoph Strobl
9d51ea4c01 DATAMONGO-1457 - Add support for $slice in aggregation.
We now support $slice in aggregation projections via the ProjectionOperationBuilder.

    Aggregation.project().and("field").slice(10, 20)

Original pull request: #372.
2016-07-08 10:08:29 +02:00
Mark Paluch
f4a5482005 DATAMONGO-1418 - Polishing.
Added ticket references. Simplified code.

 Original pull request: #361.
2016-06-24 15:32:58 +02:00
Nikolai Bogdanov
0db36aff8f DATAMONGO-1418 - Add support for $out operator in aggregations.
We now support the $out operator via Aggregation.out(…) to store aggregation results in a collection. Using the $out operator returns an empty list in AggregationResults.

Original pull request: #361.
CLA: 172720160413124705 (Nikolai Bogdanov)
2016-06-24 15:28:45 +02:00
Christoph Strobl
ba8ece334a DATAMONGO-1453 - Fix GeoJson conversion when coordinates are Integers.
We now use Number instead of Double for reading "coordinates" from GeoJSON representations.

Original pull request: #369.
2016-06-24 14:04:58 +02:00
Oliver Gierke
a5394074c5 DATAMONGO-1410 - Updated changelog. 2016-06-15 14:31:49 +02:00
Kevin Dosey
fe5bb515b7 DATAMONGO-1449 - Switched to foreach loop in collection handling of MappingMongoConverter.
This should result in minor to moderate performance improvement for iteration on Collections/Arrays during DBObject to object mapping.

Original pull request: #368.
2016-06-11 18:29:32 +02:00
Mark Paluch
c84bfbccf4 DATAMONGO-1437 - Polishing.
Renamed test. Added JavaDoc. Simplify throws declaration.

Original pull request: #367.
2016-06-02 08:55:09 +02:00
Christoph Strobl
d147f80a39 DATAMONGO-1437 - Preserve non translatable Exception cause when lazily resolving DBRef.
We now preserve the cause of Exceptions that cannot be translated into DataAccessExceptions when an error occurs during lazily loading DBRefs.

Original pull request: #367.
2016-06-02 08:52:53 +02:00
Christoph Strobl
7b8dadeb74 DATAMONGO-1271 - Polishing.
Removed non Java 6 language features, reworked and added a few tests.

Original Pull Request: #322
2016-05-27 07:53:53 +02:00
Jordi Llach Fernandez
d1251c42ca DATAMONGO-1271 - Provide lifecycle events for DBRefs.
We now publish livecycle events when loading DBRefs.

Original Pull Request: #322
CLA: 121620150519031801 (Jordi Llach Fernandez)
2016-05-27 07:53:45 +02:00
Oliver Gierke
4140dd573f DATAMONGO-1423 - Polishing.
Orignal pull request: #365.
2016-05-25 17:26:31 +02:00
Christoph Strobl
0e60630393 DATAMONGO-1423 - Map keys now get registered conversions applied for Updates.
We now pipe map keys through the potentially registered conversions when mapping Updates.

Orignal pull request: #365.
2016-05-25 17:26:30 +02:00
Oliver Gierke
9bc35512fd DATAMONGO-1416 - Polishing.
Just use instanceOf(…) from Hamcrest's Matchers class instead of dedicated class.

Original pull request: #362.
2016-05-24 15:57:21 +02:00
Christoph Strobl
b626c2f82b DATAMONGO-1416 - Get rid of the warnings for Atomic… type conversions.
We now use explicit converters instead of a ConverterFactory. This reduces noise in log when registering converters.

Original pull request: #362.
2016-05-24 15:57:21 +02:00
Mark Paluch
c3e894ee8d DATAMONGO-1424 - Polishing.
Remove EndingWith from NotLike. Remove superfluous white-spaces. Split combined highlighted keywords to individual highlighting.

Original pull request: #364.
2016-05-09 11:06:44 +02:00
Christoph Strobl
2f713bede5 DATAMONGO-1424 - Add support for NOT_LIKE.
We now support `notLike` and `isNotLike` in query derivation.

Original pull request: #364.
2016-05-09 11:05:44 +02:00
Mark Paluch
e03520d2fb DATAMONGO-1425 - Polishing.
Add NotContaining to documentation. Add integration test for Containing/NotContaining on collection properties.

Original pull request: #363.
2016-05-09 10:42:19 +02:00
Christoph Strobl
3829d58dc2 DATAMONGO-1425 - Fix query derivation for notContaining on String properties.
We now correctly build up the criteria for derived queries using notContaining keyword on String properties.

Original pull request: #363.
2016-05-09 10:42:15 +02:00
Mark Paluch
7b87fa9509 DATAMONGO-1412 - Fix backticks and code element highlighting.
Fixed broken highlighting using backticks followed by chars/single quotes. Convert single quote emphasis of id to backtick code fences. Add missing spaces between words and backticks.

Original Pull Request: #359
2016-05-02 13:20:29 +02:00
Mark Paluch
b2b9f3406a DATAMONGO-1412 - Document mapping rules for Java types to MongoDB representation.
Original Pull Request: #359
Related pull request: #353
Related ticket: DATAMONGO-1404
2016-05-02 13:20:22 +02:00
Mark Paluch
d610761019 DATAMONGO-1404 - Polishing.
Add author and since tags. Update license headers. Reformat code. Replace FQCN with import and simple class name. Remove final keyword in test methods. Add tests for numeric values. Update documentation.

Original pull request: #353.
2016-05-02 13:20:10 +02:00
Alexey Plotnik
8983bd26ce DATAMONGO-1404 - Add support for $min and $max update operators.
Original pull request: #353.
CLA: 169820160330091912 (Alexey Plotnik)
2016-05-02 13:19:31 +02:00
Christoph Strobl
5485f2fcd4 DATAMONGO-1391 - Polishing.
Removed white spaces, updated Javadoc and return early when using non 3.2 $unwind options.

Original Pull Request: #355
2016-05-02 12:56:38 +02:00
Mark Paluch
f8681fec66 DATAMONGO-1391 - Support Mongo 3.2 syntax for $unwind in aggregation.
We now support both, the simple {$unwind: path} and the MongoDB 3.2 {$unwind: {…}} syntax.

Original Pull Request: #355
2016-05-02 12:52:39 +02:00
Mark Paluch
0dd904894d DATAMONGO-1399 - Polishing.
Update since version to 1.10. Remove trailing whitespaces.

Original pull request: #352.
2016-05-02 09:44:37 +02:00
Christoph Strobl
7d70a8677e DATAMONGO-1399 - Allow adding hole to GeoJSON Polygon.
We now allow creation of GeoJsonPolygon having an outer and multiple inner rings.

Original pull request: #352.
2016-05-02 09:40:44 +02:00
Mark Paluch
13a52b5ac9 DATAMONGO-1403 - Add maxExecutionTimeMs alias for @Meta(maxExcecutionTime).
We added maxExecutionTimeMs as an alias for maxExcecutionTime which has been deprecated due to spelling issues.

Original pull request: #356.
2016-04-26 12:43:37 +02:00
Mark Paluch
f5cfcda673 DATAMONGO-1411 - Enable build on TravisCI.
We now start MongoDB server via apt-get instead of relying on the TravisCI managed 2.4.2 installation.
Doing this we altered tests to just check on the port and not the host part of the URIs.

Additionally we upgraded build profiles, removed promoted snapshot-versions, renamed mongo32-next to mongo32  and added mongo33-next build profile.

Original pull request: #358
2016-04-26 10:40:29 +02:00
Raja Dilip Kolli
cf44a7105f DATAMONGO-1420 - Update version numbers in Github readme.
Bumped to latest versions available

Original pull request: #354
2016-04-15 08:30:32 +02:00
Oliver Gierke
0228255d2b DATAMONGO-1356 - AuditingEventListener now has an explicit order.
AuditingEventListener now has a fixed ordering of 100. This allows other listeners to be registered to be executed before or after it.
2016-04-14 22:27:46 +02:00
Oliver Gierke
50e37355d4 DATAMONGO-1419 - Removed deprecated methods of AbstractMongoEventListener. 2016-04-14 22:27:42 +02:00
Oliver Gierke
a15dababfa DATAMONGO-1408 - Updated changelog. 2016-04-06 23:14:25 +02:00
Oliver Gierke
9942451017 DATAMONGO-1405 - After release cleanups. 2016-04-06 16:37:03 +02:00
Oliver Gierke
e144c29316 DATAMONGO-1405 - Prepare next development iteration. 2016-04-06 16:36:59 +02:00
692 changed files with 45290 additions and 16457 deletions

View File

@@ -1,9 +1,12 @@
<!--
Thank you for proposing a pull request. This template will guide you through the essential steps necessary for a pull request. Thank you for proposing a pull request. This template will guide you through the essential steps necessary for a pull request.
Make sure that: Make sure that:
-->
- [ ] You have read the [Spring Data contribution guidelines](https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.adoc). - [ ] You have read the [Spring Data contribution guidelines](https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.adoc).
- [ ] There is a ticket in the bug tracker for the project in our [JIRA](https://jira.spring.io/browse/DATAMONGO). - [ ] There is a ticket in the bug tracker for the project in our [JIRA](https://jira.spring.io/browse/DATAMONGO).
- [ ] You use the code formatters provided [here](https://github.com/spring-projects/spring-data-build/tree/master/etc/ide) and have them applied to your changes. Dont submit any formatting related changes. - [ ] You use the code formatters provided [here](https://github.com/spring-projects/spring-data-build/tree/master/etc/ide) and have them applied to your changes. Dont submit any formatting related changes.
- [ ] You submit test cases (unit or integration tests) that back your changes. - [ ] You submit test cases (unit or integration tests) that back your changes.
- [ ] You added yourself as author in the headers of the classes you touched. Amend the date range in the Apache license header if needed. For new types, add the license header (copy from another file and set the current year only). - [ ] You added yourself as author in the headers of the classes you touched. Amend the date range in the Apache license header if needed. For new types, add the license header (copy from another file and set the current year only).
- [ ] You provide your full name and an email address registered with your GitHub account. If youre a first-time submitter, make sure you have completed the [Contributors License Agreement form](https://support.springsource.com/spring_committer_signup).

View File

@@ -9,23 +9,19 @@ before_script:
env: env:
matrix: matrix:
- PROFILE=ci - PROFILE=ci
- PROFILE=mongo-next
- PROFILE=mongo3
- PROFILE=mongo3-next
- PROFILE=mongo31
- PROFILE=mongo32
- PROFILE=mongo33
- PROFILE=mongo34-next - PROFILE=mongo34-next
- PROFILE=mongo35-next
# Current MongoDB version is 2.4.2 as of 2016-04, see https://github.com/travis-ci/travis-ci/issues/3694 # Current MongoDB version is 2.4.2 as of 2016-04, see https://github.com/travis-ci/travis-ci/issues/3694
# apt-get starts a MongoDB instance so it's not started using before_script # apt-get starts a MongoDB instance so it's not started using before_script
addons: addons:
apt: apt:
sources: sources:
- mongodb-3.2-precise - mongodb-3.4-precise
packages: packages:
- mongodb-org-server - mongodb-org-server
- mongodb-org-shell - mongodb-org-shell
- oracle-java8-installer
sudo: false sudo: false

View File

@@ -1,3 +1,6 @@
[![Spring Data MongoDB](https://spring.io/badges/spring-data-mongodb/ga.svg)](http://projects.spring.io/spring-data-mongodb#quick-start)
[![Spring Data MongoDB](https://spring.io/badges/spring-data-mongodb/snapshot.svg)](http://projects.spring.io/spring-data-mongodb#quick-start)
# Spring Data MongoDB # Spring Data MongoDB
The primary goal of the [Spring Data](http://projects.spring.io/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services. The primary goal of the [Spring Data](http://projects.spring.io/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
@@ -26,7 +29,7 @@ Add the Maven dependency:
<dependency> <dependency>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId> <artifactId>spring-data-mongodb</artifactId>
<version>1.5.0.RELEASE</version> <version>${version}.RELEASE</version>
</dependency> </dependency>
``` ```
@@ -36,7 +39,7 @@ If you'd rather like the latest snapshots of the upcoming major version, use our
<dependency> <dependency>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId> <artifactId>spring-data-mongodb</artifactId>
<version>1.6.0.BUILD-SNAPSHOT</version> <version>${version}.BUILD-SNAPSHOT</version>
</dependency> </dependency>
<repository> <repository>
@@ -140,8 +143,8 @@ public class MyService {
Here are some ways for you to get involved in the community: Here are some ways for you to get involved in the community:
* Get involved with the Spring community on Stackoverflow and help out on the [spring-data-mongodb](http://stackoverflow.com/questions/tagged/spring-data-mongodb) tag by responding to questions and joining the debate. * Get involved with the Spring community on Stackoverflow and help out on the [spring-data-mongodb](http://stackoverflow.com/questions/tagged/spring-data-mongodb) tag by responding to questions and joining the debate.
* Create [JIRA](https://jira.springframework.org/browse/DATADOC) tickets for bugs and new features and comment and vote on the ones that you are interested in. * Create [JIRA](https://jira.spring.io/browse/DATAMONGO) tickets for bugs and new features and comment and vote on the ones that you are interested in.
* Github is for social coding: if you want to write code, we encourage contributions through pull requests from [forks of this repository](http://help.github.com/forking/). If you want to contribute code this way, please reference a JIRA ticket as well covering the specific issue you are addressing. * Github is for social coding: if you want to write code, we encourage contributions through pull requests from [forks of this repository](http://help.github.com/forking/). If you want to contribute code this way, please reference a JIRA ticket as well covering the specific issue you are addressing.
* Watch for upcoming articles on Spring by [subscribing](http://spring.io/blog) to spring.io. * Watch for upcoming articles on Spring by [subscribing](http://spring.io/blog) to spring.io.
Before we accept a non-trivial patch or pull request we will need you to sign the [contributor's agreement](https://support.springsource.com/spring_committer_signup). Signing the contributor's agreement does not grant anyone commit rights to the main repository, but it does mean that we can accept your contributions, and you will get an author credit if we do. Active contributors might be asked to join the core team, and given the ability to merge pull requests. Before we accept a non-trivial patch or pull request we will need you to [sign the Contributor License Agreement](https://cla.pivotal.io/sign/spring). Signing the contributors agreement does not grant anyone commit rights to the main repository, but it does mean that we can accept your contributions, and you will get an author credit if we do. If you forget to do so, you'll be reminded when you submit a pull request. Active contributors might be asked to join the core team, and given the ability to merge pull requests.

View File

@@ -1,291 +0,0 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<profiles version="12">
<profile kind="CodeFormatterProfile" name="Spring Data" version="12">
<setting id="org.eclipse.jdt.core.formatter.comment.insert_new_line_before_root_tags" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.disabling_tag" value="@formatter:off"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_annotation" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_type_arguments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_anonymous_type_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_case" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_brace_in_array_initializer" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.new_lines_at_block_boundaries" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_annotation_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_before_closing_brace_in_array_initializer" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_annotation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_field" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_while" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.use_on_off_tags" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_annotation_type_member_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_before_else_in_if_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_prefix_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.keep_else_statement_on_same_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_ellipsis" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.insert_new_line_for_parameter" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_annotation_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_breaks_compare_to_cases" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_at_in_annotation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_multiple_fields" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_expressions_in_array_initializer" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_conditional_expression" value="80"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_for" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_binary_operator" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_question_in_wildcard" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_array_initializer" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_before_finally_in_try_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_local_variable" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_before_catch_in_try_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_while" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_after_package" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_type_parameters" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.continuation_indentation" value="2"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_postfix_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_method_invocation" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_superinterfaces" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_new_chunk" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_binary_operator" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_package" value="0"/>
<setting id="org.eclipse.jdt.core.compiler.source" value="1.7"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_enum_constant_arguments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_constructor_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_line_comments" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_angle_bracket_in_type_arguments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_enum_declarations" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.join_wrapped_lines" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_block" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_explicit_constructor_call" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_method_invocation_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_member_type" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.align_type_members_on_columns" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_for" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_method_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_selector_in_method_invocation" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_switch" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_unary_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_case" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.indent_parameter_description" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_method_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_switch" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_enum_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_angle_bracket_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_block_comment" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_type_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.lineSplit" value="120"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_if" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_brackets_in_array_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_parenthesized_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_explicitconstructorcall_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_constructor_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_first_class_body_declaration" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_method" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indentation.size" value="2"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_method_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.enabling_tag" value="@formatter:on"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_superclass_in_type_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_assignment" value="0"/>
<setting id="org.eclipse.jdt.core.compiler.problem.assertIdentifier" value="error"/>
<setting id="org.eclipse.jdt.core.formatter.tabulation.char" value="tab"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_constructor_declaration_parameters" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_semicolon_in_try_resources" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_prefix_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_statements_compare_to_body" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_method" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.wrap_outer_expressions_when_nested" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.format_guardian_clause_on_one_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_for" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_cast" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_parameters_in_constructor_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_labeled_statement" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_annotation_type_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_method_body" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_method_declaration" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_method_invocation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_try" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_bracket_in_array_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_enum_constant" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_annotation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_at_in_annotation_type_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_method_declaration_throws" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_if" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_switch" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_method_declaration_throws" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_parenthesized_expression_in_return" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_annotation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_question_in_conditional" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_question_in_wildcard" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_try" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_bracket_in_array_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.preserve_white_space_between_code_and_line_comments" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_parenthesized_expression_in_throw" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.compiler.problem.enumIdentifier" value="error"/>
<setting id="org.eclipse.jdt.core.formatter.indent_switchstatements_compare_to_switch" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_ellipsis" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_block" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_for_inits" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_method_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.compact_else_if" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.wrap_before_or_operator_multicatch" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_array_initializer" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_for_increments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.format_line_comment_starting_on_first_column" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_bracket_in_array_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_field" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.indent_root_tags" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_enum_constant" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_enum_declarations" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_union_type_in_multicatch" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_explicitconstructorcall_arguments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_switch" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_method_declaration_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_superinterfaces" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.tabulation.size" value="2"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_opening_brace_in_array_initializer" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_brace_in_block" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_constructor_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_constructor_declaration_throws" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_if" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_javadoc_comment" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_throws_clause_in_constructor_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_assignment_operator" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_assignment_operator" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_empty_lines" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_synchronized" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_paren_in_cast" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_method_declaration_parameters" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_block_in_case" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.number_of_empty_lines_to_preserve" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_method_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_catch" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_constructor_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_method_invocation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_bracket_in_array_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_qualified_allocation_expression" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_and_in_type_parameter" value="insert"/>
<setting id="org.eclipse.jdt.core.compiler.compliance" value="1.7"/>
<setting id="org.eclipse.jdt.core.formatter.continuation_indentation_for_array_initializer" value="2"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_brackets_in_array_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_at_in_annotation_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_allocation_expression" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_cast" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_unary_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_anonymous_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.keep_empty_array_initializer_on_one_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.keep_imple_if_on_one_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_constructor_declaration_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_angle_bracket_in_type_parameters" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_at_end_of_file_if_missing" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_for" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_labeled_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_superinterfaces_in_type_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_binary_expression" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_enum_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_type" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_while" value="do not insert"/>
<setting id="org.eclipse.jdt.core.compiler.codegen.inlineJsrBytecode" value="enabled"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_try" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.put_empty_statement_on_new_line" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_label" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_parameter" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_method_invocation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_before_while_in_do_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_enum_constant" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_javadoc_comments" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.comment.line_length" value="120"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_package" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_between_import_groups" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_enum_constant_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_semicolon" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_constructor_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.number_of_blank_lines_at_beginning_of_method_body" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_conditional" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_type_header" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_annotation_type_member_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.wrap_before_binary_operator" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_enum_declaration_header" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_between_type_declarations" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_synchronized" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_statements_compare_to_block" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_superinterfaces_in_enum_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.join_lines_in_comments" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_question_in_conditional" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_multiple_field_declarations" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_compact_if" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_for_inits" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_switchstatements_compare_to_cases" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_array_initializer" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_default" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_and_in_type_parameter" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_constructor_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_imports" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_assert" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_html" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_throws_clause_in_method_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_anonymous_type_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_conditional" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_for" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_postfix_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_source_code" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_synchronized" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_allocation_expression" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_constructor_declaration_throws" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_parameters_in_method_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_brace_in_array_initializer" value="insert"/>
<setting id="org.eclipse.jdt.core.compiler.codegen.targetPlatform" value="1.7"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_resources_in_try" value="80"/>
<setting id="org.eclipse.jdt.core.formatter.use_tabs_only_for_leading_indentations" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_annotation" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_header" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_block_comments" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_enum_constants" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_block" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_annotation_declaration_header" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_parenthesized_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_parenthesized_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_catch" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_multiple_local_declarations" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_switch" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_for_increments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_method_invocation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_assert" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_type_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_array_initializer" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_braces_in_array_initializer" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_method_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_semicolon_in_for" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_catch" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_angle_bracket_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_multiple_field_declarations" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_annotation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_parameterized_type_reference" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_method_invocation_arguments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.new_lines_at_javadoc_boundaries" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_after_imports" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_multiple_local_declarations" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_enum_constant_header" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_semicolon_in_for" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.never_indent_line_comments_on_first_column" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_semicolon_in_try_resources" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_angle_bracket_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.never_indent_block_comments_on_first_column" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.keep_then_statement_on_same_line" value="false"/>
</profile>
</profiles>

View File

@@ -1,2 +0,0 @@
lombok.nonNull.exceptionType = IllegalArgumentException
lombok.log.fieldName = LOG

133
pom.xml
View File

@@ -1,21 +1,21 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?> <?xml version="1.0" encoding="UTF-8" standalone="no"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd"> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId> <artifactId>spring-data-mongodb-parent</artifactId>
<version>1.9.12.BUILD-SNAPSHOT</version> <version>2.0.0.M2</version>
<packaging>pom</packaging> <packaging>pom</packaging>
<name>Spring Data MongoDB</name> <name>Spring Data MongoDB</name>
<description>MongoDB support for Spring Data</description> <description>MongoDB support for Spring Data</description>
<url>https://projects.spring.io/spring-data-mongodb</url> <url>http://projects.spring.io/spring-data-mongodb</url>
<parent> <parent>
<groupId>org.springframework.data.build</groupId> <groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId> <artifactId>spring-data-parent</artifactId>
<version>1.8.12.BUILD-SNAPSHOT</version> <version>2.0.0.M2</version>
</parent> </parent>
<modules> <modules>
@@ -28,9 +28,9 @@
<properties> <properties>
<project.type>multi</project.type> <project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id> <dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.12.12.BUILD-SNAPSHOT</springdata.commons> <springdata.commons>2.0.0.M2</springdata.commons>
<mongo>2.14.3</mongo> <mongo>3.4.2</mongo>
<mongo.osgi>2.13.0</mongo.osgi> <mongo.reactivestreams>1.3.0</mongo.reactivestreams>
</properties> </properties>
<developers> <developers>
@@ -39,7 +39,7 @@
<name>Oliver Gierke</name> <name>Oliver Gierke</name>
<email>ogierke at gopivotal.com</email> <email>ogierke at gopivotal.com</email>
<organization>Pivotal</organization> <organization>Pivotal</organization>
<organizationUrl>https://pivotal.io</organizationUrl> <organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles> <roles>
<role>Project Lead</role> <role>Project Lead</role>
</roles> </roles>
@@ -50,7 +50,7 @@
<name>Thomas Risberg</name> <name>Thomas Risberg</name>
<email>trisberg at vmware.com</email> <email>trisberg at vmware.com</email>
<organization>Pivotal</organization> <organization>Pivotal</organization>
<organizationUrl>https://pivotal.io</organizationUrl> <organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles> <roles>
<role>Developer</role> <role>Developer</role>
</roles> </roles>
@@ -61,7 +61,7 @@
<name>Mark Pollack</name> <name>Mark Pollack</name>
<email>mpollack at gopivotal.com</email> <email>mpollack at gopivotal.com</email>
<organization>Pivotal</organization> <organization>Pivotal</organization>
<organizationUrl>https://pivotal.io</organizationUrl> <organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles> <roles>
<role>Developer</role> <role>Developer</role>
</roles> </roles>
@@ -72,7 +72,7 @@
<name>Jon Brisbin</name> <name>Jon Brisbin</name>
<email>jbrisbin at gopivotal.com</email> <email>jbrisbin at gopivotal.com</email>
<organization>Pivotal</organization> <organization>Pivotal</organization>
<organizationUrl>https://pivotal.io</organizationUrl> <organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles> <roles>
<role>Developer</role> <role>Developer</role>
</roles> </roles>
@@ -83,7 +83,7 @@
<name>Thomas Darimont</name> <name>Thomas Darimont</name>
<email>tdarimont at gopivotal.com</email> <email>tdarimont at gopivotal.com</email>
<organization>Pivotal</organization> <organization>Pivotal</organization>
<organizationUrl>https://pivotal.io</organizationUrl> <organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles> <roles>
<role>Developer</role> <role>Developer</role>
</roles> </roles>
@@ -94,7 +94,18 @@
<name>Christoph Strobl</name> <name>Christoph Strobl</name>
<email>cstrobl at gopivotal.com</email> <email>cstrobl at gopivotal.com</email>
<organization>Pivotal</organization> <organization>Pivotal</organization>
<organizationUrl>https://pivotal.io</organizationUrl> <organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>+1</timezone>
</developer>
<developer>
<id>mpaluch</id>
<name>Mark Paluch</name>
<email>mpaluch at pivotal.io</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.pivotal.io</organizationUrl>
<roles> <roles>
<role>Developer</role> <role>Developer</role>
</roles> </roles>
@@ -103,86 +114,28 @@
</developers> </developers>
<profiles> <profiles>
<profile>
<id>mongo-next</id>
<properties>
<mongo>2.15.0-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>mongo3</id>
<properties>
<mongo>3.0.4</mongo>
</properties>
</profile>
<profile>
<id>mongo3-next</id>
<properties>
<mongo>3.0.5-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>mongo31</id>
<properties>
<mongo>3.1.1</mongo>
</properties>
</profile>
<profile>
<id>mongo32</id>
<properties>
<mongo>3.2.2</mongo>
</properties>
</profile>
<profile>
<id>mongo33</id>
<properties>
<mongo>3.3.0</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile> <profile>
<id>mongo34-next</id> <id>mongo34-next</id>
<properties> <properties>
<mongo>3.4.0-SNAPSHOT</mongo> <mongo>3.4.3-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>mongo35-next</id>
<properties>
<mongo>3.5.0-SNAPSHOT</mongo>
</properties> </properties>
<repositories> <repositories>
@@ -220,8 +173,8 @@
<repositories> <repositories>
<repository> <repository>
<id>spring-libs-snapshot</id> <id>spring-libs-milestone</id>
<url>https://repo.spring.io/libs-snapshot</url> <url>https://repo.spring.io/libs-milestone</url>
</repository> </repository>
</repositories> </repositories>

View File

@@ -1,12 +1,12 @@
<?xml version="1.0" encoding="UTF-8"?> <?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/maven-v4_0_0.xsd"> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>
<parent> <parent>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId> <artifactId>spring-data-mongodb-parent</artifactId>
<version>1.9.12.BUILD-SNAPSHOT</version> <version>2.0.0.M2</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>
@@ -14,8 +14,8 @@
<name>Spring Data MongoDB - Cross-Store Support</name> <name>Spring Data MongoDB - Cross-Store Support</name>
<properties> <properties>
<jpa>2.0.0</jpa> <jpa>2.1.1</jpa>
<hibernate>3.6.10.Final</hibernate> <hibernate>5.2.1.Final</hibernate>
</properties> </properties>
<dependencies> <dependencies>
@@ -48,7 +48,15 @@
<dependency> <dependency>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId> <artifactId>spring-data-mongodb</artifactId>
<version>1.9.12.BUILD-SNAPSHOT</version> <version>2.0.0.M2</version>
</dependency>
<!-- reactive -->
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-core</artifactId>
<version>${reactor}</version>
<optional>true</optional>
</dependency> </dependency>
<dependency> <dependency>
@@ -81,13 +89,13 @@
<dependency> <dependency>
<groupId>javax.validation</groupId> <groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId> <artifactId>validation-api</artifactId>
<version>1.0.0.GA</version> <version>${validation}</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
<dependency> <dependency>
<groupId>org.hibernate</groupId> <groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId> <artifactId>hibernate-validator</artifactId>
<version>4.0.2.GA</version> <version>5.2.4.Final</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManagerFactory; import javax.persistence.EntityManagerFactory;
import org.bson.Document;
import org.slf4j.Logger; import org.slf4j.Logger;
import org.slf4j.LoggerFactory; import org.slf4j.LoggerFactory;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
@@ -29,10 +30,10 @@ import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate; import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.util.ClassUtils; import org.springframework.util.ClassUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.MongoException; import com.mongodb.MongoException;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.model.Filters;
import com.mongodb.client.result.DeleteResult;
/** /**
* @author Thomas Risberg * @author Thomas Risberg
@@ -74,15 +75,15 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
String collName = getCollectionNameForEntity(entityClass); String collName = getCollectionNameForEntity(entityClass);
final DBObject dbk = new BasicDBObject(); final Document dbk = new Document();
dbk.put(ENTITY_ID, id); dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName()); dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for {}", dbk); log.debug("Loading MongoDB data for {}", dbk);
} }
mongoTemplate.execute(collName, new CollectionCallback<Object>() { mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException { public Object doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException {
for (DBObject dbo : collection.find(dbk)) { for (Document dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME); String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("Processing key: {}", key); log.debug("Processing key: {}", key);
@@ -143,27 +144,31 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
for (String key : cs.getValues().keySet()) { for (String key : cs.getValues().keySet()) {
if (key != null && !key.startsWith("_") && !key.equals(ChangeSetPersister.ID_KEY)) { if (key != null && !key.startsWith("_") && !key.equals(ChangeSetPersister.ID_KEY)) {
Object value = cs.getValues().get(key); Object value = cs.getValues().get(key);
final DBObject dbQuery = new BasicDBObject(); final Document dbQuery = new Document();
dbQuery.put(ENTITY_ID, getPersistentId(entity, cs)); dbQuery.put(ENTITY_ID, getPersistentId(entity, cs));
dbQuery.put(ENTITY_CLASS, entity.getClass().getName()); dbQuery.put(ENTITY_CLASS, entity.getClass().getName());
dbQuery.put(ENTITY_FIELD_NAME, key); dbQuery.put(ENTITY_FIELD_NAME, key);
DBObject dbId = mongoTemplate.execute(collName, new CollectionCallback<DBObject>() { final Document dbId = mongoTemplate.execute(collName, new CollectionCallback<Document>() {
public DBObject doInCollection(DBCollection collection) throws MongoException, DataAccessException { public Document doInCollection(MongoCollection<Document> collection)
return collection.findOne(dbQuery); throws MongoException, DataAccessException {
Document id = collection.find(dbQuery).first();
return id;
} }
}); });
if (value == null) { if (value == null) {
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("Flush: removing: {}", dbQuery); log.debug("Flush: removing: {}", dbQuery);
} }
mongoTemplate.execute(collName, new CollectionCallback<Object>() { mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException { public Object doInCollection(MongoCollection<Document> collection)
collection.remove(dbQuery); throws MongoException, DataAccessException {
DeleteResult dr = collection.deleteMany(dbQuery);
return null; return null;
} }
}); });
} else { } else {
final DBObject dbDoc = new BasicDBObject(); final Document dbDoc = new Document();
dbDoc.putAll(dbQuery); dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) { if (log.isDebugEnabled()) {
log.debug("Flush: saving: {}", dbQuery); log.debug("Flush: saving: {}", dbQuery);
@@ -174,8 +179,18 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
dbDoc.put("_id", dbId.get("_id")); dbDoc.put("_id", dbId.get("_id"));
} }
mongoTemplate.execute(collName, new CollectionCallback<Object>() { mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException { public Object doInCollection(MongoCollection<Document> collection)
collection.save(dbDoc); throws MongoException, DataAccessException {
if (dbId != null) {
collection.replaceOne(Filters.eq("_id", dbId.get("_id")), dbDoc);
} else {
if (dbDoc.containsKey("_id") && dbDoc.get("_id") == null) {
dbDoc.remove("_id");
}
collection.insertOne(dbDoc);
}
return null; return null;
} }
}); });

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManager; import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext; import javax.persistence.PersistenceContext;
import org.bson.Document;
import org.junit.After; import org.junit.After;
import org.junit.Assert; import org.junit.Assert;
import org.junit.Before; import org.junit.Before;
@@ -36,8 +37,6 @@ import org.springframework.transaction.annotation.Transactional;
import org.springframework.transaction.support.TransactionCallback; import org.springframework.transaction.support.TransactionCallback;
import org.springframework.transaction.support.TransactionTemplate; import org.springframework.transaction.support.TransactionTemplate;
import com.mongodb.DBObject;
/** /**
* Integration tests for MongoDB cross-store persistence (mainly {@link MongoChangeSetPersister}). * Integration tests for MongoDB cross-store persistence (mainly {@link MongoChangeSetPersister}).
* *
@@ -48,14 +47,11 @@ import com.mongodb.DBObject;
@ContextConfiguration("classpath:/META-INF/spring/applicationContext.xml") @ContextConfiguration("classpath:/META-INF/spring/applicationContext.xml")
public class CrossStoreMongoTests { public class CrossStoreMongoTests {
@Autowired @Autowired MongoTemplate mongoTemplate;
MongoTemplate mongoTemplate;
@PersistenceContext @PersistenceContext EntityManager entityManager;
EntityManager entityManager;
@Autowired @Autowired PlatformTransactionManager transactionManager;
PlatformTransactionManager transactionManager;
TransactionTemplate txTemplate; TransactionTemplate txTemplate;
@Before @Before
@@ -187,7 +183,7 @@ public class CrossStoreMongoTests {
boolean weFound3 = false; boolean weFound3 = false;
for (DBObject dbo : this.mongoTemplate.getCollection(mongoTemplate.getCollectionName(Person.class)).find()) { for (Document dbo : this.mongoTemplate.getCollection(mongoTemplate.getCollectionName(Person.class)).find()) {
Assert.assertTrue(!dbo.get("_entity_id").equals(2L)); Assert.assertTrue(!dbo.get("_entity_id").equals(2L));
if (dbo.get("_entity_id").equals(3L)) { if (dbo.get("_entity_id").equals(3L)) {
weFound3 = true; weFound3 = true;

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -20,13 +20,13 @@
<mongo:mapping-converter/> <mongo:mapping-converter/>
<!-- Mongo config --> <!-- Mongo config -->
<bean id="mongo" class="org.springframework.data.mongodb.core.MongoFactoryBean"> <bean id="mongoClient" class="org.springframework.data.mongodb.core.MongoClientFactoryBean">
<property name="host" value="localhost"/> <property name="host" value="localhost"/>
<property name="port" value="27017"/> <property name="port" value="27017"/>
</bean> </bean>
<bean id="mongoDbFactory" class="org.springframework.data.mongodb.core.SimpleMongoDbFactory"> <bean id="mongoDbFactory" class="org.springframework.data.mongodb.core.SimpleMongoDbFactory">
<constructor-arg name="mongo" ref="mongo"/> <constructor-arg name="mongoClient" ref="mongoClient"/>
<constructor-arg name="databaseName" value="database"/> <constructor-arg name="databaseName" value="database"/>
</bean> </bean>

View File

@@ -1,18 +0,0 @@
Bundle-SymbolicName: org.springframework.data.mongodb.crossstore
Bundle-Name: Spring Data MongoDB Cross Store Support
Bundle-Vendor: Pivotal Software, Inc.
Bundle-ManifestVersion: 2
Import-Package:
sun.reflect;version="0";resolution:=optional
Export-Template:
org.springframework.data.mongodb.crossstore.*;version="${project.version}"
Import-Template:
com.mongodb.*;version="${mongo.osgi:[=.=.=,+1.0.0)}",
javax.persistence.*;version="${jpa:[=.=.=,+1.0.0)}",
org.aspectj.*;version="${aspectj:[1.0.0, 2.0.0)}",
org.bson.*;version="0",
org.slf4j.*;version="${slf4j:[=.=.=,+1.0.0)}",
org.springframework.*;version="${spring:[=.=.=.=,+1.0.0)}",
org.springframework.data.*;version="${springdata.commons:[=.=.=.=,+1.0.0)}",
org.springframework.data.mongodb.*;version="${project.version:[=.=.=.=,+1.0.0)}",
org.w3c.dom.*;version="0"

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8"?> <?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd"> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>
@@ -13,7 +13,7 @@
<parent> <parent>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId> <artifactId>spring-data-mongodb-parent</artifactId>
<version>1.9.12.BUILD-SNAPSHOT</version> <version>2.0.0.M2</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>

View File

@@ -1,11 +1,11 @@
<?xml version="1.0" encoding="UTF-8"?> <?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/maven-v4_0_0.xsd"> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>
<parent> <parent>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId> <artifactId>spring-data-mongodb-parent</artifactId>
<version>1.9.12.BUILD-SNAPSHOT</version> <version>2.0.0.M2</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -18,6 +18,8 @@ package org.springframework.data.mongodb.log4j;
import java.net.UnknownHostException; import java.net.UnknownHostException;
import java.util.Arrays; import java.util.Arrays;
import java.util.Calendar; import java.util.Calendar;
import java.util.Collections;
import java.util.List;
import java.util.Map; import java.util.Map;
import org.apache.log4j.AppenderSkeleton; import org.apache.log4j.AppenderSkeleton;
@@ -31,14 +33,17 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DB; import com.mongodb.DB;
import com.mongodb.Mongo; import com.mongodb.Mongo;
import com.mongodb.MongoClient; import com.mongodb.MongoClient;
import com.mongodb.MongoCredential;
import com.mongodb.ServerAddress;
import com.mongodb.WriteConcern; import com.mongodb.WriteConcern;
/** /**
* Log4j appender writing log entries into a MongoDB instance. * Log4j appender writing log entries into a MongoDB instance.
* *
* @author Jon Brisbin * @author Jon Brisbin
* @author Oliver Gierke * @author Oliver Gierke
* @auhtor Christoph Strobl * @author Christoph Strobl
* @author Ricardo Espirito Santo
*/ */
public class MongoLog4jAppender extends AppenderSkeleton { public class MongoLog4jAppender extends AppenderSkeleton {
@@ -56,6 +61,9 @@ public class MongoLog4jAppender extends AppenderSkeleton {
protected String host = "localhost"; protected String host = "localhost";
protected int port = 27017; protected int port = 27017;
protected String username;
protected String password;
protected String authenticationDatabase;
protected String database = "logs"; protected String database = "logs";
protected String collectionPattern = "%c"; protected String collectionPattern = "%c";
protected PatternLayout collectionLayout = new PatternLayout(collectionPattern); protected PatternLayout collectionLayout = new PatternLayout(collectionPattern);
@@ -65,8 +73,7 @@ public class MongoLog4jAppender extends AppenderSkeleton {
protected Mongo mongo; protected Mongo mongo;
protected DB db; protected DB db;
public MongoLog4jAppender() { public MongoLog4jAppender() {}
}
public MongoLog4jAppender(boolean isActive) { public MongoLog4jAppender(boolean isActive) {
super(isActive); super(isActive);
@@ -88,6 +95,53 @@ public class MongoLog4jAppender extends AppenderSkeleton {
this.port = port; this.port = port;
} }
/**
* @return
* @since 1.10
*/
public String getUsername() {
return username;
}
/**
* @param username may be {@literal null} for unauthenticated access.
* @since 1.10
*/
public void setUsername(String username) {
this.username = username;
}
/**
* @return
* @since 1.10
*/
public String getPassword() {
return password;
}
/**
* @param password may be {@literal null} for unauthenticated access.
* @since 1.10
*/
public void setPassword(String password) {
this.password = password;
}
/**
* @return
*/
public String getAuthenticationDatabase() {
return authenticationDatabase;
}
/**
* @param authenticationDatabase may be {@literal null} to use {@link #getDatabase()} as authentication database.
* @since 1.10
*/
public void setAuthenticationDatabase(String authenticationDatabase) {
this.authenticationDatabase = authenticationDatabase;
}
public String getDatabase() { public String getDatabase() {
return database; return database;
} }
@@ -113,14 +167,14 @@ public class MongoLog4jAppender extends AppenderSkeleton {
this.applicationId = applicationId; this.applicationId = applicationId;
} }
public void setWarnOrHigherWriteConcern(String wc) {
this.warnOrHigherWriteConcern = WriteConcern.valueOf(wc);
}
public String getWarnOrHigherWriteConcern() { public String getWarnOrHigherWriteConcern() {
return warnOrHigherWriteConcern.toString(); return warnOrHigherWriteConcern.toString();
} }
public void setWarnOrHigherWriteConcern(String wc) {
this.warnOrHigherWriteConcern = WriteConcern.valueOf(wc);
}
public String getInfoOrLowerWriteConcern() { public String getInfoOrLowerWriteConcern() {
return infoOrLowerWriteConcern.toString(); return infoOrLowerWriteConcern.toString();
} }
@@ -130,10 +184,26 @@ public class MongoLog4jAppender extends AppenderSkeleton {
} }
protected void connectToMongo() throws UnknownHostException { protected void connectToMongo() throws UnknownHostException {
this.mongo = new MongoClient(host, port);
this.mongo = createMongoClient();
this.db = mongo.getDB(database); this.db = mongo.getDB(database);
} }
private MongoClient createMongoClient() throws UnknownHostException {
ServerAddress serverAddress = new ServerAddress(host, port);
if (null == password || null == username) {
return new MongoClient(serverAddress);
}
String authenticationDatabaseToUse = authenticationDatabase == null ? this.database : authenticationDatabase;
MongoCredential mongoCredential = MongoCredential.createCredential(username,
authenticationDatabaseToUse, password.toCharArray());
List<MongoCredential> credentials = Collections.singletonList(mongoCredential);
return new MongoClient(serverAddress, credentials);
}
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.apache.log4j.AppenderSkeleton#append(org.apache.log4j.spi.LoggingEvent) * @see org.apache.log4j.AppenderSkeleton#append(org.apache.log4j.spi.LoggingEvent)

View File

@@ -0,0 +1,114 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.log4j;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.util.Calendar;
import java.util.Collections;
import org.apache.log4j.LogManager;
import org.apache.log4j.Logger;
import org.apache.log4j.MDC;
import org.apache.log4j.PropertyConfigurator;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.DB;
import com.mongodb.DBCursor;
import com.mongodb.MongoClient;
import com.mongodb.MongoCredential;
import com.mongodb.ServerAddress;
/**
* Integration tests for {@link MongoLog4jAppender} using authentication.
*
* @author Mark Paluch
*/
public class MongoLog4jAppenderAuthenticationIntegrationTests {
private final static String username = "admin";
private final static String password = "test";
private final static String authenticationDatabase = "logs";
MongoClient mongo;
DB db;
String collection;
ServerAddress serverLocation;
Logger log;
@Before
public void setUp() throws Exception {
serverLocation = new ServerAddress("localhost", 27017);
mongo = new MongoClient(serverLocation);
db = mongo.getDB("logs");
BasicDBList roles = new BasicDBList();
roles.add("dbOwner");
db.command(new BasicDBObjectBuilder().add("createUser", username).add("pwd", password).add("roles", roles).get());
mongo.close();
mongo = new MongoClient(serverLocation, Collections
.singletonList(MongoCredential.createCredential(username, authenticationDatabase, password.toCharArray())));
db = mongo.getDB("logs");
Calendar now = Calendar.getInstance();
collection = String.valueOf(now.get(Calendar.YEAR)) + String.format("%1$02d", now.get(Calendar.MONTH) + 1);
LogManager.resetConfiguration();
PropertyConfigurator.configure(getClass().getResource("/log4j-with-authentication.properties"));
log = Logger.getLogger(MongoLog4jAppenderIntegrationTests.class.getName());
}
@After
public void tearDown() {
if (db != null) {
db.getCollection(collection).remove(new BasicDBObject());
db.command(new BasicDBObject("dropUser", username));
}
LogManager.resetConfiguration();
PropertyConfigurator.configure(getClass().getResource("/log4j.properties"));
}
@Test
public void testLogging() {
log.debug("DEBUG message");
log.info("INFO message");
log.warn("WARN message");
log.error("ERROR message");
DBCursor msgs = db.getCollection(collection).find();
assertThat(msgs.count(), is(4));
}
@Test
public void testProperties() {
MDC.put("property", "one");
log.debug("DEBUG message");
}
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -20,8 +20,10 @@ import static org.junit.Assert.*;
import java.util.Calendar; import java.util.Calendar;
import org.apache.log4j.LogManager;
import org.apache.log4j.Logger; import org.apache.log4j.Logger;
import org.apache.log4j.MDC; import org.apache.log4j.MDC;
import org.apache.log4j.PropertyConfigurator;
import org.junit.After; import org.junit.After;
import org.junit.Before; import org.junit.Before;
import org.junit.Test; import org.junit.Test;
@@ -30,31 +32,34 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DB; import com.mongodb.DB;
import com.mongodb.DBCursor; import com.mongodb.DBCursor;
import com.mongodb.MongoClient; import com.mongodb.MongoClient;
import com.mongodb.ServerAddress;
/** /**
* Integration tests for {@link MongoLog4jAppender}. * Integration tests for {@link MongoLog4jAppender}.
* *
* @author Jon Brisbin * @author Jon Brisbin
* @author Oliver Gierke * @author Oliver Gierke
* @author Christoph Strobl * @author Christoph Strobl
*/ */
public class MongoLog4jAppenderIntegrationTests { public class MongoLog4jAppenderIntegrationTests {
static final String NAME = MongoLog4jAppenderIntegrationTests.class.getName();
private static final Logger log = Logger.getLogger(NAME);
MongoClient mongo; MongoClient mongo;
DB db; DB db;
String collection; String collection;
ServerAddress serverLocation;
Logger log;
@Before @Before
public void setUp() throws Exception { public void setUp() throws Exception {
serverLocation = new ServerAddress("localhost", 27017);
mongo = new MongoClient("localhost", 27017); mongo = new MongoClient(serverLocation);
db = mongo.getDB("logs"); db = mongo.getDB("logs");
Calendar now = Calendar.getInstance(); Calendar now = Calendar.getInstance();
collection = String.valueOf(now.get(Calendar.YEAR)) + String.format("%1$02d", now.get(Calendar.MONTH) + 1); collection = String.valueOf(now.get(Calendar.YEAR)) + String.format("%1$02d", now.get(Calendar.MONTH) + 1);
log = Logger.getLogger(MongoLog4jAppenderIntegrationTests.class.getName());
} }
@After @After
@@ -76,7 +81,6 @@ public class MongoLog4jAppenderIntegrationTests {
@Test @Test
public void testProperties() { public void testProperties() {
MDC.put("property", "one"); MDC.put("property", "one");
log.debug("DEBUG message"); log.debug("DEBUG message");
} }

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2013 the original author or authors. * Copyright 2013-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -24,10 +24,7 @@ import org.junit.Test;
*/ */
public class MongoLog4jAppenderUnitTests { public class MongoLog4jAppenderUnitTests {
/** @Test // DATAMONGO-641
* @see DATAMONGO-641
*/
@Test
public void closesWithoutMongoInstancePresent() { public void closesWithoutMongoInstancePresent() {
new MongoLog4jAppender().close(); new MongoLog4jAppender().close();
} }

View File

@@ -0,0 +1,16 @@
log4j.rootCategory=INFO, mongo
log4j.appender.mongo=org.springframework.data.mongodb.log4j.MongoLog4jAppender
log4j.appender.mongo.layout=org.apache.log4j.PatternLayout
log4j.appender.mongo.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.appender.mongo.host = localhost
log4j.appender.mongo.port = 27017
log4j.appender.mongo.database = logs
log4j.appender.mongo.username = admin
log4j.appender.mongo.password = test
log4j.appender.mongo.authenticationDatabase = logs
log4j.appender.mongo.collectionPattern = %X{year}%X{month}
log4j.appender.mongo.applicationId = my.application
log4j.appender.mongo.warnOrHigherWriteConcern = FSYNC_SAFE
log4j.category.org.springframework.data.mongodb=DEBUG

View File

@@ -1,13 +1,13 @@
log4j.rootCategory=INFO, stdout log4j.rootCategory=INFO, mongo
log4j.appender.stdout=org.springframework.data.mongodb.log4j.MongoLog4jAppender log4j.appender.mongo=org.springframework.data.mongodb.log4j.MongoLog4jAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout log4j.appender.mongo.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - <%m>%n log4j.appender.mongo.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.appender.stdout.host = localhost log4j.appender.mongo.host = localhost
log4j.appender.stdout.port = 27017 log4j.appender.mongo.port = 27017
log4j.appender.stdout.database = logs log4j.appender.mongo.database = logs
log4j.appender.stdout.collectionPattern = %X{year}%X{month} log4j.appender.mongo.collectionPattern = %X{year}%X{month}
log4j.appender.stdout.applicationId = my.application log4j.appender.mongo.applicationId = my.application
log4j.appender.stdout.warnOrHigherWriteConcern = FSYNC_SAFE log4j.appender.mongo.warnOrHigherWriteConcern = FSYNC_SAFE
log4j.category.org.springframework.data.mongodb=DEBUG log4j.category.org.springframework.data.mongodb=DEBUG

View File

@@ -1,9 +0,0 @@
Bundle-SymbolicName: org.springframework.data.mongodb.log4j
Bundle-Name: Spring Data Mongo DB Log4J Appender
Bundle-Vendor: Pivotal Software, Inc.
Bundle-ManifestVersion: 2
Import-Package:
sun.reflect;version="0";resolution:=optional
Import-Template:
com.mongodb.*;version="${mongo.osgi:[=.=.=,+1.0.0)}",
org.apache.log4j.*;version="${log4j:[=.=.=,+1.0.0)}"

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8"?> <?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd"> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>
@@ -11,12 +11,11 @@
<parent> <parent>
<groupId>org.springframework.data</groupId> <groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId> <artifactId>spring-data-mongodb-parent</artifactId>
<version>1.9.12.BUILD-SNAPSHOT</version> <version>2.0.0.M2</version>
<relativePath>../pom.xml</relativePath> <relativePath>../pom.xml</relativePath>
</parent> </parent>
<properties> <properties>
<validation>1.0.0.GA</validation>
<objenesis>1.3</objenesis> <objenesis>1.3</objenesis>
<equalsverifier>1.5</equalsverifier> <equalsverifier>1.5</equalsverifier>
</properties> </properties>
@@ -79,6 +78,66 @@
<optional>true</optional> <optional>true</optional>
</dependency> </dependency>
<!-- reactive -->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-reactivestreams</artifactId>
<version>${mongo.reactivestreams}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-async</artifactId>
<version>${mongo}</version>
<optional>true</optional>
<exclusions>
<exclusion>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-core</artifactId>
</exclusion>
<exclusion>
<groupId>org.mongodb</groupId>
<artifactId>bson</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-core</artifactId>
<version>${reactor}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.projectreactor.addons</groupId>
<artifactId>reactor-test</artifactId>
<version>${reactor}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex</groupId>
<artifactId>rxjava</artifactId>
<version>${rxjava}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex</groupId>
<artifactId>rxjava-reactive-streams</artifactId>
<version>${rxjava-reactive-streams}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex.rxjava2</groupId>
<artifactId>rxjava</artifactId>
<version>${rxjava2}</version>
<optional>true</optional>
</dependency>
<!-- CDI --> <!-- CDI -->
<dependency> <dependency>
<groupId>javax.enterprise</groupId> <groupId>javax.enterprise</groupId>
@@ -127,7 +186,7 @@
<dependency> <dependency>
<groupId>org.hibernate</groupId> <groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId> <artifactId>hibernate-validator</artifactId>
<version>4.2.0.Final</version> <version>5.2.4.Final</version>
<scope>test</scope> <scope>test</scope>
</dependency> </dependency>
@@ -213,9 +272,11 @@
</includes> </includes>
<excludes> <excludes>
<exclude>**/PerformanceTests.java</exclude> <exclude>**/PerformanceTests.java</exclude>
<exclude>**/ReactivePerformanceTests.java</exclude>
</excludes> </excludes>
<systemPropertyVariables> <systemPropertyVariables>
<java.util.logging.config.file>src/test/resources/logging.properties</java.util.logging.config.file> <java.util.logging.config.file>src/test/resources/logging.properties</java.util.logging.config.file>
<reactor.trace.cancel>true</reactor.trace.cancel>
</systemPropertyVariables> </systemPropertyVariables>
<properties> <properties>
<property> <property>

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -20,6 +20,7 @@ import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.MongoExceptionTranslator; import org.springframework.data.mongodb.core.MongoExceptionTranslator;
import com.mongodb.DB; import com.mongodb.DB;
import com.mongodb.client.MongoDatabase;
/** /**
* Interface for factories creating {@link DB} instances. * Interface for factories creating {@link DB} instances.
@@ -35,7 +36,7 @@ public interface MongoDbFactory {
* @return * @return
* @throws DataAccessException * @throws DataAccessException
*/ */
DB getDb() throws DataAccessException; MongoDatabase getDb() throws DataAccessException;
/** /**
* Creates a {@link DB} instance to access the database with the given name. * Creates a {@link DB} instance to access the database with the given name.
@@ -44,7 +45,7 @@ public interface MongoDbFactory {
* @return * @return
* @throws DataAccessException * @throws DataAccessException
*/ */
DB getDb(String dbName) throws DataAccessException; MongoDatabase getDb(String dbName) throws DataAccessException;
/** /**
* Exposes a shared {@link MongoExceptionTranslator}. * Exposes a shared {@link MongoExceptionTranslator}.
@@ -52,4 +53,6 @@ public interface MongoDbFactory {
* @return will never be {@literal null}. * @return will never be {@literal null}.
*/ */
PersistenceExceptionTranslator getExceptionTranslator(); PersistenceExceptionTranslator getExceptionTranslator();
DB getLegacyDb();
} }

View File

@@ -0,0 +1,56 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.MongoExceptionTranslator;
import com.mongodb.reactivestreams.client.MongoDatabase;
/**
* Interface for factories creating reactive {@link MongoDatabase} instances.
*
* @author Mark Paluch
* @since 2.0
*/
public interface ReactiveMongoDatabaseFactory {
/**
* Creates a default {@link MongoDatabase} instance.
*
* @return
* @throws DataAccessException
*/
MongoDatabase getMongoDatabase() throws DataAccessException;
/**
* Creates a {@link MongoDatabase} instance to access the database with the given name.
*
* @param dbName must not be {@literal null} or empty.
* @return
* @throws DataAccessException
*/
MongoDatabase getMongoDatabase(String dbName) throws DataAccessException;
/**
* Exposes a shared {@link MongoExceptionTranslator}.
*
* @return will never be {@literal null}.
*/
PersistenceExceptionTranslator getExceptionTranslator();
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2011-2015 the original author or authors. * Copyright 2011-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,85 +15,45 @@
*/ */
package org.springframework.data.mongodb.config; package org.springframework.data.mongodb.config;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider;
import org.springframework.context.annotation.Configuration; import org.springframework.context.annotation.Configuration;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mongodb.MongoDbFactory; import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate; import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory; import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DbRefResolver; import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver; import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter; import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.Document; import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.support.CachingIsNewStrategyFactory;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoClient; import com.mongodb.MongoClient;
/** /**
* Base class for Spring Data MongoDB configuration using JavaConfig. * Base class for Spring Data MongoDB configuration using JavaConfig.
* *
* @author Mark Pollack * @author Mark Pollack
* @author Oliver Gierke * @author Oliver Gierke
* @author Thomas Darimont * @author Thomas Darimont
* @author Ryan Tenney * @author Ryan Tenney
* @author Christoph Strobl * @author Christoph Strobl
* @author Mark Paluch
* @see MongoConfigurationSupport
*/ */
@Configuration @Configuration
public abstract class AbstractMongoConfiguration { public abstract class
AbstractMongoConfiguration extends MongoConfigurationSupport {
/** /**
* Return the name of the database to connect to. * Return the {@link MongoClient} instance to connect to. Annotate with {@link Bean} in case you want to expose a
* * {@link MongoClient} instance to the {@link org.springframework.context.ApplicationContext}.
* @return must not be {@literal null}. *
*/
protected abstract String getDatabaseName();
/**
* Return the name of the authentication database to use. Defaults to {@literal null} and will turn into the value
* returned by {@link #getDatabaseName()} later on effectively.
*
* @return * @return
* @deprecated since 1.7. {@link MongoClient} should hold authentication data within
* {@link MongoClient#getCredentialsList()}
*/ */
@Deprecated public abstract MongoClient mongoClient();
protected String getAuthenticationDatabaseName() {
return null;
}
/**
* Return the {@link Mongo} instance to connect to. Annotate with {@link Bean} in case you want to expose a
* {@link Mongo} instance to the {@link org.springframework.context.ApplicationContext}.
*
* @return
* @throws Exception
*/
public abstract Mongo mongo() throws Exception;
/** /**
* Creates a {@link MongoTemplate}. * Creates a {@link MongoTemplate}.
* *
* @return * @return
* @throws Exception
*/ */
@Bean @Bean
public MongoTemplate mongoTemplate() throws Exception { public MongoTemplate mongoTemplate() throws Exception {
@@ -101,92 +61,39 @@ public abstract class AbstractMongoConfiguration {
} }
/** /**
* Creates a {@link SimpleMongoDbFactory} to be used by the {@link MongoTemplate}. Will use the {@link Mongo} instance * Creates a {@link SimpleMongoDbFactory} to be used by the {@link MongoTemplate}. Will use the {@link MongoClient}
* configured in {@link #mongo()}. * instance configured in {@link #mongo()}.
* *
* @see #mongo() * @see #mongoClient()
* @see #mongoTemplate() * @see #mongoTemplate()
* @return * @return
* @throws Exception
*/ */
@Bean @Bean
public MongoDbFactory mongoDbFactory() throws Exception { public MongoDbFactory mongoDbFactory() {
return new SimpleMongoDbFactory(mongo(), getDatabaseName(), getUserCredentials(), getAuthenticationDatabaseName()); return new SimpleMongoDbFactory(mongoClient(), getDatabaseName());
} }
/** /**
* Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration * Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration
* class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending * class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending
* {@link AbstractMongoConfiguration} the base package will be considered {@code com.acme} unless the method is * {@link AbstractMongoConfiguration} the base package will be considered {@code com.acme} unless the method is
* overriden to implement alternate behaviour. * overridden to implement alternate behavior.
* *
* @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for * @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for
* entities. * entities.
* @deprecated use {@link #getMappingBasePackages()} instead.
*/ */
@Deprecated
protected String getMappingBasePackage() { protected String getMappingBasePackage() {
Package mappingBasePackage = getClass().getPackage(); Package mappingBasePackage = getClass().getPackage();
return mappingBasePackage == null ? null : mappingBasePackage.getName(); return mappingBasePackage == null ? null : mappingBasePackage.getName();
} }
/**
* Return {@link UserCredentials} to be used when connecting to the MongoDB instance or {@literal null} if none shall
* be used.
*
* @return
* @deprecated since 1.7. {@link MongoClient} should hold authentication data within
* {@link MongoClient#getCredentialsList()}
*/
@Deprecated
protected UserCredentials getUserCredentials() {
return null;
}
/**
* Creates a {@link MongoMappingContext} equipped with entity classes scanned from the mapping base package.
*
* @see #getMappingBasePackage()
* @return
* @throws ClassNotFoundException
*/
@Bean
public MongoMappingContext mongoMappingContext() throws ClassNotFoundException {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
mappingContext.setFieldNamingStrategy(fieldNamingStrategy());
return mappingContext;
}
/**
* Returns a {@link MappingContextIsNewStrategyFactory} wrapped into a {@link CachingIsNewStrategyFactory}.
*
* @return
* @throws ClassNotFoundException
*/
@Bean
public IsNewStrategyFactory isNewStrategyFactory() throws ClassNotFoundException {
return new CachingIsNewStrategyFactory(new MappingContextIsNewStrategyFactory(mongoMappingContext()));
}
/**
* Register custom {@link Converter}s in a {@link CustomConversions} object if required. These
* {@link CustomConversions} will be registered with the {@link #mappingMongoConverter()} and
* {@link #mongoMappingContext()}. Returns an empty {@link CustomConversions} instance by default.
*
* @return must not be {@literal null}.
*/
@Bean
public CustomConversions customConversions() {
return new CustomConversions(Collections.emptyList());
}
/** /**
* Creates a {@link MappingMongoConverter} using the configured {@link #mongoDbFactory()} and * Creates a {@link MappingMongoConverter} using the configured {@link #mongoDbFactory()} and
* {@link #mongoMappingContext()}. Will get {@link #customConversions()} applied. * {@link #mongoMappingContext()}. Will get {@link #customConversions()} applied.
* *
* @see #customConversions() * @see #customConversions()
* @see #mongoMappingContext() * @see #mongoMappingContext()
* @see #mongoDbFactory() * @see #mongoDbFactory()
@@ -202,53 +109,4 @@ public abstract class AbstractMongoConfiguration {
return converter; return converter;
} }
/**
* Scans the mapping base package for classes annotated with {@link Document}.
*
* @see #getMappingBasePackage()
* @return
* @throws ClassNotFoundException
*/
protected Set<Class<?>> getInitialEntitySet() throws ClassNotFoundException {
String basePackage = getMappingBasePackage();
Set<Class<?>> initialEntitySet = new HashSet<Class<?>>();
if (StringUtils.hasText(basePackage)) {
ClassPathScanningCandidateComponentProvider componentProvider = new ClassPathScanningCandidateComponentProvider(
false);
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Document.class));
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Persistent.class));
for (BeanDefinition candidate : componentProvider.findCandidateComponents(basePackage)) {
initialEntitySet.add(ClassUtils.forName(candidate.getBeanClassName(),
AbstractMongoConfiguration.class.getClassLoader()));
}
}
return initialEntitySet;
}
/**
* Configures whether to abbreviate field names for domain objects by configuring a
* {@link CamelCaseAbbreviatingFieldNamingStrategy} on the {@link MongoMappingContext} instance created. For advanced
* customization needs, consider overriding {@link #mappingMongoConverter()}.
*
* @return
*/
protected boolean abbreviateFieldNames() {
return false;
}
/**
* Configures a {@link FieldNamingStrategy} on the {@link MongoMappingContext} instance created.
*
* @return
* @since 1.5
*/
protected FieldNamingStrategy fieldNamingStrategy() {
return abbreviateFieldNames() ? new CamelCaseAbbreviatingFieldNamingStrategy()
: PropertyNameFieldNamingStrategy.INSTANCE;
}
} }

View File

@@ -0,0 +1,92 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.ReactiveMongoDatabaseFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.ReactiveMongoOperations;
import org.springframework.data.mongodb.core.SimpleReactiveMongoDatabaseFactory;
import org.springframework.data.mongodb.core.ReactiveMongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import com.mongodb.reactivestreams.client.MongoClient;
/**
* Base class for reactive Spring Data MongoDB configuration using JavaConfig.
*
* @author Mark Paluch
* @since 2.0
* @see MongoConfigurationSupport
*/
@Configuration
public abstract class AbstractReactiveMongoConfiguration extends MongoConfigurationSupport {
/**
* Return the {@link MongoClient} instance to connect to. Annotate with {@link Bean} in case you want to expose a
* {@link MongoClient} instance to the {@link org.springframework.context.ApplicationContext}.
*
* @return
*/
public abstract MongoClient mongoClient();
/**
* Creates a {@link ReactiveMongoTemplate}.
*
* @return
*/
@Bean
public ReactiveMongoOperations reactiveMongoTemplate() throws Exception {
return new ReactiveMongoTemplate(mongoDbFactory(), mappingMongoConverter());
}
/**
* Creates a {@link SimpleMongoDbFactory} to be used by the {@link MongoTemplate}. Will use the {@link Mongo} instance
* configured in {@link #mongoClient()}.
*
* @see #mongoClient()
* @see #reactiveMongoTemplate()
* @return
* @throws Exception
*/
@Bean
public ReactiveMongoDatabaseFactory mongoDbFactory() {
return new SimpleReactiveMongoDatabaseFactory(mongoClient(), getDatabaseName());
}
/**
* Creates a {@link MappingMongoConverter} using the configured {@link #mongoDbFactory()} and
* {@link #mongoMappingContext()}. Will get {@link #customConversions()} applied.
*
* @see #customConversions()
* @see #mongoMappingContext()
* @see #mongoDbFactory()
* @return
* @throws Exception
*/
@Bean
public MappingMongoConverter mappingMongoConverter() throws Exception {
MappingMongoConverter converter = new MappingMongoConverter(ReactiveMongoTemplate.NO_OP_REF_RESOLVER,
mongoMappingContext());
converter.setCustomConversions(customConversions());
return converter;
}
}

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2011-2014 the original author or authors. * Copyright 2011-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -21,13 +21,14 @@ package org.springframework.data.mongodb.config;
* @author Jon Brisbin * @author Jon Brisbin
* @author Oliver Gierke * @author Oliver Gierke
* @author Martin Baumgartner * @author Martin Baumgartner
* @author Christoph Strobl
*/ */
public abstract class BeanNames { public abstract class BeanNames {
public static final String MAPPING_CONTEXT_BEAN_NAME = "mongoMappingContext"; public static final String MAPPING_CONTEXT_BEAN_NAME = "mongoMappingContext";
static final String INDEX_HELPER_BEAN_NAME = "indexCreationHelper"; static final String INDEX_HELPER_BEAN_NAME = "indexCreationHelper";
static final String MONGO_BEAN_NAME = "mongo"; static final String MONGO_BEAN_NAME = "mongoClient";
static final String DB_FACTORY_BEAN_NAME = "mongoDbFactory"; static final String DB_FACTORY_BEAN_NAME = "mongoDbFactory";
static final String VALIDATING_EVENT_LISTENER_BEAN_NAME = "validatingMongoEventListener"; static final String VALIDATING_EVENT_LISTENER_BEAN_NAME = "validatingMongoEventListener";
static final String IS_NEW_STRATEGY_FACTORY_BEAN_NAME = "isNewStrategyFactory"; static final String IS_NEW_STRATEGY_FACTORY_BEAN_NAME = "isNewStrategyFactory";

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -55,6 +55,7 @@ import org.springframework.data.mapping.context.MappingContextIsNewStrategyFacto
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy; import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.convert.CustomConversions; import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter; import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator; import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.Document; import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext; import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
@@ -67,12 +68,13 @@ import org.w3c.dom.Element;
/** /**
* Bean definition parser for the {@code mapping-converter} element. * Bean definition parser for the {@code mapping-converter} element.
* *
* @author Jon Brisbin * @author Jon Brisbin
* @author Oliver Gierke * @author Oliver Gierke
* @author Maciej Walkowiak * @author Maciej Walkowiak
* @author Thomas Darimont * @author Thomas Darimont
* @author Christoph Strobl * @author Christoph Strobl
* @author Mark Paluch
*/ */
public class MappingMongoConverterParser implements BeanDefinitionParser { public class MappingMongoConverterParser implements BeanDefinitionParser {
@@ -120,16 +122,22 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
converterBuilder.addPropertyValue("customConversions", conversionsDefinition); converterBuilder.addPropertyValue("customConversions", conversionsDefinition);
} }
if(!registry.containsBeanDefinition("indexOperationsProvider")){
BeanDefinitionBuilder indexOperationsProviderBuilder = BeanDefinitionBuilder.genericBeanDefinition("org.springframework.data.mongodb.core.DefaultIndexOperationsProvider");
indexOperationsProviderBuilder.addConstructorArgReference(dbFactoryRef);
indexOperationsProviderBuilder.addConstructorArgValue(BeanDefinitionBuilder.genericBeanDefinition(QueryMapper.class).addConstructorArgReference(id).getBeanDefinition());
parserContext.registerBeanComponent(new BeanComponentDefinition(indexOperationsProviderBuilder.getBeanDefinition(), "indexOperationsProvider"));
}
try { try {
registry.getBeanDefinition(INDEX_HELPER_BEAN_NAME); registry.getBeanDefinition(INDEX_HELPER_BEAN_NAME);
} catch (NoSuchBeanDefinitionException ignored) { } catch (NoSuchBeanDefinitionException ignored) {
if (!StringUtils.hasText(dbFactoryRef)) {
dbFactoryRef = DB_FACTORY_BEAN_NAME;
}
BeanDefinitionBuilder indexHelperBuilder = BeanDefinitionBuilder BeanDefinitionBuilder indexHelperBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoPersistentEntityIndexCreator.class); .genericBeanDefinition(MongoPersistentEntityIndexCreator.class);
indexHelperBuilder.addConstructorArgReference(ctxRef); indexHelperBuilder.addConstructorArgReference(ctxRef);
indexHelperBuilder.addConstructorArgReference(dbFactoryRef); indexHelperBuilder.addConstructorArgReference("indexOperationsProvider");
indexHelperBuilder.addDependsOn(ctxRef); indexHelperBuilder.addDependsOn(ctxRef);
parserContext.registerBeanComponent(new BeanComponentDefinition(indexHelperBuilder.getBeanDefinition(), parserContext.registerBeanComponent(new BeanComponentDefinition(indexHelperBuilder.getBeanDefinition(),
@@ -348,7 +356,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
/** /**
* {@link TypeFilter} that returns {@literal false} in case any of the given delegates matches. * {@link TypeFilter} that returns {@literal false} in case any of the given delegates matches.
* *
* @author Oliver Gierke * @author Oliver Gierke
*/ */
private static class NegatingFilter implements TypeFilter { private static class NegatingFilter implements TypeFilter {
@@ -357,7 +365,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
/** /**
* Creates a new {@link NegatingFilter} with the given delegates. * Creates a new {@link NegatingFilter} with the given delegates.
* *
* @param filters * @param filters
*/ */
public NegatingFilter(TypeFilter... filters) { public NegatingFilter(TypeFilter... filters) {

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -0,0 +1,203 @@
/*
* Copyright 2016-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.context.PersistentEntities;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.support.CachingIsNewStrategyFactory;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Base class for Spring Data MongoDB to be extended for JavaConfiguration usage.
*
* @author Mark Paluch
* @since 2.0
*/
public abstract class MongoConfigurationSupport {
/**
* Return the name of the database to connect to.
*
* @return must not be {@literal null}.
*/
protected abstract String getDatabaseName();
/**
* Returns the base packages to scan for MongoDB mapped entities at startup. Will return the package name of the
* configuration class' (the concrete class, not this one here) by default. So if you have a
* {@code com.acme.AppConfig} extending {@link MongoConfigurationSupport} the base package will be considered
* {@code com.acme} unless the method is overridden to implement alternate behavior.
*
* @return the base packages to scan for mapped {@link Document} classes or an empty collection to not enable scanning
* for entities.
* @since 1.10
*/
protected Collection<String> getMappingBasePackages() {
Package mappingBasePackage = getClass().getPackage();
return Collections.singleton(mappingBasePackage == null ? null : mappingBasePackage.getName());
}
/**
* Creates a {@link MongoMappingContext} equipped with entity classes scanned from the mapping base package.
*
* @see #getMappingBasePackages()
* @return
* @throws ClassNotFoundException
*/
@Bean
public MongoMappingContext mongoMappingContext() throws ClassNotFoundException {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
mappingContext.setFieldNamingStrategy(fieldNamingStrategy());
return mappingContext;
}
/**
* Returns a {@link MappingContextIsNewStrategyFactory} wrapped into a {@link CachingIsNewStrategyFactory}.
*
* @return
* @throws ClassNotFoundException
*/
@Bean
public IsNewStrategyFactory isNewStrategyFactory() throws ClassNotFoundException {
return new CachingIsNewStrategyFactory(new MappingContextIsNewStrategyFactory(
new PersistentEntities(Arrays.<MappingContext<?, ?>> asList(new MappingContext[] { mongoMappingContext() }))));
}
/**
* Register custom {@link Converter}s in a {@link CustomConversions} object if required. These
* {@link CustomConversions} will be registered with the {@link #mappingMongoConverter()} and
* {@link #mongoMappingContext()}. Returns an empty {@link CustomConversions} instance by default.
*
* @return must not be {@literal null}.
*/
@Bean
public CustomConversions customConversions() {
return new CustomConversions(Collections.emptyList());
}
/**
* Scans the mapping base package for classes annotated with {@link Document}. By default, it scans for entities in
* all packages returned by {@link #getMappingBasePackages()}.
*
* @see #getMappingBasePackages()
* @return
* @throws ClassNotFoundException
*/
protected Set<Class<?>> getInitialEntitySet() throws ClassNotFoundException {
Set<Class<?>> initialEntitySet = new HashSet<Class<?>>();
for (String basePackage : getMappingBasePackages()) {
initialEntitySet.addAll(scanForEntities(basePackage));
}
return initialEntitySet;
}
/**
* Scans the given base package for entities, i.e. MongoDB specific types annotated with {@link Document} and
* {@link Persistent}.
*
* @param basePackage must not be {@literal null}.
* @return
* @throws ClassNotFoundException
* @since 1.10
*/
protected Set<Class<?>> scanForEntities(String basePackage) throws ClassNotFoundException {
if (!StringUtils.hasText(basePackage)) {
return Collections.emptySet();
}
Set<Class<?>> initialEntitySet = new HashSet<Class<?>>();
if (StringUtils.hasText(basePackage)) {
ClassPathScanningCandidateComponentProvider componentProvider = new ClassPathScanningCandidateComponentProvider(
false);
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Document.class));
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Persistent.class));
for (BeanDefinition candidate : componentProvider.findCandidateComponents(basePackage)) {
initialEntitySet
.add(ClassUtils.forName(candidate.getBeanClassName(), MongoConfigurationSupport.class.getClassLoader()));
}
}
return initialEntitySet;
}
/**
* Configures whether to abbreviate field names for domain objects by configuring a
* {@link CamelCaseAbbreviatingFieldNamingStrategy} on the {@link MongoMappingContext} instance created. For advanced
* customization needs, consider overriding {@link #mappingMongoConverter()}.
*
* @return
*/
protected boolean abbreviateFieldNames() {
return false;
}
/**
* Configures a {@link FieldNamingStrategy} on the {@link MongoMappingContext} instance created.
*
* @return
* @since 1.5
*/
protected FieldNamingStrategy fieldNamingStrategy() {
return abbreviateFieldNames() ? new CamelCaseAbbreviatingFieldNamingStrategy()
: PropertyNameFieldNamingStrategy.INSTANCE;
}
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2011-2015 by the original author(s). * Copyright 2011-2017 by the original author(s).
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -30,9 +30,8 @@ import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser; import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser; import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext; import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.config.BeanComponentDefinitionBuilder; import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mongodb.core.MongoFactoryBean; import org.springframework.data.mongodb.core.MongoClientFactoryBean;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory; import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
import org.w3c.dom.Element; import org.w3c.dom.Element;
@@ -99,8 +98,6 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
String mongoRef = element.getAttribute("mongo-ref"); String mongoRef = element.getAttribute("mongo-ref");
String dbname = element.getAttribute("dbname"); String dbname = element.getAttribute("dbname");
BeanDefinition userCredentials = getUserCredentialsBeanDefinition(element, parserContext);
// Defaulting // Defaulting
if (StringUtils.hasText(mongoRef)) { if (StringUtils.hasText(mongoRef)) {
dbFactoryBuilder.addConstructorArgReference(mongoRef); dbFactoryBuilder.addConstructorArgReference(mongoRef);
@@ -109,8 +106,6 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
} }
dbFactoryBuilder.addConstructorArgValue(StringUtils.hasText(dbname) ? dbname : "db"); dbFactoryBuilder.addConstructorArgValue(StringUtils.hasText(dbname) ? dbname : "db");
dbFactoryBuilder.addConstructorArgValue(userCredentials);
dbFactoryBuilder.addConstructorArgValue(element.getAttribute("authentication-dbname"));
BeanDefinitionBuilder writeConcernPropertyEditorBuilder = getWriteConcernPropertyEditorBuilder(); BeanDefinitionBuilder writeConcernPropertyEditorBuilder = getWriteConcernPropertyEditorBuilder();
@@ -131,35 +126,13 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
*/ */
private BeanDefinition registerMongoBeanDefinition(Element element, ParserContext parserContext) { private BeanDefinition registerMongoBeanDefinition(Element element, ParserContext parserContext) {
BeanDefinitionBuilder mongoBuilder = BeanDefinitionBuilder.genericBeanDefinition(MongoFactoryBean.class); BeanDefinitionBuilder mongoBuilder = BeanDefinitionBuilder.genericBeanDefinition(MongoClientFactoryBean.class);
setPropertyValue(mongoBuilder, element, "host"); setPropertyValue(mongoBuilder, element, "host");
setPropertyValue(mongoBuilder, element, "port"); setPropertyValue(mongoBuilder, element, "port");
return getSourceBeanDefinition(mongoBuilder, parserContext, element); return getSourceBeanDefinition(mongoBuilder, parserContext, element);
} }
/**
* Returns a {@link BeanDefinition} for a {@link UserCredentials} object.
*
* @param element
* @return the {@link BeanDefinition} or {@literal null} if neither username nor password given.
*/
private BeanDefinition getUserCredentialsBeanDefinition(Element element, ParserContext context) {
String username = element.getAttribute("username");
String password = element.getAttribute("password");
if (!StringUtils.hasText(username) && !StringUtils.hasText(password)) {
return null;
}
BeanDefinitionBuilder userCredentialsBuilder = BeanDefinitionBuilder.genericBeanDefinition(UserCredentials.class);
userCredentialsBuilder.addConstructorArgValue(StringUtils.hasText(username) ? username : null);
userCredentialsBuilder.addConstructorArgValue(StringUtils.hasText(password) ? password : null);
return getSourceBeanDefinition(userCredentialsBuilder, context, element);
}
/** /**
* Creates a {@link BeanDefinition} for a {@link MongoURI} or {@link MongoClientURI} depending on configured * Creates a {@link BeanDefinition} for a {@link MongoURI} or {@link MongoClientURI} depending on configured
* attributes. <br /> * attributes. <br />
@@ -193,7 +166,7 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
parserContext.extractSource(element)); parserContext.extractSource(element));
} }
Class<?> type = hasClientUri ? MongoClientURI.class : MongoURI.class; Class<?> type = MongoClientURI.class;
String uri = hasClientUri ? element.getAttribute("client-uri") : element.getAttribute("uri"); String uri = hasClientUri ? element.getAttribute("client-uri") : element.getAttribute("uri");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(type); BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(type);

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2011 the original author or authors. * Copyright 2011-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -26,12 +26,19 @@ import org.springframework.data.mongodb.monitor.*;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
import org.w3c.dom.Element; import org.w3c.dom.Element;
/**
* @author Mark Pollack
* @author Thomas Risberg
* @author John Brisbin
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MongoJmxParser implements BeanDefinitionParser { public class MongoJmxParser implements BeanDefinitionParser {
public BeanDefinition parse(Element element, ParserContext parserContext) { public BeanDefinition parse(Element element, ParserContext parserContext) {
String name = element.getAttribute("mongo-ref"); String name = element.getAttribute("mongo-ref");
if (!StringUtils.hasText(name)) { if (!StringUtils.hasText(name)) {
name = "mongo"; name = BeanNames.MONGO_BEAN_NAME;
} }
registerJmxComponents(name, element, parserContext); registerJmxComponents(name, element, parserContext);
return null; return null;

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2011-2015 the original author or authors. * Copyright 2011-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -33,7 +33,6 @@ public class MongoNamespaceHandler extends NamespaceHandlerSupport {
public void init() { public void init() {
registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser()); registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser());
registerBeanDefinitionParser("mongo", new MongoParser());
registerBeanDefinitionParser("mongo-client", new MongoClientParser()); registerBeanDefinitionParser("mongo-client", new MongoClientParser());
registerBeanDefinitionParser("db-factory", new MongoDbFactoryParser()); registerBeanDefinitionParser("db-factory", new MongoDbFactoryParser());
registerBeanDefinitionParser("jmx", new MongoJmxParser()); registerBeanDefinitionParser("jmx", new MongoJmxParser());

View File

@@ -1,76 +0,0 @@
/*
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.core.MongoFactoryBean;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* Parser for &lt;mongo;gt; definitions.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MongoParser implements BeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
Object source = parserContext.extractSource(element);
String id = element.getAttribute("id");
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(MongoFactoryBean.class);
ParsingUtils.setPropertyValue(builder, element, "port", "port");
ParsingUtils.setPropertyValue(builder, element, "host", "host");
ParsingUtils.setPropertyValue(builder, element, "write-concern", "writeConcern");
MongoParsingUtils.parseMongoOptions(element, builder);
MongoParsingUtils.parseReplicaSet(element, builder);
String defaultedId = StringUtils.hasText(id) ? id : BeanNames.MONGO_BEAN_NAME;
parserContext.pushContainingComponent(new CompositeComponentDefinition("Mongo", source));
BeanComponentDefinition mongoComponent = helper.getComponent(builder, defaultedId);
parserContext.registerBeanComponent(mongoComponent);
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(MongoParsingUtils
.getServerAddressPropertyEditorBuilder());
parserContext.registerBeanComponent(serverAddressPropertyEditor);
BeanComponentDefinition writeConcernPropertyEditor = helper.getComponent(MongoParsingUtils
.getWriteConcernPropertyEditorBuilder());
parserContext.registerBeanComponent(writeConcernPropertyEditor);
parserContext.popAndRegisterContainingComponent();
return mongoComponent.getBeanDefinition();
}
}

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2011-2015 the original author or authors. * Copyright 2011-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -25,7 +25,6 @@ import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedMap; import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser; import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.data.mongodb.core.MongoClientOptionsFactoryBean; import org.springframework.data.mongodb.core.MongoClientOptionsFactoryBean;
import org.springframework.data.mongodb.core.MongoOptionsFactoryBean;
import org.springframework.util.xml.DomUtils; import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element; import org.w3c.dom.Element;
@@ -54,42 +53,6 @@ abstract class MongoParsingUtils {
setPropertyValue(mongoBuilder, element, "replica-set", "replicaSetSeeds"); setPropertyValue(mongoBuilder, element, "replica-set", "replicaSetSeeds");
} }
/**
* Parses the {@code mongo:options} sub-element. Populates the given attribute factory with the proper attributes.
*
* @return true if parsing actually occured, {@literal false} otherwise
*/
static boolean parseMongoOptions(Element element, BeanDefinitionBuilder mongoBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "options");
if (optionsElement == null) {
return false;
}
BeanDefinitionBuilder optionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoOptionsFactoryBean.class);
setPropertyValue(optionsDefBuilder, optionsElement, "connections-per-host", "connectionsPerHost");
setPropertyValue(optionsDefBuilder, optionsElement, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(optionsDefBuilder, optionsElement, "max-wait-time", "maxWaitTime");
setPropertyValue(optionsDefBuilder, optionsElement, "connect-timeout", "connectTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "socket-timeout", "socketTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(optionsDefBuilder, optionsElement, "auto-connect-retry", "autoConnectRetry");
setPropertyValue(optionsDefBuilder, optionsElement, "max-auto-connect-retry-time", "maxAutoConnectRetryTime");
setPropertyValue(optionsDefBuilder, optionsElement, "write-number", "writeNumber");
setPropertyValue(optionsDefBuilder, optionsElement, "write-timeout", "writeTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "write-fsync", "writeFsync");
setPropertyValue(optionsDefBuilder, optionsElement, "slave-ok", "slaveOk");
setPropertyValue(optionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(optionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
mongoBuilder.addPropertyValue("mongoOptions", optionsDefBuilder.getBeanDefinition());
return true;
}
/** /**
* Parses the {@code mongo:client-options} sub-element. Populates the given attribute factory with the proper * Parses the {@code mongo:client-options} sub-element. Populates the given attribute factory with the proper
* attributes. * attributes.
@@ -129,6 +92,7 @@ abstract class MongoParsingUtils {
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-socket-timeout", "heartbeatSocketTimeout"); setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-socket-timeout", "heartbeatSocketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "ssl", "ssl"); setPropertyValue(clientOptionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(clientOptionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory"); setPropertyReference(clientOptionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "server-selection-timeout", "serverSelectionTimeout");
mongoClientBuilder.addPropertyValue("mongoClientOptions", clientOptionsDefBuilder.getBeanDefinition()); mongoClientBuilder.addPropertyValue("mongoClientOptions", clientOptionsDefBuilder.getBeanDefinition());

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2015-2017 the original author or authors. * Copyright 2015-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -21,14 +21,14 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update; import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.Pair; import org.springframework.data.util.Pair;
import com.mongodb.BulkWriteResult; import com.mongodb.bulk.BulkWriteResult;
/** /**
* Bulk operations for insert/update/remove actions on a collection. These bulks operation are available since MongoDB * Bulk operations for insert/update/remove actions on a collection. These bulks operation are available since MongoDB
* 2.6 and make use of low level bulk commands on the protocol level. This interface defines a fluent API to add * 2.6 and make use of low level bulk commands on the protocol level. This interface defines a fluent API to add
* multiple single operations or list of similar operations in sequence which can then eventually be executed by calling * multiple single operations or list of similar operations in sequence which can then eventually be executed by calling
* {@link #execute()}. * {@link #execute()}.
* *
* @author Tobias Trelle * @author Tobias Trelle
* @author Oliver Gierke * @author Oliver Gierke
* @since 1.9 * @since 1.9
@@ -38,7 +38,7 @@ public interface BulkOperations {
/** /**
* Mode for bulk operation. * Mode for bulk operation.
**/ **/
public enum BulkMode { enum BulkMode {
/** Perform bulk operations in sequence. The first error will cancel processing. */ /** Perform bulk operations in sequence. The first error will cancel processing. */
ORDERED, ORDERED,
@@ -49,7 +49,7 @@ public interface BulkOperations {
/** /**
* Add a single insert to the bulk operation. * Add a single insert to the bulk operation.
* *
* @param documents the document to insert, must not be {@literal null}. * @param documents the document to insert, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the insert added, will never be {@literal null}. * @return the current {@link BulkOperations} instance with the insert added, will never be {@literal null}.
*/ */
@@ -57,7 +57,7 @@ public interface BulkOperations {
/** /**
* Add a list of inserts to the bulk operation. * Add a list of inserts to the bulk operation.
* *
* @param documents List of documents to insert, must not be {@literal null}. * @param documents List of documents to insert, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the insert added, will never be {@literal null}. * @return the current {@link BulkOperations} instance with the insert added, will never be {@literal null}.
*/ */
@@ -65,7 +65,7 @@ public interface BulkOperations {
/** /**
* Add a single update to the bulk operation. For the update request, only the first matching document is updated. * Add a single update to the bulk operation. For the update request, only the first matching document is updated.
* *
* @param query update criteria, must not be {@literal null}. * @param query update criteria, must not be {@literal null}.
* @param update {@link Update} operation to perform, must not be {@literal null}. * @param update {@link Update} operation to perform, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}. * @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
@@ -74,7 +74,7 @@ public interface BulkOperations {
/** /**
* Add a list of updates to the bulk operation. For each update request, only the first matching document is updated. * Add a list of updates to the bulk operation. For each update request, only the first matching document is updated.
* *
* @param updates Update operations to perform. * @param updates Update operations to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}. * @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/ */
@@ -82,7 +82,7 @@ public interface BulkOperations {
/** /**
* Add a single update to the bulk operation. For the update request, all matching documents are updated. * Add a single update to the bulk operation. For the update request, all matching documents are updated.
* *
* @param query Update criteria. * @param query Update criteria.
* @param update Update operation to perform. * @param update Update operation to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}. * @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
@@ -91,7 +91,7 @@ public interface BulkOperations {
/** /**
* Add a list of updates to the bulk operation. For each update request, all matching documents are updated. * Add a list of updates to the bulk operation. For each update request, all matching documents are updated.
* *
* @param updates Update operations to perform. * @param updates Update operations to perform.
* @return The bulk operation. * @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}. * @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
@@ -101,7 +101,7 @@ public interface BulkOperations {
/** /**
* Add a single upsert to the bulk operation. An upsert is an update if the set of matching documents is not empty, * Add a single upsert to the bulk operation. An upsert is an update if the set of matching documents is not empty,
* else an insert. * else an insert.
* *
* @param query Update criteria. * @param query Update criteria.
* @param update Update operation to perform. * @param update Update operation to perform.
* @return The bulk operation. * @return The bulk operation.
@@ -112,7 +112,7 @@ public interface BulkOperations {
/** /**
* Add a list of upserts to the bulk operation. An upsert is an update if the set of matching documents is not empty, * Add a list of upserts to the bulk operation. An upsert is an update if the set of matching documents is not empty,
* else an insert. * else an insert.
* *
* @param updates Updates/insert operations to perform. * @param updates Updates/insert operations to perform.
* @return The bulk operation. * @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}. * @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
@@ -121,7 +121,7 @@ public interface BulkOperations {
/** /**
* Add a single remove operation to the bulk operation. * Add a single remove operation to the bulk operation.
* *
* @param remove the {@link Query} to select the documents to be removed, must not be {@literal null}. * @param remove the {@link Query} to select the documents to be removed, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the removal added, will never be {@literal null}. * @return the current {@link BulkOperations} instance with the removal added, will never be {@literal null}.
*/ */
@@ -129,7 +129,7 @@ public interface BulkOperations {
/** /**
* Add a list of remove operations to the bulk operation. * Add a list of remove operations to the bulk operation.
* *
* @param removes the remove operations to perform, must not be {@literal null}. * @param removes the remove operations to perform, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the removal added, will never be {@literal null}. * @return the current {@link BulkOperations} instance with the removal added, will never be {@literal null}.
*/ */
@@ -137,9 +137,9 @@ public interface BulkOperations {
/** /**
* Execute all bulk operations using the default write concern. * Execute all bulk operations using the default write concern.
* *
* @return Result of the bulk operation providing counters for inserts/updates etc. * @return Result of the bulk operation providing counters for inserts/updates etc.
* @throws org.springframework.data.mongodb.BulkOperationException if an error occurred during bulk processing. * @throws {@link BulkOperationException} if an error occurred during bulk processing.
*/ */
BulkWriteResult execute(); BulkWriteResult execute();
} }

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2010-2011 the original author or authors. * Copyright 2010-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,12 +15,30 @@
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import com.mongodb.DBCollection; import org.bson.Document;
import com.mongodb.MongoException;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
import com.mongodb.MongoException;
import com.mongodb.client.MongoCollection;
/**
* Callback interface for executing actions against a {@link MongoCollection}
*
* @author Mark Pollak
* @author Grame Rocher
* @author Oliver Gierke
* @author John Brisbin
* @auhtor Christoph Strobl
* @since 1.0
*/
public interface CollectionCallback<T> { public interface CollectionCallback<T> {
T doInCollection(DBCollection collection) throws MongoException, DataAccessException; /**
* @param collection never {@literal null}.
* @return
* @throws MongoException
* @throws DataAccessException
*/
T doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException;
} }

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2002-2010 the original author or authors. * Copyright 2002-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,12 +15,15 @@
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import com.mongodb.DBCursor; import org.bson.Document;
import com.mongodb.client.FindIterable;
/** /**
* Simple callback interface to allow customization of a {@link DBCursor}. * Simple callback interface to allow customization of a {@link FindIterable}.
* *
* @author Oliver Gierke * @author Oliver Gierke
* @author Christoph Strobl
*/ */
interface CursorPreparer { interface CursorPreparer {
@@ -29,5 +32,5 @@ interface CursorPreparer {
* *
* @param cursor * @param cursor
*/ */
DBCursor prepare(DBCursor cursor); FindIterable<Document> prepare(FindIterable<Document> cursor);
} }

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2010-2011 the original author or authors. * Copyright 2010-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,11 +15,16 @@
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import com.mongodb.DB;
import com.mongodb.MongoException;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
import com.mongodb.MongoException;
import com.mongodb.client.MongoDatabase;
/**
*
* @param <T>
*/
public interface DbCallback<T> { public interface DbCallback<T> {
T doInDB(DB db) throws MongoException, DataAccessException; T doInDB(MongoDatabase db) throws MongoException, DataAccessException;
} }

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,79 +15,80 @@
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import lombok.NonNull; import java.util.ArrayList;
import lombok.Value; import java.util.Arrays;
import java.util.Collections;
import java.util.List; import java.util.List;
import org.bson.Document;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator; import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.query.Query; import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update; import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.Pair; import org.springframework.data.util.Pair;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.BulkWriteException; import com.mongodb.BulkWriteException;
import com.mongodb.BulkWriteOperation;
import com.mongodb.BulkWriteRequestBuilder;
import com.mongodb.BulkWriteResult;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.WriteConcern; import com.mongodb.WriteConcern;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.model.BulkWriteOptions;
import com.mongodb.client.model.DeleteManyModel;
import com.mongodb.client.model.InsertOneModel;
import com.mongodb.client.model.UpdateManyModel;
import com.mongodb.client.model.UpdateOneModel;
import com.mongodb.client.model.UpdateOptions;
import com.mongodb.client.model.WriteModel;
/** /**
* Default implementation for {@link BulkOperations}. * Default implementation for {@link BulkOperations}.
* *
* @author Tobias Trelle * @author Tobias Trelle
* @author Oliver Gierke * @author Oliver Gierke
* @author Christoph Strobl * @author Christoph Strobl
* @author Mark Paluch
* @since 1.9 * @since 1.9
*/ */
class DefaultBulkOperations implements BulkOperations { class DefaultBulkOperations implements BulkOperations {
private final MongoOperations mongoOperations; private final MongoOperations mongoOperations;
private final BulkMode bulkMode;
private final String collectionName; private final String collectionName;
private final BulkOperationContext bulkOperationContext;
private PersistenceExceptionTranslator exceptionTranslator; private PersistenceExceptionTranslator exceptionTranslator;
private WriteConcernResolver writeConcernResolver; private WriteConcernResolver writeConcernResolver;
private WriteConcern defaultWriteConcern; private WriteConcern defaultWriteConcern;
private BulkWriteOperation bulk; private BulkWriteOptions bulkOptions;
List<WriteModel<Document>> models = new ArrayList<WriteModel<Document>>();
/** /**
* Creates a new {@link DefaultBulkOperations} for the given {@link MongoOperations}, collection name and * Creates a new {@link DefaultBulkOperations} for the given {@link MongoOperations}, {@link BulkMode}, collection
* {@link BulkOperationContext}. * name and {@link WriteConcern}.
* *
* @param mongoOperations must not be {@literal null}. * @param mongoOperations The underlying {@link MongoOperations}, must not be {@literal null}.
* @param collectionName must not be {@literal null}. * @param bulkMode must not be {@literal null}.
* @param bulkOperationContext must not be {@literal null}. * @param collectionName Name of the collection to work on, must not be {@literal null} or empty.
* @since 1.9.12 * @param entityType the entity type, can be {@literal null}.
*/ */
DefaultBulkOperations(MongoOperations mongoOperations, String collectionName, DefaultBulkOperations(MongoOperations mongoOperations, BulkMode bulkMode, String collectionName,
BulkOperationContext bulkOperationContext) { Class<?> entityType) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!"); Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.hasText(collectionName, "CollectionName must not be null nor empty!"); Assert.notNull(bulkMode, "BulkMode must not be null!");
Assert.notNull(bulkOperationContext, "BulkOperationContext must not be null!"); Assert.hasText(collectionName, "Collection name must not be null or empty!");
this.mongoOperations = mongoOperations; this.mongoOperations = mongoOperations;
this.bulkMode = bulkMode;
this.collectionName = collectionName; this.collectionName = collectionName;
this.bulkOperationContext = bulkOperationContext;
this.exceptionTranslator = new MongoExceptionTranslator(); this.exceptionTranslator = new MongoExceptionTranslator();
this.writeConcernResolver = DefaultWriteConcernResolver.INSTANCE; this.writeConcernResolver = DefaultWriteConcernResolver.INSTANCE;
this.bulk = getBulkWriteOptions(bulkOperationContext.getBulkMode());
this.bulkOptions = initBulkOperation();
} }
/** /**
* Configures the {@link PersistenceExceptionTranslator} to be used. Defaults to {@link MongoExceptionTranslator}. * Configures the {@link PersistenceExceptionTranslator} to be used. Defaults to {@link MongoExceptionTranslator}.
* *
* @param exceptionTranslator can be {@literal null}. * @param exceptionTranslator can be {@literal null}.
*/ */
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) { public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
@@ -96,7 +97,7 @@ class DefaultBulkOperations implements BulkOperations {
/** /**
* Configures the {@link WriteConcernResolver} to be used. Defaults to {@link DefaultWriteConcernResolver}. * Configures the {@link WriteConcernResolver} to be used. Defaults to {@link DefaultWriteConcernResolver}.
* *
* @param writeConcernResolver can be {@literal null}. * @param writeConcernResolver can be {@literal null}.
*/ */
public void setWriteConcernResolver(WriteConcernResolver writeConcernResolver) { public void setWriteConcernResolver(WriteConcernResolver writeConcernResolver) {
@@ -106,10 +107,10 @@ class DefaultBulkOperations implements BulkOperations {
/** /**
* Configures the default {@link WriteConcern} to be used. Defaults to {@literal null}. * Configures the default {@link WriteConcern} to be used. Defaults to {@literal null}.
* *
* @param defaultWriteConcern can be {@literal null}. * @param defaultWriteConcern can be {@literal null}.
*/ */
void setDefaultWriteConcern(WriteConcern defaultWriteConcern) { public void setDefaultWriteConcern(WriteConcern defaultWriteConcern) {
this.defaultWriteConcern = defaultWriteConcern; this.defaultWriteConcern = defaultWriteConcern;
} }
@@ -122,15 +123,16 @@ class DefaultBulkOperations implements BulkOperations {
Assert.notNull(document, "Document must not be null!"); Assert.notNull(document, "Document must not be null!");
if (document instanceof DBObject) { if (document instanceof Document) {
bulk.insert((DBObject) document); models.add(new InsertOneModel<>((Document) document));
return this; return this;
} }
DBObject sink = new BasicDBObject(); Document sink = new Document();
mongoOperations.getConverter().write(document, sink); mongoOperations.getConverter().write(document, sink);
bulk.insert(sink);
models.add(new InsertOneModel<>(sink));
return this; return this;
} }
@@ -161,7 +163,7 @@ class DefaultBulkOperations implements BulkOperations {
Assert.notNull(query, "Query must not be null!"); Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!"); Assert.notNull(update, "Update must not be null!");
return updateOne(Collections.singletonList(Pair.of(query, update))); return updateOne(Arrays.asList(Pair.of(query, update)));
} }
/* /*
@@ -191,7 +193,7 @@ class DefaultBulkOperations implements BulkOperations {
Assert.notNull(query, "Query must not be null!"); Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!"); Assert.notNull(update, "Update must not be null!");
return updateMulti(Collections.singletonList(Pair.of(query, update))); return updateMulti(Arrays.asList(Pair.of(query, update)));
} }
/* /*
@@ -242,8 +244,7 @@ class DefaultBulkOperations implements BulkOperations {
Assert.notNull(query, "Query must not be null!"); Assert.notNull(query, "Query must not be null!");
bulk.find(getMappedQuery(query.getQueryObject())).remove(); models.add(new DeleteManyModel<Document>(query.getQueryObject()));
return this; return this;
} }
@@ -268,27 +269,30 @@ class DefaultBulkOperations implements BulkOperations {
* @see org.springframework.data.mongodb.core.BulkOperations#executeBulk() * @see org.springframework.data.mongodb.core.BulkOperations#executeBulk()
*/ */
@Override @Override
public BulkWriteResult execute() { public com.mongodb.bulk.BulkWriteResult execute() {
MongoAction action = new MongoAction(defaultWriteConcern, MongoActionOperation.BULK, collectionName,
bulkOperationContext.getEntityType(), null, null);
WriteConcern writeConcern = writeConcernResolver.resolve(action);
try { try {
return writeConcern == null ? bulk.execute() : bulk.execute(writeConcern);
MongoCollection<Document> collection = mongoOperations.getCollection(collectionName);
if (defaultWriteConcern != null) {
collection = collection.withWriteConcern(defaultWriteConcern);
}
return collection.bulkWrite(models, bulkOptions);
} catch (BulkWriteException o_O) { } catch (BulkWriteException o_O) {
DataAccessException toThrow = exceptionTranslator.translateExceptionIfPossible(o_O); DataAccessException toThrow = exceptionTranslator.translateExceptionIfPossible(o_O);
throw toThrow == null ? o_O : toThrow; throw toThrow == null ? o_O : toThrow;
} finally { } finally {
this.bulk = getBulkWriteOptions(bulkOperationContext.getBulkMode()); this.bulkOptions = initBulkOperation();
} }
} }
/** /**
* Performs update and upsert bulk operations. * Performs update and upsert bulk operations.
* *
* @param query the {@link Query} to determine documents to update. * @param query the {@link Query} to determine documents to update.
* @param update the {@link Update} to perform, must not be {@literal null}. * @param update the {@link Update} to perform, must not be {@literal null}.
* @param upsert whether to upsert. * @param upsert whether to upsert.
@@ -300,68 +304,26 @@ class DefaultBulkOperations implements BulkOperations {
Assert.notNull(query, "Query must not be null!"); Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!"); Assert.notNull(update, "Update must not be null!");
BulkWriteRequestBuilder builder = bulk.find(getMappedQuery(query.getQueryObject())); UpdateOptions options = new UpdateOptions();
options.upsert(upsert);
if (upsert) {
if (multi) {
builder.upsert().update(getMappedUpdate(update.getUpdateObject()));
} else {
builder.upsert().updateOne(getMappedUpdate(update.getUpdateObject()));
}
if (multi) {
models.add(new UpdateManyModel<Document>(query.getQueryObject(), update.getUpdateObject(), options));
} else { } else {
models.add(new UpdateOneModel<Document>(query.getQueryObject(), update.getUpdateObject(), options));
if (multi) {
builder.update(getMappedUpdate(update.getUpdateObject()));
} else {
builder.updateOne(getMappedUpdate(update.getUpdateObject()));
}
} }
return this; return this;
} }
private DBObject getMappedUpdate(DBObject update) { private final BulkWriteOptions initBulkOperation() {
return bulkOperationContext.getUpdateMapper().getMappedObject(update, bulkOperationContext.getEntity());
}
private DBObject getMappedQuery(DBObject query) {
return bulkOperationContext.getQueryMapper().getMappedObject(query, bulkOperationContext.getEntity());
}
private BulkWriteOperation getBulkWriteOptions(BulkMode bulkMode) {
DBCollection collection = mongoOperations.getCollection(collectionName);
BulkWriteOptions options = new BulkWriteOptions();
switch (bulkMode) { switch (bulkMode) {
case ORDERED: case ORDERED:
return collection.initializeOrderedBulkOperation(); return options.ordered(true);
case UNORDERED: case UNORDERED:
return collection.initializeUnorderedBulkOperation(); return options.ordered(false);
} }
throw new IllegalStateException("BulkMode was null!"); throw new IllegalStateException("BulkMode was null!");
} }
/**
* {@link BulkOperationContext} holds information about
* {@link org.springframework.data.mongodb.core.BulkOperations.BulkMode} the entity in use as well as references to
* {@link QueryMapper} and {@link UpdateMapper}.
*
* @author Christoph Strobl
* @since 2.0
*/
@Value
static class BulkOperationContext {
@NonNull BulkMode bulkMode;
MongoPersistentEntity<?> entity;
@NonNull QueryMapper queryMapper;
@NonNull UpdateMapper updateMapper;
Class<?> getEntityType() {
return entity != null ? entity.getType() : null;
}
}
} }

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,20 +15,25 @@
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.core.MongoTemplate.*;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Collection; import java.util.Collection;
import java.util.List; import java.util.List;
import org.bson.Document;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.convert.QueryMapper; import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.index.IndexDefinition; import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo; import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity; import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.MongoException; import com.mongodb.MongoException;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.MongoCursor;
import com.mongodb.client.model.IndexOptions;
/** /**
* Default implementation of {@link IndexOperations}. * Default implementation of {@link IndexOperations}.
@@ -37,12 +42,13 @@ import com.mongodb.MongoException;
* @author Oliver Gierke * @author Oliver Gierke
* @author Komi Innocent * @author Komi Innocent
* @author Christoph Strobl * @author Christoph Strobl
* @author Mark Paluch
*/ */
public class DefaultIndexOperations implements IndexOperations { public class DefaultIndexOperations implements IndexOperations {
private static final String PARTIAL_FILTER_EXPRESSION_KEY = "partialFilterExpression"; private static final String PARTIAL_FILTER_EXPRESSION_KEY = "partialFilterExpression";
private final MongoOperations mongoOperations; private final MongoDbFactory mongoDbFactory;
private final String collectionName; private final String collectionName;
private final QueryMapper mapper; private final QueryMapper mapper;
private final Class<?> type; private final Class<?> type;
@@ -50,29 +56,34 @@ public class DefaultIndexOperations implements IndexOperations {
/** /**
* Creates a new {@link DefaultIndexOperations}. * Creates a new {@link DefaultIndexOperations}.
* *
* @param mongoOperations must not be {@literal null}. * @param mongoDbFactory must not be {@literal null}.
* @param collectionName must not be {@literal null}. * @param collectionName must not be {@literal null}.
* @param queryMapper must not be {@literal null}.
*/ */
public DefaultIndexOperations(MongoOperations mongoOperations, String collectionName) { public DefaultIndexOperations(MongoDbFactory mongoDbFactory, String collectionName, QueryMapper queryMapper) {
this(mongoOperations, collectionName, null);
this(mongoDbFactory, collectionName, queryMapper, null);
} }
/** /**
* Creates a new {@link DefaultIndexOperations}. * Creates a new {@link DefaultIndexOperations}.
* *
* @param mongoOperations must not be {@literal null}. * @param mongoDbFactory must not be {@literal null}.
* @param collectionName must not be {@literal null}. * @param collectionName must not be {@literal null}.
* @param queryMapper must not be {@literal null}.
* @param type Type used for mapping potential partial index filter expression. Can be {@literal null}. * @param type Type used for mapping potential partial index filter expression. Can be {@literal null}.
* @since 1.10 * @since 1.10
*/ */
public DefaultIndexOperations(MongoOperations mongoOperations, String collectionName, Class<?> type) { public DefaultIndexOperations(MongoDbFactory mongoDbFactory, String collectionName, QueryMapper queryMapper,
Class<?> type) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!"); Assert.notNull(mongoDbFactory, "MongoDbFactory must not be null!");
Assert.notNull(collectionName, "Collection name can not be null!"); Assert.notNull(collectionName, "Collection name can not be null!");
Assert.notNull(queryMapper, "QueryMapper must not be null!");
this.mongoOperations = mongoOperations; this.mongoDbFactory = mongoDbFactory;
this.collectionName = collectionName; this.collectionName = collectionName;
this.mapper = new QueryMapper(mongoOperations.getConverter()); this.mapper = queryMapper;
this.type = type; this.type = type;
} }
@@ -80,47 +91,47 @@ public class DefaultIndexOperations implements IndexOperations {
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition) * @see org.springframework.data.mongodb.core.IndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/ */
public void ensureIndex(final IndexDefinition indexDefinition) { public String ensureIndex(final IndexDefinition indexDefinition) {
mongoOperations.execute(collectionName, new CollectionCallback<Object>() { return execute(collection -> {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
DBObject indexOptions = indexDefinition.getIndexOptions();
if (indexOptions != null && indexOptions.containsField(PARTIAL_FILTER_EXPRESSION_KEY)) { Document indexOptions = indexDefinition.getIndexOptions();
Assert.isInstanceOf(DBObject.class, indexOptions.get(PARTIAL_FILTER_EXPRESSION_KEY)); if (indexOptions != null) {
indexOptions.put(PARTIAL_FILTER_EXPRESSION_KEY, IndexOptions ops = IndexConverters.indexDefinitionToIndexOptionsConverter().convert(indexDefinition);
mapper.getMappedObject((DBObject) indexOptions.get(PARTIAL_FILTER_EXPRESSION_KEY),
lookupPersistentEntity(type, collectionName))); if (indexOptions.containsKey(PARTIAL_FILTER_EXPRESSION_KEY)) {
Assert.isInstanceOf(Document.class, indexOptions.get(PARTIAL_FILTER_EXPRESSION_KEY));
ops.partialFilterExpression( mapper.getMappedObject(
(Document) indexOptions.get(PARTIAL_FILTER_EXPRESSION_KEY), lookupPersistentEntity(type, collectionName)));
} }
if (indexOptions != null) { return collection.createIndex(indexDefinition.getIndexKeys(), ops);
collection.createIndex(indexDefinition.getIndexKeys(), indexOptions);
} else {
collection.createIndex(indexDefinition.getIndexKeys());
}
return null;
} }
return collection.createIndex(indexDefinition.getIndexKeys());
}
private MongoPersistentEntity<?> lookupPersistentEntity(Class<?> entityType, String collection) { );
}
if (entityType != null) { private MongoPersistentEntity<?> lookupPersistentEntity(Class<?> entityType, String collection) {
return mongoOperations.getConverter().getMappingContext().getPersistentEntity(entityType);
}
Collection<? extends MongoPersistentEntity<?>> entities = mongoOperations.getConverter().getMappingContext() if (entityType != null) {
.getPersistentEntities(); return mapper.getMappingContext().getRequiredPersistentEntity(entityType);
}
for (MongoPersistentEntity<?> entity : entities) { Collection<? extends MongoPersistentEntity<?>> entities = mapper.getMappingContext().getPersistentEntities();
if (entity.getCollection().equals(collection)) {
return entity;
}
}
return null; for (MongoPersistentEntity<?> entity : entities) {
if (entity.getCollection().equals(collection)) {
return entity;
} }
}); }
return null;
} }
/* /*
@@ -128,11 +139,10 @@ public class DefaultIndexOperations implements IndexOperations {
* @see org.springframework.data.mongodb.core.IndexOperations#dropIndex(java.lang.String) * @see org.springframework.data.mongodb.core.IndexOperations#dropIndex(java.lang.String)
*/ */
public void dropIndex(final String name) { public void dropIndex(final String name) {
mongoOperations.execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException { execute(collection -> {
collection.dropIndex(name); collection.dropIndex(name);
return null; return null;
}
}); });
} }
@@ -145,45 +155,46 @@ public class DefaultIndexOperations implements IndexOperations {
dropIndex("*"); dropIndex("*");
} }
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#resetIndexCache()
*/
@Deprecated
public void resetIndexCache() {
mongoOperations.execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
ReflectiveDBCollectionInvoker.resetIndexCache(collection);
return null;
}
});
}
/* /*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#getIndexInfo() * @see org.springframework.data.mongodb.core.IndexOperations#getIndexInfo()
*/ */
public List<IndexInfo> getIndexInfo() { public List<IndexInfo> getIndexInfo() {
return mongoOperations.execute(collectionName, new CollectionCallback<List<IndexInfo>>() { return execute(new CollectionCallback<List<IndexInfo>>() {
public List<IndexInfo> doInCollection(DBCollection collection) throws MongoException, DataAccessException { public List<IndexInfo> doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
List<DBObject> dbObjectList = collection.getIndexInfo(); MongoCursor<Document> cursor = collection.listIndexes(Document.class).iterator();
return getIndexData(dbObjectList); return getIndexData(cursor);
} }
private List<IndexInfo> getIndexData(List<DBObject> dbObjectList) { private List<IndexInfo> getIndexData(MongoCursor<Document> cursor) {
List<IndexInfo> indexInfoList = new ArrayList<IndexInfo>(); List<IndexInfo> indexInfoList = new ArrayList<IndexInfo>();
for (DBObject ix : dbObjectList) { while (cursor.hasNext()) {
indexInfoList.add(IndexInfo.indexInfoOf(ix));
Document ix = cursor.next();
IndexInfo indexInfo = IndexConverters.documentToIndexInfoConverter().convert(ix);
indexInfoList.add(indexInfo);
} }
return indexInfoList; return indexInfoList;
} }
}); });
} }
public <T> T execute(CollectionCallback<T> callback) {
Assert.notNull(callback, "CollectionCallback must not be null!");
try {
MongoCollection<Document> collection = mongoDbFactory.getDb().getCollection(collectionName);
return callback.doInCollection(collection);
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e, mongoDbFactory.getExceptionTranslator());
}
}
} }

View File

@@ -0,0 +1,47 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.QueryMapper;
/**
* {@link IndexOperationsProvider} to obtain {@link IndexOperations} from a given {@link MongoDbFactory}. TODO: Review
* me
*
* @author Mark Paluch
* @since 2.0
*/
class DefaultIndexOperationsProvider implements IndexOperationsProvider {
private final MongoDbFactory mongoDbFactory;
private final QueryMapper mapper;
/**
* @param mongoDbFactory must not be {@literal null}.
*/
DefaultIndexOperationsProvider(MongoDbFactory mongoDbFactory, QueryMapper mapper) {
this.mongoDbFactory = mongoDbFactory; this.mapper = mapper;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperationsProvider#reactiveIndexOps(java.lang.String)
*/
@Override
public IndexOperations indexOps(String collectionName) {
return new DefaultIndexOperations(mongoDbFactory, collectionName, mapper);
}
}

View File

@@ -0,0 +1,102 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.bson.Document;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.util.Assert;
import com.mongodb.reactivestreams.client.ListIndexesPublisher;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
/**
* Default implementation of {@link IndexOperations}.
*
* @author Mark Paluch
* @since 1.11
*/
public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
private final ReactiveMongoOperations mongoOperations;
private final String collectionName;
/**
* Creates a new {@link DefaultReactiveIndexOperations}.
*
* @param mongoOperations must not be {@literal null}.
* @param collectionName must not be {@literal null}.
*/
public DefaultReactiveIndexOperations(ReactiveMongoOperations mongoOperations, String collectionName) {
Assert.notNull(mongoOperations, "ReactiveMongoOperations must not be null!");
Assert.notNull(collectionName, "Collection must not be null!");
this.mongoOperations = mongoOperations;
this.collectionName = collectionName;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveIndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/
public Mono<String> ensureIndex(final IndexDefinition indexDefinition) {
return mongoOperations.execute(collectionName, (ReactiveCollectionCallback<String>) collection -> {
Document indexOptions = indexDefinition.getIndexOptions();
if (indexOptions != null) {
return collection.createIndex(indexDefinition.getIndexKeys(),
IndexConverters.indexDefinitionToIndexOptionsConverter().convert(indexDefinition));
}
return collection.createIndex(indexDefinition.getIndexKeys());
}).next();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveIndexOperations#dropIndex(java.lang.String)
*/
public Mono<Void> dropIndex(final String name) {
return mongoOperations.execute(collectionName, collection -> {
return Mono.from(collection.dropIndex(name));
}).flatMap(success -> Mono.<Void>empty()).next();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveIndexOperations#dropAllIndexes()
*/
public Mono<Void> dropAllIndexes() {
return dropIndex("*");
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveIndexOperations#getIndexInfo()
*/
public Flux<IndexInfo> getIndexInfo() {
return mongoOperations.execute(collectionName, collection -> {
ListIndexesPublisher<Document> indexesPublisher = collection.listIndexes(Document.class);
return Flux.from(indexesPublisher).map(IndexConverters.documentToIndexInfoConverter()::convert);
});
}
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -20,11 +20,13 @@ import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*; import static org.springframework.data.mongodb.core.query.Query.*;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections; import java.util.Collections;
import java.util.HashSet; import java.util.HashSet;
import java.util.List; import java.util.List;
import java.util.Set; import java.util.Set;
import org.bson.Document;
import org.bson.types.ObjectId; import org.bson.types.ObjectId;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.script.ExecutableMongoScript; import org.springframework.data.mongodb.core.script.ExecutableMongoScript;
@@ -34,8 +36,9 @@ import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils; import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
import com.mongodb.DB; import com.mongodb.BasicDBList;
import com.mongodb.MongoException; import com.mongodb.MongoException;
import com.mongodb.client.MongoDatabase;
/** /**
* Default implementation of {@link ScriptOperations} capable of saving and executing {@link ServerSideJavaScript}. * Default implementation of {@link ScriptOperations} capable of saving and executing {@link ServerSideJavaScript}.
@@ -97,8 +100,13 @@ class DefaultScriptOperations implements ScriptOperations {
return mongoOperations.execute(new DbCallback<Object>() { return mongoOperations.execute(new DbCallback<Object>() {
@Override @Override
public Object doInDB(DB db) throws MongoException, DataAccessException { public Object doInDB(MongoDatabase db) throws MongoException, DataAccessException {
return db.eval(script.getCode(), convertScriptArgs(false, args));
Document command = new Document("$eval", script.getCode());
BasicDBList commandArgs = new BasicDBList();
commandArgs.addAll(Arrays.asList(convertScriptArgs(false, args)));
command.append("args", commandArgs);
return db.runCommand(command).get("retval");
} }
}); });
} }
@@ -115,8 +123,10 @@ class DefaultScriptOperations implements ScriptOperations {
return mongoOperations.execute(new DbCallback<Object>() { return mongoOperations.execute(new DbCallback<Object>() {
@Override @Override
public Object doInDB(DB db) throws MongoException, DataAccessException { public Object doInDB(MongoDatabase db) throws MongoException, DataAccessException {
return db.eval(String.format("%s(%s)", scriptName, convertAndJoinScriptArgs(args)));
return db.runCommand(new Document("eval", String.format("%s(%s)", scriptName, convertAndJoinScriptArgs(args))))
.get("retval");
} }
}); });
} }

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2010-2011 the original author or authors. * Copyright 2010-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,24 +15,26 @@
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import org.bson.Document;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
import com.mongodb.DBObject;
import com.mongodb.MongoException; import com.mongodb.MongoException;
/** /**
* An interface used by {@link MongoTemplate} for processing documents returned from a MongoDB query on a per-document * An interface used by {@link MongoTemplate} for processing documents returned from a MongoDB query on a per-document
* basis. Implementations of this interface perform the actual work of prcoessing each document but don't need to worry * basis. Implementations of this interface perform the actual work of prcoessing each document but don't need to worry
* about exception handling. {@MongoException}s will be caught and translated by the calling * about exception handling. {@MongoException}s will be caught and translated by the calling MongoTemplate An
* MongoTemplate * DocumentCallbackHandler is typically stateful: It keeps the result state within the object, to be available later for
* * later inspection.
* An DocumentCallbackHandler is typically stateful: It keeps the result state within the object, to be available later
* for later inspection.
* *
* @author Mark Pollack * @author Mark Pollack
* * @author Grame Rocher
* @author Oliver Gierke
* @author John Brisbin
* @author Christoph Strobl
* @since 1.0
*/ */
public interface DocumentCallbackHandler { public interface DocumentCallbackHandler {
void processDocument(DBObject dbObject) throws MongoException, DataAccessException; void processDocument(Document document) throws MongoException, DataAccessException;
} }

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -0,0 +1,34 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.DBCursor;
import com.mongodb.reactivestreams.client.FindPublisher;
/**
* Simple callback interface to allow customization of a {@link FindPublisher}.
*
* @author Mark Paluch
*/
interface FindPublisherPreparer {
/**
* Prepare the given cursor (apply limits, skips and so on). Returns the prepared cursor.
*
* @param cursor
*/
<T> FindPublisher<T> prepare(FindPublisher<T> findPublisher);
}

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2016 the original author or authors. * Copyright 2016-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,29 +15,29 @@
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import org.bson.Document;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/** /**
* Value object to mitigate different representations of geo command execution results in MongoDB. * Value object to mitigate different representations of geo command execution results in MongoDB.
* *
* @author Oliver Gierke * @author Oliver Gierke
* @author Christoph Strobl
* @soundtrack Fruitcake - Jeff Coffin (The Inside of the Outside) * @soundtrack Fruitcake - Jeff Coffin (The Inside of the Outside)
* @since 1.9
*/ */
class GeoCommandStatistics { class GeoCommandStatistics {
private static final GeoCommandStatistics NONE = new GeoCommandStatistics(new BasicDBObject()); private static final GeoCommandStatistics NONE = new GeoCommandStatistics(new Document());
private final DBObject source; private final Document source;
/** /**
* Creates a new {@link GeoCommandStatistics} instance with the given source document. * Creates a new {@link GeoCommandStatistics} instance with the given source document.
* *
* @param source must not be {@literal null}. * @param source must not be {@literal null}.
*/ */
private GeoCommandStatistics(DBObject source) { private GeoCommandStatistics(Document source) {
Assert.notNull(source, "Source document must not be null!"); Assert.notNull(source, "Source document must not be null!");
this.source = source; this.source = source;
@@ -49,12 +49,12 @@ class GeoCommandStatistics {
* @param commandResult must not be {@literal null}. * @param commandResult must not be {@literal null}.
* @return * @return
*/ */
public static GeoCommandStatistics from(DBObject commandResult) { public static GeoCommandStatistics from(Document commandResult) {
Assert.notNull(commandResult, "Command result must not be null!"); Assert.notNull(commandResult, "Command result must not be null!");
Object stats = commandResult.get("stats"); Object stats = commandResult.get("stats");
return stats == null ? NONE : new GeoCommandStatistics((DBObject) stats); return stats == null ? NONE : new GeoCommandStatistics((Document) stats);
} }
/** /**
@@ -62,7 +62,7 @@ class GeoCommandStatistics {
* didn't return any result introduced in MongoDB 3.2 RC1. * didn't return any result introduced in MongoDB 3.2 RC1.
* *
* @return * @return
* @see https://jira.mongodb.org/browse/SERVER-21024 * @see <a href="https://jira.mongodb.org/browse/SERVER-21024">MongoDB Jira SERVER-21024</a>
*/ */
public double getAverageDistance() { public double getAverageDistance() {

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2015 the original author or authors. * Copyright 2015-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -17,15 +17,14 @@ package org.springframework.data.mongodb.core;
import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Bean;
import org.springframework.data.mongodb.core.geo.GeoJsonModule; import org.springframework.data.mongodb.core.geo.GeoJsonModule;
import org.springframework.data.web.config.SpringDataWebConfigurationMixin; import org.springframework.data.web.config.SpringDataJacksonModules;
/** /**
* Configuration class to expose {@link GeoJsonModule} as a Spring bean. * Configuration class to expose {@link GeoJsonModule} as a Spring bean.
* *
* @author Oliver Gierke * @author Oliver Gierke
*/ */
@SpringDataWebConfigurationMixin public class GeoJsonConfiguration implements SpringDataJacksonModules {
public class GeoJsonConfiguration {
@Bean @Bean
public GeoJsonModule geoJsonModule() { public GeoJsonModule geoJsonModule() {

View File

@@ -0,0 +1,134 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.domain.Sort.Direction.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.List;
import java.util.concurrent.TimeUnit;
import org.bson.Document;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexField;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.util.ObjectUtils;
import com.mongodb.client.model.IndexOptions;
/**
* {@link Converter Converters} for index-related MongoDB documents/types.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.0
*/
abstract class IndexConverters {
private static final Converter<IndexDefinition, IndexOptions> DEFINITION_TO_MONGO_INDEX_OPTIONS;
private static final Converter<Document, IndexInfo> DOCUMENT_INDEX_INFO;
private static final Double ONE = Double.valueOf(1);
private static final Double MINUS_ONE = Double.valueOf(-1);
private static final Collection<String> TWO_D_IDENTIFIERS = Arrays.asList("2d", "2dsphere");
static {
DEFINITION_TO_MONGO_INDEX_OPTIONS = getIndexDefinitionIndexOptionsConverter();
DOCUMENT_INDEX_INFO = getDocumentIndexInfoConverter();
}
private IndexConverters() {
}
static Converter<IndexDefinition, IndexOptions> indexDefinitionToIndexOptionsConverter() {
return DEFINITION_TO_MONGO_INDEX_OPTIONS;
}
static Converter<Document, IndexInfo> documentToIndexInfoConverter() {
return DOCUMENT_INDEX_INFO;
}
private static Converter<IndexDefinition, IndexOptions> getIndexDefinitionIndexOptionsConverter() {
return indexDefinition -> {
Document indexOptions = indexDefinition.getIndexOptions();
IndexOptions ops = new IndexOptions();
if (indexOptions.containsKey("name")) {
ops = ops.name(indexOptions.get("name").toString());
}
if (indexOptions.containsKey("unique")) {
ops = ops.unique((Boolean) indexOptions.get("unique"));
}
if (indexOptions.containsKey("sparse")) {
ops = ops.sparse((Boolean) indexOptions.get("sparse"));
}
if (indexOptions.containsKey("background")) {
ops = ops.background((Boolean) indexOptions.get("background"));
}
if (indexOptions.containsKey("expireAfterSeconds")) {
ops = ops.expireAfter((Long) indexOptions.get("expireAfterSeconds"), TimeUnit.SECONDS);
}
if (indexOptions.containsKey("min")) {
ops = ops.min(((Number) indexOptions.get("min")).doubleValue());
}
if (indexOptions.containsKey("max")) {
ops = ops.max(((Number) indexOptions.get("max")).doubleValue());
}
if (indexOptions.containsKey("bits")) {
ops = ops.bits((Integer) indexOptions.get("bits"));
}
if (indexOptions.containsKey("bucketSize")) {
ops = ops.bucketSize(((Number) indexOptions.get("bucketSize")).doubleValue());
}
if (indexOptions.containsKey("default_language")) {
ops = ops.defaultLanguage(indexOptions.get("default_language").toString());
}
if (indexOptions.containsKey("language_override")) {
ops = ops.languageOverride(indexOptions.get("language_override").toString());
}
if (indexOptions.containsKey("weights")) {
ops = ops.weights((org.bson.Document) indexOptions.get("weights"));
}
for (String key : indexOptions.keySet()) {
if (ObjectUtils.nullSafeEquals("2dsphere", indexOptions.get(key))) {
ops = ops.sphereVersion(2);
}
}
if(indexOptions.containsKey("partialFilterExpression")) {
ops = ops.partialFilterExpression((org.bson.Document)indexOptions.get("partialFilterExpression"));
}
return ops;
};
}
private static Converter<Document, IndexInfo> getDocumentIndexInfoConverter() {
return ix -> {
return IndexInfo.indexInfoOf(ix);
};
}
}

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2011-2015 the original author or authors. * Copyright 2011-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -35,7 +35,7 @@ public interface IndexOperations {
* *
* @param indexDefinition must not be {@literal null}. * @param indexDefinition must not be {@literal null}.
*/ */
void ensureIndex(IndexDefinition indexDefinition); String ensureIndex(IndexDefinition indexDefinition);
/** /**
* Drops an index from this collection. * Drops an index from this collection.
@@ -49,15 +49,6 @@ public interface IndexOperations {
*/ */
void dropAllIndexes(); void dropAllIndexes();
/**
* Clears all indices that have not yet been applied to this collection.
*
* @deprecated since 1.7. The MongoDB Java driver version 3.0 does no longer support reseting the index cache.
* @throws {@link UnsupportedOperationException} when used with MongoDB Java driver version 3.0.
*/
@Deprecated
void resetIndexCache();
/** /**
* Returns the index information on the collection. * Returns the index information on the collection.
* *

View File

@@ -0,0 +1,66 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.List;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.util.Assert;
/**
* Adapter for creating synchronous {@link IndexOperations}.
*
* @author Christoph Strobl
* @since 2.0
*/
public interface IndexOperationsAdapter extends IndexOperations {
/**
* Obtain a blocking variant of {@link IndexOperations} wrapping {@link ReactiveIndexOperations}.
*
* @param reactiveIndexOperations must not be {@literal null}.
* @return never {@literal null}
*/
static IndexOperationsAdapter blocking(ReactiveIndexOperations reactiveIndexOperations) {
Assert.notNull(reactiveIndexOperations, "ReactiveIndexOperations must not be null!");
return new IndexOperationsAdapter() {
@Override
public String ensureIndex(IndexDefinition indexDefinition) {
return reactiveIndexOperations.ensureIndex(indexDefinition).block();
}
@Override
public void dropIndex(String name) {
reactiveIndexOperations.dropIndex(name).block();
}
@Override
public void dropAllIndexes() {
reactiveIndexOperations.dropAllIndexes().block();
}
@Override
public List<IndexInfo> getIndexInfo() {
return reactiveIndexOperations.getIndexInfo().collectList().block();
}
};
}
}

View File

@@ -0,0 +1,34 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.data.mongodb.core.convert.QueryMapper;
/**
* TODO: Revisit for a better pattern.
* @author Mark Paluch
* @since 2.0
*/
public interface IndexOperationsProvider {
/**
* Returns the operations that can be performed on indexes
*
* @return index operations on the named collection
*/
IndexOperations indexOps(String collectionName);
}

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2011-2012 the original author or authors. * Copyright 2011-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,9 +15,9 @@
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import org.bson.Document;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import com.mongodb.DBObject;
import com.mongodb.WriteConcern; import com.mongodb.WriteConcern;
/** /**
@@ -31,6 +31,7 @@ import com.mongodb.WriteConcern;
* *
* @author Mark Pollack * @author Mark Pollack
* @author Oliver Gierke * @author Oliver Gierke
* @author Christoph Strobl
*/ */
public class MongoAction { public class MongoAction {
@@ -38,8 +39,8 @@ public class MongoAction {
private final WriteConcern defaultWriteConcern; private final WriteConcern defaultWriteConcern;
private final Class<?> entityType; private final Class<?> entityType;
private final MongoActionOperation mongoActionOperation; private final MongoActionOperation mongoActionOperation;
private final DBObject query; private final Document query;
private final DBObject document; private final Document document;
/** /**
* Create an instance of a {@link MongoAction}. * Create an instance of a {@link MongoAction}.
@@ -48,11 +49,11 @@ public class MongoAction {
* @param mongoActionOperation action being taken against the collection * @param mongoActionOperation action being taken against the collection
* @param collectionName the collection name, must not be {@literal null} or empty. * @param collectionName the collection name, must not be {@literal null} or empty.
* @param entityType the POJO that is being operated against * @param entityType the POJO that is being operated against
* @param document the converted DBObject from the POJO or Spring Update object * @param document the converted Document from the POJO or Spring Update object
* @param query the converted DBObject from the Spring Query object * @param query the converted Document from the Spring Query object
*/ */
public MongoAction(WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation, public MongoAction(WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation, String collectionName,
String collectionName, Class<?> entityType, DBObject document, DBObject query) { Class<?> entityType, Document document, Document query) {
Assert.hasText(collectionName, "Collection name must not be null or empty!"); Assert.hasText(collectionName, "Collection name must not be null or empty!");
@@ -72,14 +73,6 @@ public class MongoAction {
return defaultWriteConcern; return defaultWriteConcern;
} }
/**
* @deprecated use {@link #getEntityType()} instead.
*/
@Deprecated
public Class<?> getEntityClass() {
return entityType;
}
public Class<?> getEntityType() { public Class<?> getEntityType() {
return entityType; return entityType;
} }
@@ -88,11 +81,11 @@ public class MongoAction {
return mongoActionOperation; return mongoActionOperation;
} }
public DBObject getQuery() { public Document getQuery() {
return query; return query;
} }
public DBObject getDocument() { public Document getDocument() {
return document; return document;
} }

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,33 +15,30 @@
*/ */
package org.springframework.data.mongodb.core; package org.springframework.data.mongodb.core;
import org.springframework.data.authentication.UserCredentials; import com.mongodb.MongoClient;
import com.mongodb.client.MongoDatabase;
import org.bson.Document;
import org.springframework.jmx.export.annotation.ManagedOperation; import org.springframework.jmx.export.annotation.ManagedOperation;
import org.springframework.jmx.export.annotation.ManagedResource; import org.springframework.jmx.export.annotation.ManagedResource;
import org.springframework.util.Assert; import org.springframework.util.Assert;
import com.mongodb.DB;
import com.mongodb.Mongo;
/** /**
* Mongo server administration exposed via JMX annotations * Mongo server administration exposed via JMX annotations
* *
* @author Mark Pollack * @author Mark Pollack
* @author Thomas Darimont * @author Thomas Darimont
* @author Mark Paluch * @author Mark Paluch
* @author Christoph Strobl
*/ */
@ManagedResource(description = "Mongo Admin Operations") @ManagedResource(description = "Mongo Admin Operations")
public class MongoAdmin implements MongoAdminOperations { public class MongoAdmin implements MongoAdminOperations {
private final Mongo mongo; private final MongoClient mongoClient;
private String username;
private String password;
private String authenticationDatabaseName;
public MongoAdmin(Mongo mongo) { public MongoAdmin(MongoClient mongoClient) {
Assert.notNull(mongo, "Mongo must not be null!"); Assert.notNull(mongoClient, "MongoClient must not be null!");
this.mongo = mongo; this.mongoClient = mongoClient;
} }
/* (non-Javadoc) /* (non-Javadoc)
@@ -49,7 +46,7 @@ public class MongoAdmin implements MongoAdminOperations {
*/ */
@ManagedOperation @ManagedOperation
public void dropDatabase(String databaseName) { public void dropDatabase(String databaseName) {
getDB(databaseName).dropDatabase(); getDB(databaseName).drop();
} }
/* (non-Javadoc) /* (non-Javadoc)
@@ -65,37 +62,16 @@ public class MongoAdmin implements MongoAdminOperations {
*/ */
@ManagedOperation @ManagedOperation
public String getDatabaseStats(String databaseName) { public String getDatabaseStats(String databaseName) {
return getDB(databaseName).getStats().toString(); return getDB(databaseName).runCommand(new Document("dbStats", 1).append("scale" , 1024)).toJson();
} }
/** @ManagedOperation
* Sets the username to use to connect to the Mongo database public String getServerStatus() {
* return getDB("admin").runCommand(new Document("serverStatus", 1).append("rangeDeleter", 1).append("repl", 1)).toJson();
* @param username The username to use
*/
public void setUsername(String username) {
this.username = username;
} }
/**
* Sets the password to use to authenticate with the Mongo database.
*
* @param password The password to use
*/
public void setPassword(String password) {
this.password = password;
}
/** MongoDatabase getDB(String databaseName) {
* Sets the authenticationDatabaseName to use to authenticate with the Mongo database. return mongoClient.getDatabase(databaseName);
*
* @param authenticationDatabaseName The authenticationDatabaseName to use.
*/
public void setAuthenticationDatabaseName(String authenticationDatabaseName) {
this.authenticationDatabaseName = authenticationDatabaseName;
}
DB getDB(String databaseName) {
return MongoDbUtils.getDB(mongo, databaseName, new UserCredentials(username, password), authenticationDatabaseName);
} }
} }

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2015 the original author or authors. * Copyright 2015-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -26,7 +26,6 @@ import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.util.CollectionUtils; import org.springframework.util.CollectionUtils;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoClient; import com.mongodb.MongoClient;
import com.mongodb.MongoClientOptions; import com.mongodb.MongoClientOptions;
import com.mongodb.MongoCredential; import com.mongodb.MongoCredential;
@@ -38,7 +37,7 @@ import com.mongodb.ServerAddress;
* @author Christoph Strobl * @author Christoph Strobl
* @since 1.7 * @since 1.7
*/ */
public class MongoClientFactoryBean extends AbstractFactoryBean<Mongo> implements PersistenceExceptionTranslator { public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> implements PersistenceExceptionTranslator {
private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator(); private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator();
@@ -108,8 +107,8 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<Mongo> implement
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType() * @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/ */
public Class<? extends Mongo> getObjectType() { public Class<? extends MongoClient> getObjectType() {
return Mongo.class; return MongoClient.class;
} }
/* /*
@@ -125,7 +124,7 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<Mongo> implement
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance() * @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/ */
@Override @Override
protected Mongo createInstance() throws Exception { protected MongoClient createInstance() throws Exception {
if (mongoClientOptions == null) { if (mongoClientOptions == null) {
mongoClientOptions = MongoClientOptions.builder().build(); mongoClientOptions = MongoClientOptions.builder().build();
@@ -143,7 +142,7 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<Mongo> implement
* @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object) * @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object)
*/ */
@Override @Override
protected void destroyInstance(Mongo instance) throws Exception { protected void destroyInstance(MongoClient instance) throws Exception {
instance.close(); instance.close();
} }

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2015 the original author or authors. * Copyright 2015-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -30,9 +30,10 @@ import com.mongodb.WriteConcern;
/** /**
* A factory bean for construction of a {@link MongoClientOptions} instance. * A factory bean for construction of a {@link MongoClientOptions} instance.
* *
* @author Christoph Strobl * @author Christoph Strobl
* @author Oliver Gierke * @author Oliver Gierke
* @author Mark Paluch
* @since 1.7 * @since 1.7
*/ */
public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClientOptions> { public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClientOptions> {
@@ -62,13 +63,14 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
private int heartbeatConnectTimeout = DEFAULT_MONGO_OPTIONS.getHeartbeatConnectTimeout(); private int heartbeatConnectTimeout = DEFAULT_MONGO_OPTIONS.getHeartbeatConnectTimeout();
private int heartbeatSocketTimeout = DEFAULT_MONGO_OPTIONS.getHeartbeatSocketTimeout(); private int heartbeatSocketTimeout = DEFAULT_MONGO_OPTIONS.getHeartbeatSocketTimeout();
private String requiredReplicaSetName = DEFAULT_MONGO_OPTIONS.getRequiredReplicaSetName(); private String requiredReplicaSetName = DEFAULT_MONGO_OPTIONS.getRequiredReplicaSetName();
private int serverSelectionTimeout = DEFAULT_MONGO_OPTIONS.getServerSelectionTimeout();
private boolean ssl; private boolean ssl;
private SSLSocketFactory sslSocketFactory; private SSLSocketFactory sslSocketFactory;
/** /**
* Set the {@link MongoClient} description. * Set the {@link MongoClient} description.
* *
* @param description * @param description
*/ */
public void setDescription(String description) { public void setDescription(String description) {
@@ -77,7 +79,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/** /**
* Set the minimum number of connections per host. * Set the minimum number of connections per host.
* *
* @param minConnectionsPerHost * @param minConnectionsPerHost
*/ */
public void setMinConnectionsPerHost(int minConnectionsPerHost) { public void setMinConnectionsPerHost(int minConnectionsPerHost) {
@@ -87,7 +89,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/** /**
* Set the number of connections allowed per host. Will block if run out. Default is 10. System property * Set the number of connections allowed per host. Will block if run out. Default is 10. System property
* {@code MONGO.POOLSIZE} can override * {@code MONGO.POOLSIZE} can override
* *
* @param connectionsPerHost * @param connectionsPerHost
*/ */
public void setConnectionsPerHost(int connectionsPerHost) { public void setConnectionsPerHost(int connectionsPerHost) {
@@ -98,7 +100,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
* Set the multiplier for connectionsPerHost for # of threads that can block. Default is 5. If connectionsPerHost is * Set the multiplier for connectionsPerHost for # of threads that can block. Default is 5. If connectionsPerHost is
* 10, and threadsAllowedToBlockForConnectionMultiplier is 5, then 50 threads can block more than that and an * 10, and threadsAllowedToBlockForConnectionMultiplier is 5, then 50 threads can block more than that and an
* exception will be thrown. * exception will be thrown.
* *
* @param threadsAllowedToBlockForConnectionMultiplier * @param threadsAllowedToBlockForConnectionMultiplier
*/ */
public void setThreadsAllowedToBlockForConnectionMultiplier(int threadsAllowedToBlockForConnectionMultiplier) { public void setThreadsAllowedToBlockForConnectionMultiplier(int threadsAllowedToBlockForConnectionMultiplier) {
@@ -107,7 +109,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/** /**
* Set the max wait time of a blocking thread for a connection. Default is 12000 ms (2 minutes) * Set the max wait time of a blocking thread for a connection. Default is 12000 ms (2 minutes)
* *
* @param maxWaitTime * @param maxWaitTime
*/ */
public void setMaxWaitTime(int maxWaitTime) { public void setMaxWaitTime(int maxWaitTime) {
@@ -116,7 +118,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/** /**
* The maximum idle time for a pooled connection. * The maximum idle time for a pooled connection.
* *
* @param maxConnectionIdleTime * @param maxConnectionIdleTime
*/ */
public void setMaxConnectionIdleTime(int maxConnectionIdleTime) { public void setMaxConnectionIdleTime(int maxConnectionIdleTime) {
@@ -125,7 +127,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/** /**
* Set the maximum life time for a pooled connection. * Set the maximum life time for a pooled connection.
* *
* @param maxConnectionLifeTime * @param maxConnectionLifeTime
*/ */
public void setMaxConnectionLifeTime(int maxConnectionLifeTime) { public void setMaxConnectionLifeTime(int maxConnectionLifeTime) {
@@ -134,7 +136,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/** /**
* Set the connect timeout in milliseconds. 0 is default and infinite. * Set the connect timeout in milliseconds. 0 is default and infinite.
* *
* @param connectTimeout * @param connectTimeout
*/ */
public void setConnectTimeout(int connectTimeout) { public void setConnectTimeout(int connectTimeout) {
@@ -143,7 +145,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/** /**
* Set the socket timeout. 0 is default and infinite. * Set the socket timeout. 0 is default and infinite.
* *
* @param socketTimeout * @param socketTimeout
*/ */
public void setSocketTimeout(int socketTimeout) { public void setSocketTimeout(int socketTimeout) {
@@ -152,7 +154,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/** /**
* Set the keep alive flag, controls whether or not to have socket keep alive timeout. Defaults to false. * Set the keep alive flag, controls whether or not to have socket keep alive timeout. Defaults to false.
* *
* @param socketKeepAlive * @param socketKeepAlive
*/ */
public void setSocketKeepAlive(boolean socketKeepAlive) { public void setSocketKeepAlive(boolean socketKeepAlive) {
@@ -161,7 +163,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/** /**
* Set the {@link ReadPreference}. * Set the {@link ReadPreference}.
* *
* @param readPreference * @param readPreference
*/ */
public void setReadPreference(ReadPreference readPreference) { public void setReadPreference(ReadPreference readPreference) {
@@ -171,7 +173,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/** /**
* Set the {@link WriteConcern} that will be the default value used when asking the {@link MongoDbFactory} for a DB * Set the {@link WriteConcern} that will be the default value used when asking the {@link MongoDbFactory} for a DB
* object. * object.
* *
* @param writeConcern * @param writeConcern
*/ */
public void setWriteConcern(WriteConcern writeConcern) { public void setWriteConcern(WriteConcern writeConcern) {
@@ -187,7 +189,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/** /**
* Set the frequency that the driver will attempt to determine the current state of each server in the cluster. * Set the frequency that the driver will attempt to determine the current state of each server in the cluster.
* *
* @param heartbeatFrequency * @param heartbeatFrequency
*/ */
public void setHeartbeatFrequency(int heartbeatFrequency) { public void setHeartbeatFrequency(int heartbeatFrequency) {
@@ -197,7 +199,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/** /**
* In the event that the driver has to frequently re-check a server's availability, it will wait at least this long * In the event that the driver has to frequently re-check a server's availability, it will wait at least this long
* since the previous check to avoid wasted effort. * since the previous check to avoid wasted effort.
* *
* @param minHeartbeatFrequency * @param minHeartbeatFrequency
*/ */
public void setMinHeartbeatFrequency(int minHeartbeatFrequency) { public void setMinHeartbeatFrequency(int minHeartbeatFrequency) {
@@ -206,7 +208,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/** /**
* Set the connect timeout for connections used for the cluster heartbeat. * Set the connect timeout for connections used for the cluster heartbeat.
* *
* @param heartbeatConnectTimeout * @param heartbeatConnectTimeout
*/ */
public void setHeartbeatConnectTimeout(int heartbeatConnectTimeout) { public void setHeartbeatConnectTimeout(int heartbeatConnectTimeout) {
@@ -215,7 +217,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/** /**
* Set the socket timeout for connections used for the cluster heartbeat. * Set the socket timeout for connections used for the cluster heartbeat.
* *
* @param heartbeatSocketTimeout * @param heartbeatSocketTimeout
*/ */
public void setHeartbeatSocketTimeout(int heartbeatSocketTimeout) { public void setHeartbeatSocketTimeout(int heartbeatSocketTimeout) {
@@ -224,7 +226,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/** /**
* Configures the name of the replica set. * Configures the name of the replica set.
* *
* @param requiredReplicaSetName * @param requiredReplicaSetName
*/ */
public void setRequiredReplicaSetName(String requiredReplicaSetName) { public void setRequiredReplicaSetName(String requiredReplicaSetName) {
@@ -233,7 +235,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/** /**
* This controls if the driver should us an SSL connection. Defaults to |@literal false}. * This controls if the driver should us an SSL connection. Defaults to |@literal false}.
* *
* @param ssl * @param ssl
*/ */
public void setSsl(boolean ssl) { public void setSsl(boolean ssl) {
@@ -243,22 +245,35 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/** /**
* Set the {@link SSLSocketFactory} to use for the {@literal SSL} connection. If none is configured here, * Set the {@link SSLSocketFactory} to use for the {@literal SSL} connection. If none is configured here,
* {@link SSLSocketFactory#getDefault()} will be used. * {@link SSLSocketFactory#getDefault()} will be used.
* *
* @param sslSocketFactory * @param sslSocketFactory
*/ */
public void setSslSocketFactory(SSLSocketFactory sslSocketFactory) { public void setSslSocketFactory(SSLSocketFactory sslSocketFactory) {
this.sslSocketFactory = sslSocketFactory; this.sslSocketFactory = sslSocketFactory;
this.ssl = sslSocketFactory != null;
} }
/* /**
* Set the {@literal server selection timeout} in msec for a 3.x MongoDB Java driver. If not set the default value of
* 30 sec will be used. A value of 0 means that it will timeout immediately if no server is available. A negative
* value means to wait indefinitely.
*
* @param serverSelectionTimeout in msec.
*/
public void setServerSelectionTimeout(int serverSelectionTimeout) {
this.serverSelectionTimeout = serverSelectionTimeout;
}
/*
* (non-Javadoc) * (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance() * @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/ */
@Override @Override
protected MongoClientOptions createInstance() throws Exception { protected MongoClientOptions createInstance() throws Exception {
SocketFactory socketFactoryToUse = ssl ? (sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory SocketFactory socketFactoryToUse = ssl
.getDefault()) : this.socketFactory; ? (sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory.getDefault()) : this.socketFactory;
return MongoClientOptions.builder() // return MongoClientOptions.builder() //
.alwaysUseMBeans(this.alwaysUseMBeans) // .alwaysUseMBeans(this.alwaysUseMBeans) //
@@ -278,6 +293,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
.minHeartbeatFrequency(minHeartbeatFrequency) // .minHeartbeatFrequency(minHeartbeatFrequency) //
.readPreference(readPreference) // .readPreference(readPreference) //
.requiredReplicaSetName(requiredReplicaSetName) // .requiredReplicaSetName(requiredReplicaSetName) //
.serverSelectionTimeout(serverSelectionTimeout) //
.socketFactory(socketFactoryToUse) // .socketFactory(socketFactoryToUse) //
.socketKeepAlive(socketKeepAlive) // .socketKeepAlive(socketKeepAlive) //
.socketTimeout(socketTimeout) // .socketTimeout(socketTimeout) //

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,216 +0,0 @@
/*
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.util.MongoClientVersion;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.util.Assert;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Helper class featuring helper methods for internal MongoDb classes. Mainly intended for internal use within the
* framework.
*
* @author Thomas Risberg
* @author Graeme Rocher
* @author Oliver Gierke
* @author Randy Watler
* @author Thomas Darimont
* @author Christoph Strobl
* @since 1.0
*/
public abstract class MongoDbUtils {
private static final Logger LOGGER = LoggerFactory.getLogger(MongoDbUtils.class);
/**
* Private constructor to prevent instantiation.
*/
private MongoDbUtils() {}
/**
* Obtains a {@link DB} connection for the given {@link Mongo} instance and database name
*
* @param mongo the {@link Mongo} instance, must not be {@literal null}.
* @param databaseName the database name, must not be {@literal null} or empty.
* @return the {@link DB} connection
*/
public static DB getDB(Mongo mongo, String databaseName) {
return doGetDB(mongo, databaseName, UserCredentials.NO_CREDENTIALS, true, databaseName);
}
/**
* Obtains a {@link DB} connection for the given {@link Mongo} instance and database name
*
* @param mongo the {@link Mongo} instance, must not be {@literal null}.
* @param databaseName the database name, must not be {@literal null} or empty.
* @param credentials the credentials to use, must not be {@literal null}.
* @return the {@link DB} connection
* @deprecated since 1.7. The {@link MongoClient} itself should hold credentials within
* {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials) {
return getDB(mongo, databaseName, credentials, databaseName);
}
/**
* @param mongo
* @param databaseName
* @param credentials
* @param authenticationDatabaseName
* @return
* @deprecated since 1.7. The {@link MongoClient} itself should hold credentials within
* {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials,
String authenticationDatabaseName) {
Assert.notNull(mongo, "No Mongo instance specified!");
Assert.hasText(databaseName, "Database name must be given!");
Assert.notNull(credentials, "Credentials must not be null, use UserCredentials.NO_CREDENTIALS!");
Assert.hasText(authenticationDatabaseName, "Authentication database name must not be null or empty!");
return doGetDB(mongo, databaseName, credentials, true, authenticationDatabaseName);
}
private static DB doGetDB(Mongo mongo, String databaseName, UserCredentials credentials, boolean allowCreate,
String authenticationDatabaseName) {
DbHolder dbHolder = (DbHolder) TransactionSynchronizationManager.getResource(mongo);
// Do we have a populated holder and TX sync active?
if (dbHolder != null && !dbHolder.isEmpty() && TransactionSynchronizationManager.isSynchronizationActive()) {
DB db = dbHolder.getDB(databaseName);
// DB found but not yet synchronized
if (db != null && !dbHolder.isSynchronizedWithTransaction()) {
LOGGER.debug("Registering Spring transaction synchronization for existing MongoDB {}.", databaseName);
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(dbHolder, mongo));
dbHolder.setSynchronizedWithTransaction(true);
}
if (db != null) {
return db;
}
}
// Lookup fresh database instance
LOGGER.debug("Getting Mongo Database name=[{}]", databaseName);
DB db = mongo.getDB(databaseName);
if (!(mongo instanceof MongoClient) && requiresAuthDbAuthentication(credentials)) {
ReflectiveDbInvoker.authenticate(mongo, db, credentials, authenticationDatabaseName);
}
// TX sync active, bind new database to thread
if (TransactionSynchronizationManager.isSynchronizationActive()) {
LOGGER.debug("Registering Spring transaction synchronization for MongoDB instance {}.", databaseName);
DbHolder holderToUse = dbHolder;
if (holderToUse == null) {
holderToUse = new DbHolder(databaseName, db);
} else {
holderToUse.addDB(databaseName, db);
}
// synchronize holder only if not yet synchronized
if (!holderToUse.isSynchronizedWithTransaction()) {
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(holderToUse, mongo));
holderToUse.setSynchronizedWithTransaction(true);
}
if (holderToUse != dbHolder) {
TransactionSynchronizationManager.bindResource(mongo, holderToUse);
}
}
// Check whether we are allowed to return the DB.
if (!allowCreate && !isDBTransactional(db, mongo)) {
throw new IllegalStateException("No Mongo DB bound to thread, "
+ "and configuration does not allow creation of non-transactional one here");
}
return db;
}
/**
* Return whether the given DB instance is transactional, that is, bound to the current thread by Spring's transaction
* facilities.
*
* @param db the DB to check
* @param mongo the Mongo instance that the DB was created with (may be <code>null</code>)
* @return whether the DB is transactional
*/
public static boolean isDBTransactional(DB db, Mongo mongo) {
if (mongo == null) {
return false;
}
DbHolder dbHolder = (DbHolder) TransactionSynchronizationManager.getResource(mongo);
return dbHolder != null && dbHolder.containsDB(db);
}
/**
* Perform actual closing of the Mongo DB object, catching and logging any cleanup exceptions thrown.
*
* @param db the DB to close (may be <code>null</code>)
* @deprecated since 1.7. The main use case for this method is to ensure that applications can read their own
* unacknowledged writes, but this is no longer so prevalent since the MongoDB Java driver version 3
* started defaulting to acknowledged writes.
*/
@Deprecated
public static void closeDB(DB db) {
if (db != null) {
LOGGER.debug("Closing Mongo DB object");
try {
ReflectiveDbInvoker.requestDone(db);
} catch (Throwable ex) {
LOGGER.debug("Unexpected exception on closing Mongo DB object", ex);
}
}
}
/**
* Check if credentials present. In case we're using a mongo-java-driver version 3 or above we do not have the need
* for authentication as the auth data has to be provided within the MongoClient
*
* @param credentials
* @return
*/
private static boolean requiresAuthDbAuthentication(UserCredentials credentials) {
if (credentials == null || !credentials.hasUsername()) {
return false;
}
return !MongoClientVersion.isMongo3Driver();
}
}

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2010-2015 the original author or authors. * Copyright 2010-2016 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -19,6 +19,7 @@ import java.util.Arrays;
import java.util.HashSet; import java.util.HashSet;
import java.util.Set; import java.util.Set;
import org.bson.BsonInvalidOperationException;
import org.springframework.dao.DataAccessException; import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException; import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException; import org.springframework.dao.DataIntegrityViolationException;
@@ -33,7 +34,10 @@ import org.springframework.data.mongodb.util.MongoDbErrorCodes;
import org.springframework.util.ClassUtils; import org.springframework.util.ClassUtils;
import com.mongodb.BulkWriteException; import com.mongodb.BulkWriteException;
import com.mongodb.MongoBulkWriteException;
import com.mongodb.MongoException; import com.mongodb.MongoException;
import com.mongodb.MongoServerException;
import com.mongodb.bulk.BulkWriteError;
/** /**
* Simple {@link PersistenceExceptionTranslator} for Mongo. Convert the given runtime exception to an appropriate * Simple {@link PersistenceExceptionTranslator} for Mongo. Convert the given runtime exception to an appropriate
@@ -57,7 +61,7 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
Arrays.asList("MongoInternalException")); Arrays.asList("MongoInternalException"));
private static final Set<String> DATA_INTEGRETY_EXCEPTIONS = new HashSet<String>( private static final Set<String> DATA_INTEGRETY_EXCEPTIONS = new HashSet<String>(
Arrays.asList("WriteConcernException")); Arrays.asList("WriteConcernException", "MongoWriteException", "MongoBulkWriteException"));
/* /*
* (non-Javadoc) * (non-Javadoc)
@@ -67,6 +71,10 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
// Check for well-known MongoException subclasses. // Check for well-known MongoException subclasses.
if (ex instanceof BsonInvalidOperationException) {
throw new InvalidDataAccessApiUsageException(ex.getMessage(), ex);
}
String exception = ClassUtils.getShortName(ClassUtils.getUserClass(ex.getClass())); String exception = ClassUtils.getShortName(ClassUtils.getUserClass(ex.getClass()));
if (DULICATE_KEY_EXCEPTIONS.contains(exception)) { if (DULICATE_KEY_EXCEPTIONS.contains(exception)) {
@@ -82,6 +90,20 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
} }
if (DATA_INTEGRETY_EXCEPTIONS.contains(exception)) { if (DATA_INTEGRETY_EXCEPTIONS.contains(exception)) {
if (ex instanceof MongoServerException) {
if (((MongoServerException) ex).getCode() == 11000) {
return new DuplicateKeyException(ex.getMessage(), ex);
}
if (ex instanceof MongoBulkWriteException) {
for (BulkWriteError x : ((MongoBulkWriteException) ex).getWriteErrors()) {
if (x.getCode() == 11000) {
return new DuplicateKeyException(ex.getMessage(), ex);
}
}
}
}
return new DataIntegrityViolationException(ex.getMessage(), ex); return new DataIntegrityViolationException(ex.getMessage(), ex);
} }

View File

@@ -1,199 +0,0 @@
/*
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.List;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoOptions;
import com.mongodb.ServerAddress;
import com.mongodb.WriteConcern;
/**
* Convenient factory for configuring MongoDB.
*
* @author Thomas Risberg
* @author Graeme Rocher
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
* @since 1.0
* @deprecated since 1.7. Please use {@link MongoClientFactoryBean} instead.
*/
@Deprecated
public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements PersistenceExceptionTranslator {
private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator();
private MongoOptions mongoOptions;
private String host;
private Integer port;
private WriteConcern writeConcern;
private List<ServerAddress> replicaSetSeeds;
private List<ServerAddress> replicaPair;
private PersistenceExceptionTranslator exceptionTranslator = DEFAULT_EXCEPTION_TRANSLATOR;
/**
* @param mongoOptions
*/
public void setMongoOptions(MongoOptions mongoOptions) {
this.mongoOptions = mongoOptions;
}
public void setReplicaSetSeeds(ServerAddress[] replicaSetSeeds) {
this.replicaSetSeeds = filterNonNullElementsAsList(replicaSetSeeds);
}
/**
* @deprecated use {@link #setReplicaSetSeeds(ServerAddress[])} instead
* @param replicaPair
*/
@Deprecated
public void setReplicaPair(ServerAddress[] replicaPair) {
this.replicaPair = filterNonNullElementsAsList(replicaPair);
}
/**
* Configures the host to connect to.
*
* @param host
*/
public void setHost(String host) {
this.host = host;
}
/**
* Configures the port to connect to.
*
* @param port
*/
public void setPort(int port) {
this.port = port;
}
/**
* Sets the {@link WriteConcern} to be configured for the {@link Mongo} instance to be created.
*
* @param writeConcern
*/
public void setWriteConcern(WriteConcern writeConcern) {
this.writeConcern = writeConcern;
}
/**
* Configures the {@link PersistenceExceptionTranslator} to use.
*
* @param exceptionTranslator can be {@literal null}.
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
public Class<? extends Mongo> getObjectType() {
return Mongo.class;
}
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
*/
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
return exceptionTranslator.translateExceptionIfPossible(ex);
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override
protected Mongo createInstance() throws Exception {
Mongo mongo;
ServerAddress defaultOptions = new ServerAddress();
if (mongoOptions == null) {
mongoOptions = new MongoOptions();
}
if (!isNullOrEmpty(replicaPair)) {
if (replicaPair.size() < 2) {
throw new CannotGetMongoDbConnectionException("A replica pair must have two server entries");
}
mongo = new Mongo(replicaPair.get(0), replicaPair.get(1), mongoOptions);
} else if (!isNullOrEmpty(replicaSetSeeds)) {
mongo = new Mongo(replicaSetSeeds, mongoOptions);
} else {
String mongoHost = StringUtils.hasText(host) ? host : defaultOptions.getHost();
mongo = port != null ? new Mongo(new ServerAddress(mongoHost, port), mongoOptions) : new Mongo(mongoHost,
mongoOptions);
}
if (writeConcern != null) {
mongo.setWriteConcern(writeConcern);
}
return mongo;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object)
*/
@Override
protected void destroyInstance(Mongo mongo) throws Exception {
mongo.close();
}
private static boolean isNullOrEmpty(Collection<?> elements) {
return elements == null || elements.isEmpty();
}
/**
* Returns the given array as {@link List} with all {@literal null} elements removed.
*
* @param elements the elements to filter <T>
* @return a new unmodifiable {@link List#} from the given elements without nulls
*/
private static <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
}
}

View File

@@ -1,11 +1,11 @@
/* /*
* Copyright 2011-2015 the original author or authors. * Copyright 2011-2017 the original author or authors.
* *
* Licensed under the Apache License, Version 2.0 (the "License"); * Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,
@@ -19,9 +19,11 @@ import java.util.Collection;
import java.util.List; import java.util.List;
import java.util.Set; import java.util.Set;
import org.bson.Document;
import org.springframework.data.geo.GeoResults; import org.springframework.data.geo.GeoResults;
import org.springframework.data.mongodb.core.BulkOperations.BulkMode; import org.springframework.data.mongodb.core.BulkOperations.BulkMode;
import org.springframework.data.mongodb.core.aggregation.Aggregation; import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOptions;
import org.springframework.data.mongodb.core.aggregation.AggregationResults; import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation; import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MongoConverter; import org.springframework.data.mongodb.core.convert.MongoConverter;
@@ -36,13 +38,11 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update; import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.CloseableIterator; import org.springframework.data.util.CloseableIterator;
import com.mongodb.CommandResult;
import com.mongodb.Cursor; import com.mongodb.Cursor;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.ReadPreference; import com.mongodb.ReadPreference;
import com.mongodb.WriteResult; import com.mongodb.client.MongoCollection;
import com.mongodb.client.result.DeleteResult;
import com.mongodb.client.result.UpdateResult;
/** /**
* Interface that specifies a basic set of MongoDB operations. Implemented by {@link MongoTemplate}. Not often used but * Interface that specifies a basic set of MongoDB operations. Implemented by {@link MongoTemplate}. Not often used but
@@ -56,6 +56,7 @@ import com.mongodb.WriteResult;
* @author Chuong Ngo * @author Chuong Ngo
* @author Christoph Strobl * @author Christoph Strobl
* @author Thomas Darimont * @author Thomas Darimont
* @author Maninder Singh
*/ */
public interface MongoOperations { public interface MongoOperations {
@@ -69,12 +70,12 @@ public interface MongoOperations {
/** /**
* Execute the a MongoDB command expressed as a JSON string. This will call the method JSON.parse that is part of the * Execute the a MongoDB command expressed as a JSON string. This will call the method JSON.parse that is part of the
* MongoDB driver to convert the JSON string to a DBObject. Any errors that result from executing this command will be * MongoDB driver to convert the JSON string to a Document. Any errors that result from executing this command will be
* converted into Spring's DAO exception hierarchy. * converted into Spring's DAO exception hierarchy.
* *
* @param jsonCommand a MongoDB command expressed as a JSON string. * @param jsonCommand a MongoDB command expressed as a JSON string.
*/ */
CommandResult executeCommand(String jsonCommand); Document executeCommand(String jsonCommand);
/** /**
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's DAO * Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's DAO
@@ -82,19 +83,7 @@ public interface MongoOperations {
* *
* @param command a MongoDB command * @param command a MongoDB command
*/ */
CommandResult executeCommand(DBObject command); Document executeCommand(Document command);
/**
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's DAO
* exception hierarchy.
*
* @param command a MongoDB command
* @param options query options to use
* @deprecated since 1.7. Please use {@link #executeCommand(DBObject, ReadPreference)}, as the MongoDB Java driver
* version 3 no longer supports this operation.
*/
@Deprecated
CommandResult executeCommand(DBObject command, int options);
/** /**
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's data * Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's data
@@ -105,7 +94,7 @@ public interface MongoOperations {
* @return * @return
* @since 1.7 * @since 1.7
*/ */
CommandResult executeCommand(DBObject command, ReadPreference readPreference); Document executeCommand(Document command, ReadPreference readPreference);
/** /**
* Execute a MongoDB query and iterate over the query results on a per-document basis with a DocumentCallbackHandler. * Execute a MongoDB query and iterate over the query results on a per-document basis with a DocumentCallbackHandler.
@@ -152,22 +141,6 @@ public interface MongoOperations {
*/ */
<T> T execute(String collectionName, CollectionCallback<T> action); <T> T execute(String collectionName, CollectionCallback<T> action);
/**
* Executes the given {@link DbCallback} within the same connection to the database so as to ensure consistency in a
* write heavy environment where you may read the data that you wrote. See the comments on {@see <a
* href=http://www.mongodb.org/display/DOCS/Java+Driver+Concurrency>Java Driver Concurrency</a>}
* <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param <T> return type
* @param action callback that specified the MongoDB actions to perform on the DB instance
* @return a result object returned by the action or <tt>null</tt>
* @deprecated since 1.7 as the MongoDB Java driver version 3 does not longer support request boundaries via
* {@link DB#requestStart()} and {@link DB#requestDone()}.
*/
@Deprecated
<T> T executeInSession(DbCallback<T> action);
/** /**
* Executes the given {@link Query} on the entity collection of the specified {@code entityType} backed by a Mongo DB * Executes the given {@link Query} on the entity collection of the specified {@code entityType} backed by a Mongo DB
* {@link Cursor}. * {@link Cursor}.
@@ -175,20 +148,35 @@ public interface MongoOperations {
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed. * Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed.
* *
* @param <T> element return type * @param <T> element return type
* @param query * @param query must not be {@literal null}.
* @param entityType * @param entityType must not be {@literal null}.
* @return * @return will never be {@literal null}.
* @since 1.7 * @since 1.7
*/ */
<T> CloseableIterator<T> stream(Query query, Class<T> entityType); <T> CloseableIterator<T> stream(Query query, Class<T> entityType);
/**
* Executes the given {@link Query} on the entity collection of the specified {@code entityType} and collection backed
* by a Mongo DB {@link Cursor}.
* <p>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed.
*
* @param <T> element return type
* @param query must not be {@literal null}.
* @param entityType must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return will never be {@literal null}.
* @since 1.10
*/
<T> CloseableIterator<T> stream(Query query, Class<T> entityType, String collectionName);
/** /**
* Create an uncapped collection with a name based on the provided entity class. * Create an uncapped collection with a name based on the provided entity class.
* *
* @param entityClass class that determines the collection to create * @param entityClass class that determines the collection to create
* @return the created collection * @return the created collection
*/ */
<T> DBCollection createCollection(Class<T> entityClass); <T> MongoCollection<Document> createCollection(Class<T> entityClass);
/** /**
* Create a collection with a name based on the provided entity class using the options. * Create a collection with a name based on the provided entity class using the options.
@@ -197,7 +185,7 @@ public interface MongoOperations {
* @param collectionOptions options to use when creating the collection. * @param collectionOptions options to use when creating the collection.
* @return the created collection * @return the created collection
*/ */
<T> DBCollection createCollection(Class<T> entityClass, CollectionOptions collectionOptions); <T> MongoCollection<Document> createCollection(Class<T> entityClass, CollectionOptions collectionOptions);
/** /**
* Create an uncapped collection with the provided name. * Create an uncapped collection with the provided name.
@@ -205,7 +193,7 @@ public interface MongoOperations {
* @param collectionName name of the collection * @param collectionName name of the collection
* @return the created collection * @return the created collection
*/ */
DBCollection createCollection(String collectionName); MongoCollection<Document> createCollection(String collectionName);
/** /**
* Create a collection with the provided name and options. * Create a collection with the provided name and options.
@@ -214,7 +202,7 @@ public interface MongoOperations {
* @param collectionOptions options to use when creating the collection. * @param collectionOptions options to use when creating the collection.
* @return the created collection * @return the created collection
*/ */
DBCollection createCollection(String collectionName, CollectionOptions collectionOptions); MongoCollection<Document> createCollection(String collectionName, CollectionOptions collectionOptions);
/** /**
* A set of collection names. * A set of collection names.
@@ -231,7 +219,7 @@ public interface MongoOperations {
* @param collectionName name of the collection * @param collectionName name of the collection
* @return an existing collection or a newly created one. * @return an existing collection or a newly created one.
*/ */
DBCollection getCollection(String collectionName); MongoCollection<Document> getCollection(String collectionName);
/** /**
* Check to see if a collection with a name indicated by the entity class exists. * Check to see if a collection with a name indicated by the entity class exists.
@@ -294,10 +282,7 @@ public interface MongoOperations {
ScriptOperations scriptOps(); ScriptOperations scriptOps();
/** /**
* Returns a new {@link BulkOperations} for the given collection. <br /> * Returns a new {@link BulkOperations} for the given collection.
* <strong>NOTE:</strong> Any additional support for field mapping, etc. is not available for {@literal update} or
* {@literal remove} operations in bulk mode due to the lack of domain type information. Use
* {@link #bulkOps(BulkMode, Class, String)} to get full type specific support.
* *
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}. * @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param collectionName the name of the collection to work on, must not be {@literal null} or empty. * @param collectionName the name of the collection to work on, must not be {@literal null} or empty.
@@ -433,6 +418,81 @@ public interface MongoOperations {
*/ */
<O> AggregationResults<O> aggregate(Aggregation aggregation, String collectionName, Class<O> outputType); <O> AggregationResults<O> aggregate(Aggregation aggregation, String collectionName, Class<O> outputType);
/**
* Execute an aggregation operation backed by a Mongo DB {@link Cursor}.
* <p>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed. The raw
* results will be mapped to the given entity class. The name of the inputCollection is derived from the inputType of
* the aggregation.
* <p>
* Aggregation streaming can't be used with {@link AggregationOptions#isExplain() aggregation explain}. Enabling
* explanation mode will throw an {@link IllegalArgumentException}.
*
* @param aggregation The {@link TypedAggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param collectionName The name of the input collection to use for the aggreation.
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
* @return The results of the aggregation operation.
* @since 2.0
*/
<O> CloseableIterator<O> aggregateStream(TypedAggregation<?> aggregation, String collectionName, Class<O> outputType);
/**
* Execute an aggregation operation backed by a Mongo DB {@link Cursor}.
* <p/>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed. The raw
* results will be mapped to the given entity class and are returned as stream. The name of the inputCollection is
* derived from the inputType of the aggregation.
* <p/>
* Aggregation streaming can't be used with {@link AggregationOptions#isExplain() aggregation explain}. Enabling
* explanation mode will throw an {@link IllegalArgumentException}.
*
* @param aggregation The {@link TypedAggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
* @return The results of the aggregation operation.
* @since 2.0
*/
<O> CloseableIterator<O> aggregateStream(TypedAggregation<?> aggregation, Class<O> outputType);
/**
* Execute an aggregation operation backed by a Mongo DB {@link Cursor}.
* <p/>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed. The raw
* results will be mapped to the given entity class.
* <p/>
* Aggregation streaming can't be used with {@link AggregationOptions#isExplain() aggregation explain}. Enabling
* explanation mode will throw an {@link IllegalArgumentException}.
*
* @param aggregation The {@link Aggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param inputType the inputType where the aggregation operation will read from, must not be {@literal null} or
* empty.
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
* @return The results of the aggregation operation.
* @since 2.0
*/
<O> CloseableIterator<O> aggregateStream(Aggregation aggregation, Class<?> inputType, Class<O> outputType);
/**
* Execute an aggregation operation backed by a Mongo DB {@link Cursor}.
* <p/>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed. The raw
* results will be mapped to the given entity class.
* <p/>
* Aggregation streaming can't be used with {@link AggregationOptions#isExplain() aggregation explain}. Enabling
* explanation mode will throw an {@link IllegalArgumentException}.
*
* @param aggregation The {@link Aggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param collectionName the collection where the aggregation operation will read from, must not be {@literal null} or
* empty.
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
* @return The results of the aggregation operation.
* @since 2.0
*/
<O> CloseableIterator<O> aggregateStream(Aggregation aggregation, String collectionName, Class<O> outputType);
/** /**
* Execute a map-reduce operation. The map-reduce operation will be formed with an output type of INLINE * Execute a map-reduce operation. The map-reduce operation will be formed with an output type of INLINE
* *
@@ -549,9 +609,7 @@ public interface MongoOperations {
<T> T findOne(Query query, Class<T> entityClass, String collectionName); <T> T findOne(Query query, Class<T> entityClass, String collectionName);
/** /**
* Determine result of given {@link Query} contains at least one element. <br /> * Determine result of given {@link Query} contains at least one element.
* <strong>NOTE:</strong> Any additional support for query/field mapping, etc. is not available due to the lack of
* domain type information. Use {@link #exists(Query, Class, String)} to get full type specific support.
* *
* @param query the {@link Query} class that specifies the criteria used to find a record. * @param query the {@link Query} class that specifies the criteria used to find a record.
* @param collectionName name of the collection to check for objects. * @param collectionName name of the collection to check for objects.
@@ -634,8 +692,8 @@ public interface MongoOperations {
<T> T findById(Object id, Class<T> entityClass, String collectionName); <T> T findById(Object id, Class<T> entityClass, String collectionName);
/** /**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify * Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify <a/>
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}. * to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
* *
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional * @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification. * fields specification.
@@ -646,8 +704,8 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, Class<T> entityClass); <T> T findAndModify(Query query, Update update, Class<T> entityClass);
/** /**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify * Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify <a/>
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}. * to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
* *
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional * @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification. * fields specification.
@@ -659,8 +717,8 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, Class<T> entityClass, String collectionName); <T> T findAndModify(Query query, Update update, Class<T> entityClass, String collectionName);
/** /**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify * Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify <a/>
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking * to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account. * {@link FindAndModifyOptions} into account.
* *
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional * @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
@@ -673,8 +731,8 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass); <T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass);
/** /**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify * Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify <a/>
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking * to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account. * {@link FindAndModifyOptions} into account.
* *
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional * @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
@@ -735,7 +793,7 @@ public interface MongoOperations {
/** /**
* Returns the number of documents for the given {@link Query} querying the given collection. The given {@link Query} * Returns the number of documents for the given {@link Query} querying the given collection. The given {@link Query}
* must solely consist of document field references as we lack type information to map potential property references * must solely consist of document field references as we lack type information to map potential property references
* onto document fields. Use {@link #count(Query, Class, String)} to get full type specific support. * onto document fields. TO make sure the query gets mapped, use {@link #count(Query, Class, String)}.
* *
* @param query * @param query
* @param collectionName must not be {@literal null} or empty. * @param collectionName must not be {@literal null} or empty.
@@ -854,13 +912,11 @@ public interface MongoOperations {
* @param entityClass class that determines the collection to use * @param entityClass class that determines the collection to use
* @return the WriteResult which lets you access the results of the previous write. * @return the WriteResult which lets you access the results of the previous write.
*/ */
WriteResult upsert(Query query, Update update, Class<?> entityClass); UpdateResult upsert(Query query, Update update, Class<?> entityClass);
/** /**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by * Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document. <br /> * combining the query document and the update document.
* <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of
* domain type information. Use {@link #upsert(Query, Update, Class, String)} to get full type specific support.
* *
* @param query the query document that specifies the criteria used to select a record to be updated * @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing * @param update the update document that contains the updated object or $ operators to manipulate the existing
@@ -868,7 +924,7 @@ public interface MongoOperations {
* @param collectionName name of the collection to update the object in * @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write. * @return the WriteResult which lets you access the results of the previous write.
*/ */
WriteResult upsert(Query query, Update update, String collectionName); UpdateResult upsert(Query query, Update update, String collectionName);
/** /**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by * Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
@@ -880,7 +936,7 @@ public interface MongoOperations {
* @param collectionName name of the collection to update the object in * @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write. * @return the WriteResult which lets you access the results of the previous write.
*/ */
WriteResult upsert(Query query, Update update, Class<?> entityClass, String collectionName); UpdateResult upsert(Query query, Update update, Class<?> entityClass, String collectionName);
/** /**
* Updates the first object that is found in the collection of the entity class that matches the query document with * Updates the first object that is found in the collection of the entity class that matches the query document with
@@ -892,13 +948,11 @@ public interface MongoOperations {
* @param entityClass class that determines the collection to use * @param entityClass class that determines the collection to use
* @return the WriteResult which lets you access the results of the previous write. * @return the WriteResult which lets you access the results of the previous write.
*/ */
WriteResult updateFirst(Query query, Update update, Class<?> entityClass); UpdateResult updateFirst(Query query, Update update, Class<?> entityClass);
/** /**
* Updates the first object that is found in the specified collection that matches the query document criteria with * Updates the first object that is found in the specified collection that matches the query document criteria with
* the provided updated document. <br /> * the provided updated document.
* <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of
* domain type information. Use {@link #updateFirst(Query, Update, Class, String)} to get full type specific support.
* *
* @param query the query document that specifies the criteria used to select a record to be updated * @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing * @param update the update document that contains the updated object or $ operators to manipulate the existing
@@ -906,7 +960,7 @@ public interface MongoOperations {
* @param collectionName name of the collection to update the object in * @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write. * @return the WriteResult which lets you access the results of the previous write.
*/ */
WriteResult updateFirst(Query query, Update update, String collectionName); UpdateResult updateFirst(Query query, Update update, String collectionName);
/** /**
* Updates the first object that is found in the specified collection that matches the query document criteria with * Updates the first object that is found in the specified collection that matches the query document criteria with
@@ -919,7 +973,7 @@ public interface MongoOperations {
* @param collectionName name of the collection to update the object in * @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write. * @return the WriteResult which lets you access the results of the previous write.
*/ */
WriteResult updateFirst(Query query, Update update, Class<?> entityClass, String collectionName); UpdateResult updateFirst(Query query, Update update, Class<?> entityClass, String collectionName);
/** /**
* Updates all objects that are found in the collection for the entity class that matches the query document criteria * Updates all objects that are found in the collection for the entity class that matches the query document criteria
@@ -931,13 +985,11 @@ public interface MongoOperations {
* @param entityClass class that determines the collection to use * @param entityClass class that determines the collection to use
* @return the WriteResult which lets you access the results of the previous write. * @return the WriteResult which lets you access the results of the previous write.
*/ */
WriteResult updateMulti(Query query, Update update, Class<?> entityClass); UpdateResult updateMulti(Query query, Update update, Class<?> entityClass);
/** /**
* Updates all objects that are found in the specified collection that matches the query document criteria with the * Updates all objects that are found in the specified collection that matches the query document criteria with the
* provided updated document. <br /> * provided updated document.
* <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of
* domain type information. Use {@link #updateMulti(Query, Update, Class, String)} to get full type specific support.
* *
* @param query the query document that specifies the criteria used to select a record to be updated * @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing * @param update the update document that contains the updated object or $ operators to manipulate the existing
@@ -945,7 +997,7 @@ public interface MongoOperations {
* @param collectionName name of the collection to update the object in * @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write. * @return the WriteResult which lets you access the results of the previous write.
*/ */
WriteResult updateMulti(Query query, Update update, String collectionName); UpdateResult updateMulti(Query query, Update update, String collectionName);
/** /**
* Updates all objects that are found in the collection for the entity class that matches the query document criteria * Updates all objects that are found in the collection for the entity class that matches the query document criteria
@@ -958,14 +1010,14 @@ public interface MongoOperations {
* @param collectionName name of the collection to update the object in * @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write. * @return the WriteResult which lets you access the results of the previous write.
*/ */
WriteResult updateMulti(final Query query, final Update update, Class<?> entityClass, String collectionName); UpdateResult updateMulti(final Query query, final Update update, Class<?> entityClass, String collectionName);
/** /**
* Remove the given object from the collection by id. * Remove the given object from the collection by id.
* *
* @param object * @param object
*/ */
WriteResult remove(Object object); DeleteResult remove(Object object);
/** /**
* Removes the given object from the given collection. * Removes the given object from the given collection.
@@ -973,7 +1025,7 @@ public interface MongoOperations {
* @param object * @param object
* @param collection must not be {@literal null} or empty. * @param collection must not be {@literal null} or empty.
*/ */
WriteResult remove(Object object, String collection); DeleteResult remove(Object object, String collection);
/** /**
* Remove all documents that match the provided query document criteria from the the collection used to store the * Remove all documents that match the provided query document criteria from the the collection used to store the
@@ -982,7 +1034,7 @@ public interface MongoOperations {
* @param query * @param query
* @param entityClass * @param entityClass
*/ */
WriteResult remove(Query query, Class<?> entityClass); DeleteResult remove(Query query, Class<?> entityClass);
/** /**
* Remove all documents that match the provided query document criteria from the the collection used to store the * Remove all documents that match the provided query document criteria from the the collection used to store the
@@ -992,26 +1044,22 @@ public interface MongoOperations {
* @param entityClass * @param entityClass
* @param collectionName * @param collectionName
*/ */
WriteResult remove(Query query, Class<?> entityClass, String collectionName); DeleteResult remove(Query query, Class<?> entityClass, String collectionName);
/** /**
* Remove all documents from the specified collection that match the provided query document criteria. There is no * Remove all documents from the specified collection that match the provided query document criteria. There is no
* conversion/mapping done for any criteria using the id field. <br /> * conversion/mapping done for any criteria using the id field.
* <strong>NOTE:</strong> Any additional support for field mapping is not available due to the lack of domain type
* information. Use {@link #remove(Query, Class, String)} to get full type specific support.
* *
* @param query the query document that specifies the criteria used to remove a record * @param query the query document that specifies the criteria used to remove a record
* @param collectionName name of the collection where the objects will removed * @param collectionName name of the collection where the objects will removed
*/ */
WriteResult remove(Query query, String collectionName); DeleteResult remove(Query query, String collectionName);
/** /**
* Returns and removes all documents form the specified collection that match the provided query. <br /> * Returns and removes all documents form the specified collection that match the provided query.
* <strong>NOTE:</strong> Any additional support for field mapping is not available due to the lack of domain type
* information. Use {@link #findAllAndRemove(Query, Class, String)} to get full type specific support.
* *
* @param query must not be {@literal null}. * @param query
* @param collectionName must not be {@literal null}. * @param collectionName
* @return * @return
* @since 1.5 * @since 1.5
*/ */
@@ -1046,5 +1094,4 @@ public interface MongoOperations {
* @return * @return
*/ */
MongoConverter getConverter(); MongoConverter getConverter();
} }

View File

@@ -1,255 +0,0 @@
/*
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import javax.net.ssl.SSLSocketFactory;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.MongoOptions;
/**
* A factory bean for construction of a {@link MongoOptions} instance. In case used with MongoDB Java driver version 3
* porperties not suppprted by the driver will be ignored.
*
* @author Graeme Rocher
* @author Mark Pollack
* @author Mike Saavedra
* @author Thomas Darimont
* @author Christoph Strobl
* @deprecated since 1.7. Please use {@link MongoClientOptionsFactoryBean} instead.
*/
@Deprecated
public class MongoOptionsFactoryBean extends AbstractFactoryBean<MongoOptions> {
private static final MongoOptions DEFAULT_MONGO_OPTIONS = new MongoOptions();
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.getConnectionsPerHost();
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS
.getThreadsAllowedToBlockForConnectionMultiplier();
private int maxWaitTime = DEFAULT_MONGO_OPTIONS.getMaxWaitTime();
private int connectTimeout = DEFAULT_MONGO_OPTIONS.getConnectTimeout();
private int socketTimeout = DEFAULT_MONGO_OPTIONS.getSocketTimeout();
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.isSocketKeepAlive();
private int writeNumber = DEFAULT_MONGO_OPTIONS.getW();
private int writeTimeout = DEFAULT_MONGO_OPTIONS.getWtimeout();
private boolean writeFsync = DEFAULT_MONGO_OPTIONS.isFsync();
private boolean autoConnectRetry = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getAutoConnectRetry(DEFAULT_MONGO_OPTIONS) : false;
private long maxAutoConnectRetryTime = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getMaxAutoConnectRetryTime(DEFAULT_MONGO_OPTIONS) : -1;
private boolean slaveOk = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getSlaveOk(DEFAULT_MONGO_OPTIONS) : false;
private boolean ssl;
private SSLSocketFactory sslSocketFactory;
/**
* Configures the maximum number of connections allowed per host until we will block.
*
* @param connectionsPerHost
*/
public void setConnectionsPerHost(int connectionsPerHost) {
this.connectionsPerHost = connectionsPerHost;
}
/**
* A multiplier for connectionsPerHost for # of threads that can block a connection. If connectionsPerHost is 10, and
* threadsAllowedToBlockForConnectionMultiplier is 5, then 50 threads can block. If more threads try to block an
* exception will be thrown.
*
* @param threadsAllowedToBlockForConnectionMultiplier
*/
public void setThreadsAllowedToBlockForConnectionMultiplier(int threadsAllowedToBlockForConnectionMultiplier) {
this.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;
}
/**
* Max wait time of a blocking thread for a connection.
*
* @param maxWaitTime
*/
public void setMaxWaitTime(int maxWaitTime) {
this.maxWaitTime = maxWaitTime;
}
/**
* Configures the connect timeout in milliseconds. Defaults to 0 (infinite time).
*
* @param connectTimeout
*/
public void setConnectTimeout(int connectTimeout) {
this.connectTimeout = connectTimeout;
}
/**
* Configures the socket timeout. Defaults to 0 (infinite time).
*
* @param socketTimeout
*/
public void setSocketTimeout(int socketTimeout) {
this.socketTimeout = socketTimeout;
}
/**
* Configures whether or not to have socket keep alive turned on (SO_KEEPALIVE). Defaults to {@literal false}.
*
* @param socketKeepAlive
*/
public void setSocketKeepAlive(boolean socketKeepAlive) {
this.socketKeepAlive = socketKeepAlive;
}
/**
* This specifies the number of servers to wait for on the write operation, and exception raising behavior. The 'w'
* option to the getlasterror command. Defaults to 0.
* <ul>
* <li>-1 = don't even report network errors</li>
* <li>0 = default, don't call getLastError by default</li>
* <li>1 = basic, call getLastError, but don't wait for slaves</li>
* <li>2 += wait for slaves</li>
* </ul>
*
* @param writeNumber the number of servers to wait for on the write operation, and exception raising behavior.
*/
public void setWriteNumber(int writeNumber) {
this.writeNumber = writeNumber;
}
/**
* Configures the timeout for write operations in milliseconds. This defaults to {@literal 0} (indefinite).
*
* @param writeTimeout
*/
public void setWriteTimeout(int writeTimeout) {
this.writeTimeout = writeTimeout;
}
/**
* Configures whether or not to fsync. The 'fsync' option to the getlasterror command. Defaults to {@literal false}.
*
* @param writeFsync to fsync on <code>write (true)<code>, otherwise {@literal false}.
*/
public void setWriteFsync(boolean writeFsync) {
this.writeFsync = writeFsync;
}
/**
* Configures whether or not the system retries automatically on a failed connect. This defaults to {@literal false}.
*
* @deprecated since 1.7.
*/
@Deprecated
public void setAutoConnectRetry(boolean autoConnectRetry) {
this.autoConnectRetry = autoConnectRetry;
}
/**
* Configures the maximum amount of time in millisecons to spend retrying to open connection to the same server. This
* defaults to {@literal 0}, which means to use the default {@literal 15s} if {@link #autoConnectRetry} is on.
*
* @param maxAutoConnectRetryTime the maxAutoConnectRetryTime to set
* @deprecated since 1.7
*/
@Deprecated
public void setMaxAutoConnectRetryTime(long maxAutoConnectRetryTime) {
this.maxAutoConnectRetryTime = maxAutoConnectRetryTime;
}
/**
* Specifies if the driver is allowed to read from secondaries or slaves. Defaults to {@literal false}.
*
* @param slaveOk true if the driver should read from secondaries or slaves.
* @deprecated since 1.7
*/
@Deprecated
public void setSlaveOk(boolean slaveOk) {
this.slaveOk = slaveOk;
}
/**
* Specifies if the driver should use an SSL connection to Mongo. This defaults to {@literal false}. By default
* {@link SSLSocketFactory#getDefault()} will be used. See {@link #setSslSocketFactory(SSLSocketFactory)} if you want
* to configure a custom factory.
*
* @param ssl true if the driver should use an SSL connection.
* @see #setSslSocketFactory(SSLSocketFactory)
*/
public void setSsl(boolean ssl) {
this.ssl = ssl;
}
/**
* Specifies the {@link SSLSocketFactory} to use for creating SSL connections to Mongo. Defaults to
* {@link SSLSocketFactory#getDefault()}. Implicitly activates {@link #setSsl(boolean)} if a non-{@literal null} value
* is given.
*
* @param sslSocketFactory the sslSocketFactory to use.
* @see #setSsl(boolean)
*/
public void setSslSocketFactory(SSLSocketFactory sslSocketFactory) {
setSsl(sslSocketFactory != null);
this.sslSocketFactory = sslSocketFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override
protected MongoOptions createInstance() throws Exception {
if (MongoClientVersion.isMongo3Driver()) {
throw new IllegalArgumentException(
String
.format("Usage of 'mongo-options' is no longer supported for MongoDB Java driver version 3 and above. Please use 'mongo-client-options' and refer to chapter 'MongoDB 3.0 Support' for details."));
}
MongoOptions options = new MongoOptions();
options.setConnectionsPerHost(connectionsPerHost);
options.setThreadsAllowedToBlockForConnectionMultiplier(threadsAllowedToBlockForConnectionMultiplier);
options.setMaxWaitTime(maxWaitTime);
options.setConnectTimeout(connectTimeout);
options.setSocketTimeout(socketTimeout);
options.setSocketKeepAlive(socketKeepAlive);
options.setW(writeNumber);
options.setWtimeout(writeTimeout);
options.setFsync(writeFsync);
if (ssl) {
options.setSocketFactory(sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory.getDefault());
}
ReflectiveMongoOptionsInvoker.setAutoConnectRetry(options, autoConnectRetry);
ReflectiveMongoOptionsInvoker.setMaxAutoConnectRetryTime(options, maxAutoConnectRetryTime);
ReflectiveMongoOptionsInvoker.setSlaveOk(options, slaveOk);
return options;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
public Class<?> getObjectType() {
return MongoOptions.class;
}
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License. * you may not use this file except in compliance with the License.
* You may obtain a copy of the License at * You may obtain a copy of the License at
* *
* https://www.apache.org/licenses/LICENSE-2.0 * http://www.apache.org/licenses/LICENSE-2.0
* *
* Unless required by applicable law or agreed to in writing, software * Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, * distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -0,0 +1,34 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.dao.DataAccessException;
import com.mongodb.MongoException;
import com.mongodb.reactivestreams.client.MongoCollection;
import org.bson.Document;
import org.reactivestreams.Publisher;
/**
* @author Mark Paluch
* @param <T>
* @since 2.0
*/
public interface ReactiveCollectionCallback<T> {
Publisher<T> doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException;
}

View File

@@ -0,0 +1,32 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.dao.DataAccessException;
import com.mongodb.MongoException;
import com.mongodb.reactivestreams.client.MongoDatabase;
import org.reactivestreams.Publisher;
/**
* @author Mark Paluch
* @param <T>
* @since 2.0
*/
public interface ReactiveDatabaseCallback<T> {
Publisher<T> doInDB(MongoDatabase db) throws MongoException, DataAccessException;
}

View File

@@ -0,0 +1,62 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.List;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
/**
* Index operations on a collection.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.0
*/
public interface ReactiveIndexOperations {
/**
* Ensure that an index for the provided {@link IndexDefinition} exists for the collection indicated by the entity
* class. If not it will be created.
*
* @param indexDefinition must not be {@literal null}.
*/
Mono<String> ensureIndex(IndexDefinition indexDefinition);
/**
* Drops an index from this collection.
*
* @param name name of index to drop
*/
Mono<Void> dropIndex(String name);
/**
* Drops all indices from this collection.
*/
Mono<Void> dropAllIndexes();
/**
* Returns the index information on the collection.
*
* @return index information on the collection
*/
Flux<IndexInfo> getIndexInfo();
}

View File

@@ -0,0 +1,129 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.util.StringUtils;
import com.mongodb.async.client.MongoClientSettings;
import com.mongodb.reactivestreams.client.MongoClient;
import com.mongodb.reactivestreams.client.MongoClients;
/**
* Convenient factory for configuring a reactive streams {@link MongoClient}.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.0
*/
public class ReactiveMongoClientFactoryBean extends AbstractFactoryBean<MongoClient>
implements PersistenceExceptionTranslator {
private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator();
private String connectionString;
private String host;
private Integer port;
private MongoClientSettings mongoClientSettings;
private PersistenceExceptionTranslator exceptionTranslator = DEFAULT_EXCEPTION_TRANSLATOR;
/**
* Configures the host to connect to.
*
* @param host
*/
public void setHost(String host) {
this.host = host;
}
/**
* Configures the port to connect to.
*
* @param port
*/
public void setPort(int port) {
this.port = port;
}
/**
* Configures the connection string.
*
* @param connectionString
*/
public void setConnectionString(String connectionString) {
this.connectionString = connectionString;
}
/**
* Configures the mongo client settings.
*
* @param mongoClientSettings
*/
public void setMongoClientSettings(MongoClientSettings mongoClientSettings) {
this.mongoClientSettings = mongoClientSettings;
}
/**
* Configures the {@link PersistenceExceptionTranslator} to use.
*
* @param exceptionTranslator
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator;
}
@Override
public Class<?> getObjectType() {
return MongoClient.class;
}
@Override
protected MongoClient createInstance() throws Exception {
if (mongoClientSettings != null) {
return MongoClients.create(mongoClientSettings);
}
if (StringUtils.hasText(connectionString)) {
return MongoClients.create(connectionString);
}
if (StringUtils.hasText(host)) {
if (port != null) {
return MongoClients.create(String.format("mongodb://%s:%d", host, port));
}
return MongoClients.create(String.format("mongodb://%s", host));
}
throw new IllegalStateException(
"Cannot create MongoClients. One of the following is required: mongoClientSettings, connectionString or host/port");
}
@Override
protected void destroyInstance(MongoClient instance) throws Exception {
instance.close();
}
@Override
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
return exceptionTranslator.translateExceptionIfPossible(ex);
}
}

View File

@@ -0,0 +1,206 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.ArrayList;
import java.util.List;
import org.bson.codecs.configuration.CodecRegistry;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.util.Assert;
import com.mongodb.MongoCredential;
import com.mongodb.ReadConcern;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
import com.mongodb.async.client.MongoClientSettings;
import com.mongodb.connection.ClusterSettings;
import com.mongodb.connection.ConnectionPoolSettings;
import com.mongodb.connection.ServerSettings;
import com.mongodb.connection.SocketSettings;
import com.mongodb.connection.SslSettings;
import com.mongodb.connection.StreamFactoryFactory;
/**
* A factory bean for construction of a {@link MongoClientSettings} instance to be used with the async MongoDB driver.
*
* @author Mark Paluch
* @since 2.0
*/
public class ReactiveMongoClientSettingsFactoryBean extends AbstractFactoryBean<MongoClientSettings> {
private static final MongoClientSettings DEFAULT_MONGO_SETTINGS = MongoClientSettings.builder().build();
private ReadPreference readPreference = DEFAULT_MONGO_SETTINGS.getReadPreference();
private WriteConcern writeConcern = DEFAULT_MONGO_SETTINGS.getWriteConcern();
private ReadConcern readConcern = DEFAULT_MONGO_SETTINGS.getReadConcern();
private List<MongoCredential> credentialList = new ArrayList<>();
private StreamFactoryFactory streamFactoryFactory = DEFAULT_MONGO_SETTINGS.getStreamFactoryFactory();
private CodecRegistry codecRegistry = DEFAULT_MONGO_SETTINGS.getCodecRegistry();
private ClusterSettings clusterSettings = DEFAULT_MONGO_SETTINGS.getClusterSettings();
private SocketSettings socketSettings = DEFAULT_MONGO_SETTINGS.getSocketSettings();
private SocketSettings heartbeatSocketSettings = DEFAULT_MONGO_SETTINGS.getHeartbeatSocketSettings();
private ConnectionPoolSettings connectionPoolSettings = DEFAULT_MONGO_SETTINGS.getConnectionPoolSettings();
private ServerSettings serverSettings = DEFAULT_MONGO_SETTINGS.getServerSettings();
private SslSettings sslSettings = DEFAULT_MONGO_SETTINGS.getSslSettings();
/**
* Set the {@link ReadPreference}.
*
* @param readPreference
*/
public void setReadPreference(ReadPreference readPreference) {
this.readPreference = readPreference;
}
/**
* Set the {@link WriteConcern}.
*
* @param writeConcern
*/
public void setWriteConcern(WriteConcern writeConcern) {
this.writeConcern = writeConcern;
}
/**
* Set the {@link ReadConcern}.
*
* @param readConcern
*/
public void setReadConcern(ReadConcern readConcern) {
this.readConcern = readConcern;
}
/**
* Set the List of {@link MongoCredential}s.
*
* @param credentialList must not be {@literal null}.
*/
public void setCredentialList(List<MongoCredential> credentialList) {
Assert.notNull(credentialList, "CredendialList must not be null!");
this.credentialList.addAll(credentialList);
}
/**
* Adds the {@link MongoCredential} to the list of credentials.
*
* @param mongoCredential must not be {@literal null}.
*/
public void addMongoCredential(MongoCredential mongoCredential) {
Assert.notNull(mongoCredential, "MongoCredential must not be null!");
this.credentialList.add(mongoCredential);
}
/**
* Set the {@link StreamFactoryFactory}.
*
* @param streamFactoryFactory
*/
public void setStreamFactoryFactory(StreamFactoryFactory streamFactoryFactory) {
this.streamFactoryFactory = streamFactoryFactory;
}
/**
* Set the {@link CodecRegistry}.
*
* @param codecRegistry
*/
public void setCodecRegistry(CodecRegistry codecRegistry) {
this.codecRegistry = codecRegistry;
}
/**
* Set the {@link ClusterSettings}.
*
* @param clusterSettings
*/
public void setClusterSettings(ClusterSettings clusterSettings) {
this.clusterSettings = clusterSettings;
}
/**
* Set the {@link SocketSettings}.
*
* @param socketSettings
*/
public void setSocketSettings(SocketSettings socketSettings) {
this.socketSettings = socketSettings;
}
/**
* Set the heartbeat {@link SocketSettings}.
*
* @param heartbeatSocketSettings
*/
public void setHeartbeatSocketSettings(SocketSettings heartbeatSocketSettings) {
this.heartbeatSocketSettings = heartbeatSocketSettings;
}
/**
* Set the {@link ConnectionPoolSettings}.
*
* @param connectionPoolSettings
*/
public void setConnectionPoolSettings(ConnectionPoolSettings connectionPoolSettings) {
this.connectionPoolSettings = connectionPoolSettings;
}
/**
* Set the {@link ServerSettings}.
*
* @param serverSettings
*/
public void setServerSettings(ServerSettings serverSettings) {
this.serverSettings = serverSettings;
}
/**
* Set the {@link SslSettings}.
*
* @param sslSettings
*/
public void setSslSettings(SslSettings sslSettings) {
this.sslSettings = sslSettings;
}
@Override
public Class<?> getObjectType() {
return MongoClientSettings.class;
}
@Override
protected MongoClientSettings createInstance() throws Exception {
return MongoClientSettings.builder() //
.readPreference(readPreference) //
.writeConcern(writeConcern) //
.readConcern(readConcern) //
.credentialList(credentialList) //
.streamFactoryFactory(streamFactoryFactory) //
.codecRegistry(codecRegistry) //
.clusterSettings(clusterSettings) //
.socketSettings(socketSettings) //
.heartbeatSocketSettings(heartbeatSocketSettings) //
.connectionPoolSettings(connectionPoolSettings) //
.serverSettings(serverSettings) //
.sslSettings(sslSettings) //
.build();
}
}

View File

@@ -0,0 +1,51 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.reactivestreams.client.MongoClient;
import com.mongodb.reactivestreams.client.MongoDatabase;
/**
* Helper class featuring helper methods for internal MongoDb classes. Mainly intended for internal use within the
* framework.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.0
*/
public abstract class ReactiveMongoDbUtils {
/**
* Private constructor to prevent instantiation.
*/
private ReactiveMongoDbUtils() {}
/**
* Obtains a {@link MongoDatabase} connection for the given {@link MongoClient} instance and database name
*
* @param mongo the {@link MongoClient} instance, must not be {@literal null}.
* @param databaseName the database name, must not be {@literal null} or empty.
* @return the {@link MongoDatabase} connection
*/
public static MongoDatabase getMongoDatabase(MongoClient mongo, String databaseName) {
return doGetMongoDatabase(mongo, databaseName, true);
}
private static MongoDatabase doGetMongoDatabase(MongoClient mongo, String databaseName, boolean allowCreate) {
return mongo.getDatabase(databaseName);
}
}

View File

@@ -0,0 +1,945 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.Collection;
import org.bson.Document;
import org.reactivestreams.Publisher;
import org.reactivestreams.Subscription;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import com.mongodb.ReadPreference;
import com.mongodb.client.result.DeleteResult;
import com.mongodb.client.result.UpdateResult;
import com.mongodb.reactivestreams.client.MongoCollection;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
/**
* Interface that specifies a basic set of MongoDB operations executed in a reactive way.
* <p>
* Implemented by {@link ReactiveMongoTemplate}. Not often used but a useful option for extensibility and testability
* (as it can be easily mocked, stubbed, or be the target of a JDK proxy). Command execution using
* {@link ReactiveMongoOperations} is deferred until subscriber subscribes to the {@link Publisher}.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.0
* @see Flux
* @see Mono
* @see <a href="http://projectreactor.io/docs/">Project Reactor</a>
*/
public interface ReactiveMongoOperations {
/**
* Returns the reactive operations that can be performed on indexes
*
* @return index operations on the named collection
*/
ReactiveIndexOperations indexOps(String collectionName);
/**
* Returns the reactive operations that can be performed on indexes
*
* @return index operations on the named collection associated with the given entity class
*/
ReactiveIndexOperations indexOps(Class<?> entityClass);
/**
* Execute the a MongoDB command expressed as a JSON string. This will call the method JSON.parse that is part of the
* MongoDB driver to convert the JSON string to a DBObject. Any errors that result from executing this command will be
* converted into Spring's DAO exception hierarchy.
*
* @param jsonCommand a MongoDB command expressed as a JSON string.
* @return a result object returned by the action
*/
Mono<Document> executeCommand(String jsonCommand);
/**
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's DAO
* exception hierarchy.
*
* @param command a MongoDB command
* @return a result object returned by the action
*/
Mono<Document> executeCommand(Document command);
/**
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's data
* access exception hierarchy.
*
* @param command a MongoDB command, must not be {@literal null}.
* @param readPreference read preferences to use, can be {@literal null}.
* @return a result object returned by the action
*/
Mono<Document> executeCommand(Document command, ReadPreference readPreference);
/**
* Executes a {@link ReactiveDatabaseCallback} translating any exceptions as necessary.
* <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param <T> return type
* @param action callback object that specifies the MongoDB actions to perform on the passed in DB instance.
* @return a result object returned by the action
*/
<T> Flux<T> execute(ReactiveDatabaseCallback<T> action);
/**
* Executes the given {@link ReactiveCollectionCallback} on the entity collection of the specified class.
* <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param entityClass class that determines the collection to use
* @param <T> return type
* @param action callback object that specifies the MongoDB action
* @return a result object returned by the action or <tt>null</tt>
*/
<T> Flux<T> execute(Class<?> entityClass, ReactiveCollectionCallback<T> action);
/**
* Executes the given {@link ReactiveCollectionCallback} on the collection of the given name.
* <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
* @param <T> return type
* @param collectionName the name of the collection that specifies which DBCollection instance will be passed into
* @param action callback object that specifies the MongoDB action the callback action.
* @return a result object returned by the action or <tt>null</tt>
*/
<T> Flux<T> execute(String collectionName, ReactiveCollectionCallback<T> action);
/**
* Create an uncapped collection with a name based on the provided entity class.
*
* @param entityClass class that determines the collection to create
* @return the created collection
*/
<T> Mono<MongoCollection<Document>> createCollection(Class<T> entityClass);
/**
* Create a collection with a name based on the provided entity class using the options.
*
* @param entityClass class that determines the collection to create
* @param collectionOptions options to use when creating the collection.
* @return the created collection
*/
<T> Mono<MongoCollection<Document>> createCollection(Class<T> entityClass, CollectionOptions collectionOptions);
/**
* Create an uncapped collection with the provided name.
*
* @param collectionName name of the collection
* @return the created collection
*/
Mono<MongoCollection<Document>> createCollection(String collectionName);
/**
* Create a collection with the provided name and options.
*
* @param collectionName name of the collection
* @param collectionOptions options to use when creating the collection.
* @return the created collection
*/
Mono<MongoCollection<Document>> createCollection(String collectionName, CollectionOptions collectionOptions);
/**
* A set of collection names.
*
* @return Flux of collection names
*/
Flux<String> getCollectionNames();
/**
* Get a collection by name, creating it if it doesn't exist.
* <p/>
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection
* @return an existing collection or a newly created one.
*/
MongoCollection<Document> getCollection(String collectionName);
/**
* Check to see if a collection with a name indicated by the entity class exists.
* <p/>
* Translate any exceptions as necessary.
*
* @param entityClass class that determines the name of the collection
* @return true if a collection with the given name is found, false otherwise.
*/
<T> Mono<Boolean> collectionExists(Class<T> entityClass);
/**
* Check to see if a collection with a given name exists.
* <p/>
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection
* @return true if a collection with the given name is found, false otherwise.
*/
Mono<Boolean> collectionExists(String collectionName);
/**
* Drop the collection with the name indicated by the entity class.
* <p/>
* Translate any exceptions as necessary.
*
* @param entityClass class that determines the collection to drop/delete.
*/
<T> Mono<Void> dropCollection(Class<T> entityClass);
/**
* Drop the collection with the given name.
* <p/>
* Translate any exceptions as necessary.
*
* @param collectionName name of the collection to drop/delete.
*/
Mono<Void> dropCollection(String collectionName);
/**
* Query for a {@link Flux} of objects of type T from the collection used by the entity class.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
* @param entityClass the parametrized type of the returned {@link Flux}.
*
* @return the converted collection
*/
<T> Flux<T> findAll(Class<T> entityClass);
/**
* Query for a {@link Flux} of objects of type T from the specified collection.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
*
* @param entityClass the parametrized type of the returned {@link Flux}.
* @param collectionName name of the collection to retrieve the objects from
* @return the converted collection
*/
<T> Flux<T> findAll(Class<T> entityClass, String collectionName);
/**
* Map the results of an ad-hoc query on the collection for the entity class to a single instance of an object of the
* specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parametrized type of the returned {@link Mono}.
* @return the converted object
*/
<T> Mono<T> findOne(Query query, Class<T> entityClass);
/**
* Map the results of an ad-hoc query on the specified collection to a single instance of an object of the specified
* type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parametrized type of the returned {@link Mono}.
* @param collectionName name of the collection to retrieve the objects from
* @return the converted object
*/
<T> Mono<T> findOne(Query query, Class<T> entityClass, String collectionName);
/**
* Determine result of given {@link Query} contains at least one element.
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param collectionName name of the collection to check for objects.
* @return
*/
Mono<Boolean> exists(Query query, String collectionName);
/**
* Determine result of given {@link Query} contains at least one element.
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param entityClass the parametrized type.
* @return
*/
Mono<Boolean> exists(Query query, Class<?> entityClass);
/**
* Determine result of given {@link Query} contains at least one element.
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param entityClass the parametrized type.
* @param collectionName name of the collection to check for objects.
* @return
*/
Mono<Boolean> exists(Query query, Class<?> entityClass, String collectionName);
/**
* Map the results of an ad-hoc query on the collection for the entity class to a {@link Flux} of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parametrized type of the returned {@link Flux}.
* @return the {@link Flux} of converted objects
*/
<T> Flux<T> find(Query query, Class<T> entityClass);
/**
* Map the results of an ad-hoc query on the specified collection to a {@link Flux} of the specified type.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parametrized type of the returned {@link Flux}.
* @param collectionName name of the collection to retrieve the objects from
* @return the {@link Flux} of converted objects
*/
<T> Flux<T> find(Query query, Class<T> entityClass, String collectionName);
/**
* Returns a document with the given id mapped onto the given class. The collection the query is ran against will be
* derived from the given target class as well.
*
* @param <T>
* @param id the id of the document to return.
* @param entityClass the type the document shall be converted into.
* @return the document with the given id mapped onto the given target class.
*/
<T> Mono<T> findById(Object id, Class<T> entityClass);
/**
* Returns the document with the given id from the given collection mapped onto the given target class.
*
* @param id the id of the document to return
* @param entityClass the type to convert the document to
* @param collectionName the collection to query for the document
* @param <T>
* @return
*/
<T> Mono<T> findById(Object id, Class<T> entityClass, String collectionName);
/**
* Returns {@link Flux} of {@link GeoResult} for all entities matching the given {@link NearQuery}. Will consider entity mapping
* information to determine the collection the query is ran against. Note, that MongoDB limits the number of results
* by default. Make sure to add an explicit limit to the {@link NearQuery} if you expect a particular number of
* results.
*
* @param near must not be {@literal null}.
* @param entityClass must not be {@literal null}.
* @return
*/
<T> Flux<GeoResult<T>> geoNear(NearQuery near, Class<T> entityClass);
/**
* Returns {@link Flux} of {@link GeoResult} for all entities matching the given {@link NearQuery}. Note, that MongoDB limits the
* number of results by default. Make sure to add an explicit limit to the {@link NearQuery} if you expect a
* particular number of results.
*
* @param near must not be {@literal null}.
* @param entityClass must not be {@literal null}.
* @param collectionName the collection to trigger the query against. If no collection name is given the entity class
* will be inspected.
* @return
*/
<T> Flux<GeoResult<T>> geoNear(NearQuery near, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify <a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
* @param entityClass the parametrized type.
* @return
*/
<T> Mono<T> findAndModify(Query query, Update update, Class<T> entityClass);
/**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify <a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
* @param entityClass the parametrized type.
* @param collectionName the collection to query.
* @return
*/
<T> Mono<T> findAndModify(Query query, Update update, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify <a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
* @param options the {@link FindAndModifyOptions} holding additional information.
* @param entityClass the parametrized type.
* @return
*/
<T> Mono<T> findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass);
/**
* Triggers <a href="https://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify <a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
* @param options the {@link FindAndModifyOptions} holding additional information.
* @param entityClass the parametrized type.
* @param collectionName the collection to query.
* @return
*/
<T> Mono<T> findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass,
String collectionName);
/**
* Map the results of an ad-hoc query on the collection for the entity type to a single instance of an object of the
* specified type. The first document that matches the query is returned and also removed from the collection in the
* database.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parametrized type of the returned {@link Mono}.
* @return the converted object
*/
<T> Mono<T> findAndRemove(Query query, Class<T> entityClass);
/**
* Map the results of an ad-hoc query on the specified collection to a single instance of an object of the specified
* type. The first document that matches the query is returned and also removed from the collection in the database.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parametrized type of the returned {@link Mono}.
* @param collectionName name of the collection to retrieve the objects from.
* @return the converted object
*/
<T> Mono<T> findAndRemove(Query query, Class<T> entityClass, String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the collection of the given entity class.
*
* @param query
* @param entityClass must not be {@literal null}.
* @return
*/
Mono<Long> count(Query query, Class<?> entityClass);
/**
* Returns the number of documents for the given {@link Query} querying the given collection. The given {@link Query}
* must solely consist of document field references as we lack type information to map potential property references
* onto document fields. TO make sure the query gets mapped, use {@link #count(Query, Class, String)}.
*
* @param query
* @param collectionName must not be {@literal null} or empty.
* @return
* @see #count(Query, Class, String)
*/
Mono<Long> count(Query query, String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the given collection using the given entity
* class to map the given {@link Query}.
*
* @param query
* @param entityClass must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return
*/
Mono<Long> count(Query query, Class<?> entityClass, String collectionName);
/**
* Insert the object into the collection for the entity type of the object to save.
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" >
* Spring's Type Conversion"</a> for more details.
* <p/>
* <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method.
*
* @param objectToSave the object to store in the collection.
* @return
*/
<T> Mono<T> insert(T objectToSave);
/**
* Insert the object into the specified collection.
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method.
*
* @param objectToSave the object to store in the collection
* @param collectionName name of the collection to store the object in
* @return
*/
<T> Mono<T> insert(T objectToSave, String collectionName);
/**
* Insert a Collection of objects into a collection in a single batch write to the database.
*
* @param batchToSave the batch of objects to save.
* @param entityClass class that determines the collection to use
* @return
*/
<T> Flux<T> insert(Collection<? extends T> batchToSave, Class<?> entityClass);
/**
* Insert a batch of objects into the specified collection in a single batch write to the database.
*
* @param batchToSave the list of objects to save.
* @param collectionName name of the collection to store the object in
* @return
*/
<T> Flux<T> insert(Collection<? extends T> batchToSave, String collectionName);
/**
* Insert a mixed Collection of objects into a database collection determining the collection name to use based on the
* class.
*
* @param objectsToSave the list of objects to save.
* @return
*/
<T> Flux<T> insertAll(Collection<? extends T> objectsToSave);
/**
* Insert the object into the collection for the entity type of the object to save.
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" >
* Spring's Type Conversion"</a> for more details.
* <p/>
* <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method.
*
* @param objectToSave the object to store in the collection.
* @return
*/
<T> Mono<T> insert(Mono<? extends T> objectToSave);
/**
* Insert a Collection of objects into a collection in a single batch write to the database.
*
* @param batchToSave the publisher which provides objects to save.
* @param entityClass class that determines the collection to use
* @return
*/
<T> Flux<T> insertAll(Mono<? extends Collection<? extends T>> batchToSave, Class<?> entityClass);
/**
* Insert objects into the specified collection in a single batch write to the database.
*
* @param batchToSave the publisher which provides objects to save.
* @param collectionName name of the collection to store the object in
* @return
*/
<T> Flux<T> insertAll(Mono<? extends Collection<? extends T>> batchToSave, String collectionName);
/**
* Insert a mixed Collection of objects into a database collection determining the collection name to use based on the
* class.
*
* @param objectsToSave the publisher which provides objects to save.
* @return
*/
<T> Flux<T> insertAll(Mono<? extends Collection<? extends T>> objectsToSave);
/**
* Save the object to the collection for the entity type of the object to save. This will perform an insert if the
* object is not already present, that is an 'upsert'.
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" >
* Spring's Type Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection
* @return
*/
<T> Mono<T> save(T objectToSave);
/**
* Save the object to the specified collection. This will perform an insert if the object is not already present, that
* is an 'upsert'.
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert">Spring's
* Type Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection
* @param collectionName name of the collection to store the object in
* @return
*/
<T> Mono<T> save(T objectToSave, String collectionName);
/**
* Save the object to the collection for the entity type of the object to save. This will perform an insert if the
* object is not already present, that is an 'upsert'.
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" >
* Spring's Type Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection
* @return
*/
<T> Mono<T> save(Mono<? extends T> objectToSave);
/**
* Save the object to the specified collection. This will perform an insert if the object is not already present, that
* is an 'upsert'.
* <p/>
* The object is converted to the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert">Spring's
* Type Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection
* @param collectionName name of the collection to store the object in
* @return
*/
<T> Mono<T> save(Mono<? extends T> objectToSave, String collectionName);
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
*
* @param query the query document that specifies the criteria used to select a record to be upserted
* @param update the update document that contains the updated object or $ operators to manipulate the existing object
* @param entityClass class that determines the collection to use
* @return the WriteResult which lets you access the results of the previous write.
*/
Mono<UpdateResult> upsert(Query query, Update update, Class<?> entityClass);
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write.
*/
Mono<UpdateResult> upsert(Query query, Update update, String collectionName);
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
*
* @param query the query document that specifies the criteria used to select a record to be upserted
* @param update the update document that contains the updated object or $ operators to manipulate the existing object
* @param entityClass class of the pojo to be operated on
* @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write.
*/
Mono<UpdateResult> upsert(Query query, Update update, Class<?> entityClass, String collectionName);
/**
* Updates the first object that is found in the collection of the entity class that matches the query document with
* the provided update document.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param entityClass class that determines the collection to use
* @return the WriteResult which lets you access the results of the previous write.
*/
Mono<UpdateResult> updateFirst(Query query, Update update, Class<?> entityClass);
/**
* Updates the first object that is found in the specified collection that matches the query document criteria with
* the provided updated document.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write.
*/
Mono<UpdateResult> updateFirst(Query query, Update update, String collectionName);
/**
* Updates the first object that is found in the specified collection that matches the query document criteria with
* the provided updated document.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param entityClass class of the pojo to be operated on
* @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write.
*/
Mono<UpdateResult> updateFirst(Query query, Update update, Class<?> entityClass, String collectionName);
/**
* Updates all objects that are found in the collection for the entity class that matches the query document criteria
* with the provided updated document.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param entityClass class that determines the collection to use
* @return the WriteResult which lets you access the results of the previous write.
*/
Mono<UpdateResult> updateMulti(Query query, Update update, Class<?> entityClass);
/**
* Updates all objects that are found in the specified collection that matches the query document criteria with the
* provided updated document.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write.
*/
Mono<UpdateResult> updateMulti(Query query, Update update, String collectionName);
/**
* Updates all objects that are found in the collection for the entity class that matches the query document criteria
* with the provided updated document.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
* @param entityClass class of the pojo to be operated on
* @param collectionName name of the collection to update the object in
* @return the WriteResult which lets you access the results of the previous write.
*/
Mono<UpdateResult> updateMulti(final Query query, final Update update, Class<?> entityClass, String collectionName);
/**
* Remove the given object from the collection by id.
*
* @param object
* @return
*/
Mono<DeleteResult> remove(Object object);
/**
* Removes the given object from the given collection.
*
* @param object
* @param collection must not be {@literal null} or empty.
*/
Mono<DeleteResult> remove(Object object, String collection);
/**
* Remove the given object from the collection by id.
*
* @param objectToRemove
* @return
*/
Mono<DeleteResult> remove(Mono<? extends Object> objectToRemove);
/**
* Removes the given object from the given collection.
*
* @param objectToRemove
* @param collection must not be {@literal null} or empty.
* @return
*/
Mono<DeleteResult> remove(Mono<? extends Object> objectToRemove, String collection);
/**
* Remove all documents that match the provided query document criteria from the the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
* @param query
* @param entityClass
* @return
*/
Mono<DeleteResult> remove(Query query, Class<?> entityClass);
/**
* Remove all documents that match the provided query document criteria from the the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
* @param query
* @param entityClass
* @param collectionName
* @return
*/
Mono<DeleteResult> remove(Query query, Class<?> entityClass, String collectionName);
/**
* Remove all documents from the specified collection that match the provided query document criteria. There is no
* conversion/mapping done for any criteria using the id field.
*
* @param query the query document that specifies the criteria used to remove a record
* @param collectionName name of the collection where the objects will removed
*/
Mono<DeleteResult> remove(Query query, String collectionName);
/**
* Returns and removes all documents form the specified collection that match the provided query.
*
* @param query
* @param collectionName
* @return
*/
<T> Flux<T> findAllAndRemove(Query query, String collectionName);
/**
* Returns and removes all documents matching the given query form the collection used to store the entityClass.
*
* @param query
* @param entityClass
* @return
*/
<T> Flux<T> findAllAndRemove(Query query, Class<T> entityClass);
/**
* Returns and removes all documents that match the provided query document criteria from the the collection used to
* store the entityClass. The Class parameter is also used to help convert the Id of the object if it is present in
* the query.
*
* @param query
* @param entityClass
* @param collectionName
* @return
*/
<T> Flux<T> findAllAndRemove(Query query, Class<T> entityClass, String collectionName);
/**
* Map the results of an ad-hoc query on the collection for the entity class to a stream of objects of the specified
* type. The stream uses a {@link com.mongodb.CursorType#TailableAwait tailable} cursor that may be an infinite
* stream. The stream will not be completed unless the {@link org.reactivestreams.Subscription} is
* {@link Subscription#cancel() canceled}.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parametrized type of the returned {@link Flux}.
* @return the {@link Flux} of converted objects
*/
<T> Flux<T> tail(Query query, Class<T> entityClass);
/**
* Map the results of an ad-hoc query on the collection for the entity class to a stream of objects of the specified
* type. The stream uses a {@link com.mongodb.CursorType#TailableAwait tailable} cursor that may be an infinite
* stream. The stream will not be completed unless the {@link org.reactivestreams.Subscription} is
* {@link Subscription#cancel() canceled}.
* <p/>
* The object is converted from the MongoDB native representation using an instance of {@see MongoConverter}. Unless
* configured otherwise, an instance of {@link MappingMongoConverter} will be used.
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parametrized type of the returned {@link Flux}.
* @param collectionName name of the collection to retrieve the objects from
* @return the {@link Flux} of converted objects
*/
<T> Flux<T> tail(Query query, Class<T> entityClass, String collectionName);
/**
* Returns the underlying {@link MongoConverter}.
*
* @return
*/
MongoConverter getConverter();
}

View File

@@ -1,109 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
/**
* {@link ReflectiveDBCollectionInvoker} provides reflective access to {@link DBCollection} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class ReflectiveDBCollectionInvoker {
private static final Method GEN_INDEX_NAME_METHOD;
private static final Method RESET_INDEX_CHACHE_METHOD;
static {
GEN_INDEX_NAME_METHOD = findMethod(DBCollection.class, "genIndexName", DBObject.class);
RESET_INDEX_CHACHE_METHOD = findMethod(DBCollection.class, "resetIndexCache");
}
private ReflectiveDBCollectionInvoker() {}
/**
* Convenience method to generate an index name from the set of fields it is over. Will fall back to a MongoDB Java
* driver version 2 compatible way of generating index name in case of {@link MongoClientVersion#isMongo3Driver()}.
*
* @param keys the names of the fields used in this index
* @return
*/
public static String generateIndexName(DBObject keys) {
if (isMongo3Driver()) {
return genIndexName(keys);
}
return (String) invokeMethod(GEN_INDEX_NAME_METHOD, null, keys);
}
/**
* In case of MongoDB Java driver version 2 all indices that have not yet been applied to this collection will be
* cleared. Since this method is not available for the MongoDB Java driver version 3 the operation will throw
* {@link UnsupportedOperationException}.
*
* @param dbCollection
* @throws UnsupportedOperationException
*/
public static void resetIndexCache(DBCollection dbCollection) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException("The mongo java driver 3 does no loger support resetIndexCache!");
}
invokeMethod(RESET_INDEX_CHACHE_METHOD, dbCollection);
}
/**
* Borrowed from MongoDB Java driver version 2. See <a
* href="http://github.com/mongodb/mongo-java-driver/blob/r2.13.0/src/main/com/mongodb/DBCollection.java#L754"
* >http://github.com/mongodb/mongo-java-driver/blob/r2.13.0/src/main/com/mongodb/DBCollection.java#L754</a>
*
* @param keys
* @return
*/
private static String genIndexName(DBObject keys) {
StringBuilder name = new StringBuilder();
for (String s : keys.keySet()) {
if (name.length() > 0) {
name.append('_');
}
name.append(s).append('_');
Object val = keys.get(s);
if (val instanceof Number || val instanceof String) {
name.append(val.toString().replace(' ', '_'));
}
}
return name.toString();
}
}

View File

@@ -1,134 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.DB;
import com.mongodb.Mongo;
/**
* {@link ReflectiveDbInvoker} provides reflective access to {@link DB} API that is not consistently available for
* various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
final class ReflectiveDbInvoker {
private static final Method DB_IS_AUTHENTICATED_METHOD;
private static final Method DB_AUTHENTICATE_METHOD;
private static final Method DB_REQUEST_DONE_METHOD;
private static final Method DB_ADD_USER_METHOD;
private static final Method DB_REQUEST_START_METHOD;
static {
DB_IS_AUTHENTICATED_METHOD = findMethod(DB.class, "isAuthenticated");
DB_AUTHENTICATE_METHOD = findMethod(DB.class, "authenticate", String.class, char[].class);
DB_REQUEST_DONE_METHOD = findMethod(DB.class, "requestDone");
DB_ADD_USER_METHOD = findMethod(DB.class, "addUser", String.class, char[].class);
DB_REQUEST_START_METHOD = findMethod(DB.class, "requestStart");
}
private ReflectiveDbInvoker() {}
/**
* Authenticate against database using provided credentials in case of a MongoDB Java driver version 2.
*
* @param mongo must not be {@literal null}.
* @param db must not be {@literal null}.
* @param credentials must not be {@literal null}.
* @param authenticationDatabaseName
*/
public static void authenticate(Mongo mongo, DB db, UserCredentials credentials, String authenticationDatabaseName) {
String databaseName = db.getName();
DB authDb = databaseName.equals(authenticationDatabaseName) ? db : mongo.getDB(authenticationDatabaseName);
synchronized (authDb) {
Boolean isAuthenticated = (Boolean) invokeMethod(DB_IS_AUTHENTICATED_METHOD, authDb);
if (!isAuthenticated) {
String username = credentials.getUsername();
String password = credentials.hasPassword() ? credentials.getPassword() : null;
Boolean authenticated = (Boolean) invokeMethod(DB_AUTHENTICATE_METHOD, authDb, username,
password == null ? null : password.toCharArray());
if (!authenticated) {
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName + "], "
+ credentials.toString(), databaseName, credentials);
}
}
}
}
/**
* Starts a new 'consistent request' in case of MongoDB Java driver version 2. Will do nothing for MongoDB Java driver
* version 3 since the operation is no longer available.
*
* @param db
*/
public static void requestStart(DB db) {
if (isMongo3Driver()) {
return;
}
invokeMethod(DB_REQUEST_START_METHOD, db);
}
/**
* Ends the current 'consistent request'. a new 'consistent request' in case of MongoDB Java driver version 2. Will do
* nothing for MongoDB Java driver version 3 since the operation is no longer available
*
* @param db
*/
public static void requestDone(DB db) {
if (MongoClientVersion.isMongo3Driver()) {
return;
}
invokeMethod(DB_REQUEST_DONE_METHOD, db);
}
/**
* @param db
* @param username
* @param password
* @throws UnsupportedOperationException
*/
public static void addUser(DB db, String username, char[] password) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Please use DB.command(…) to call either the createUser or updateUser command!");
}
invokeMethod(DB_ADD_USER_METHOD, db, username, password);
}
}

Some files were not shown because too many files have changed in this diff Show More