Compare commits

..

155 Commits

Author SHA1 Message Date
Oliver Gierke
bf61c5782e DATAMONGO-1535 - Release version 2.0 M2 (Kay). 2017-04-04 21:12:35 +02:00
Oliver Gierke
431e71f4a0 DATAMONGO-1535 - Prepare 2.0 M2 (Kay). 2017-04-04 21:12:02 +02:00
Oliver Gierke
3192d7dd78 DATAMONGO-1535 - Updated changelog. 2017-04-04 21:11:53 +02:00
Christoph Strobl
d7a6594933 DATAMONGO-1655 - Remove obsolete build profiles.
And add snapshot profile for 3.5.0.
2017-04-03 18:37:19 +02:00
Christoph Strobl
609d6d5a19 DATAMONGO-1656 - Upgrade to MongoDB Driver 3.4.2 and Reactive Streams Driver 1.3.0.
Prepare issue branch.
2017-04-03 18:37:06 +02:00
Mark Paluch
19c8788376 DATAMONGO-1643 - Polishing.
Fix documentation for namespace types. Remove unused ReflectiveDBCollectionInvoker. Simplify tests, align bean name to mongoClient. Remove injection of Mongo in favor of injecting MongoTemplate.

Remove throws declaration from AbstractMongoConfiguration.mongoClient(). MongoClient creation does not throw checked exceptions so throwing an Exception is no longer required. Replace Mongo by MongoClient in reference documentation.

Original pull request: #451.
2017-04-03 15:36:58 +02:00
Christoph Strobl
db9934c7d8 DATAMONGO-1643 - Replace references to Mongo by MongoClient.
Remove and replace usage of "mongo" by "mongoClient". This involves xsd schema, bean names, constructor and parameter types. This required some API changes as some server commands are no longer directly available through the api, but have to be invoked via runCommand.

Also remove references to outdated API using Credentials and an authentication DB instead of MongoCredentials for authentication.

Updated and removed (unused) tests; Altered documentation.

Original pull request: #451.
2017-04-03 15:35:43 +02:00
Christoph Strobl
b5679744e7 DATAMONGO-1643 - Add and use 2.0 namespace xsd.
Original pull request: #451.
2017-04-03 15:35:27 +02:00
Mark Paluch
b1acda4188 DATAMONGO-1610 - Support RxJava 2 repositories.
Add RxJava 2 dependency. Add test to verify RxJava 2 interoperability.

Original Pull Request: #440
2017-04-03 14:57:11 +02:00
Christoph Strobl
2ab466eb35 DATAMONGO-1447 - Add support for isolations on Update.
We now allow usage of the $isolated update operator via Update.isolated().
In case isolated is set the query involved in MongoOperations.updateMulti will be enhanced by '$isolated' : 1 in case the isolation level has not already been set explicitly via eg. new BasicQuery("{'$isolated' : 0}").

Original pull request: #371.
2017-04-03 13:53:00 +02:00
Mark Paluch
e9da449667 DATAMONGO-1559 - Drop collections used in ReactiveMongoTemplateExecuteTests before tests.
We now drop the collections used in ReactiveMongoTemplateExecuteTests on test start to create a clean state for the tests. Previously, collections were dropped after the tests only so existing data in the collections could interfere with the tests themselves.
2017-03-27 08:40:10 +02:00
John Blum
6bc72a654f DATAMONGO-1648 - Rename getRepositoryFactoryClassName to getRepositoryFactoryBeanClassName. 2017-03-24 13:36:11 -07:00
Mark Paluch
4fd9edf585 DATAMONGO-1559 - Polishing.
Migrate off deprecated Cancellation API to Disposable.
2017-03-24 17:46:41 +01:00
Mark Paluch
955597bb54 DATAMONGO-1559 - Migrate reactive tests from TestSubscriber to StepVerifier. 2017-03-24 17:42:31 +01:00
Oliver Gierke
f59bbd351d DATAMONGO-1647 - Polishing. 2017-03-24 12:25:21 +01:00
Oliver Gierke
43ab3cad13 DATAMONGO-1647 - Switched to use IdentifierAccessor.getRequiredIdentifier() in MongoTemplate.doSaveVersioned(…). 2017-03-24 12:24:57 +01:00
Oliver Gierke
5ba46dadb8 DATAMONGO-1609 - Switched to Mockito from parent POM.
Moved of deprecated runner declarations.
2017-03-24 08:54:29 +01:00
Oliver Gierke
a245c0f280 DATAMONGO-1609 - Adapt to API changes in Spring Data Commons. 2017-03-24 08:54:29 +01:00
Christoph Strobl
288f244c34 DATAMONGO-1609 - Fix failing tests.
Fix issues pointed out by failing tests. Main focus was to restore functionality and not a Java 8 code cleanup. So, this one still needs some love and polishing.
2017-03-24 08:54:29 +01:00
Christoph Strobl
90bb6262f9 DATAMONGO-1609 - Fix compile errors.
Still way to go:
  - Failures: 113, Errors: 836, Skipped: 16
2017-03-24 08:54:29 +01:00
Oliver Gierke
826d00afa7 DATAMONGO-1609 - Hacking. 2017-03-24 08:54:29 +01:00
Mark Paluch
ac3f7dbf99 DATAMONGO-1645 - Polishing.
Clean up appender and log level after test run. Suppress log output during tests.

Original pull request: #450.
2017-03-21 10:52:14 +01:00
Christoph Strobl
f70e1fa291 DATAMONGO-1645 - Safely serialize JSON output for log message in LoggingEventListener.
We now make sure to safely serialize JSON output for mapped documents. This prevents the logger from rendering false exception messages to log appender.

Original pull request: #450.
2017-03-21 10:45:13 +01:00
Mark Paluch
a84c4b064d DATAMONGO-1637 - Polishing.
Move aggregation options conversion to AggregationOptions.getMongoAggregationOptions(). Allow cursor options to control cursor batch size. Add command logging to stream execution. Rearrange method order. Close cursor in tests. Change author name from user name to full name.

Original pull request: #447.
2017-03-21 09:57:25 +01:00
Mainder Singh
1a65828365 DATAMONGO-1637 - Add support for aggregation result streaming.
We now support aggregation result streaming backed by a MongoDB cursor. Result streaming fetches aggregation results in batches from MongoDB and converts results as they are retrieved through the iterator.

Aggregation aggregation = …

CloseableIterator<TagCount> results = mongoOperations.aggregateStream(aggregation, "inputCollection", TagCount.class);

List<TagCount> tagCount = new ArrayList<TagCount>();
while (results.hasNext()) {
	tagCount.add(results.next());
}

results.close();

Original pull request: #447.
2017-03-21 09:56:53 +01:00
Mark Paluch
f4f5e02e66 DATAMONGO-1620 - Polishing.
Fix test method name. Add JavaDoc.

Original pull request: #449.
2017-03-13 16:28:54 +01:00
Christoph Strobl
4b8d35262f DATAMONGO-1620 - Add server-selection-timeout to XML MongoClientOptions config.
We now allow server-selection-timeout attribute on MongoClientOptions XML configuration for a MongoDB 3.x client.

Original pull request: #449.
2017-03-13 16:28:49 +01:00
Mark Paluch
b486fec048 DATAMONGO-1641 - Remove formatting config from the repository.
See https://github.com/spring-projects/spring-data-build/tree/master/etc/ide for most recent IDE settings.
2017-03-08 11:45:42 +01:00
Mark Paluch
50fcbd18f2 DATAMONGO-1421 - Polishing.
Remove trailing whitespaces. Construct exception message with String.format(…).

Original pull request: #448.
2017-03-08 08:56:03 +01:00
Christoph Strobl
c37dfd9688 DATAMONGO-1421 - Fix serialization in error message causing error itself.
We now make sure to safely serialize the criteria object used for creating the error message when raising an `InvalidMongoDbApiUsageException` in cases where `addCriteria` is used to add multiple entries for the same property.

Original pull request: #448.
2017-03-08 08:55:43 +01:00
Oliver Gierke
98d655f4e2 DATAMONGO-1639 - Make sure BeforeConvertEvent sees new version for updates.
The changes for DATAMONGO-1617 subtley changed the behavior for entity updates in terms of the version value they see for entities using optimistic locking. Previously the updates already saw the new version value, where after the fix for DATAMONGO-1617 it saw the old one. That caused BeforeConvertEvent listeners not being able to distinguish between an original insert and the first update anymore.

This change is now rolled back and we introduced a test case that encodes this expectation explicitly.
2017-03-06 16:32:19 +01:00
Oliver Gierke
98a9b66e6b DATAMONGO-1617 - Reinstantiate version property initialization before BeforeConvertEvent publication.
Related pull request: #443.
2017-03-03 08:46:28 +01:00
Oliver Gierke
827b6204a9 DATAMONGO-1617 - Polishing.
Some cleanups in MongoTemplateTests. Removed manual ID assignment in general id handling test to make sure we use the id generation. Removed unneccessary code from domain type in favor of Lombok.

Original pull request: #443.
2017-03-03 08:36:21 +01:00
Laszlo Csontos
781717faa8 DATAMONGO-1617 - BeforeConvertEvent is now emitted before updatable idendifier assertion.
We now make sure the BeforeConvertEvent is published before we check for identifier types that can potentially be auto-generated. That allows the event listeners to populate identifiers. Previously the identifier check kicked in before that and thus caused the listener not being able to populate the property.

Original pull request: #443.
2017-03-03 08:36:05 +01:00
Oliver Gierke
084a167e20 DATAMONGO-1598 - Updated changelog. 2017-03-02 11:11:03 +01:00
Mark Paluch
456ae2459f DATAMONGO-1605 - Polishing.
Remove additional quoting around JSON serialization because JSON serialization adds quotes to a string. Reformat code.
2017-03-01 16:14:03 +01:00
Christoph Strobl
d079edb92c DATAMONGO-1605 - Retain type of SpEL expression result when used in @Query.
Fix issue where any result of a SpEL expression had been treated as quoted String within the resulting MongoDB query.
2017-03-01 16:14:00 +01:00
Mark Paluch
eb97944fd4 DATAMONGO-1600 - Make GraphLookupOperationBuilder public.
Make GraphLookupOperationBuilder public so it can be used in types outside the aggregation package.

Original Pull Request: #437
2017-03-01 12:55:39 +01:00
Mark Paluch
e2d6f187c2 DATAMONGO-1603 - Polishing.
Remove code that became unused. Reformat code. Extend years in copyright header.

Original pull request: #441.
2017-03-01 08:54:32 +01:00
Christoph Strobl
8068e36679 DATAMONGO-1603 - Fix Placeholder not replaced correctly in @Query.
Fix issues when placeholders are appended with other chars eg. '?0xyz' or have been reused multiple times within the query. Additional tests and fixes for complex quoted replacements eg. in regex query. Rely on placeholder quotation indication instead of binding one. Might be misleading when placeholder is used more than once.

Original pull request: #441.
2017-03-01 08:52:18 +01:00
Christoph Strobl
c62d13154f DATAMONGO-1608 - Polishing.
Throw an IllegalArgumentException when trying to create a query using 'null' as an argument for queries resulting in a $regex query operator.

Original Pull Request: #439
2017-02-13 08:33:12 +01:00
Edward Prentice
7d53f21d58 DATAMONGO-1608 - Add guard against NPE in MongoQueryCreator when using IgnoreCase.
Original Pull Request: #439
2017-02-13 08:28:26 +01:00
Christoph Strobl
0000a8fd11 DATAMONGO-1607 - Polishing.
Move coordinate conversion to dedicated method. Additionally fix issue with assertions applied to late in the chain and added some tests.

Original Pull Request: #438
2017-02-10 14:24:37 +01:00
Thiago Diniz da Silveira
fad9756a93 DATAMONGO-1607 - Fix ClassCastException in Circle, Point and Sphere when coordinates are not Double.
Original Pull Request: #438
2017-02-10 14:19:06 +01:00
Mark Paluch
54acc4934c DATAMONGO-1602 - Remove references to Assert single-arg methods.
Replace references to Assert single-arg methods with references to methods accepting the test object and message.

Related ticket: SPR-15196.
2017-02-01 11:08:11 +01:00
Oliver Gierke
2e593bb9b2 DATAMONGO-1573 - Updated changelog. 2017-01-26 12:12:37 +01:00
Oliver Gierke
2bf32a25be DATAMONGO-1574 - Updated changelog. 2017-01-26 12:12:07 +01:00
Christoph Strobl
148c4f9e24 DATAMONGO-1517 - Polishing.
Remove ReflectiveSimpleTypes in favor of MongoSimpleTypes.
Add add integration test.
2017-01-25 17:12:31 +01:00
Mark Paluch
ae6171802e DATAMONGO-1517 - Add support for Decimal128 BSON type.
Support Decimal128 as Mongo simple type if present. Decimal128 is stored as NumberDecimal.

class Person {

  String id;
  Decimal128 decimal128;

  Person(String id, Decimal128 decimal128) {
    this.id = id;
    this.decimal128 = decimal128;
  }
}

mongoTemplate.save(new Person("foo", new Decimal128(new BigDecimal("123.456"))));

is represented as:

{ "_id" : "foo", "decimal128" : NumberDecimal("123.456") }
2017-01-25 17:12:31 +01:00
Mark Paluch
1b97d1d1d0 DATAMONGO-1596 - Fix typo in JavaDoc.
Use correct @RelatedDocument annotation in MongoDB cross store reference documentation.
2017-01-25 16:53:41 +01:00
Mark Paluch
91495825a5 DATAMONGO-1575 - Polishing.
Extend year range in license headers. Use MongoDB JSON serializer for String escaping. Move unquoting/quote checking to inner QuotedString utility class. Reformat code.
2017-01-25 11:44:12 +01:00
Christoph Strobl
18e6b9cfa7 DATAMONGO-1575 - Escape Strings correctly.
Use regex groups and parameter index values for replacement in string based queries.
2017-01-25 11:44:09 +01:00
Christoph Strobl
0e0b8d5f79 DATAMONGO-1594 - Update "what’s new" section in reference documentation. 2017-01-23 08:23:51 +01:00
Oliver Gierke
11882724fa DATAMONGO-1592 - Adapt AuditingEventListenerUnitTests to changes in core auditing.
The core auditing implementation now skips the invocation of auditing in case the candidate aggregate doesn't need any auditing in the first place. We needed to adapt the sample class we use to actually carry the necessary auditing annotations.

Related ticket: DATACMNS-957.
2017-01-20 16:36:45 +01:00
Oliver Gierke
eb2a58cdbe DATAMONGO-1590 - Polishing.
Removed some compiler warnings. Hide newly introduced class in package scope and made use of Lombok annotations to avoid boilerplate code.

Original pull request: #436.
2017-01-18 19:42:46 +01:00
Christoph Strobl
8c9bbf7f91 DATAMONGO-1590 - EntityInformation selected now correctly considers Persistable.
We now wrap the MappingMongoEntityInformation into one that delegates the methods implemented by Persistable to the actual entity in case it implements said interface.

Original pull request: #436.
2017-01-18 19:42:42 +01:00
Mark Paluch
63fc047160 DATAMONGO-1581 - Expose ReactiveMongoRepositoryConfigurationExtension as public type.
Expose ReactiveMongoRepositoryConfigurationExtension so configuration extensions such as Spring Boot's ReactiveMongoRepositoriesAutoConfigureRegistrar can pick it up and reuse the repository configuration extension.
2017-01-17 14:14:08 +01:00
Mark Paluch
44e872c7df DATAMONGO-1588 - Polishing.
Remove unused fields. Fix typo in method name. Reformat inner class to align formatting.

Original pull request: #435.
2017-01-16 09:29:10 +01:00
Christoph Strobl
1135e90be0 DATAMONGO-1588 - Fix derived finder not accepting subclass of parameter type.
We now allow using sub types as arguments for derived queries. This makes it possible to use eg. a GeoJsonPoint for querying while the declared property type in the domain object remains a regular (legacy) Point.

Original pull request: #435.
2017-01-16 09:29:06 +01:00
Mark Paluch
65da90afd8 DATAMONGO-1589 - Update project documentation with the CLA tool integration. 2017-01-13 11:46:58 +01:00
Mark Paluch
4856c9b4f5 DATAMONGO-1587 - Polishing.
Convert @see http://… links to valid format using @see <a href=…>…</a>
2017-01-12 17:24:03 +01:00
Mark Paluch
644c1a2c89 DATAMONGO-1587 - Migrate ticket references in test code to Spring Framework style. 2017-01-12 16:51:57 +01:00
Mark Paluch
8df9d30d2e DATAMONGO-1586 - Consider field name in TypeBasedAggregationOperationContext.getReferenceFor(…).
We now consider the provided field name (alias) in mapped fields with which it is exposed. The field name applies to the exposed field after property path resolution in TypeBasedAggregationOperationContext. Previously, the field reference used the property name which caused fields to be considered non-aliased, so aggregation projection operations dropped the alias and exposed the field with its leaf property name.

Original Pull Request: #434
2017-01-12 10:34:02 +01:00
Christoph Strobl
90ae6d1805 DATAMONGO-1585 - Polishing.
Update documentation for better readability in html and pdf format.

Original Pull Request: #433
2017-01-12 10:19:55 +01:00
Mark Paluch
1fe79f1194 DATAMONGO-1585 - Expose synthetic fields in $project aggregation stage.
Field projections now expose their fields as synthetic simple fields. Projection aggregation stage redefines the available field set available for later aggregation stages entirely so projected fields are considered synthetic. A simple synthetic field has no target field which causes later aggregation stages to not pick up the underlying target but the exposed field name when rendering aggregation operations to Mongo documents.

The change is motivated by a bug where previously an aggregation consisting of projection of an aliased field and sort caused the sort projection stage to render with the original field name instead of the aliased field. The sort did not apply any sorting since projection redefines the available field set entirely and the original field is no longer accessible.

Original Pull Request: #433
2017-01-12 10:19:10 +01:00
Christoph Strobl
2c6bd6ecea DATAMONGO-1576 - Update lifecycle event documentation.
Add note on lifecycle event handling for property types.
2017-01-11 13:10:25 +01:00
Mark Paluch
fc5bb3f1d3 DATAMONGO-1578 - Polishing.
Add ticket references to test methods. Extend license years in copyright header.

Original pull request: #398.
2017-01-02 11:46:57 +01:00
Martin Macko
d4d9c7673a DATAMONGO-1578 - Add missing @Test annotation to ProjectionOperationUnitTests.
Original pull request: #398.
2017-01-02 11:45:04 +01:00
Mark Paluch
20e2cfb273 DATAMONGO-1508 - Improve reference documentation.
Replace Spring Data Document with Spring Data MongoDB. Extend copyright year range. Replace static Spring version leftover with variable. Fix typos.
2017-01-02 11:19:59 +01:00
Lukasz Kryger
634f618eb1 DATAMONGO-1577 - Fix wording repetition in MongoRepository JavaDoc.
Original pull request: #407.
2017-01-02 11:19:59 +01:00
Ken Dombeck
c6e2662151 DATAMONGO-1577 - Fix Reference and JavaDoc spelling issues.
Replaced invalid class name MongoMappingConverter with actual class name of MappingMongoConverter. Fix typos.

Original pull request: #425.
2017-01-02 11:19:56 +01:00
Mark Paluch
51ae618585 DATAMONGO-1508 - Polishing.
Highlight attribute name. Replace tabs with spaces.

Original pull request: #399.
2017-01-02 11:14:25 +01:00
John Lilley
b584f04a41 DATAMONGO-1508 - Document authentication-dbname attribute in db-factory.
Original pull request: #399.
2017-01-02 11:14:25 +01:00
Oliver Gierke
6e1e9967af DATAMONGO-1522 - Updated changelog. 2016-12-21 19:35:37 +01:00
Oliver Gierke
af49230093 DATAMONGO-1469 - Updated changelog. 2016-12-21 18:43:02 +01:00
Oliver Gierke
3355a436c8 DATAMONGO-1467 - Polishing.
Original pull request: #431.
2016-12-20 14:26:40 +01:00
Christoph Strobl
9a0241992e DATAMONGO-1467 - Add support for MongoDB 3.2 partialFilterExpression for index creation.
We now support partial filter expression on indexes via Index.partial(…). This allows to create partial indexes that only index the documents in a collection that meet a specified filter expression.

new Index().named("idx").on("k3y", ASC).partial(filter(where("age").gte(10)))

The filter expression can be set via a plain DBObject or a CriteriaDefinition and is mapped against the associated domain type.

Original pull request: #431.
2016-12-20 14:19:28 +01:00
Oliver Gierke
779145645d DATAMONGO-1565 - Polishing.
Formatting in ExpressionEvaluatingParameterBinder and StringBasedMongoQueryUnitTests. Turned Placeholder into value object.
2016-12-20 11:37:28 +01:00
Mark Paluch
c0f3255e26 DATAMONGO-1565 - Polishing.
Consider quoted/unquoted parameter use with the same parameter reference. Extend date range in license headers.
2016-12-20 11:37:05 +01:00
Christoph Strobl
b1de52f05a DATAMONGO-1565 - Ignore placeholder pattern in replacement values for annotated queries.
We now make sure to quote single and double ticks in the replacement values before actually appending them to the query. We also replace single ticks around parameters in the actual raw annotated query by double quotes to make sure they are treated as a single string parameter.
2016-12-20 11:30:02 +01:00
Mark Paluch
646b525d86 DATAMONGO-1564 - Polishing.
Fix JavaDoc references and minor a import formatting issue.

Original pull request: #429.
2016-12-16 14:10:15 +01:00
Christoph Strobl
2c29f204c3 DATAMONGO-1564 - Split up AggregationExpressions.
Refactored to multiple smaller Aggregation Operator classes reflecting the grouping (array operators, string operators,…) predefined by MongoDB.

Original pull request: #429.
2016-12-16 14:00:22 +01:00
Mark Paluch
9cd44faeb7 DATAMONGO-1567 - Use newer Java 8 on Travis CI. 2016-12-16 10:31:22 +01:00
Mark Paluch
362de45664 DATAMONGO-1533 - Polishing.
Extend JavaDoc. Minor reformatting.

Original pull request: #428.
2016-12-16 09:24:42 +01:00
Christoph Strobl
b7317892a2 DATAMONGO-1533 - Add AggregationExpression derived from SpEL AST.
We added an AggregationExpression that renders a MongoDB Aggregation Framework expression from the AST of a SpEL expression. This allows usage with various stages (eg. $project, $group) throughout the aggregation support.

  // { $and: [ { $gt: [ "$qty", 100 ] }, { $lt: [ "$qty", 250 ] } ] }
  expressionOf("qty > 100 && qty < 250);

  // { $cond : { if : { $gte : [ "$a", 42 ]}, then : "answer", else : "no-answer" } }
  expressionOf("cond(a >= 42, 'answer', 'no-answer')");

Original pull request: #428.
2016-12-16 09:21:27 +01:00
Mark Paluch
35f43e9ab8 DATAMONGO-1566 - Adapt API in ReactiveMongoRepositoryFactoryBean.
Related tickets: DATACMNS-891.
2016-12-16 09:21:27 +01:00
Oliver Gierke
a77c5b6e1d DATAMONGO-1566 - Adapt API in MongoRepositoryFactoryBean.
Related tickets: DATACMNS-891.
2016-12-15 16:17:54 +01:00
Christoph Strobl
5cf8ec3e55 DATAMONGO-1552 - Polishing.
Updated doc, removed whitespaces, minor method wording changes.

Original Pull Request: #426
2016-12-14 13:09:06 +01:00
Mark Paluch
450549150d DATAMONGO-1552 - Add $facet aggregation stage.
Original Pull Request: #426
2016-12-14 12:00:30 +01:00
Mark Paluch
b7a0b1d523 DATAMONGO-1552 - Add $bucketAuto aggregation stage.
Original Pull Request: #426
2016-12-14 11:46:54 +01:00
Mark Paluch
5c5c616be9 DATAMONGO-1552 - Add $bucket aggregation stage.
Original Pull Request: #426
2016-12-14 11:32:50 +01:00
Mark Paluch
0bd98d67c8 DATAMONGO-442 - Polishing.
Reformat code according to Spring Data style. Add test for authenticated use. Add JavaDoc to newly introduced methods. Allow configuration of an authentication database. Update reference documentation.

Original pull request: #419.
2016-12-13 16:51:02 +01:00
Ricardo Espirito Santo
8302727b23 DATAMONGO-442 - Support username/password authentication with MongoLog4jAppender.
Added optional username and password for authentication support on Log4jAppender.

Original pull request: #419.
2016-12-13 16:48:02 +01:00
Christoph Strobl
86dbd95220 DATAMONGO-1551 - Polishing.
Add startWith overload allowing to mix expressions, removed white spaces, updated doc.

Original Pull Request: #424
2016-12-13 15:15:00 +01:00
Mark Paluch
3d0750afc5 DATAMONGO-1551 - Add $graphLookup aggregation stage.
We now support the $graphLookup aggregation pipeline stage via Aggregation to perform recursive lookup adding the lookup result as array to documents entering $graphLookup.

TypedAggregation<Employee> agg = Aggregation.newAggregation(Employee.class,
		graphLookup("employee")
			.startWith("reportsTo")
			.connectFrom("reportsTo")
			.connectTo("name")
			.depthField("depth")
			.maxDepth(5)
			.as("reportingHierarchy"));

Original Pull Request: #424
2016-12-13 14:44:53 +01:00
Christoph Strobl
1bf8eb09ca DATAMONGO-1550 - Polishing $replaceRoot (aggregation stage).
Original Pull Request: #422.
2016-12-13 08:18:40 +01:00
Mark Paluch
ae4cfaa58c DATAMONGO-1550 - Add $replaceRoot aggregation stage.
We now support the $replaceRoot stage in aggregation pipelines. $replaceRoot can reference either a field, an aggregation expression or it can be used to compose a replacement document.

newAggregation(
	replaceRoot().withDocument()
		.andValue("value").as("field")
		.and(MULTIPLY.of(field("total"), field("discounted")))
);

newAggregation(
	replaceRoot("item")));

Original Pull Request: #422
2016-12-13 08:18:40 +01:00
Christoph Strobl
1dea009e32 DATAMONGO-1549 - Polishing $count (aggregation stage).
Original Pull Request: #422
2016-12-13 08:18:27 +01:00
Mark Paluch
4c07235107 DATAMONGO-1549 - Add $count aggregation stage.
We now support the $count stage in aggregation pipelines.

newAggregation(
	match(where("hotelCode").is("0360")),
	count().as("documents"));

Original Pull Request: #422
2016-12-13 08:18:10 +01:00
Christoph Strobl
5ebbf93cf9 DATAMONGO-1558 - Upgrade MongoDB server version to, and add build profile for MongoDB 3.4.
Added MongoDB 3.4 profile to pom.xml and upgraded to MongoDB 3.4 on travis-ci.
Delete assertion checking property that has been removed in MongoDB 3.4 (see: https://jira.mongodb.org/browse/SERVER-24928)
2016-12-13 08:18:10 +01:00
Mark Paluch
794756d055 DATAMONGO-1548 - Polishing.
Enhance JavaDoc. Minor formatting. Fix typos.

Original pull request: #423.
2016-12-12 12:58:36 +01:00
Christoph Strobl
4fcc09c6c1 DATAMONGO-1548 - Add support for MongoDB 3.4 aggregation operators.
We now support the following MongoDB 3.4 aggregation operators:

$indexOfBytes, $indexOfCP, $split, $strLenBytes, $strLenCP, $substrCP, $indexOfArray, $range, $reverseArray, $reduce, $zip, $in, $isoDayOfWeek, $isoWeek, $isoWeekYear, $switch and $type.

Original pull request: #423.
2016-12-12 12:58:32 +01:00
Mark Paluch
d297f5a253 DATAMONGO-1538 - Polishing.
Use InheritingExposedFieldsAggregationOperationContext instead of anonymous context class for condition mapping. Drop aggregation input collections before tests. Minor reformatting.

Original pull request: #417.
2016-12-07 10:01:20 +01:00
Christoph Strobl
696e53ff60 DATAMONGO-1538 - Add support for $let to aggregation.
We now support $let in aggregation $project stage.

ExpressionVariable total = newExpressionVariable("total").forExpression(ADD.of(field("price"), field("tax")));
ExpressionVariable discounted = newExpressionVariable("discounted").forExpression(Cond.when("applyDiscount").then(0.9D).otherwise(1.0D));

newAggregation(Sales.class,
	project()
		.and(define(total, discounted)
			.andApply(MULTIPLY.of(field("total"), field("discounted"))))
		.as("finalTotal"));

Original pull request: #417.
2016-12-07 10:01:17 +01:00
Christoph Strobl
f512d8cb16 DATAMONGO-1542 - Polishing.
Added some static entry points for better readability.

Original Pull Request: #421
2016-12-06 10:08:05 +01:00
Mark Paluch
b98bc0e2bf DATAMONGO-1542 - Refactor CondOperator and IfNullOperator to children of AggregationExpressions.
Renamed CondOperator to Cond and IfNullOperator to IfNull. Both conditional operations are now available from ConditionalOperators.when and ConditionalOperators.ifNull and accept AggregationExpressions for conditions and values.

Original Pull Request: #421
2016-12-06 10:06:58 +01:00
Christoph Strobl
f64d205522 DATAMONGO-1520 - Add overload for aggregation $match accepting CriteriaDefinition.
We now also accept CriteriaDefinition next to Criteria for Aggregation.match. The existing match(Criteria) method remains to preserve binary compatibility.
2016-12-06 09:05:39 +01:00
Mark Paluch
853b2b2d5c DATAMONGO-1540 - Polishing.
Reduce Map aggregation expression builder entrypoint. Fix JavaDoc.

Original pull request: #420.
2016-12-05 16:46:50 +01:00
Christoph Strobl
c3f9af01e6 DATAMONGO-1540 - Add support for $map (aggregation).
We now support $map operator in aggregation.

Original pull request: #420.
2016-12-05 16:46:41 +01:00
Oliver Gierke
655a1e0351 DATAMONGO-1547 - Register MongoRepositoryFactory in spring.factories.
This is required for the switch in support for multi-store detection.

Related ticket: DATACMNS-952.
2016-12-05 14:28:26 +01:00
Oliver Gierke
26395f1b78 DATAMONGO-1546 - Register GeoJsonConfiguration via spring.factories.
Related tickets: DATACMNS-952.
2016-12-05 14:24:18 +01:00
Oliver Gierke
ecd8dae876 DATAMONGO-1141 - Polishing.
Aligned assertion messages for consistency. Fixed imports in  UpdateMapperUnitTests.

Original pull request: #405.
2016-12-02 18:33:08 +01:00
Mark Paluch
4b59736f82 DATAMONGO-1141 - Polishing.
Add property to field name mapping for Sort orders by moving Sort mapping to UpdateMapper. Fix typo. Add JavaDoc. Reformat code. Remove trailing whitespaces.

Original pull request: #405.
2016-12-02 18:32:50 +01:00
Pavel Vodrážka
31d4434562 DATAMONGO-1141 - Add support for $push $sort in Update.
Sorting update modifier added. Supports sorting arrays by document fields and element values.

Original pull request: #405.
2016-12-02 18:26:09 +01:00
Oliver Gierke
f5a339bfe4 DATAMONGO-1454 - Polishing.
Formatting in test case.

Original pull request: #381.
2016-12-02 16:46:06 +01:00
Mark Paluch
5e60867750 DATAMONGO-1454 - Add support for exists projection in repository query methods.
We now support exists projections for query methods in query methods for derived and string queries.

public PersonRepository extends Repository<Person, String> {

  boolean existsByFirstname(String firstname);

  @ExistsQuery(value = "{ 'lastname' : ?0 }")
  boolean someExistQuery(String lastname);

  @Query(value = "{ 'lastname' : ?0 }", exists = true)
  boolean anotherExistQuery(String lastname);
}

Original pull request: #381.
2016-12-02 16:45:58 +01:00
Mark Paluch
8a5da0e737 DATAMONGO-1536 - Polishing.
Add JavaDoc. Change visibility of AbstractAggregationExpression.getMongoMethod() to protected.

Original pull request: #418.
2016-12-02 15:43:11 +01:00
Christoph Strobl
e01ebcf605 DATAMONGO-1536 - Add aggregation operators for array, arithmetic, date and set operations.
We now support the following aggregation framework operators:

- setEquals, setIntersection, setUnion, setDifference, setIsSubset, anyElementTrue, allElementsTrue
- stdDevPop, stdDevSamp
- abs, ceil, exp, floor, ln, log, log10, pow, sqrt, trunc
- arrayElementAt, concatArrays, isArray
- literal
- dayOfYear, dayOfMonth, dayOfWeek, year, month, week, hour, minute, second, millisecond, dateToString

Original pull request: #418.
2016-12-02 15:43:05 +01:00
Oliver Gierke
b01a34b2b1 DATAMONGO-1539 - Polishing.
Renamed @Count and @Delete to @CountQuery and @DeleteQuery. Minor polishing in test cases and test repository methods. JavaDoc, formatting.

Original pull request: #416.
2016-12-02 11:59:10 +01:00
Fırat KÜÇÜK
5650e35eb6 DATAMONGO-1539 - Introduce @CountQuery and @DeleteQuery.
Introducing dedicated annotations for manually defined count and delete queries to avoid misconfiguration and generally simplifying the declaration.

Original pull request: 416.
2016-12-02 11:59:10 +01:00
Christoph Strobl
e5a41ad7f9 DATAMONGO-1534 - Fix bulk operations missing to write type info.
We now correctly convert entities into their MongoDB representation including type information via _class property.

Original pull request: #415.
2016-11-28 09:13:57 +01:00
Christoph Strobl
dd7d25cdb3 DATAMONGO-1530 - Polishing.
Add missing transformations for ConstructorReference, OperatorNot, OpNE, OpEQ, OpGT, OpGE, OpLT, OpLE, OperatorPower, OpOr and OpAnd. This allows usage of logical operators &, || and ! as part of the expression, while ConstructorReference allows instantiating eg. arrays via an expression `new int[]{4,5,6}`. This can be useful eg. comparing arrays using $setEquals.

More complex aggregation operators like $filter can be created by defining the variable references as string inside the expression like filter(a, 'num', '$$num' > 10).
Commands like $let requires usage of InlineMap to pass in required arguments like eg. let({low:1, high:'$$low'}, gt('$$low', '$$high')).

Original Pull Request: #410
2016-11-25 17:14:17 +01:00
Sebastien Gerard
0ddd7e3afd DATAMONGO-1530 - Add support for missing MongoDB 3.2 aggregation pipeline operators.
Original Pull Request: #410
2016-11-25 15:46:49 +01:00
Mark Paluch
81c501dea3 DATAMONGO-784 - Polishing.
Add JavaDoc for compareValue.

Original pull request: #414.
2016-11-24 13:49:12 +01:00
Christoph Strobl
b1cd7cfa53 DATAMONGO-784 - Add support for comparison aggregation operators to group & project.
We now directly support comparison aggregation operators ($cmp, $eq, $gt, $gte, $lt, $lte and $ne) on both group and project stages.

Original pull request: #414.
2016-11-24 13:49:09 +01:00
Mark Paluch
2ffac0a74e DATAMONGO-1491 - Polishing.
Remove variable before returning value. Add generics for list creation.

Original pull request: #412.
2016-11-24 12:49:30 +01:00
Christoph Strobl
ded99a74a3 DATAMONGO-1491 - Add support for $filter (aggregation).
We new support $filter in aggregation pipeline.

Aggregation.newAggregation(Sales.class,
	Aggregation.project()
		.and(filter("items").as("item").by(GTE.of(field("item.price"), 100)))
		.as("items"))

Original pull request: #412.
2016-11-24 12:45:56 +01:00
Christoph Strobl
f275aaad7f DATAMONGO-1327 - Polishing.
Just added overloads for stdDevSamp and stdDevPop taking AggregationExpression and updated the doc.
Also replaced String operation based MongoDB operation building by using operators directly.

Original Pull Request: #360
2016-11-24 07:57:17 +01:00
gustavodegeus
5d2faef072 DATAMONGO-1327 - Added support for $stdDevSamp and $stdDevPop to aggregation $group stage.
Original Pull Request: #360
CLA: 171720160409030719 (Gustavo de Geus)
2016-11-24 07:56:33 +01:00
Oliver Gierke
2c3cbb3613 DATAMONGO-1527 - After release cleanups. 2016-11-23 10:59:11 +01:00
Oliver Gierke
2b635fa151 DATAMONGO-1527 - Prepare next development iteration. 2016-11-23 10:59:08 +01:00
Oliver Gierke
909fda2076 DATAMONGO-1527 - Release version 2.0 M1 (Kay). 2016-11-23 10:35:48 +01:00
Oliver Gierke
3b5d231529 DATAMONGO-1527 - Prepare 2.0 M1 (Kay). 2016-11-23 10:35:08 +01:00
Oliver Gierke
e90fe70ae1 DATAMONGO-1527 - Updated changelog. 2016-11-23 10:35:01 +01:00
Oliver Gierke
27d379ba71 DATAMONGO-1527 - Removed explicit reference to Spring Framework 5 snapshots. 2016-11-19 15:30:23 +01:00
Mark Paluch
e987a853ac DATAMONGO-1509 - Polishing.
Adopt type hint assertion for existing _class field checks. Simplify test code to use Collections.singletonList instead of Arrays.asList. Replace BasicDBList with List in JavaDoc. Use type inference for DocumentTestUtils.getAsDBList to avoid casts in test code. Extend documentation.

Original pull request: #411.
2016-11-17 15:54:10 +01:00
Christoph Strobl
3c16b4db7f DATAMONGO-1509 - Write type hint as last element of a Document.
Always add type hint as last property of a Document.
This is necessary to assure document equality within MongoDB in cases where the query contains full document comparisons. Unfortunately this also might break existing stuff since the order of properties within a Document is changed with this commit.

Original pull request: #411.
2016-11-17 15:12:36 +01:00
Mark Paluch
070d784be3 DATAMONGO-1176 - Exclude null id fields in bulk insert.
We no longer map _id fields into Document that are null to enforce MongoDB ObjectId generation on batch insert like insertAll.
2016-11-17 13:56:45 +01:00
Mark Paluch
af4f0e0913 DATAMONGO-1444 - Accept Collection and subtypes in ReactiveMongoOperations.insertAll(…). 2016-11-14 18:13:35 +01:00
Oliver Gierke
474af92075 DATAMONGO-1444 - Moved to new base class for reactive repository factories. 2016-11-14 18:13:34 +01:00
Oliver Gierke
99838c02fd DATAMONGO-1444 - Adapt to new changes in reactive repository configuration.
We now use the newly introduced ….useRepositoryConfiguration(…) in the module specific RepositoryConfigurationExtension implementations to distinguish between reactive and non-reactive repositories.

Removed RepositoryType class as it was only used by the previous repository typ detection.
2016-11-14 18:13:34 +01:00
Oliver Gierke
8060ebae6a DATAMONGO-1444 - Polishing.
Removed unused references to ConversionService from repository query implementations.
2016-11-14 18:13:34 +01:00
Mark Paluch
9e9495ee54 DATAMONGO-1444 - Make Project Reactor dependency no longer required for blocking repository usage.
Use ReactiveWrapperConverters for reactive wrapper type conversion to not require Project Reactor for blocking repository usage.
2016-11-14 18:13:34 +01:00
Christoph Strobl
098aae41b7 DATAMONGO-1444 - Update bulk insert operations.
Accept Mono<Collection<T>> instead of Publisher<T> to get rid of hidden buffer call inside of MongoTemplate.
2016-11-14 18:13:28 +01:00
Christoph Strobl
df859d0f3a DATAMONGO-1444 - Adopt changes in Spring Data Commons.
- Adopt RxJava to RxJava1 repository interface renaming.
- Remove ReactiveChunk, Slice and Page.
- Update documentation.
- Prevent sliced/paged query execution.
2016-11-14 18:13:25 +01:00
Christoph Strobl
e0f371f648 DATAMONGO-1444 - Polishing.
- Update Javadoc comments and reference documentation.
- Introduce Adapter for blocking IndexOperations.
- Remove transaction synchronization.
- Remove unused types.
- Remove dropDupos options from indexes.
- Prevent usage of Querydsl along with reactive repository.
- Use ReactiveQueryMethod in ReactiveMongoQuery.
2016-11-14 18:13:25 +01:00
Mark Paluch
2145e212ca DATAMONGO-1444 - Add support for RxJava wrapper types and slice queries.
Reactive MongoDB repository can now be composed from Project Reactor and RxJava types for method arguments and return types. Query methods and methods from the base/implementation classes can be invoked with a conversion of input/output types.
2016-11-14 18:13:25 +01:00
Mark Paluch
c814073441 DATAMONGO-1176 - Test code cleanup.
Replace DbObject in documentation with Document. Fix typos. Change dbObject variable names to document.
2016-11-14 18:13:24 +01:00
Mark Paluch
59573b10e6 DATAMONGO-1176 - Cleanup.
Replace DbObject in documentation with Document. Fix typos. Change dbObject variable names to document. Improve error messages. Remove unused code.
2016-11-14 18:13:24 +01:00
Christoph Strobl
054274392e DATAMONGO-1176 - Cleanup.
- Update licenses headers.
- Renname variables and methods from dbo -> document.
- Remove deprecations.
- Remove unused code blocks.
- Upgrade to MongoDB Java Driver 3.3
2016-11-14 18:13:24 +01:00
Mark Paluch
2d3efdc0b4 DATAMONGO-1176 - Polishing.
- Remove dropDups assertion as the MongoDB 3 driver does no longer provide dropDups even if running agains a MongoDB v2.6.7.
- Remove mongo-next build profile as we're based on the Mongo 3 driver now.
- Map update object and merge set operations.
2016-11-14 18:13:23 +01:00
Christoph Strobl
2461575c52 DATAMONGO-1176 - Switch to Document API.
We use the Document API when interacting with the MongoDB Java Driver. This allows us to make use of new features and enables us to use the Codec API and prepares the project for future enhancements concerning the drivers the reactive API.
2016-11-14 18:13:23 +01:00
Mark Paluch
4371760272 DATAMONGO-1461 - Upgrade Hibernate/Validator/JPA dependencies to match Spring 5 baseline.
Upgrade:
* Hibernate Validator to 5.2.4.Final
* JPA API to 2.1.1
* Hibernate Core to 5.2.1.Final
2016-11-14 17:49:48 +01:00
Mark Paluch
da5289fc18 DATAMONGO-1448 - Prepare 2.0 development.
Upgraded to Spring Data Build parent 2.0 snapshots and Spring Data Commons 2.0 snapshots. Removed obsolete distribution key property. Removed obsolete template.mf.
2016-11-14 17:49:48 +01:00
Oliver Gierke
ea9b402547 DATAMONGO-1525 - Improved creation of empty collections, esp. EnumSet.
We now use more type information to create a better empty collection in the first place. The previous algorithm always used an empty HashSet plus a subsequent conversion using the raw collection type. Especially the latter caused problems for EnumSets as the conversion into one requires the presence of component type information.

We now use Spring's collection factory and more available type information to create a proper collection in the first place and only rely on a subsequent conversion for arrays.
2016-11-13 18:33:45 +01:00
709 changed files with 26712 additions and 24752 deletions

Binary file not shown.

View File

@@ -1 +0,0 @@
distributionUrl=https://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.5.4/apache-maven-3.5.4-bin.zip

34
.travis.yml Normal file
View File

@@ -0,0 +1,34 @@
language: java
jdk:
- oraclejdk8
before_script:
- mongod --version
env:
matrix:
- PROFILE=ci
- PROFILE=mongo34-next
- PROFILE=mongo35-next
# Current MongoDB version is 2.4.2 as of 2016-04, see https://github.com/travis-ci/travis-ci/issues/3694
# apt-get starts a MongoDB instance so it's not started using before_script
addons:
apt:
sources:
- mongodb-3.4-precise
packages:
- mongodb-org-server
- mongodb-org-shell
- oracle-java8-installer
sudo: false
cache:
directories:
- $HOME/.m2
install: true
script: "mvn clean dependency:list test -P${PROFILE} -Dsort"

43
CI.adoc
View File

@@ -1,43 +0,0 @@
= Continuous Integration
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmaster&subject=Moore%20(master)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2F2.1.x&subject=Lovelace%20(2.1.x)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]
image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2F1.10.x&subject=Ingalls%20(1.10.x)[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/]
== Running CI tasks locally
Since this pipeline is purely Docker-based, it's easy to:
* Debug what went wrong on your local machine.
* Test out a a tweak to your test routine before sending it out.
* Experiment against a new image before submitting your pull request.
All of these use cases are great reasons to essentially run what the CI server does on your local machine.
IMPORTANT: To do this you must have Docker installed on your machine.
1. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github springci/spring-data-openjdk8-with-mongodb-4.0:latest /bin/bash`
+
This will launch the Docker image and mount your source code at `spring-data-mongodb-github`.
+
2. `cd spring-data-mongodb-github`
+
Next, run the tests from inside the container:
+
3. `./mvnw clean dependency:list test -Dsort -Dbundlor.enabled=false -B` (or with whatever profile you need to test out)
Since the container is binding to your source, you can make edits from your IDE and continue to run build jobs.
If you need to package things up, do this:
1. `docker run -it --mount type=bind,source="$(pwd)",target=/spring-data-mongodb-github springci/spring-data-openjdk8-with-mongodb-4.0:latest /bin/bash`
+
This will launch the Docker image and mount your source code at `spring-data-mongodb-github`.
+
2. `cd spring-data-mongodb-github`
+
Next, package things from inside the container doing this:
+
3. `./mvnw clean dependency:list package -Dsort -Dbundlor.enabled=false -B`
NOTE: Docker containers can eat up disk space fast! From time to time, run `docker system prune` to clean out old images.

123
Jenkinsfile vendored
View File

@@ -1,123 +0,0 @@
pipeline {
agent none
triggers {
pollSCM 'H/10 * * * *'
upstream(upstreamProjects: "spring-data-commons/1.13.x", threshold: hudson.model.Result.SUCCESS)
}
options {
disableConcurrentBuilds()
buildDiscarder(logRotator(numToKeepStr: '14'))
}
stages {
stage("Test") {
when {
anyOf {
branch '1.10.x'
not { triggeredBy 'UpstreamCause' }
}
}
parallel {
stage("test: baseline") {
agent {
docker {
image 'springci/spring-data-openjdk8-with-mongodb-4.0:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
}
options { timeout(time: 30, unit: 'MINUTES') }
steps {
sh 'rm -rf ?'
sh 'mkdir -p /tmp/mongodb/db /tmp/mongodb/log'
sh 'mongod --dbpath /tmp/mongodb/db --replSet rs0 --fork --logpath /tmp/mongodb/log/mongod.log &'
sh 'sleep 10'
sh 'mongo --eval "rs.initiate({_id: \'rs0\', members:[{_id: 0, host: \'127.0.0.1:27017\'}]});"'
sh 'sleep 15'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw clean dependency:list test -Dsort -U -B'
}
}
}
}
stage('Release to artifactory') {
when {
branch 'issue/*'
not { triggeredBy 'UpstreamCause' }
}
agent {
docker {
image 'adoptopenjdk/openjdk8:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
}
options { timeout(time: 20, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
sh 'rm -rf ?'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pci,artifactory ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
"-Dartifactory.staging-repository=libs-snapshot-local " +
"-Dartifactory.build-name=spring-data-mongodb-1.10 " +
"-Dartifactory.build-number=${BUILD_NUMBER} " +
'-Dmaven.test.skip=true clean deploy -U -B'
}
}
stage('Release to artifactory with docs') {
when {
branch '1.10.x'
}
agent {
docker {
image 'adoptopenjdk/openjdk8:latest'
label 'data'
args '-v $HOME:/tmp/jenkins-home'
}
}
options { timeout(time: 20, unit: 'MINUTES') }
environment {
ARTIFACTORY = credentials('02bd1690-b54f-4c9f-819d-a77cb7a9822c')
}
steps {
sh 'rm -rf ?'
sh 'MAVEN_OPTS="-Duser.name=jenkins -Duser.home=/tmp/jenkins-home" ./mvnw -Pci,artifactory ' +
'-Dartifactory.server=https://repo.spring.io ' +
"-Dartifactory.username=${ARTIFACTORY_USR} " +
"-Dartifactory.password=${ARTIFACTORY_PSW} " +
"-Dartifactory.staging-repository=libs-snapshot-local " +
"-Dartifactory.build-name=spring-data-mongodb-1.10 " +
"-Dartifactory.build-number=${BUILD_NUMBER} " +
'-Dmaven.test.skip=true clean deploy -U -B'
}
}
}
post {
changed {
script {
slackSend(
color: (currentBuild.currentResult == 'SUCCESS') ? 'good' : 'danger',
channel: '#spring-data-dev',
message: "${currentBuild.fullDisplayName} - `${currentBuild.currentResult}`\n${env.BUILD_URL}")
emailext(
subject: "[${currentBuild.fullDisplayName}] ${currentBuild.currentResult}",
mimeType: 'text/html',
recipientProviders: [[$class: 'CulpritsRecipientProvider'], [$class: 'RequesterRecipientProvider']],
body: "<a href=\"${env.BUILD_URL}\">${currentBuild.fullDisplayName} is reported as ${currentBuild.currentResult}</a>")
}
}
}
}

View File

@@ -1,159 +0,0 @@
image:https://spring.io/badges/spring-data-mongodb/ga.svg[Spring Data MongoDB,link=https://projects.spring.io/spring-data-mongodb#quick-start] image:https://spring.io/badges/spring-data-mongodb/snapshot.svg[Spring Data MongoDB,link=https://projects.spring.io/spring-data-mongodb#quick-start]
= Spring Data MongoDB image:https://jenkins.spring.io/buildStatus/icon?job=spring-data-mongodb%2Fmaster&subject=Build[link=https://jenkins.spring.io/view/SpringData/job/spring-data-mongodb/] https://gitter.im/spring-projects/spring-data[image:https://badges.gitter.im/spring-projects/spring-data.svg[Gitter]]
The primary goal of the https://projects.spring.io/spring-data[Spring Data] project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
The Spring Data MongoDB project aims to provide a familiar and consistent Spring-based programming model for new datastores while retaining store-specific features and capabilities.
The Spring Data MongoDB project provides integration with the MongoDB document database.
Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB `+Document+` and easily writing a repository style data access layer.
== Code of Conduct
This project is governed by the link:CODE_OF_CONDUCT.adoc[Spring Code of Conduct]. By participating, you are expected to uphold this code of conduct. Please report unacceptable behavior to spring-code-of-conduct@pivotal.io.
== Getting Started
Here is a quick teaser of an application using Spring Data Repositories in Java:
[source,java]
----
public interface PersonRepository extends CrudRepository<Person, Long> {
List<Person> findByLastname(String lastname);
List<Person> findByFirstnameLike(String firstname);
}
@Service
public class MyService {
private final PersonRepository repository;
public MyService(PersonRepository repository) {
this.repository = repository;
}
public void doWork() {
repository.deleteAll();
Person person = new Person();
person.setFirstname("Oliver");
person.setLastname("Gierke");
repository.save(person);
List<Person> lastNameResults = repository.findByLastname("Gierke");
List<Person> firstNameResults = repository.findByFirstnameLike("Oli*");
}
}
@Configuration
@EnableMongoRepositories
class ApplicationConfig extends AbstractMongoConfiguration {
@Override
public MongoClient mongoClient() {
return new MongoClient();
}
@Override
protected String getDatabaseName() {
return "springdata";
}
}
----
=== Maven configuration
Add the Maven dependency:
[source,xml]
----
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}.RELEASE</version>
</dependency>
----
If you'd rather like the latest snapshots of the upcoming major version, use our Maven snapshot repository and declare the appropriate dependency version.
[source,xml]
----
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}.BUILD-SNAPSHOT</version>
</dependency>
<repository>
<id>spring-libs-snapshot</id>
<name>Spring Snapshot Repository</name>
<url>https://repo.spring.io/libs-snapshot</url>
</repository>
----
== Getting Help
Having trouble with Spring Data? Wed love to help!
* Check the
https://docs.spring.io/spring-data/mongodb/docs/current/reference/html/[reference documentation], and https://docs.spring.io/spring-data/mongodb/docs/current/api/[Javadocs].
* Learn the Spring basics Spring Data builds on Spring Framework, check the https://spring.io[spring.io] web-site for a wealth of reference documentation.
If you are just starting out with Spring, try one of the https://spring.io/guides[guides].
* If you are upgrading, check out the https://docs.spring.io/spring-data/mongodb/docs/current/changelog.txt[changelog] for "`new and noteworthy`" features.
* Ask a question - we monitor https://stackoverflow.com[stackoverflow.com] for questions tagged with https://stackoverflow.com/tags/spring-data[`spring-data-mongodb`].
You can also chat with the community on https://gitter.im/spring-projects/spring-data[Gitter].
* Report bugs with Spring Data MongoDB at https://jira.spring.io/browse/DATAMONGO[jira.spring.io/browse/DATAMONGO].
== Reporting Issues
Spring Data uses JIRA as issue tracking system to record bugs and feature requests. If you want to raise an issue, please follow the recommendations below:
* Before you log a bug, please search the
https://jira.spring.io/browse/DATAMONGO[issue tracker] to see if someone has already reported the problem.
* If the issue doesnt already exist, https://jira.spring.io/browse/DATAMONGO[create a new issue].
* Please provide as much information as possible with the issue report, we like to know the version of Spring Data that you are using and JVM version.
* If you need to paste code, or include a stack trace use JIRA `{code}…{code}` escapes before and after your text.
* If possible try to create a test-case or project that replicates the issue. Attach a link to your code or a compressed file containing your code.
== Building from Source
You dont need to build from source to use Spring Data (binaries in https://repo.spring.io[repo.spring.io]), but if you want to try out the latest and greatest, Spring Data can be easily built with the https://github.com/takari/maven-wrapper[maven wrapper].
You also need JDK 1.8.
[source,bash]
----
$ ./mvnw clean install
----
If you want to build with the regular `mvn` command, you will need https://maven.apache.org/run-maven/index.html[Maven v3.5.0 or above].
_Also see link:CONTRIBUTING.adoc[CONTRIBUTING.adoc] if you wish to submit pull requests, and in particular please sign the https://cla.pivotal.io/sign/spring[Contributors Agreement] before your first non-trivial change._
=== Building reference documentation
Building the documentation builds also the project without running tests.
[source,bash]
----
$ ./mvnw clean install -Pdistribute
----
The generated documentation is available from `target/site/reference/html/index.html`.
== Guides
The https://spring.io/[spring.io] site contains several guides that show how to use Spring Data step-by-step:
* https://spring.io/guides/gs/accessing-data-mongodb/[Accessing Data with MongoDB] is a very basic guide that shows you how to create a simple application and how to access data using repositories.
* https://spring.io/guides/gs/accessing-mongodb-data-rest/[Accessing MongoDB Data with REST] is a guide to creating a REST web service exposing data stored in MongoDB through repositories.
== Examples
* https://github.com/spring-projects/spring-data-examples/[Spring Data Examples] contains example projects that explain specific features in more detail.
== License
Spring Data MongoDB is Open Source software released under the https://www.apache.org/licenses/LICENSE-2.0.html[Apache 2.0 license].

150
README.md Normal file
View File

@@ -0,0 +1,150 @@
[![Spring Data MongoDB](https://spring.io/badges/spring-data-mongodb/ga.svg)](http://projects.spring.io/spring-data-mongodb#quick-start)
[![Spring Data MongoDB](https://spring.io/badges/spring-data-mongodb/snapshot.svg)](http://projects.spring.io/spring-data-mongodb#quick-start)
# Spring Data MongoDB
The primary goal of the [Spring Data](http://projects.spring.io/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
The Spring Data MongoDB project aims to provide a familiar and consistent Spring-based programming model for new datastores while retaining store-specific features and capabilities. The Spring Data MongoDB project provides integration with the MongoDB document database. Key functional areas of Spring Data MongoDB are a POJO centric model for interacting with a MongoDB DBCollection and easily writing a repository style data access layer.
## Getting Help
For a comprehensive treatment of all the Spring Data MongoDB features, please refer to:
* the [User Guide](http://docs.spring.io/spring-data/mongodb/docs/current/reference/html/)
* the [JavaDocs](http://docs.spring.io/spring-data/mongodb/docs/current/api/) have extensive comments in them as well.
* the home page of [Spring Data MongoDB](http://projects.spring.io/spring-data-mongodb) contains links to articles and other resources.
* for more detailed questions, use [Spring Data Mongodb on Stackoverflow](http://stackoverflow.com/questions/tagged/spring-data-mongodb).
If you are new to Spring as well as to Spring Data, look for information about [Spring projects](http://projects.spring.io/).
## Quick Start
### Maven configuration
Add the Maven dependency:
```xml
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}.RELEASE</version>
</dependency>
```
If you'd rather like the latest snapshots of the upcoming major version, use our Maven snapshot repository and declare the appropriate dependency version.
```xml
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}.BUILD-SNAPSHOT</version>
</dependency>
<repository>
<id>spring-libs-snapshot</id>
<name>Spring Snapshot Repository</name>
<url>http://repo.spring.io/libs-snapshot</url>
</repository>
```
### MongoTemplate
MongoTemplate is the central support class for Mongo database operations. It provides:
* Basic POJO mapping support to and from BSON
* Convenience methods to interact with the store (insert object, update objects) and MongoDB specific ones (geo-spatial operations, upserts, map-reduce etc.)
* Connection affinity callback
* Exception translation into Spring's [technology agnostic DAO exception hierarchy](http://docs.spring.io/spring/docs/current/spring-framework-reference/html/dao.html#dao-exceptions).
### Spring Data repositories
To simplify the creation of data repositories Spring Data MongoDB provides a generic repository programming model. It will automatically create a repository proxy for you that adds implementations of finder methods you specify on an interface.
For example, given a `Person` class with first and last name properties, a `PersonRepository` interface that can query for `Person` by last name and when the first name matches a like expression is shown below:
```java
public interface PersonRepository extends CrudRepository<Person, Long> {
List<Person> findByLastname(String lastname);
List<Person> findByFirstnameLike(String firstname);
}
```
The queries issued on execution will be derived from the method name. Extending `CrudRepository` causes CRUD methods being pulled into the interface so that you can easily save and find single entities and collections of them.
You can have Spring automatically create a proxy for the interface by using the following JavaConfig:
```java
@Configuration
@EnableMongoRepositories
class ApplicationConfig extends AbstractMongoConfiguration {
@Override
public Mongo mongo() throws Exception {
return new MongoClient();
}
@Override
protected String getDatabaseName() {
return "springdata";
}
}
```
This sets up a connection to a local MongoDB instance and enables the detection of Spring Data repositories (through `@EnableMongoRepositories`). The same configuration would look like this in XML:
```xml
<bean id="template" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg>
<bean class="com.mongodb.MongoClient">
<constructor-arg value="localhost" />
<constructor-arg value="27017" />
</bean>
</constructor-arg>
<constructor-arg value="database" />
</bean>
<mongo:repositories base-package="com.acme.repository" />
```
This will find the repository interface and register a proxy object in the container. You can use it as shown below:
```java
@Service
public class MyService {
private final PersonRepository repository;
@Autowired
public MyService(PersonRepository repository) {
this.repository = repository;
}
public void doWork() {
repository.deleteAll();
Person person = new Person();
person.setFirstname("Oliver");
person.setLastname("Gierke");
person = repository.save(person);
List<Person> lastNameResults = repository.findByLastname("Gierke");
List<Person> firstNameResults = repository.findByFirstnameLike("Oli*");
}
}
```
## Contributing to Spring Data
Here are some ways for you to get involved in the community:
* Get involved with the Spring community on Stackoverflow and help out on the [spring-data-mongodb](http://stackoverflow.com/questions/tagged/spring-data-mongodb) tag by responding to questions and joining the debate.
* Create [JIRA](https://jira.spring.io/browse/DATAMONGO) tickets for bugs and new features and comment and vote on the ones that you are interested in.
* Github is for social coding: if you want to write code, we encourage contributions through pull requests from [forks of this repository](http://help.github.com/forking/). If you want to contribute code this way, please reference a JIRA ticket as well covering the specific issue you are addressing.
* Watch for upcoming articles on Spring by [subscribing](http://spring.io/blog) to spring.io.
Before we accept a non-trivial patch or pull request we will need you to [sign the Contributor License Agreement](https://cla.pivotal.io/sign/spring). Signing the contributors agreement does not grant anyone commit rights to the main repository, but it does mean that we can accept your contributions, and you will get an author credit if we do. If you forget to do so, you'll be reminded when you submit a pull request. Active contributors might be asked to join the core team, and given the ability to merge pull requests.

View File

@@ -1,14 +0,0 @@
FROM adoptopenjdk/openjdk8:latest
RUN apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2
RUN apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 9DA31620334BD75D9DCB49F368818C72E52529D4
RUN echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.0 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.0.list
RUN apt-get update
RUN apt-get install -y mongodb-org=4.0.9 mongodb-org-server=4.0.9 mongodb-org-shell=4.0.9 mongodb-org-mongos=4.0.9 mongodb-org-tools=4.0.9
RUN apt-get clean \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -1,14 +0,0 @@
FROM adoptopenjdk/openjdk8:latest
RUN apt-get update && apt-get install -y apt-transport-https apt-utils gnupg2
RUN apt-key adv --keyserver hkps://keyserver.ubuntu.com:443 --recv 4B7C549A058F8B6B
RUN echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.1 multiverse" | tee /etc/apt/sources.list.d/mongodb-org-4.1.list
RUN apt-get update
RUN apt-get install -y mongodb-org-unstable=4.1.13 mongodb-org-unstable-server=4.1.13 mongodb-org-unstable-shell=4.1.13 mongodb-org-unstable-mongos=4.1.13 mongodb-org-unstable-tools=4.1.13
RUN apt-get clean \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -1,291 +0,0 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<profiles version="12">
<profile kind="CodeFormatterProfile" name="Spring Data" version="12">
<setting id="org.eclipse.jdt.core.formatter.comment.insert_new_line_before_root_tags" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.disabling_tag" value="@formatter:off"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_annotation" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_type_arguments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_anonymous_type_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_case" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_brace_in_array_initializer" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.new_lines_at_block_boundaries" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_annotation_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_before_closing_brace_in_array_initializer" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_annotation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_field" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_while" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.use_on_off_tags" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_annotation_type_member_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_before_else_in_if_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_prefix_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.keep_else_statement_on_same_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_ellipsis" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.insert_new_line_for_parameter" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_annotation_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_breaks_compare_to_cases" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_at_in_annotation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_multiple_fields" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_expressions_in_array_initializer" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_conditional_expression" value="80"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_for" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_binary_operator" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_question_in_wildcard" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_array_initializer" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_before_finally_in_try_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_local_variable" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_before_catch_in_try_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_while" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_after_package" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_type_parameters" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.continuation_indentation" value="2"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_postfix_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_method_invocation" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_superinterfaces" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_new_chunk" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_binary_operator" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_package" value="0"/>
<setting id="org.eclipse.jdt.core.compiler.source" value="1.7"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_enum_constant_arguments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_constructor_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_line_comments" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_angle_bracket_in_type_arguments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_enum_declarations" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.join_wrapped_lines" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_block" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_explicit_constructor_call" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_method_invocation_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_member_type" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.align_type_members_on_columns" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_for" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_method_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_selector_in_method_invocation" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_switch" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_unary_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_case" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.indent_parameter_description" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_method_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_switch" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_enum_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_angle_bracket_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_block_comment" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_type_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.lineSplit" value="120"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_if" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_brackets_in_array_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_parenthesized_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_explicitconstructorcall_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_constructor_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_first_class_body_declaration" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_method" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indentation.size" value="2"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_method_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.enabling_tag" value="@formatter:on"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_superclass_in_type_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_assignment" value="0"/>
<setting id="org.eclipse.jdt.core.compiler.problem.assertIdentifier" value="error"/>
<setting id="org.eclipse.jdt.core.formatter.tabulation.char" value="tab"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_constructor_declaration_parameters" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_semicolon_in_try_resources" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_prefix_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_statements_compare_to_body" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_method" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.wrap_outer_expressions_when_nested" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.format_guardian_clause_on_one_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_for" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_cast" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_parameters_in_constructor_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_labeled_statement" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_annotation_type_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_method_body" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_method_declaration" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_method_invocation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_try" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_bracket_in_array_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_enum_constant" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_annotation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_at_in_annotation_type_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_method_declaration_throws" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_if" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_switch" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_method_declaration_throws" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_parenthesized_expression_in_return" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_annotation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_question_in_conditional" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_question_in_wildcard" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_try" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_bracket_in_array_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.preserve_white_space_between_code_and_line_comments" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_parenthesized_expression_in_throw" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.compiler.problem.enumIdentifier" value="error"/>
<setting id="org.eclipse.jdt.core.formatter.indent_switchstatements_compare_to_switch" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_ellipsis" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_block" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_for_inits" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_method_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.compact_else_if" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.wrap_before_or_operator_multicatch" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_array_initializer" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_for_increments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.format_line_comment_starting_on_first_column" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_bracket_in_array_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_field" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.indent_root_tags" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_enum_constant" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_enum_declarations" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_union_type_in_multicatch" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_explicitconstructorcall_arguments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_switch" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_method_declaration_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_superinterfaces" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.tabulation.size" value="2"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_opening_brace_in_array_initializer" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_brace_in_block" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_constructor_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_constructor_declaration_throws" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_if" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.clear_blank_lines_in_javadoc_comment" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_throws_clause_in_constructor_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_assignment_operator" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_assignment_operator" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_empty_lines" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_synchronized" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_paren_in_cast" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_method_declaration_parameters" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_block_in_case" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.number_of_empty_lines_to_preserve" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_method_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_catch" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_constructor_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_method_invocation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_bracket_in_array_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_qualified_allocation_expression" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_and_in_type_parameter" value="insert"/>
<setting id="org.eclipse.jdt.core.compiler.compliance" value="1.7"/>
<setting id="org.eclipse.jdt.core.formatter.continuation_indentation_for_array_initializer" value="2"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_brackets_in_array_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_at_in_annotation_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_allocation_expression" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_cast" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_unary_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_anonymous_type_declaration" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.keep_empty_array_initializer_on_one_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_enum_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.keep_imple_if_on_one_line" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_constructor_declaration_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_closing_angle_bracket_in_type_parameters" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_at_end_of_file_if_missing" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_for" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_labeled_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_superinterfaces_in_type_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_binary_expression" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_enum_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_type" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_while" value="do not insert"/>
<setting id="org.eclipse.jdt.core.compiler.codegen.inlineJsrBytecode" value="enabled"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_try" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.put_empty_statement_on_new_line" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_label" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_parameter" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_method_invocation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_before_while_in_do_statement" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_enum_constant" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_javadoc_comments" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.comment.line_length" value="120"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_after_annotation_on_package" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_between_import_groups" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_enum_constant_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_semicolon" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_constructor_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.number_of_blank_lines_at_beginning_of_method_body" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_conditional" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_type_header" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_annotation_type_member_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.wrap_before_binary_operator" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_enum_declaration_header" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_between_type_declarations" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_synchronized" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_statements_compare_to_block" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_superinterfaces_in_enum_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.join_lines_in_comments" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_question_in_conditional" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_multiple_field_declarations" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_compact_if" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_for_inits" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_switchstatements_compare_to_cases" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_array_initializer" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_default" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_and_in_type_parameter" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_parens_in_constructor_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_before_imports" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_assert" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_html" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_throws_clause_in_method_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_angle_bracket_in_type_parameters" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_bracket_in_array_allocation_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_anonymous_type_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_colon_in_conditional" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_angle_bracket_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_for" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_postfix_operator" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_source_code" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_synchronized" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_allocation_expression" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_constructor_declaration_throws" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_parameters_in_method_declaration" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_brace_in_array_initializer" value="insert"/>
<setting id="org.eclipse.jdt.core.compiler.codegen.targetPlatform" value="1.7"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_resources_in_try" value="80"/>
<setting id="org.eclipse.jdt.core.formatter.use_tabs_only_for_leading_indentations" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_arguments_in_annotation" value="16"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_header" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.comment.format_block_comments" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_enum_constant" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.alignment_for_enum_constants" value="0"/>
<setting id="org.eclipse.jdt.core.formatter.insert_new_line_in_empty_block" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_annotation_declaration_header" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_parenthesized_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_parenthesized_expression" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_catch" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_multiple_local_declarations" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_switch" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_comma_in_for_increments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_method_invocation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_colon_in_assert" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.brace_position_for_type_declaration" value="end_of_line"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_brace_in_array_initializer" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_between_empty_braces_in_array_initializer" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_opening_paren_in_method_declaration" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_semicolon_in_for" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_paren_in_catch" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_angle_bracket_in_parameterized_type_reference" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_multiple_field_declarations" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_closing_paren_in_annotation" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_parameterized_type_reference" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_method_invocation_arguments" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.comment.new_lines_at_javadoc_boundaries" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.blank_lines_after_imports" value="1"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_comma_in_multiple_local_declarations" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.indent_body_declarations_compare_to_enum_constant_header" value="true"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_after_semicolon_in_for" value="insert"/>
<setting id="org.eclipse.jdt.core.formatter.never_indent_line_comments_on_first_column" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_semicolon_in_try_resources" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.insert_space_before_opening_angle_bracket_in_type_arguments" value="do not insert"/>
<setting id="org.eclipse.jdt.core.formatter.never_indent_block_comments_on_first_column" value="false"/>
<setting id="org.eclipse.jdt.core.formatter.keep_then_statement_on_same_line" value="false"/>
</profile>
</profiles>

View File

@@ -1,2 +0,0 @@
lombok.nonNull.exceptionType = IllegalArgumentException
lombok.log.fieldName = LOG

286
mvnw vendored
View File

@@ -1,286 +0,0 @@
#!/bin/sh
# ----------------------------------------------------------------------------
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
# ----------------------------------------------------------------------------
# ----------------------------------------------------------------------------
# Maven2 Start Up Batch script
#
# Required ENV vars:
# ------------------
# JAVA_HOME - location of a JDK home dir
#
# Optional ENV vars
# -----------------
# M2_HOME - location of maven2's installed home dir
# MAVEN_OPTS - parameters passed to the Java VM when running Maven
# e.g. to debug Maven itself, use
# set MAVEN_OPTS=-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000
# MAVEN_SKIP_RC - flag to disable loading of mavenrc files
# ----------------------------------------------------------------------------
if [ -z "$MAVEN_SKIP_RC" ] ; then
if [ -f /etc/mavenrc ] ; then
. /etc/mavenrc
fi
if [ -f "$HOME/.mavenrc" ] ; then
. "$HOME/.mavenrc"
fi
fi
# OS specific support. $var _must_ be set to either true or false.
cygwin=false;
darwin=false;
mingw=false
case "`uname`" in
CYGWIN*) cygwin=true ;;
MINGW*) mingw=true;;
Darwin*) darwin=true
# Use /usr/libexec/java_home if available, otherwise fall back to /Library/Java/Home
# See https://developer.apple.com/library/mac/qa/qa1170/_index.html
if [ -z "$JAVA_HOME" ]; then
if [ -x "/usr/libexec/java_home" ]; then
export JAVA_HOME="`/usr/libexec/java_home`"
else
export JAVA_HOME="/Library/Java/Home"
fi
fi
;;
esac
if [ -z "$JAVA_HOME" ] ; then
if [ -r /etc/gentoo-release ] ; then
JAVA_HOME=`java-config --jre-home`
fi
fi
if [ -z "$M2_HOME" ] ; then
## resolve links - $0 may be a link to maven's home
PRG="$0"
# need this for relative symlinks
while [ -h "$PRG" ] ; do
ls=`ls -ld "$PRG"`
link=`expr "$ls" : '.*-> \(.*\)$'`
if expr "$link" : '/.*' > /dev/null; then
PRG="$link"
else
PRG="`dirname "$PRG"`/$link"
fi
done
saveddir=`pwd`
M2_HOME=`dirname "$PRG"`/..
# make it fully qualified
M2_HOME=`cd "$M2_HOME" && pwd`
cd "$saveddir"
# echo Using m2 at $M2_HOME
fi
# For Cygwin, ensure paths are in UNIX format before anything is touched
if $cygwin ; then
[ -n "$M2_HOME" ] &&
M2_HOME=`cygpath --unix "$M2_HOME"`
[ -n "$JAVA_HOME" ] &&
JAVA_HOME=`cygpath --unix "$JAVA_HOME"`
[ -n "$CLASSPATH" ] &&
CLASSPATH=`cygpath --path --unix "$CLASSPATH"`
fi
# For Mingw, ensure paths are in UNIX format before anything is touched
if $mingw ; then
[ -n "$M2_HOME" ] &&
M2_HOME="`(cd "$M2_HOME"; pwd)`"
[ -n "$JAVA_HOME" ] &&
JAVA_HOME="`(cd "$JAVA_HOME"; pwd)`"
# TODO classpath?
fi
if [ -z "$JAVA_HOME" ]; then
javaExecutable="`which javac`"
if [ -n "$javaExecutable" ] && ! [ "`expr \"$javaExecutable\" : '\([^ ]*\)'`" = "no" ]; then
# readlink(1) is not available as standard on Solaris 10.
readLink=`which readlink`
if [ ! `expr "$readLink" : '\([^ ]*\)'` = "no" ]; then
if $darwin ; then
javaHome="`dirname \"$javaExecutable\"`"
javaExecutable="`cd \"$javaHome\" && pwd -P`/javac"
else
javaExecutable="`readlink -f \"$javaExecutable\"`"
fi
javaHome="`dirname \"$javaExecutable\"`"
javaHome=`expr "$javaHome" : '\(.*\)/bin'`
JAVA_HOME="$javaHome"
export JAVA_HOME
fi
fi
fi
if [ -z "$JAVACMD" ] ; then
if [ -n "$JAVA_HOME" ] ; then
if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
# IBM's JDK on AIX uses strange locations for the executables
JAVACMD="$JAVA_HOME/jre/sh/java"
else
JAVACMD="$JAVA_HOME/bin/java"
fi
else
JAVACMD="`which java`"
fi
fi
if [ ! -x "$JAVACMD" ] ; then
echo "Error: JAVA_HOME is not defined correctly." >&2
echo " We cannot execute $JAVACMD" >&2
exit 1
fi
if [ -z "$JAVA_HOME" ] ; then
echo "Warning: JAVA_HOME environment variable is not set."
fi
CLASSWORLDS_LAUNCHER=org.codehaus.plexus.classworlds.launcher.Launcher
# traverses directory structure from process work directory to filesystem root
# first directory with .mvn subdirectory is considered project base directory
find_maven_basedir() {
if [ -z "$1" ]
then
echo "Path not specified to find_maven_basedir"
return 1
fi
basedir="$1"
wdir="$1"
while [ "$wdir" != '/' ] ; do
if [ -d "$wdir"/.mvn ] ; then
basedir=$wdir
break
fi
# workaround for JBEAP-8937 (on Solaris 10/Sparc)
if [ -d "${wdir}" ]; then
wdir=`cd "$wdir/.."; pwd`
fi
# end of workaround
done
echo "${basedir}"
}
# concatenates all lines of a file
concat_lines() {
if [ -f "$1" ]; then
echo "$(tr -s '\n' ' ' < "$1")"
fi
}
BASE_DIR=`find_maven_basedir "$(pwd)"`
if [ -z "$BASE_DIR" ]; then
exit 1;
fi
##########################################################################################
# Extension to allow automatically downloading the maven-wrapper.jar from Maven-central
# This allows using the maven wrapper in projects that prohibit checking in binary data.
##########################################################################################
if [ -r "$BASE_DIR/.mvn/wrapper/maven-wrapper.jar" ]; then
if [ "$MVNW_VERBOSE" = true ]; then
echo "Found .mvn/wrapper/maven-wrapper.jar"
fi
else
if [ "$MVNW_VERBOSE" = true ]; then
echo "Couldn't find .mvn/wrapper/maven-wrapper.jar, downloading it ..."
fi
jarUrl="https://repo.maven.apache.org/maven2/io/takari/maven-wrapper/0.4.2/maven-wrapper-0.4.2.jar"
while IFS="=" read key value; do
case "$key" in (wrapperUrl) jarUrl="$value"; break ;;
esac
done < "$BASE_DIR/.mvn/wrapper/maven-wrapper.properties"
if [ "$MVNW_VERBOSE" = true ]; then
echo "Downloading from: $jarUrl"
fi
wrapperJarPath="$BASE_DIR/.mvn/wrapper/maven-wrapper.jar"
if command -v wget > /dev/null; then
if [ "$MVNW_VERBOSE" = true ]; then
echo "Found wget ... using wget"
fi
wget "$jarUrl" -O "$wrapperJarPath"
elif command -v curl > /dev/null; then
if [ "$MVNW_VERBOSE" = true ]; then
echo "Found curl ... using curl"
fi
curl -o "$wrapperJarPath" "$jarUrl"
else
if [ "$MVNW_VERBOSE" = true ]; then
echo "Falling back to using Java to download"
fi
javaClass="$BASE_DIR/.mvn/wrapper/MavenWrapperDownloader.java"
if [ -e "$javaClass" ]; then
if [ ! -e "$BASE_DIR/.mvn/wrapper/MavenWrapperDownloader.class" ]; then
if [ "$MVNW_VERBOSE" = true ]; then
echo " - Compiling MavenWrapperDownloader.java ..."
fi
# Compiling the Java class
("$JAVA_HOME/bin/javac" "$javaClass")
fi
if [ -e "$BASE_DIR/.mvn/wrapper/MavenWrapperDownloader.class" ]; then
# Running the downloader
if [ "$MVNW_VERBOSE" = true ]; then
echo " - Running MavenWrapperDownloader.java ..."
fi
("$JAVA_HOME/bin/java" -cp .mvn/wrapper MavenWrapperDownloader "$MAVEN_PROJECTBASEDIR")
fi
fi
fi
fi
##########################################################################################
# End of extension
##########################################################################################
export MAVEN_PROJECTBASEDIR=${MAVEN_BASEDIR:-"$BASE_DIR"}
if [ "$MVNW_VERBOSE" = true ]; then
echo $MAVEN_PROJECTBASEDIR
fi
MAVEN_OPTS="$(concat_lines "$MAVEN_PROJECTBASEDIR/.mvn/jvm.config") $MAVEN_OPTS"
# For Cygwin, switch paths to Windows format before running java
if $cygwin; then
[ -n "$M2_HOME" ] &&
M2_HOME=`cygpath --path --windows "$M2_HOME"`
[ -n "$JAVA_HOME" ] &&
JAVA_HOME=`cygpath --path --windows "$JAVA_HOME"`
[ -n "$CLASSPATH" ] &&
CLASSPATH=`cygpath --path --windows "$CLASSPATH"`
[ -n "$MAVEN_PROJECTBASEDIR" ] &&
MAVEN_PROJECTBASEDIR=`cygpath --path --windows "$MAVEN_PROJECTBASEDIR"`
fi
WRAPPER_LAUNCHER=org.apache.maven.wrapper.MavenWrapperMain
exec "$JAVACMD" \
$MAVEN_OPTS \
-classpath "$MAVEN_PROJECTBASEDIR/.mvn/wrapper/maven-wrapper.jar" \
"-Dmaven.home=${M2_HOME}" "-Dmaven.multiModuleProjectDirectory=${MAVEN_PROJECTBASEDIR}" \
${WRAPPER_LAUNCHER} $MAVEN_CONFIG "$@"

161
mvnw.cmd vendored
View File

@@ -1,161 +0,0 @@
@REM ----------------------------------------------------------------------------
@REM Licensed to the Apache Software Foundation (ASF) under one
@REM or more contributor license agreements. See the NOTICE file
@REM distributed with this work for additional information
@REM regarding copyright ownership. The ASF licenses this file
@REM to you under the Apache License, Version 2.0 (the
@REM "License"); you may not use this file except in compliance
@REM with the License. You may obtain a copy of the License at
@REM
@REM https://www.apache.org/licenses/LICENSE-2.0
@REM
@REM Unless required by applicable law or agreed to in writing,
@REM software distributed under the License is distributed on an
@REM "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
@REM KIND, either express or implied. See the License for the
@REM specific language governing permissions and limitations
@REM under the License.
@REM ----------------------------------------------------------------------------
@REM ----------------------------------------------------------------------------
@REM Maven2 Start Up Batch script
@REM
@REM Required ENV vars:
@REM JAVA_HOME - location of a JDK home dir
@REM
@REM Optional ENV vars
@REM M2_HOME - location of maven2's installed home dir
@REM MAVEN_BATCH_ECHO - set to 'on' to enable the echoing of the batch commands
@REM MAVEN_BATCH_PAUSE - set to 'on' to wait for a key stroke before ending
@REM MAVEN_OPTS - parameters passed to the Java VM when running Maven
@REM e.g. to debug Maven itself, use
@REM set MAVEN_OPTS=-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000
@REM MAVEN_SKIP_RC - flag to disable loading of mavenrc files
@REM ----------------------------------------------------------------------------
@REM Begin all REM lines with '@' in case MAVEN_BATCH_ECHO is 'on'
@echo off
@REM set title of command window
title %0
@REM enable echoing my setting MAVEN_BATCH_ECHO to 'on'
@if "%MAVEN_BATCH_ECHO%" == "on" echo %MAVEN_BATCH_ECHO%
@REM set %HOME% to equivalent of $HOME
if "%HOME%" == "" (set "HOME=%HOMEDRIVE%%HOMEPATH%")
@REM Execute a user defined script before this one
if not "%MAVEN_SKIP_RC%" == "" goto skipRcPre
@REM check for pre script, once with legacy .bat ending and once with .cmd ending
if exist "%HOME%\mavenrc_pre.bat" call "%HOME%\mavenrc_pre.bat"
if exist "%HOME%\mavenrc_pre.cmd" call "%HOME%\mavenrc_pre.cmd"
:skipRcPre
@setlocal
set ERROR_CODE=0
@REM To isolate internal variables from possible post scripts, we use another setlocal
@setlocal
@REM ==== START VALIDATION ====
if not "%JAVA_HOME%" == "" goto OkJHome
echo.
echo Error: JAVA_HOME not found in your environment. >&2
echo Please set the JAVA_HOME variable in your environment to match the >&2
echo location of your Java installation. >&2
echo.
goto error
:OkJHome
if exist "%JAVA_HOME%\bin\java.exe" goto init
echo.
echo Error: JAVA_HOME is set to an invalid directory. >&2
echo JAVA_HOME = "%JAVA_HOME%" >&2
echo Please set the JAVA_HOME variable in your environment to match the >&2
echo location of your Java installation. >&2
echo.
goto error
@REM ==== END VALIDATION ====
:init
@REM Find the project base dir, i.e. the directory that contains the folder ".mvn".
@REM Fallback to current working directory if not found.
set MAVEN_PROJECTBASEDIR=%MAVEN_BASEDIR%
IF NOT "%MAVEN_PROJECTBASEDIR%"=="" goto endDetectBaseDir
set EXEC_DIR=%CD%
set WDIR=%EXEC_DIR%
:findBaseDir
IF EXIST "%WDIR%"\.mvn goto baseDirFound
cd ..
IF "%WDIR%"=="%CD%" goto baseDirNotFound
set WDIR=%CD%
goto findBaseDir
:baseDirFound
set MAVEN_PROJECTBASEDIR=%WDIR%
cd "%EXEC_DIR%"
goto endDetectBaseDir
:baseDirNotFound
set MAVEN_PROJECTBASEDIR=%EXEC_DIR%
cd "%EXEC_DIR%"
:endDetectBaseDir
IF NOT EXIST "%MAVEN_PROJECTBASEDIR%\.mvn\jvm.config" goto endReadAdditionalConfig
@setlocal EnableExtensions EnableDelayedExpansion
for /F "usebackq delims=" %%a in ("%MAVEN_PROJECTBASEDIR%\.mvn\jvm.config") do set JVM_CONFIG_MAVEN_PROPS=!JVM_CONFIG_MAVEN_PROPS! %%a
@endlocal & set JVM_CONFIG_MAVEN_PROPS=%JVM_CONFIG_MAVEN_PROPS%
:endReadAdditionalConfig
SET MAVEN_JAVA_EXE="%JAVA_HOME%\bin\java.exe"
set WRAPPER_JAR="%MAVEN_PROJECTBASEDIR%\.mvn\wrapper\maven-wrapper.jar"
set WRAPPER_LAUNCHER=org.apache.maven.wrapper.MavenWrapperMain
set DOWNLOAD_URL="https://repo.maven.apache.org/maven2/io/takari/maven-wrapper/0.4.2/maven-wrapper-0.4.2.jar"
FOR /F "tokens=1,2 delims==" %%A IN (%MAVEN_PROJECTBASEDIR%\.mvn\wrapper\maven-wrapper.properties) DO (
IF "%%A"=="wrapperUrl" SET DOWNLOAD_URL=%%B
)
@REM Extension to allow automatically downloading the maven-wrapper.jar from Maven-central
@REM This allows using the maven wrapper in projects that prohibit checking in binary data.
if exist %WRAPPER_JAR% (
echo Found %WRAPPER_JAR%
) else (
echo Couldn't find %WRAPPER_JAR%, downloading it ...
echo Downloading from: %DOWNLOAD_URL%
powershell -Command "(New-Object Net.WebClient).DownloadFile('%DOWNLOAD_URL%', '%WRAPPER_JAR%')"
echo Finished downloading %WRAPPER_JAR%
)
@REM End of extension
%MAVEN_JAVA_EXE% %JVM_CONFIG_MAVEN_PROPS% %MAVEN_OPTS% %MAVEN_DEBUG_OPTS% -classpath %WRAPPER_JAR% "-Dmaven.multiModuleProjectDirectory=%MAVEN_PROJECTBASEDIR%" %WRAPPER_LAUNCHER% %MAVEN_CONFIG% %*
if ERRORLEVEL 1 goto error
goto end
:error
set ERROR_CODE=1
:end
@endlocal & set ERROR_CODE=%ERROR_CODE%
if not "%MAVEN_SKIP_RC%" == "" goto skipRcPost
@REM check for post script, once with legacy .bat ending and once with .cmd ending
if exist "%HOME%\mavenrc_post.bat" call "%HOME%\mavenrc_post.bat"
if exist "%HOME%\mavenrc_post.cmd" call "%HOME%\mavenrc_post.cmd"
:skipRcPost
@REM pause the script if MAVEN_BATCH_PAUSE is set to 'on'
if "%MAVEN_BATCH_PAUSE%" == "on" pause
if "%MAVEN_TERMINATE_CMD%" == "on" exit %ERROR_CODE%
exit /B %ERROR_CODE%

161
pom.xml
View File

@@ -1,21 +1,21 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.24.BUILD-SNAPSHOT</version>
<version>2.0.0.M2</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
<description>MongoDB support for Spring Data</description>
<url>https://projects.spring.io/spring-data-mongodb</url>
<url>http://projects.spring.io/spring-data-mongodb</url>
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.9.24.BUILD-SNAPSHOT</version>
<version>2.0.0.M2</version>
</parent>
<modules>
@@ -28,10 +28,9 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.13.24.BUILD-SNAPSHOT</springdata.commons>
<mongo>2.14.3</mongo>
<mongo.osgi>2.13.0</mongo.osgi>
<jmh.version>1.19</jmh.version>
<springdata.commons>2.0.0.M2</springdata.commons>
<mongo>3.4.2</mongo>
<mongo.reactivestreams>1.3.0</mongo.reactivestreams>
</properties>
<developers>
@@ -40,7 +39,7 @@
<name>Oliver Gierke</name>
<email>ogierke at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>https://pivotal.io</organizationUrl>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Project Lead</role>
</roles>
@@ -51,7 +50,7 @@
<name>Thomas Risberg</name>
<email>trisberg at vmware.com</email>
<organization>Pivotal</organization>
<organizationUrl>https://pivotal.io</organizationUrl>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -62,7 +61,7 @@
<name>Mark Pollack</name>
<email>mpollack at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>https://pivotal.io</organizationUrl>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -73,7 +72,7 @@
<name>Jon Brisbin</name>
<email>jbrisbin at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>https://pivotal.io</organizationUrl>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -84,7 +83,7 @@
<name>Thomas Darimont</name>
<email>tdarimont at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>https://pivotal.io</organizationUrl>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -95,7 +94,18 @@
<name>Christoph Strobl</name>
<email>cstrobl at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>https://pivotal.io</organizationUrl>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>+1</timezone>
</developer>
<developer>
<id>mpaluch</id>
<name>Mark Paluch</name>
<email>mpaluch at pivotal.io</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -104,95 +114,12 @@
</developers>
<profiles>
<profile>
<id>mongo-next</id>
<properties>
<mongo>2.15.0-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>mongo3</id>
<properties>
<mongo>3.0.4</mongo>
</properties>
</profile>
<profile>
<id>mongo3-next</id>
<properties>
<mongo>3.0.5-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>mongo31</id>
<properties>
<mongo>3.1.1</mongo>
</properties>
</profile>
<profile>
<id>mongo32</id>
<properties>
<mongo>3.2.2</mongo>
</properties>
</profile>
<profile>
<id>mongo33</id>
<properties>
<mongo>3.3.0</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>mongo34</id>
<properties>
<mongo>3.4.0</mongo>
</properties>
</profile>
<profile>
<id>mongo34-next</id>
<properties>
<mongo>3.4.1-SNAPSHOT</mongo>
<mongo>3.4.3-SNAPSHOT</mongo>
</properties>
<repositories>
@@ -205,14 +132,32 @@
</profile>
<profile>
<id>benchmarks</id>
<modules>
<module>spring-data-mongodb</module>
<module>spring-data-mongodb-cross-store</module>
<module>spring-data-mongodb-log4j</module>
<module>spring-data-mongodb-distribution</module>
<module>spring-data-mongodb-benchmarks</module>
</modules>
<id>mongo35-next</id>
<properties>
<mongo>3.5.0-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>release</id>
<build>
<plugins>
<plugin>
<groupId>org.jfrog.buildinfo</groupId>
<artifactId>artifactory-maven-plugin</artifactId>
<inherited>false</inherited>
</plugin>
</plugins>
</build>
</profile>
</profiles>
@@ -228,8 +173,8 @@
<repositories>
<repository>
<id>spring-libs-snapshot</id>
<url>https://repo.spring.io/libs-snapshot</url>
<id>spring-libs-milestone</id>
<url>https://repo.spring.io/libs-milestone</url>
</repository>
</repositories>

View File

@@ -1,76 +0,0 @@
# Benchmarks
Benchmarks are based on [JMH](http://openjdk.java.net/projects/code-tools/jmh/).
# Running Benchmarks
Running benchmarks is disabled by default and can be activated via the `benchmarks` profile.
To run the benchmarks with default settings use.
```bash
mvn -P benchmarks clean test
```
A basic report will be printed to the CLI.
```bash
# Run complete. Total time: 00:00:15
Benchmark Mode Cnt Score Error Units
MappingMongoConverterBenchmark.readObject thrpt 10 1920157,631 ± 64310,809 ops/s
MappingMongoConverterBenchmark.writeObject thrpt 10 782732,857 ± 53804,130 ops/s
```
## Running all Benchmarks of a specific class
To run all Benchmarks of a specific class, just provide its simple class name via the `benchmark` command line argument.
```bash
mvn -P benchmarks clean test -D benchmark=MappingMongoConverterBenchmark
```
## Running a single Benchmark
To run a single Benchmark provide its containing class simple name followed by `#` and the method name via the `benchmark` command line argument.
```bash
mvn -P benchmarks clean test -D benchmark=MappingMongoConverterBenchmark#readObjectWith2Properties
```
# Saving Benchmark Results
A detailed benchmark report is stored in JSON format in the `/target/reports/performance` directory.
To store the report in a different location use the `benchmarkReportDir` command line argument.
## MongoDB
Results can be directly piped to MongoDB by providing a valid [Connection String](https://docs.mongodb.com/manual/reference/connection-string/) via the `publishTo` command line argument.
```bash
mvn -P benchmarks clean test -D publishTo=mongodb://127.0.0.1:27017
```
NOTE: If the uri does not explicitly define a database the default `spring-data-mongodb-benchmarks` is used.
## HTTP Endpoint
The benchmark report can also be posted as `application/json` to an HTTP Endpoint by providing a valid URl via the `publishTo` command line argument.
```bash
mvn -P benchmarks clean test -D publishTo=http://127.0.0.1:8080/capture-benchmarks
```
# Customizing Benchmarks
Following options can be set via command line.
Option | Default Value
--- | ---
warmupIterations | 10
warmupTime | 1 (seconds)
measurementIterations | 10
measurementTime | 1 (seconds)
forks | 1
benchmarkReportDir | /target/reports/performance (always relative to project root dir)
benchmark | .* (single benchmark via `classname#benchmark`)
publishTo | \[not set\] (mongodb-uri or http-endpoint)

View File

@@ -1,112 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.24.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-benchmarks</artifactId>
<packaging>jar</packaging>
<name>Spring Data MongoDB - Microbenchmarks</name>
<properties>
<!-- Skip tests by default; run only if -DskipTests=false is specified or benchmarks profile is activated -->
<skipTests>true</skipTests>
<bundlor.enabled>false</bundlor.enabled>
</properties>
<dependencies>
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.openjdk.jmh</groupId>
<artifactId>jmh-core</artifactId>
<version>${jmh.version}</version>
</dependency>
<dependency>
<groupId>org.openjdk.jmh</groupId>
<artifactId>jmh-generator-annprocess</artifactId>
<version>${jmh.version}</version>
<scope>provided</scope>
</dependency>
</dependencies>
<profiles>
<profile>
<id>benchmarks</id>
<properties>
<skipTests>false</skipTests>
</properties>
</profile>
</profiles>
<build>
<plugins>
<plugin>
<groupId>pl.project13.maven</groupId>
<artifactId>git-commit-id-plugin</artifactId>
<version>2.2.2</version>
<executions>
<execution>
<goals>
<goal>revision</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<executions>
<execution>
<id>default-jar</id>
<phase>never</phase>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<testSourceDirectory>${project.build.sourceDirectory}</testSourceDirectory>
<testClassesDirectory>${project.build.outputDirectory}</testClassesDirectory>
<excludes>
<exclude>**/AbstractMicrobenchmark.java</exclude>
<exclude>**/*$*.class</exclude>
<exclude>**/generated/*.class</exclude>
</excludes>
<includes>
<include>**/*Benchmark*</include>
</includes>
<systemPropertyVariables>
<benchmarkReportDir>${project.build.directory}/reports/performance</benchmarkReportDir>
<project.version>${project.version}</project.version>
<git.dirty>${git.dirty}</git.dirty>
<git.commit.id>${git.commit.id}</git.commit.id>
<git.branch>${git.branch}</git.branch>
</systemPropertyVariables>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -1,111 +0,0 @@
/*
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import lombok.Data;
import java.util.ArrayList;
import java.util.List;
import org.bson.types.ObjectId;
import org.openjdk.jmh.annotations.Benchmark;
import org.openjdk.jmh.annotations.Scope;
import org.openjdk.jmh.annotations.Setup;
import org.openjdk.jmh.annotations.State;
import org.openjdk.jmh.annotations.TearDown;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.microbenchmark.AbstractMicrobenchmark;
import com.mongodb.MongoClient;
import com.mongodb.ServerAddress;
/**
* @author Christoph Strobl
*/
@State(Scope.Benchmark)
public class DbRefMappingBenchmark extends AbstractMicrobenchmark {
private static final String DB_NAME = "dbref-loading-benchmark";
private MongoClient client;
private MongoTemplate template;
private Query queryObjectWithDBRef;
private Query queryObjectWithDBRefList;
@Setup
public void setUp() throws Exception {
client = new MongoClient(new ServerAddress());
template = new MongoTemplate(client, DB_NAME);
List<RefObject> refObjects = new ArrayList<RefObject>();
for (int i = 0; i < 1; i++) {
RefObject o = new RefObject();
template.save(o);
refObjects.add(o);
}
ObjectWithDBRef singleDBRef = new ObjectWithDBRef();
singleDBRef.ref = refObjects.iterator().next();
template.save(singleDBRef);
ObjectWithDBRef multipleDBRefs = new ObjectWithDBRef();
multipleDBRefs.refList = refObjects;
template.save(multipleDBRefs);
queryObjectWithDBRef = query(where("id").is(singleDBRef.id));
queryObjectWithDBRefList = query(where("id").is(multipleDBRefs.id));
}
@TearDown
public void tearDown() {
client.dropDatabase(DB_NAME);
client.close();
}
@Benchmark // DATAMONGO-1720
public ObjectWithDBRef readSingleDbRef() {
return template.findOne(queryObjectWithDBRef, ObjectWithDBRef.class);
}
@Benchmark // DATAMONGO-1720
public ObjectWithDBRef readMultipleDbRefs() {
return template.findOne(queryObjectWithDBRefList, ObjectWithDBRef.class);
}
@Data
static class ObjectWithDBRef {
private @Id ObjectId id;
private @DBRef RefObject ref;
private @DBRef List<RefObject> refList;
}
@Data
static class RefObject {
private @Id String id;
private String someValue;
}
}

View File

@@ -1,182 +0,0 @@
/*
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.Getter;
import lombok.RequiredArgsConstructor;
import java.util.Arrays;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.UUID;
import org.bson.types.ObjectId;
import org.openjdk.jmh.annotations.Benchmark;
import org.openjdk.jmh.annotations.Scope;
import org.openjdk.jmh.annotations.Setup;
import org.openjdk.jmh.annotations.State;
import org.openjdk.jmh.annotations.TearDown;
import org.springframework.data.annotation.Id;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.microbenchmark.AbstractMicrobenchmark;
import com.mongodb.BasicDBObject;
import com.mongodb.MongoClient;
import com.mongodb.ServerAddress;
import com.mongodb.util.JSON;
/**
* @author Christoph Strobl
*/
@State(Scope.Benchmark)
public class MappingMongoConverterBenchmark extends AbstractMicrobenchmark {
private static final String DB_NAME = "mapping-mongo-converter-benchmark";
private MongoClient client;
private MongoMappingContext mappingContext;
private MappingMongoConverter converter;
private BasicDBObject documentWith2Properties, documentWith2PropertiesAnd1Nested;
private Customer objectWith2PropertiesAnd1Nested;
private BasicDBObject documentWithFlatAndComplexPropertiesPlusListAndMap;
private SlightlyMoreComplexObject objectWithFlatAndComplexPropertiesPlusListAndMap;
@Setup
public void setUp() throws Exception {
client = new MongoClient(new ServerAddress());
this.mappingContext = new MongoMappingContext();
this.mappingContext.setInitialEntitySet(Collections.singleton(Customer.class));
this.mappingContext.afterPropertiesSet();
DbRefResolver dbRefResolver = new DefaultDbRefResolver(new SimpleMongoDbFactory(client, DB_NAME));
this.converter = new MappingMongoConverter(dbRefResolver, mappingContext);
this.converter.setCustomConversions(new CustomConversions(Collections.emptyList()));
this.converter.afterPropertiesSet();
// just a flat document
this.documentWith2Properties = new BasicDBObject("firstname", "Dave").append("lastname", "Matthews");
// document with a nested one
BasicDBObject address = new BasicDBObject("zipCode", "ABCDE").append("city", "Some Place");
this.documentWith2PropertiesAnd1Nested = new BasicDBObject("firstname", "Dave").//
append("lastname", "Matthews").//
append("address", address);
// object equivalent of documentWith2PropertiesAnd1Nested
this.objectWith2PropertiesAnd1Nested = new Customer("Dave", "Matthews", new Address("zipCode", "City"));
// a bit more challenging object with list & map conversion.
objectWithFlatAndComplexPropertiesPlusListAndMap = new SlightlyMoreComplexObject();
objectWithFlatAndComplexPropertiesPlusListAndMap.id = UUID.randomUUID().toString();
objectWithFlatAndComplexPropertiesPlusListAndMap.addressList = Arrays.asList(new Address("zip-1", "city-1"),
new Address("zip-2", "city-2"));
objectWithFlatAndComplexPropertiesPlusListAndMap.customer = objectWith2PropertiesAnd1Nested;
objectWithFlatAndComplexPropertiesPlusListAndMap.customerMap = new LinkedHashMap<String, Customer>();
objectWithFlatAndComplexPropertiesPlusListAndMap.customerMap.put("dave", objectWith2PropertiesAnd1Nested);
objectWithFlatAndComplexPropertiesPlusListAndMap.customerMap.put("deborah",
new Customer("Deborah Anne", "Dyer", new Address("?", "london")));
objectWithFlatAndComplexPropertiesPlusListAndMap.customerMap.put("eddie",
new Customer("Eddie", "Vedder", new Address("??", "Seattle")));
objectWithFlatAndComplexPropertiesPlusListAndMap.intOne = Integer.MIN_VALUE;
objectWithFlatAndComplexPropertiesPlusListAndMap.intTwo = Integer.MAX_VALUE;
objectWithFlatAndComplexPropertiesPlusListAndMap.location = new Point(-33.865143, 151.209900);
objectWithFlatAndComplexPropertiesPlusListAndMap.renamedField = "supercalifragilisticexpialidocious";
objectWithFlatAndComplexPropertiesPlusListAndMap.stringOne = "¯\\_(ツ)_/¯";
objectWithFlatAndComplexPropertiesPlusListAndMap.stringTwo = " (╯°□°)╯︵ ┻━┻";
// JSON equivalent of objectWithFlatAndComplexPropertiesPlusListAndMap
documentWithFlatAndComplexPropertiesPlusListAndMap = (BasicDBObject) JSON.parse(
"{ \"_id\" : \"517f6aee-e9e0-44f0-88ed-f3694a019f27\", \"intOne\" : -2147483648, \"intTwo\" : 2147483647, \"stringOne\" : \"¯\\\\_(ツ)_/¯\", \"stringTwo\" : \" (╯°□°)╯︵ ┻━┻\", \"explicit-field-name\" : \"supercalifragilisticexpialidocious\", \"location\" : { \"x\" : -33.865143, \"y\" : 151.2099 }, \"objectWith2PropertiesAnd1Nested\" : { \"firstname\" : \"Dave\", \"lastname\" : \"Matthews\", \"address\" : { \"zipCode\" : \"zipCode\", \"city\" : \"City\" } }, \"addressList\" : [{ \"zipCode\" : \"zip-1\", \"city\" : \"city-1\" }, { \"zipCode\" : \"zip-2\", \"city\" : \"city-2\" }], \"customerMap\" : { \"dave\" : { \"firstname\" : \"Dave\", \"lastname\" : \"Matthews\", \"address\" : { \"zipCode\" : \"zipCode\", \"city\" : \"City\" } }, \"deborah\" : { \"firstname\" : \"Deborah Anne\", \"lastname\" : \"Dyer\", \"address\" : { \"zipCode\" : \"?\", \"city\" : \"london\" } }, \"eddie\" : { \"firstname\" : \"Eddie\", \"lastname\" : \"Vedder\", \"address\" : { \"zipCode\" : \"??\", \"city\" : \"Seattle\" } } }, \"_class\" : \"org.springframework.data.mongodb.core.convert.MappingMongoConverterBenchmark$SlightlyMoreComplexObject\" }");
}
@TearDown
public void tearDown() {
client.dropDatabase(DB_NAME);
client.close();
}
@Benchmark // DATAMONGO-1720
public Customer readObjectWith2Properties() {
return converter.read(Customer.class, documentWith2Properties);
}
@Benchmark // DATAMONGO-1720
public Customer readObjectWith2PropertiesAnd1NestedObject() {
return converter.read(Customer.class, documentWith2PropertiesAnd1Nested);
}
@Benchmark // DATAMONGO-1720
public BasicDBObject writeObjectWith2PropertiesAnd1NestedObject() {
BasicDBObject sink = new BasicDBObject();
converter.write(objectWith2PropertiesAnd1Nested, sink);
return sink;
}
@Benchmark // DATAMONGO-1720
public Object readObjectWithListAndMapsOfComplexType() {
return converter.read(SlightlyMoreComplexObject.class, documentWithFlatAndComplexPropertiesPlusListAndMap);
}
@Benchmark // DATAMONGO-1720
public Object writeObjectWithListAndMapsOfComplexType() {
BasicDBObject sink = new BasicDBObject();
converter.write(objectWithFlatAndComplexPropertiesPlusListAndMap, sink);
return sink;
}
@Getter
@RequiredArgsConstructor
static class Customer {
private @Id ObjectId id;
private final String firstname, lastname;
private final Address address;
}
@Getter
@AllArgsConstructor
static class Address {
private String zipCode, city;
}
@Data
static class SlightlyMoreComplexObject {
@Id String id;
int intOne, intTwo;
String stringOne, stringTwo;
@Field("explicit-field-name") String renamedField;
Point location;
Customer customer;
List<Address> addressList;
Map<String, Customer> customerMap;
}
}

View File

@@ -1,329 +0,0 @@
/*
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.microbenchmark;
import java.io.File;
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.Collection;
import java.util.Date;
import org.junit.Test;
import org.openjdk.jmh.annotations.Fork;
import org.openjdk.jmh.annotations.Measurement;
import org.openjdk.jmh.annotations.Scope;
import org.openjdk.jmh.annotations.State;
import org.openjdk.jmh.annotations.Warmup;
import org.openjdk.jmh.results.RunResult;
import org.openjdk.jmh.results.format.ResultFormatType;
import org.openjdk.jmh.runner.Runner;
import org.openjdk.jmh.runner.options.ChainedOptionsBuilder;
import org.openjdk.jmh.runner.options.OptionsBuilder;
import org.openjdk.jmh.runner.options.TimeValue;
import org.springframework.core.env.StandardEnvironment;
import org.springframework.data.mongodb.microbenchmark.ResultsWriter.Utils;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ResourceUtils;
import org.springframework.util.StringUtils;
/**
* @author Christoph Strobl
*/
@Warmup(iterations = AbstractMicrobenchmark.WARMUP_ITERATIONS)
@Measurement(iterations = AbstractMicrobenchmark.MEASUREMENT_ITERATIONS)
@Fork(AbstractMicrobenchmark.FORKS)
@State(Scope.Thread)
public class AbstractMicrobenchmark {
static final int WARMUP_ITERATIONS = 5;
static final int MEASUREMENT_ITERATIONS = 10;
static final int FORKS = 1;
static final String[] JVM_ARGS = { "-server", "-XX:+HeapDumpOnOutOfMemoryError", "-Xms1024m", "-Xmx1024m",
"-XX:MaxDirectMemorySize=1024m" };
private final StandardEnvironment environment = new StandardEnvironment();
/**
* Run matching {@link org.openjdk.jmh.annotations.Benchmark} methods with options collected from
* {@link org.springframework.core.env.Environment}.
*
* @throws Exception
* @see #options(String)
*/
@Test
public void run() throws Exception {
String includes = includes();
if (!includes.contains(org.springframework.util.ClassUtils.getShortName(getClass()))) {
return;
}
publishResults(new Runner(options(includes).build()).run());
}
/**
* Get the regex for all benchmarks to be included in the run. By default every benchmark within classes matching the
* current ones short name. <br />
* The {@literal benchmark} command line argument allows overriding the defaults using {@code #} as class / method
* name separator.
*
* @return never {@literal null}.
* @see org.springframework.util.ClassUtils#getShortName(Class)
*/
protected String includes() {
String tests = environment.getProperty("benchmark", String.class);
if (!StringUtils.hasText(tests)) {
return ".*" + org.springframework.util.ClassUtils.getShortName(getClass()) + ".*";
}
if (!tests.contains("#")) {
return ".*" + tests + ".*";
}
String[] args = tests.split("#");
return ".*" + args[0] + "." + args[1];
}
/**
* Collect all options for the {@link Runner}.
*
* @param includes regex for matching benchmarks to be included in the run.
* @return never {@literal null}.
* @throws Exception
*/
protected ChainedOptionsBuilder options(String includes) throws Exception {
ChainedOptionsBuilder optionsBuilder = new OptionsBuilder().include(includes).jvmArgs(jvmArgs());
optionsBuilder = warmup(optionsBuilder);
optionsBuilder = measure(optionsBuilder);
optionsBuilder = forks(optionsBuilder);
optionsBuilder = report(optionsBuilder);
return optionsBuilder;
}
/**
* JVM args to apply to {@link Runner} via its {@link org.openjdk.jmh.runner.options.Options}.
*
* @return {@link #JVM_ARGS} by default.
*/
protected String[] jvmArgs() {
String[] args = new String[JVM_ARGS.length];
System.arraycopy(JVM_ARGS, 0, args, 0, JVM_ARGS.length);
return args;
}
/**
* Read {@code warmupIterations} property from {@link org.springframework.core.env.Environment}.
*
* @return -1 if not set.
*/
protected int getWarmupIterations() {
return environment.getProperty("warmupIterations", Integer.class, -1);
}
/**
* Read {@code measurementIterations} property from {@link org.springframework.core.env.Environment}.
*
* @return -1 if not set.
*/
protected int getMeasurementIterations() {
return environment.getProperty("measurementIterations", Integer.class, -1);
}
/**
* Read {@code forks} property from {@link org.springframework.core.env.Environment}.
*
* @return -1 if not set.
*/
protected int getForksCount() {
return environment.getProperty("forks", Integer.class, -1);
}
/**
* Read {@code benchmarkReportDir} property from {@link org.springframework.core.env.Environment}.
*
* @return {@literal null} if not set.
*/
protected String getReportDirectory() {
return environment.getProperty("benchmarkReportDir");
}
/**
* Read {@code measurementTime} property from {@link org.springframework.core.env.Environment}.
*
* @return -1 if not set.
*/
protected long getMeasurementTime() {
return environment.getProperty("measurementTime", Long.class, -1L);
}
/**
* Read {@code warmupTime} property from {@link org.springframework.core.env.Environment}.
*
* @return -1 if not set.
*/
protected long getWarmupTime() {
return environment.getProperty("warmupTime", Long.class, -1L);
}
/**
* {@code project.version_yyyy-MM-dd_ClassName.json} eg.
* {@literal 1.11.0.BUILD-SNAPSHOT_2017-03-07_MappingMongoConverterBenchmark.json}
*
* @return
*/
protected String reportFilename() {
StringBuilder sb = new StringBuilder();
if (environment.containsProperty("project.version")) {
sb.append(environment.getProperty("project.version"));
sb.append("_");
}
sb.append(new SimpleDateFormat("yyyy-MM-dd").format(new Date()));
sb.append("_");
sb.append(org.springframework.util.ClassUtils.getShortName(getClass()));
sb.append(".json");
return sb.toString();
}
/**
* Apply measurement options to {@link ChainedOptionsBuilder}.
*
* @param optionsBuilder must not be {@literal null}.
* @return {@link ChainedOptionsBuilder} with options applied.
* @see #getMeasurementIterations()
* @see #getMeasurementTime()
*/
private ChainedOptionsBuilder measure(ChainedOptionsBuilder optionsBuilder) {
int measurementIterations = getMeasurementIterations();
long measurementTime = getMeasurementTime();
if (measurementIterations > 0) {
optionsBuilder = optionsBuilder.measurementIterations(measurementIterations);
}
if (measurementTime > 0) {
optionsBuilder = optionsBuilder.measurementTime(TimeValue.seconds(measurementTime));
}
return optionsBuilder;
}
/**
* Apply warmup options to {@link ChainedOptionsBuilder}.
*
* @param optionsBuilder must not be {@literal null}.
* @return {@link ChainedOptionsBuilder} with options applied.
* @see #getWarmupIterations()
* @see #getWarmupTime()
*/
private ChainedOptionsBuilder warmup(ChainedOptionsBuilder optionsBuilder) {
int warmupIterations = getWarmupIterations();
long warmupTime = getWarmupTime();
if (warmupIterations > 0) {
optionsBuilder = optionsBuilder.warmupIterations(warmupIterations);
}
if (warmupTime > 0) {
optionsBuilder = optionsBuilder.warmupTime(TimeValue.seconds(warmupTime));
}
return optionsBuilder;
}
/**
* Apply forks option to {@link ChainedOptionsBuilder}.
*
* @param optionsBuilder must not be {@literal null}.
* @return {@link ChainedOptionsBuilder} with options applied.
* @see #getForksCount()
*/
private ChainedOptionsBuilder forks(ChainedOptionsBuilder optionsBuilder) {
int forks = getForksCount();
if (forks <= 0) {
return optionsBuilder;
}
return optionsBuilder.forks(forks);
}
/**
* Apply report option to {@link ChainedOptionsBuilder}.
*
* @param optionsBuilder must not be {@literal null}.
* @return {@link ChainedOptionsBuilder} with options applied.
* @throws IOException if report file cannot be created.
* @see #getReportDirectory()
*/
private ChainedOptionsBuilder report(ChainedOptionsBuilder optionsBuilder) throws IOException {
String reportDir = getReportDirectory();
if (!StringUtils.hasText(reportDir)) {
return optionsBuilder;
}
String reportFilePath = reportDir + (reportDir.endsWith(File.separator) ? "" : File.separator) + reportFilename();
File file = ResourceUtils.getFile(reportFilePath);
if (file.exists()) {
file.delete();
} else {
file.getParentFile().mkdirs();
file.createNewFile();
}
optionsBuilder.resultFormat(ResultFormatType.JSON);
optionsBuilder.result(reportFilePath);
return optionsBuilder;
}
/**
* Publish results to an external system.
*
* @param results must not be {@literal null}.
*/
private void publishResults(Collection<RunResult> results) {
if (CollectionUtils.isEmpty(results) || !environment.containsProperty("publishTo")) {
return;
}
String uri = environment.getProperty("publishTo");
try {
Utils.forUri(uri).write(results);
} catch (Exception e) {
System.err.println(String.format("Cannot save benchmark results to '%s'. Error was %s.", uri, e));
}
}
}

View File

@@ -1,86 +0,0 @@
/*
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.microbenchmark;
import lombok.SneakyThrows;
import java.io.OutputStream;
import java.net.HttpURLConnection;
import java.net.URL;
import java.net.URLConnection;
import java.nio.charset.Charset;
import java.util.Collection;
import org.openjdk.jmh.results.RunResult;
import org.springframework.core.env.StandardEnvironment;
import org.springframework.util.CollectionUtils;
/**
* {@link ResultsWriter} implementation of {@link URLConnection}.
*
* @since 2.0
*/
class HttpResultsWriter implements ResultsWriter {
private final String url;
HttpResultsWriter(String url) {
this.url = url;
}
@Override
@SneakyThrows
public void write(Collection<RunResult> results) {
if (CollectionUtils.isEmpty(results)) {
return;
}
StandardEnvironment env = new StandardEnvironment();
String projectVersion = env.getProperty("project.version", "unknown");
String gitBranch = env.getProperty("git.branch", "unknown");
String gitDirty = env.getProperty("git.dirty", "no");
String gitCommitId = env.getProperty("git.commit.id", "unknown");
HttpURLConnection connection = (HttpURLConnection) new URL(url).openConnection();
connection.setConnectTimeout(1000);
connection.setReadTimeout(1000);
connection.setDoOutput(true);
connection.setRequestMethod("POST");
connection.setRequestProperty("Content-Type", "application/json");
connection.addRequestProperty("X-Project-Version", projectVersion);
connection.addRequestProperty("X-Git-Branch", gitBranch);
connection.addRequestProperty("X-Git-Dirty", gitDirty);
connection.addRequestProperty("X-Git-Commit-Id", gitCommitId);
OutputStream output = null;
try {
output = connection.getOutputStream();
output.write(ResultsWriter.Utils.jsonifyResults(results).getBytes(Charset.forName("UTF-8")));
} finally {
if (output != null) {
output.close();
}
}
if (connection.getResponseCode() >= 400) {
throw new IllegalStateException(
String.format("Status %d %s", connection.getResponseCode(), connection.getResponseMessage()));
}
}
}

View File

@@ -1,135 +0,0 @@
/*
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.microbenchmark;
import java.net.UnknownHostException;
import java.util.Collection;
import java.util.Date;
import java.util.List;
import org.openjdk.jmh.results.RunResult;
import org.springframework.core.env.StandardEnvironment;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientURI;
import com.mongodb.util.JSON;
/**
* MongoDB specific {@link ResultsWriter} implementation.
*
* @author Christoph Strobl
* @since 2.0
*/
class MongoResultsWriter implements ResultsWriter {
private final String uri;
MongoResultsWriter(String uri) {
this.uri = uri;
}
@Override
public void write(Collection<RunResult> results) {
Date now = new Date();
StandardEnvironment env = new StandardEnvironment();
String projectVersion = env.getProperty("project.version", "unknown");
String gitBranch = env.getProperty("git.branch", "unknown");
String gitDirty = env.getProperty("git.dirty", "no");
String gitCommitId = env.getProperty("git.commit.id", "unknown");
MongoClientURI uri = new MongoClientURI(this.uri);
MongoClient client = null;
try {
client = new MongoClient(uri);
} catch (UnknownHostException e) {
throw new RuntimeException(e);
}
String dbName = StringUtils.hasText(uri.getDatabase()) ? uri.getDatabase() : "spring-data-mongodb-benchmarks";
DB db = client.getDB(dbName);
for (BasicDBObject dbo : (List<BasicDBObject>) JSON.parse(Utils.jsonifyResults(results))) {
String collectionName = extractClass(dbo.get("benchmark").toString());
BasicDBObject sink = new BasicDBObject();
sink.append("_version", projectVersion);
sink.append("_branch", gitBranch);
sink.append("_commit", gitCommitId);
sink.append("_dirty", gitDirty);
sink.append("_method", extractBenchmarkName(dbo.get("benchmark").toString()));
sink.append("_date", now);
sink.append("_snapshot", projectVersion.toLowerCase().contains("snapshot"));
sink.putAll(dbo.toMap());
db.getCollection(collectionName).insert(fixDocumentKeys(sink));
}
client.close();
}
/**
* Replace {@code .} by {@code ,}.
*
* @param doc
* @return
*/
private BasicDBObject fixDocumentKeys(BasicDBObject doc) {
BasicDBObject sanitized = new BasicDBObject();
for (Object key : doc.keySet()) {
Object value = doc.get(key);
if (value instanceof BasicDBObject) {
value = fixDocumentKeys((BasicDBObject) value);
}
if (key instanceof String) {
String newKey = (String) key;
if (newKey.contains(".")) {
newKey = newKey.replace('.', ',');
}
sanitized.put(newKey, value);
} else {
sanitized.put(ObjectUtils.nullSafeToString(key).replace('.', ','), value);
}
}
return sanitized;
}
private static String extractClass(String source) {
String tmp = source.substring(0, source.lastIndexOf('.'));
return tmp.substring(tmp.lastIndexOf(".") + 1);
}
private static String extractBenchmarkName(String source) {
return source.substring(source.lastIndexOf(".") + 1);
}
}

View File

@@ -1,71 +0,0 @@
/*
* Copyright 2017-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.microbenchmark;
import lombok.SneakyThrows;
import java.io.ByteArrayOutputStream;
import java.io.PrintStream;
import java.nio.charset.Charset;
import java.util.Collection;
import org.openjdk.jmh.results.RunResult;
import org.openjdk.jmh.results.format.ResultFormatFactory;
import org.openjdk.jmh.results.format.ResultFormatType;
/**
* @author Christoph Strobl
* @since 2.0
*/
interface ResultsWriter {
/**
* Write the {@link RunResult}s.
*
* @param results can be {@literal null}.
*/
void write(Collection<RunResult> results);
/* non Java8 hack */
class Utils {
/**
* Get the uri specific {@link ResultsWriter}.
*
* @param uri must not be {@literal null}.
* @return
*/
static ResultsWriter forUri(String uri) {
return uri.startsWith("mongodb:") ? new MongoResultsWriter(uri) : new HttpResultsWriter(uri);
}
/**
* Convert {@link RunResult}s to JMH Json representation.
*
* @param results
* @return json string representation of results.
* @see org.openjdk.jmh.results.format.JSONResultFormat
*/
@SneakyThrows
static String jsonifyResults(Collection<RunResult> results) {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ResultFormatFactory.getInstance(ResultFormatType.JSON, new PrintStream(baos, true, "UTF-8")).writeOut(results);
return new String(baos.toByteArray(), Charset.forName("UTF-8"));
}
}
}

View File

@@ -1,14 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="console" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d %5p %40.40c:%4L - %m%n</pattern>
</encoder>
</appender>
<root level="error">
<appender-ref ref="console" />
</root>
</configuration>

View File

@@ -1,12 +1,12 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/maven-v4_0_0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.24.BUILD-SNAPSHOT</version>
<version>2.0.0.M2</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -14,9 +14,8 @@
<name>Spring Data MongoDB - Cross-Store Support</name>
<properties>
<jpa>2.0.0</jpa>
<hibernate>3.6.10.Final</hibernate>
<project.root>${basedir}/..</project.root>
<jpa>2.1.1</jpa>
<hibernate>5.2.1.Final</hibernate>
</properties>
<dependencies>
@@ -49,7 +48,15 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.10.24.BUILD-SNAPSHOT</version>
<version>2.0.0.M2</version>
</dependency>
<!-- reactive -->
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-core</artifactId>
<version>${reactor}</version>
<optional>true</optional>
</dependency>
<dependency>
@@ -88,7 +95,7 @@
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<version>4.0.2.GA</version>
<version>5.2.4.Final</version>
<scope>test</scope>
</dependency>

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,197 +1,212 @@
/*
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManagerFactory;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.crossstore.ChangeSet;
import org.springframework.data.crossstore.ChangeSetBacked;
import org.springframework.data.crossstore.ChangeSetPersister;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.util.ClassUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.MongoException;
/**
* @author Thomas Risberg
* @author Oliver Gierke
* @author Alex Vengrovsk
* @author Mark Paluch
*/
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_CLASS = "_entity_class";
private static final String ENTITY_ID = "_entity_id";
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
private final Logger log = LoggerFactory.getLogger(getClass());
private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory;
public void setMongoTemplate(MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
public void setEntityManagerFactory(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentState(java.lang.Class, java.lang.Object, org.springframework.data.crossstore.ChangeSet)
*/
public void getPersistentState(Class<? extends ChangeSetBacked> entityClass, Object id, final ChangeSet changeSet)
throws DataAccessException, NotFoundException {
if (id == null) {
log.debug("Unable to load MongoDB data for null id");
return;
}
String collName = getCollectionNameForEntity(entityClass);
final DBObject dbk = new BasicDBObject();
dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for {}", dbk);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
for (DBObject dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) {
log.debug("Processing key: {}", key);
}
if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) {
throw new DataIntegrityViolationException(
"Unble to convert property " + key + ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
}
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: {}", key);
}
changeSet.set(key, value);
}
}
return null;
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (log.isDebugEnabled()) {
log.debug("getPersistentId called on {}", entity);
}
if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
}
return entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity);
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#persistState(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object persistState(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (cs == null) {
log.debug("Flush: changeset was null, nothing to flush.");
return 0L;
}
if (log.isDebugEnabled()) {
log.debug("Flush: changeset: {}", cs.getValues());
}
String collName = getCollectionNameForEntity(entity.getClass());
if (mongoTemplate.getCollection(collName) == null) {
mongoTemplate.createCollection(collName);
}
for (String key : cs.getValues().keySet()) {
if (key != null && !key.startsWith("_") && !key.equals(ChangeSetPersister.ID_KEY)) {
Object value = cs.getValues().get(key);
final DBObject dbQuery = new BasicDBObject();
dbQuery.put(ENTITY_ID, getPersistentId(entity, cs));
dbQuery.put(ENTITY_CLASS, entity.getClass().getName());
dbQuery.put(ENTITY_FIELD_NAME, key);
DBObject dbId = mongoTemplate.execute(collName, new CollectionCallback<DBObject>() {
public DBObject doInCollection(DBCollection collection) throws MongoException, DataAccessException {
return collection.findOne(dbQuery);
}
});
if (value == null) {
if (log.isDebugEnabled()) {
log.debug("Flush: removing: {}", dbQuery);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
collection.remove(dbQuery);
return null;
}
});
} else {
final DBObject dbDoc = new BasicDBObject();
dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) {
log.debug("Flush: saving: {}", dbQuery);
}
mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());
if (dbId != null) {
dbDoc.put("_id", dbId.get("_id"));
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
collection.save(dbDoc);
return null;
}
});
}
}
}
return 0L;
}
/**
* Returns the collection the given entity type shall be persisted to.
*
* @param entityClass must not be {@literal null}.
* @return
*/
private String getCollectionNameForEntity(Class<? extends ChangeSetBacked> entityClass) {
return mongoTemplate.getCollectionName(entityClass);
}
}
/*
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManagerFactory;
import org.bson.Document;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.crossstore.ChangeSet;
import org.springframework.data.crossstore.ChangeSetBacked;
import org.springframework.data.crossstore.ChangeSetPersister;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.util.ClassUtils;
import com.mongodb.MongoException;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.model.Filters;
import com.mongodb.client.result.DeleteResult;
/**
* @author Thomas Risberg
* @author Oliver Gierke
* @author Alex Vengrovsk
* @author Mark Paluch
*/
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_CLASS = "_entity_class";
private static final String ENTITY_ID = "_entity_id";
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
private final Logger log = LoggerFactory.getLogger(getClass());
private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory;
public void setMongoTemplate(MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
public void setEntityManagerFactory(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentState(java.lang.Class, java.lang.Object, org.springframework.data.crossstore.ChangeSet)
*/
public void getPersistentState(Class<? extends ChangeSetBacked> entityClass, Object id, final ChangeSet changeSet)
throws DataAccessException, NotFoundException {
if (id == null) {
log.debug("Unable to load MongoDB data for null id");
return;
}
String collName = getCollectionNameForEntity(entityClass);
final Document dbk = new Document();
dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for {}", dbk);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException {
for (Document dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) {
log.debug("Processing key: {}", key);
}
if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) {
throw new DataIntegrityViolationException(
"Unble to convert property " + key + ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
}
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: {}", key);
}
changeSet.set(key, value);
}
}
return null;
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (log.isDebugEnabled()) {
log.debug("getPersistentId called on {}", entity);
}
if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
}
return entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity);
}
/*
* (non-Javadoc)
* @see org.springframework.data.crossstore.ChangeSetPersister#persistState(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object persistState(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (cs == null) {
log.debug("Flush: changeset was null, nothing to flush.");
return 0L;
}
if (log.isDebugEnabled()) {
log.debug("Flush: changeset: {}", cs.getValues());
}
String collName = getCollectionNameForEntity(entity.getClass());
if (mongoTemplate.getCollection(collName) == null) {
mongoTemplate.createCollection(collName);
}
for (String key : cs.getValues().keySet()) {
if (key != null && !key.startsWith("_") && !key.equals(ChangeSetPersister.ID_KEY)) {
Object value = cs.getValues().get(key);
final Document dbQuery = new Document();
dbQuery.put(ENTITY_ID, getPersistentId(entity, cs));
dbQuery.put(ENTITY_CLASS, entity.getClass().getName());
dbQuery.put(ENTITY_FIELD_NAME, key);
final Document dbId = mongoTemplate.execute(collName, new CollectionCallback<Document>() {
public Document doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
Document id = collection.find(dbQuery).first();
return id;
}
});
if (value == null) {
if (log.isDebugEnabled()) {
log.debug("Flush: removing: {}", dbQuery);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
DeleteResult dr = collection.deleteMany(dbQuery);
return null;
}
});
} else {
final Document dbDoc = new Document();
dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) {
log.debug("Flush: saving: {}", dbQuery);
}
mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());
if (dbId != null) {
dbDoc.put("_id", dbId.get("_id"));
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
if (dbId != null) {
collection.replaceOne(Filters.eq("_id", dbId.get("_id")), dbDoc);
} else {
if (dbDoc.containsKey("_id") && dbDoc.get("_id") == null) {
dbDoc.remove("_id");
}
collection.insertOne(dbDoc);
}
return null;
}
});
}
}
}
return 0L;
}
/**
* Returns the collection the given entity type shall be persisted to.
*
* @param entityClass must not be {@literal null}.
* @return
*/
private String getCollectionNameForEntity(Class<? extends ChangeSetBacked> entityClass) {
return mongoTemplate.getCollectionName(entityClass);
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import org.bson.Document;
import org.junit.After;
import org.junit.Assert;
import org.junit.Before;
@@ -36,8 +37,6 @@ import org.springframework.transaction.annotation.Transactional;
import org.springframework.transaction.support.TransactionCallback;
import org.springframework.transaction.support.TransactionTemplate;
import com.mongodb.DBObject;
/**
* Integration tests for MongoDB cross-store persistence (mainly {@link MongoChangeSetPersister}).
*
@@ -48,14 +47,11 @@ import com.mongodb.DBObject;
@ContextConfiguration("classpath:/META-INF/spring/applicationContext.xml")
public class CrossStoreMongoTests {
@Autowired
MongoTemplate mongoTemplate;
@Autowired MongoTemplate mongoTemplate;
@PersistenceContext
EntityManager entityManager;
@PersistenceContext EntityManager entityManager;
@Autowired
PlatformTransactionManager transactionManager;
@Autowired PlatformTransactionManager transactionManager;
TransactionTemplate txTemplate;
@Before
@@ -187,7 +183,7 @@ public class CrossStoreMongoTests {
boolean weFound3 = false;
for (DBObject dbo : this.mongoTemplate.getCollection(mongoTemplate.getCollectionName(Person.class)).find()) {
for (Document dbo : this.mongoTemplate.getCollection(mongoTemplate.getCollectionName(Person.class)).find()) {
Assert.assertTrue(!dbo.get("_entity_id").equals(2L));
if (dbo.get("_entity_id").equals(3L)) {
weFound3 = true;

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -20,13 +20,13 @@
<mongo:mapping-converter/>
<!-- Mongo config -->
<bean id="mongo" class="org.springframework.data.mongodb.core.MongoFactoryBean">
<bean id="mongoClient" class="org.springframework.data.mongodb.core.MongoClientFactoryBean">
<property name="host" value="localhost"/>
<property name="port" value="27017"/>
</bean>
<bean id="mongoDbFactory" class="org.springframework.data.mongodb.core.SimpleMongoDbFactory">
<constructor-arg name="mongo" ref="mongo"/>
<constructor-arg name="mongoClient" ref="mongoClient"/>
<constructor-arg name="databaseName" value="database"/>
</bean>

View File

@@ -1,18 +0,0 @@
Bundle-SymbolicName: org.springframework.data.mongodb.crossstore
Bundle-Name: Spring Data MongoDB Cross Store Support
Bundle-Vendor: Pivotal Software, Inc.
Bundle-ManifestVersion: 2
Import-Package:
sun.reflect;version="0";resolution:=optional
Export-Template:
org.springframework.data.mongodb.crossstore.*;version="${project.version}"
Import-Template:
com.mongodb.*;version="${mongo.osgi:[=.=.=,+1.0.0)}",
javax.persistence.*;version="${jpa:[=.=.=,+1.0.0)}",
org.aspectj.*;version="${aspectj:[1.0.0, 2.0.0)}",
org.bson.*;version="0",
org.slf4j.*;version="${slf4j:[=.=.=,+1.0.0)}",
org.springframework.*;version="${spring:[=.=.=.=,+1.0.0)}",
org.springframework.data.*;version="${springdata.commons:[=.=.=.=,+1.0.0)}",
org.springframework.data.mongodb.*;version="${project.version:[=.=.=.=,+1.0.0)}",
org.w3c.dom.*;version="0"

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.24.BUILD-SNAPSHOT</version>
<version>2.0.0.M2</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -28,6 +28,10 @@
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>wagon-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>

View File

@@ -1,11 +1,11 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/maven-v4_0_0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.24.BUILD-SNAPSHOT</version>
<version>2.0.0.M2</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -14,7 +14,6 @@
<properties>
<log4j>1.2.16</log4j>
<project.root>${basedir}/..</project.root>
</properties>
<dependencies>

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016-2019 the original author or authors.
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2019 the original author or authors.
* Copyright 2013-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,9 +0,0 @@
Bundle-SymbolicName: org.springframework.data.mongodb.log4j
Bundle-Name: Spring Data Mongo DB Log4J Appender
Bundle-Vendor: Pivotal Software, Inc.
Bundle-ManifestVersion: 2
Import-Package:
sun.reflect;version="0";resolution:=optional
Import-Template:
com.mongodb.*;version="${mongo.osgi:[=.=.=,+1.0.0)}",
org.apache.log4j.*;version="${log4j:[=.=.=,+1.0.0)}"

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
@@ -11,14 +11,13 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.24.BUILD-SNAPSHOT</version>
<version>2.0.0.M2</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<objenesis>1.3</objenesis>
<equalsverifier>1.5</equalsverifier>
<project.root>${basedir}/..</project.root>
</properties>
<dependencies>
@@ -79,6 +78,66 @@
<optional>true</optional>
</dependency>
<!-- reactive -->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-reactivestreams</artifactId>
<version>${mongo.reactivestreams}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-async</artifactId>
<version>${mongo}</version>
<optional>true</optional>
<exclusions>
<exclusion>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-core</artifactId>
</exclusion>
<exclusion>
<groupId>org.mongodb</groupId>
<artifactId>bson</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-core</artifactId>
<version>${reactor}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.projectreactor.addons</groupId>
<artifactId>reactor-test</artifactId>
<version>${reactor}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex</groupId>
<artifactId>rxjava</artifactId>
<version>${rxjava}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex</groupId>
<artifactId>rxjava-reactive-streams</artifactId>
<version>${rxjava-reactive-streams}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex.rxjava2</groupId>
<artifactId>rxjava</artifactId>
<version>${rxjava2}</version>
<optional>true</optional>
</dependency>
<!-- CDI -->
<dependency>
<groupId>javax.enterprise</groupId>
@@ -127,7 +186,7 @@
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<version>4.2.0.Final</version>
<version>5.2.4.Final</version>
<scope>test</scope>
</dependency>
@@ -148,6 +207,7 @@
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson}</version>
<optional>true</optional>
</dependency>
@@ -212,9 +272,11 @@
</includes>
<excludes>
<exclude>**/PerformanceTests.java</exclude>
<exclude>**/ReactivePerformanceTests.java</exclude>
</excludes>
<systemPropertyVariables>
<java.util.logging.config.file>src/test/resources/logging.properties</java.util.logging.config.file>
<reactor.trace.cancel>true</reactor.trace.cancel>
</systemPropertyVariables>
<properties>
<property>

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2019 the original author or authors.
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2019 the original author or authors.
* Copyright 2010-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2019 the original author or authors.
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2019 the original author or authors.
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -20,10 +20,11 @@ import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.MongoExceptionTranslator;
import com.mongodb.DB;
import com.mongodb.client.MongoDatabase;
/**
* Interface for factories creating {@link DB} instances.
*
*
* @author Mark Pollack
* @author Thomas Darimont
*/
@@ -31,25 +32,27 @@ public interface MongoDbFactory {
/**
* Creates a default {@link DB} instance.
*
*
* @return
* @throws DataAccessException
*/
DB getDb() throws DataAccessException;
MongoDatabase getDb() throws DataAccessException;
/**
* Creates a {@link DB} instance to access the database with the given name.
*
*
* @param dbName must not be {@literal null} or empty.
* @return
* @throws DataAccessException
*/
DB getDb(String dbName) throws DataAccessException;
MongoDatabase getDb(String dbName) throws DataAccessException;
/**
* Exposes a shared {@link MongoExceptionTranslator}.
*
*
* @return will never be {@literal null}.
*/
PersistenceExceptionTranslator getExceptionTranslator();
DB getLegacyDb();
}

View File

@@ -0,0 +1,56 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.MongoExceptionTranslator;
import com.mongodb.reactivestreams.client.MongoDatabase;
/**
* Interface for factories creating reactive {@link MongoDatabase} instances.
*
* @author Mark Paluch
* @since 2.0
*/
public interface ReactiveMongoDatabaseFactory {
/**
* Creates a default {@link MongoDatabase} instance.
*
* @return
* @throws DataAccessException
*/
MongoDatabase getMongoDatabase() throws DataAccessException;
/**
* Creates a {@link MongoDatabase} instance to access the database with the given name.
*
* @param dbName must not be {@literal null} or empty.
* @return
* @throws DataAccessException
*/
MongoDatabase getMongoDatabase(String dbName) throws DataAccessException;
/**
* Exposes a shared {@link MongoExceptionTranslator}.
*
* @return will never be {@literal null}.
*/
PersistenceExceptionTranslator getExceptionTranslator();
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2019 the original author or authors.
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,297 +1,112 @@
/*
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.support.CachingIsNewStrategyFactory;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Base class for Spring Data MongoDB configuration using JavaConfig.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Ryan Tenney
* @author Christoph Strobl
*/
@Configuration
public abstract class AbstractMongoConfiguration {
/**
* Return the name of the database to connect to.
*
* @return must not be {@literal null}.
*/
protected abstract String getDatabaseName();
/**
* Return the name of the authentication database to use. Defaults to {@literal null} and will turn into the value
* returned by {@link #getDatabaseName()} later on effectively.
*
* @return
* @deprecated since 1.7. {@link MongoClient} should hold authentication data within
* {@link MongoClient#getCredentialsList()}
*/
@Deprecated
protected String getAuthenticationDatabaseName() {
return null;
}
/**
* Return the {@link Mongo} instance to connect to. Annotate with {@link Bean} in case you want to expose a
* {@link Mongo} instance to the {@link org.springframework.context.ApplicationContext}.
*
* @return
* @throws Exception
*/
public abstract Mongo mongo() throws Exception;
/**
* Creates a {@link MongoTemplate}.
*
* @return
* @throws Exception
*/
@Bean
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongoDbFactory(), mappingMongoConverter());
}
/**
* Creates a {@link SimpleMongoDbFactory} to be used by the {@link MongoTemplate}. Will use the {@link Mongo} instance
* configured in {@link #mongo()}.
*
* @see #mongo()
* @see #mongoTemplate()
* @return
* @throws Exception
*/
@Bean
public MongoDbFactory mongoDbFactory() throws Exception {
return new SimpleMongoDbFactory(mongo(), getDatabaseName(), getUserCredentials(), getAuthenticationDatabaseName());
}
/**
* Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration
* class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending
* {@link AbstractMongoConfiguration} the base package will be considered {@code com.acme} unless the method is
* overridden to implement alternate behavior.
*
* @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for
* entities.
* @deprecated use {@link #getMappingBasePackages()} instead.
*/
@Deprecated
protected String getMappingBasePackage() {
Package mappingBasePackage = getClass().getPackage();
return mappingBasePackage == null ? null : mappingBasePackage.getName();
}
/**
* Returns the base packages to scan for MongoDB mapped entities at startup. Will return the package name of the
* configuration class' (the concrete class, not this one here) by default. So if you have a
* {@code com.acme.AppConfig} extending {@link AbstractMongoConfiguration} the base package will be considered
* {@code com.acme} unless the method is overridden to implement alternate behavior.
*
* @return the base packages to scan for mapped {@link Document} classes or an empty collection to not enable scanning
* for entities.
* @since 1.10
*/
protected Collection<String> getMappingBasePackages() {
return Collections.singleton(getMappingBasePackage());
}
/**
* Return {@link UserCredentials} to be used when connecting to the MongoDB instance or {@literal null} if none shall
* be used.
*
* @return
* @deprecated since 1.7. {@link MongoClient} should hold authentication data within
* {@link MongoClient#getCredentialsList()}
*/
@Deprecated
protected UserCredentials getUserCredentials() {
return null;
}
/**
* Creates a {@link MongoMappingContext} equipped with entity classes scanned from the mapping base package.
*
* @see #getMappingBasePackage()
* @return
* @throws ClassNotFoundException
*/
@Bean
public MongoMappingContext mongoMappingContext() throws ClassNotFoundException {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
mappingContext.setFieldNamingStrategy(fieldNamingStrategy());
return mappingContext;
}
/**
* Returns a {@link MappingContextIsNewStrategyFactory} wrapped into a {@link CachingIsNewStrategyFactory}.
*
* @return
* @throws ClassNotFoundException
*/
@Bean
public IsNewStrategyFactory isNewStrategyFactory() throws ClassNotFoundException {
return new CachingIsNewStrategyFactory(new MappingContextIsNewStrategyFactory(mongoMappingContext()));
}
/**
* Register custom {@link Converter}s in a {@link CustomConversions} object if required. These
* {@link CustomConversions} will be registered with the {@link #mappingMongoConverter()} and
* {@link #mongoMappingContext()}. Returns an empty {@link CustomConversions} instance by default.
*
* @return must not be {@literal null}.
*/
@Bean
public CustomConversions customConversions() {
return new CustomConversions(Collections.emptyList());
}
/**
* Creates a {@link MappingMongoConverter} using the configured {@link #mongoDbFactory()} and
* {@link #mongoMappingContext()}. Will get {@link #customConversions()} applied.
*
* @see #customConversions()
* @see #mongoMappingContext()
* @see #mongoDbFactory()
* @return
* @throws Exception
*/
@Bean
public MappingMongoConverter mappingMongoConverter() throws Exception {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext());
converter.setCustomConversions(customConversions());
return converter;
}
/**
* Scans the mapping base package for classes annotated with {@link Document}. By default, it scans for entities in
* all packages returned by {@link #getMappingBasePackages()}.
*
* @see #getMappingBasePackages()
* @return
* @throws ClassNotFoundException
*/
protected Set<Class<?>> getInitialEntitySet() throws ClassNotFoundException {
Set<Class<?>> initialEntitySet = new HashSet<Class<?>>();
for (String basePackage : getMappingBasePackages()) {
initialEntitySet.addAll(scanForEntities(basePackage));
}
return initialEntitySet;
}
/**
* Scans the given base package for entities, i.e. MongoDB specific types annotated with {@link Document} and
* {@link Persistent}.
*
* @param basePackage must not be {@literal null}.
* @return
* @throws ClassNotFoundException
* @since 1.10
*/
protected Set<Class<?>> scanForEntities(String basePackage) throws ClassNotFoundException {
if (!StringUtils.hasText(basePackage)) {
return Collections.emptySet();
}
Set<Class<?>> initialEntitySet = new HashSet<Class<?>>();
if (StringUtils.hasText(basePackage)) {
ClassPathScanningCandidateComponentProvider componentProvider = new ClassPathScanningCandidateComponentProvider(
false);
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Document.class));
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Persistent.class));
for (BeanDefinition candidate : componentProvider.findCandidateComponents(basePackage)) {
initialEntitySet
.add(ClassUtils.forName(candidate.getBeanClassName(), AbstractMongoConfiguration.class.getClassLoader()));
}
}
return initialEntitySet;
}
/**
* Configures whether to abbreviate field names for domain objects by configuring a
* {@link CamelCaseAbbreviatingFieldNamingStrategy} on the {@link MongoMappingContext} instance created. For advanced
* customization needs, consider overriding {@link #mappingMongoConverter()}.
*
* @return
*/
protected boolean abbreviateFieldNames() {
return false;
}
/**
* Configures a {@link FieldNamingStrategy} on the {@link MongoMappingContext} instance created.
*
* @return
* @since 1.5
*/
protected FieldNamingStrategy fieldNamingStrategy() {
return abbreviateFieldNames() ? new CamelCaseAbbreviatingFieldNamingStrategy()
: PropertyNameFieldNamingStrategy.INSTANCE;
}
}
/*
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.Document;
import com.mongodb.MongoClient;
/**
* Base class for Spring Data MongoDB configuration using JavaConfig.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Ryan Tenney
* @author Christoph Strobl
* @author Mark Paluch
* @see MongoConfigurationSupport
*/
@Configuration
public abstract class
AbstractMongoConfiguration extends MongoConfigurationSupport {
/**
* Return the {@link MongoClient} instance to connect to. Annotate with {@link Bean} in case you want to expose a
* {@link MongoClient} instance to the {@link org.springframework.context.ApplicationContext}.
*
* @return
*/
public abstract MongoClient mongoClient();
/**
* Creates a {@link MongoTemplate}.
*
* @return
*/
@Bean
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongoDbFactory(), mappingMongoConverter());
}
/**
* Creates a {@link SimpleMongoDbFactory} to be used by the {@link MongoTemplate}. Will use the {@link MongoClient}
* instance configured in {@link #mongo()}.
*
* @see #mongoClient()
* @see #mongoTemplate()
* @return
*/
@Bean
public MongoDbFactory mongoDbFactory() {
return new SimpleMongoDbFactory(mongoClient(), getDatabaseName());
}
/**
* Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration
* class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending
* {@link AbstractMongoConfiguration} the base package will be considered {@code com.acme} unless the method is
* overridden to implement alternate behavior.
*
* @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for
* entities.
* @deprecated use {@link #getMappingBasePackages()} instead.
*/
@Deprecated
protected String getMappingBasePackage() {
Package mappingBasePackage = getClass().getPackage();
return mappingBasePackage == null ? null : mappingBasePackage.getName();
}
/**
* Creates a {@link MappingMongoConverter} using the configured {@link #mongoDbFactory()} and
* {@link #mongoMappingContext()}. Will get {@link #customConversions()} applied.
*
* @see #customConversions()
* @see #mongoMappingContext()
* @see #mongoDbFactory()
* @return
* @throws Exception
*/
@Bean
public MappingMongoConverter mappingMongoConverter() throws Exception {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext());
converter.setCustomConversions(customConversions());
return converter;
}
}

View File

@@ -0,0 +1,92 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.ReactiveMongoDatabaseFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.ReactiveMongoOperations;
import org.springframework.data.mongodb.core.SimpleReactiveMongoDatabaseFactory;
import org.springframework.data.mongodb.core.ReactiveMongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import com.mongodb.reactivestreams.client.MongoClient;
/**
* Base class for reactive Spring Data MongoDB configuration using JavaConfig.
*
* @author Mark Paluch
* @since 2.0
* @see MongoConfigurationSupport
*/
@Configuration
public abstract class AbstractReactiveMongoConfiguration extends MongoConfigurationSupport {
/**
* Return the {@link MongoClient} instance to connect to. Annotate with {@link Bean} in case you want to expose a
* {@link MongoClient} instance to the {@link org.springframework.context.ApplicationContext}.
*
* @return
*/
public abstract MongoClient mongoClient();
/**
* Creates a {@link ReactiveMongoTemplate}.
*
* @return
*/
@Bean
public ReactiveMongoOperations reactiveMongoTemplate() throws Exception {
return new ReactiveMongoTemplate(mongoDbFactory(), mappingMongoConverter());
}
/**
* Creates a {@link SimpleMongoDbFactory} to be used by the {@link MongoTemplate}. Will use the {@link Mongo} instance
* configured in {@link #mongoClient()}.
*
* @see #mongoClient()
* @see #reactiveMongoTemplate()
* @return
* @throws Exception
*/
@Bean
public ReactiveMongoDatabaseFactory mongoDbFactory() {
return new SimpleReactiveMongoDatabaseFactory(mongoClient(), getDatabaseName());
}
/**
* Creates a {@link MappingMongoConverter} using the configured {@link #mongoDbFactory()} and
* {@link #mongoMappingContext()}. Will get {@link #customConversions()} applied.
*
* @see #customConversions()
* @see #mongoMappingContext()
* @see #mongoDbFactory()
* @return
* @throws Exception
*/
@Bean
public MappingMongoConverter mappingMongoConverter() throws Exception {
MappingMongoConverter converter = new MappingMongoConverter(ReactiveMongoTemplate.NO_OP_REF_RESOLVER,
mongoMappingContext());
converter.setCustomConversions(customConversions());
return converter;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -21,13 +21,14 @@ package org.springframework.data.mongodb.config;
* @author Jon Brisbin
* @author Oliver Gierke
* @author Martin Baumgartner
* @author Christoph Strobl
*/
public abstract class BeanNames {
public static final String MAPPING_CONTEXT_BEAN_NAME = "mongoMappingContext";
static final String INDEX_HELPER_BEAN_NAME = "indexCreationHelper";
static final String MONGO_BEAN_NAME = "mongo";
static final String MONGO_BEAN_NAME = "mongoClient";
static final String DB_FACTORY_BEAN_NAME = "mongoDbFactory";
static final String VALIDATING_EVENT_LISTENER_BEAN_NAME = "validatingMongoEventListener";
static final String IS_NEW_STRATEGY_FACTORY_BEAN_NAME = "isNewStrategyFactory";

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2019 the original author or authors.
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2019 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -55,6 +55,7 @@ import org.springframework.data.mapping.context.MappingContextIsNewStrategyFacto
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
@@ -67,12 +68,13 @@ import org.w3c.dom.Element;
/**
* Bean definition parser for the {@code mapping-converter} element.
*
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Maciej Walkowiak
* @author Thomas Darimont
* @author Christoph Strobl
* @author Mark Paluch
*/
public class MappingMongoConverterParser implements BeanDefinitionParser {
@@ -120,16 +122,22 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
converterBuilder.addPropertyValue("customConversions", conversionsDefinition);
}
if(!registry.containsBeanDefinition("indexOperationsProvider")){
BeanDefinitionBuilder indexOperationsProviderBuilder = BeanDefinitionBuilder.genericBeanDefinition("org.springframework.data.mongodb.core.DefaultIndexOperationsProvider");
indexOperationsProviderBuilder.addConstructorArgReference(dbFactoryRef);
indexOperationsProviderBuilder.addConstructorArgValue(BeanDefinitionBuilder.genericBeanDefinition(QueryMapper.class).addConstructorArgReference(id).getBeanDefinition());
parserContext.registerBeanComponent(new BeanComponentDefinition(indexOperationsProviderBuilder.getBeanDefinition(), "indexOperationsProvider"));
}
try {
registry.getBeanDefinition(INDEX_HELPER_BEAN_NAME);
} catch (NoSuchBeanDefinitionException ignored) {
if (!StringUtils.hasText(dbFactoryRef)) {
dbFactoryRef = DB_FACTORY_BEAN_NAME;
}
BeanDefinitionBuilder indexHelperBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoPersistentEntityIndexCreator.class);
indexHelperBuilder.addConstructorArgReference(ctxRef);
indexHelperBuilder.addConstructorArgReference(dbFactoryRef);
indexHelperBuilder.addConstructorArgReference("indexOperationsProvider");
indexHelperBuilder.addDependsOn(ctxRef);
parserContext.registerBeanComponent(new BeanComponentDefinition(indexHelperBuilder.getBeanDefinition(),
@@ -348,7 +356,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
/**
* {@link TypeFilter} that returns {@literal false} in case any of the given delegates matches.
*
*
* @author Oliver Gierke
*/
private static class NegatingFilter implements TypeFilter {
@@ -357,7 +365,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
/**
* Creates a new {@link NegatingFilter} with the given delegates.
*
*
* @param filters
*/
public NegatingFilter(TypeFilter... filters) {

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2012-2019 the original author or authors.
* Copyright 2012-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2019 the original author or authors.
* Copyright 2013-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2019 the original author or authors.
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -0,0 +1,203 @@
/*
* Copyright 2016-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.context.PersistentEntities;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.support.CachingIsNewStrategyFactory;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Base class for Spring Data MongoDB to be extended for JavaConfiguration usage.
*
* @author Mark Paluch
* @since 2.0
*/
public abstract class MongoConfigurationSupport {
/**
* Return the name of the database to connect to.
*
* @return must not be {@literal null}.
*/
protected abstract String getDatabaseName();
/**
* Returns the base packages to scan for MongoDB mapped entities at startup. Will return the package name of the
* configuration class' (the concrete class, not this one here) by default. So if you have a
* {@code com.acme.AppConfig} extending {@link MongoConfigurationSupport} the base package will be considered
* {@code com.acme} unless the method is overridden to implement alternate behavior.
*
* @return the base packages to scan for mapped {@link Document} classes or an empty collection to not enable scanning
* for entities.
* @since 1.10
*/
protected Collection<String> getMappingBasePackages() {
Package mappingBasePackage = getClass().getPackage();
return Collections.singleton(mappingBasePackage == null ? null : mappingBasePackage.getName());
}
/**
* Creates a {@link MongoMappingContext} equipped with entity classes scanned from the mapping base package.
*
* @see #getMappingBasePackages()
* @return
* @throws ClassNotFoundException
*/
@Bean
public MongoMappingContext mongoMappingContext() throws ClassNotFoundException {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
mappingContext.setFieldNamingStrategy(fieldNamingStrategy());
return mappingContext;
}
/**
* Returns a {@link MappingContextIsNewStrategyFactory} wrapped into a {@link CachingIsNewStrategyFactory}.
*
* @return
* @throws ClassNotFoundException
*/
@Bean
public IsNewStrategyFactory isNewStrategyFactory() throws ClassNotFoundException {
return new CachingIsNewStrategyFactory(new MappingContextIsNewStrategyFactory(
new PersistentEntities(Arrays.<MappingContext<?, ?>> asList(new MappingContext[] { mongoMappingContext() }))));
}
/**
* Register custom {@link Converter}s in a {@link CustomConversions} object if required. These
* {@link CustomConversions} will be registered with the {@link #mappingMongoConverter()} and
* {@link #mongoMappingContext()}. Returns an empty {@link CustomConversions} instance by default.
*
* @return must not be {@literal null}.
*/
@Bean
public CustomConversions customConversions() {
return new CustomConversions(Collections.emptyList());
}
/**
* Scans the mapping base package for classes annotated with {@link Document}. By default, it scans for entities in
* all packages returned by {@link #getMappingBasePackages()}.
*
* @see #getMappingBasePackages()
* @return
* @throws ClassNotFoundException
*/
protected Set<Class<?>> getInitialEntitySet() throws ClassNotFoundException {
Set<Class<?>> initialEntitySet = new HashSet<Class<?>>();
for (String basePackage : getMappingBasePackages()) {
initialEntitySet.addAll(scanForEntities(basePackage));
}
return initialEntitySet;
}
/**
* Scans the given base package for entities, i.e. MongoDB specific types annotated with {@link Document} and
* {@link Persistent}.
*
* @param basePackage must not be {@literal null}.
* @return
* @throws ClassNotFoundException
* @since 1.10
*/
protected Set<Class<?>> scanForEntities(String basePackage) throws ClassNotFoundException {
if (!StringUtils.hasText(basePackage)) {
return Collections.emptySet();
}
Set<Class<?>> initialEntitySet = new HashSet<Class<?>>();
if (StringUtils.hasText(basePackage)) {
ClassPathScanningCandidateComponentProvider componentProvider = new ClassPathScanningCandidateComponentProvider(
false);
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Document.class));
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Persistent.class));
for (BeanDefinition candidate : componentProvider.findCandidateComponents(basePackage)) {
initialEntitySet
.add(ClassUtils.forName(candidate.getBeanClassName(), MongoConfigurationSupport.class.getClassLoader()));
}
}
return initialEntitySet;
}
/**
* Configures whether to abbreviate field names for domain objects by configuring a
* {@link CamelCaseAbbreviatingFieldNamingStrategy} on the {@link MongoMappingContext} instance created. For advanced
* customization needs, consider overriding {@link #mappingMongoConverter()}.
*
* @return
*/
protected boolean abbreviateFieldNames() {
return false;
}
/**
* Configures a {@link FieldNamingStrategy} on the {@link MongoMappingContext} instance created.
*
* @return
* @since 1.5
*/
protected FieldNamingStrategy fieldNamingStrategy() {
return abbreviateFieldNames() ? new CamelCaseAbbreviatingFieldNamingStrategy()
: PropertyNameFieldNamingStrategy.INSTANCE;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2019 the original author or authors.
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -16,7 +16,6 @@
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import java.lang.reflect.Method;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
@@ -24,18 +23,15 @@ import java.util.Properties;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.springframework.util.ReflectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.MongoCredential;
/**
* Parse a {@link String} to a Collection of {@link MongoCredential}.
*
*
* @author Christoph Strobl
* @author Oliver Gierke
* @author Stephen Tyler Conrad
* @author Mark Paluch
* @since 1.7
*/
public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
@@ -99,20 +95,6 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
verifyDatabasePresent(database);
credentials.add(MongoCredential.createScramSha1Credential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if ("SCRAM-SHA-256".equals(authMechanism)) {
Method createScramSha256Credential = ReflectionUtils.findMethod(MongoCredential.class,
"createScramSha256Credential", String.class, String.class, char[].class);
if (createScramSha256Credential == null) {
throw new IllegalArgumentException(
"SCRAM-SHA-256 auth mechanism is available as of MongoDB 4 and MongoDB Java Driver 3.8! Please make sure to use at least those versions.");
}
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.class.cast(ReflectionUtils.invokeMethod(createScramSha256Credential, null,
userNameAndPassword[0], database, userNameAndPassword[1].toCharArray())));
} else {
throw new IllegalArgumentException(
String.format("Cannot create MongoCredentials for unknown auth mechanism '%s'!", authMechanism));
@@ -174,7 +156,7 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
private static Properties extractOptions(String text) {
int optionsSeperationIndex = text.lastIndexOf(OPTIONS_DELIMINATOR);
int dbSeperationIndex = text.lastIndexOf(DATABASE_DELIMINATOR);
int dbSeperationIndex = text.lastIndexOf(OPTIONS_DELIMINATOR);
if (optionsSeperationIndex == -1 || dbSeperationIndex > optionsSeperationIndex) {
return new Properties();
@@ -184,11 +166,6 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
for (String option : text.substring(optionsSeperationIndex + 1).split(OPTION_VALUE_DELIMINATOR)) {
String[] optionArgs = option.split("=");
if (optionArgs.length == 1) {
throw new IllegalArgumentException(String.format("Query parameter '%s' has no value!", optionArgs[0]));
}
properties.put(optionArgs[0], optionArgs[1]);
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2015 by the original author(s).
* Copyright 2011-2017 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -30,9 +30,8 @@ import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.AbstractBeanDefinitionParser;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mongodb.core.MongoFactoryBean;
import org.springframework.data.mongodb.core.MongoClientFactoryBean;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
@@ -99,8 +98,6 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
String mongoRef = element.getAttribute("mongo-ref");
String dbname = element.getAttribute("dbname");
BeanDefinition userCredentials = getUserCredentialsBeanDefinition(element, parserContext);
// Defaulting
if (StringUtils.hasText(mongoRef)) {
dbFactoryBuilder.addConstructorArgReference(mongoRef);
@@ -109,8 +106,6 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
}
dbFactoryBuilder.addConstructorArgValue(StringUtils.hasText(dbname) ? dbname : "db");
dbFactoryBuilder.addConstructorArgValue(userCredentials);
dbFactoryBuilder.addConstructorArgValue(element.getAttribute("authentication-dbname"));
BeanDefinitionBuilder writeConcernPropertyEditorBuilder = getWriteConcernPropertyEditorBuilder();
@@ -131,35 +126,13 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
*/
private BeanDefinition registerMongoBeanDefinition(Element element, ParserContext parserContext) {
BeanDefinitionBuilder mongoBuilder = BeanDefinitionBuilder.genericBeanDefinition(MongoFactoryBean.class);
BeanDefinitionBuilder mongoBuilder = BeanDefinitionBuilder.genericBeanDefinition(MongoClientFactoryBean.class);
setPropertyValue(mongoBuilder, element, "host");
setPropertyValue(mongoBuilder, element, "port");
return getSourceBeanDefinition(mongoBuilder, parserContext, element);
}
/**
* Returns a {@link BeanDefinition} for a {@link UserCredentials} object.
*
* @param element
* @return the {@link BeanDefinition} or {@literal null} if neither username nor password given.
*/
private BeanDefinition getUserCredentialsBeanDefinition(Element element, ParserContext context) {
String username = element.getAttribute("username");
String password = element.getAttribute("password");
if (!StringUtils.hasText(username) && !StringUtils.hasText(password)) {
return null;
}
BeanDefinitionBuilder userCredentialsBuilder = BeanDefinitionBuilder.genericBeanDefinition(UserCredentials.class);
userCredentialsBuilder.addConstructorArgValue(StringUtils.hasText(username) ? username : null);
userCredentialsBuilder.addConstructorArgValue(StringUtils.hasText(password) ? password : null);
return getSourceBeanDefinition(userCredentialsBuilder, context, element);
}
/**
* Creates a {@link BeanDefinition} for a {@link MongoURI} or {@link MongoClientURI} depending on configured
* attributes. <br />
@@ -193,7 +166,7 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
parserContext.extractSource(element));
}
Class<?> type = hasClientUri ? MongoClientURI.class : MongoURI.class;
Class<?> type = MongoClientURI.class;
String uri = hasClientUri ? element.getAttribute("client-uri") : element.getAttribute("uri");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(type);

View File

@@ -1,69 +1,76 @@
/*
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.mongodb.core.MongoAdmin;
import org.springframework.data.mongodb.monitor.*;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
public class MongoJmxParser implements BeanDefinitionParser {
public BeanDefinition parse(Element element, ParserContext parserContext) {
String name = element.getAttribute("mongo-ref");
if (!StringUtils.hasText(name)) {
name = "mongo";
}
registerJmxComponents(name, element, parserContext);
return null;
}
protected void registerJmxComponents(String mongoRefName, Element element, ParserContext parserContext) {
Object eleSource = parserContext.extractSource(element);
CompositeComponentDefinition compositeDef = new CompositeComponentDefinition(element.getTagName(), eleSource);
createBeanDefEntry(AssertMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(BackgroundFlushingMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(BtreeIndexCounters.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(ConnectionMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(GlobalLockMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(MemoryMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(OperationCounters.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(ServerInfo.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(MongoAdmin.class, compositeDef, mongoRefName, eleSource, parserContext);
parserContext.registerComponent(compositeDef);
}
protected void createBeanDefEntry(Class<?> clazz, CompositeComponentDefinition compositeDef, String mongoRefName,
Object eleSource, ParserContext parserContext) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(clazz);
builder.getRawBeanDefinition().setSource(eleSource);
builder.addConstructorArgReference(mongoRefName);
BeanDefinition assertDef = builder.getBeanDefinition();
String assertName = parserContext.getReaderContext().registerWithGeneratedName(assertDef);
compositeDef.addNestedComponent(new BeanComponentDefinition(assertDef, assertName));
}
}
/*
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.mongodb.core.MongoAdmin;
import org.springframework.data.mongodb.monitor.*;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* @author Mark Pollack
* @author Thomas Risberg
* @author John Brisbin
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MongoJmxParser implements BeanDefinitionParser {
public BeanDefinition parse(Element element, ParserContext parserContext) {
String name = element.getAttribute("mongo-ref");
if (!StringUtils.hasText(name)) {
name = BeanNames.MONGO_BEAN_NAME;
}
registerJmxComponents(name, element, parserContext);
return null;
}
protected void registerJmxComponents(String mongoRefName, Element element, ParserContext parserContext) {
Object eleSource = parserContext.extractSource(element);
CompositeComponentDefinition compositeDef = new CompositeComponentDefinition(element.getTagName(), eleSource);
createBeanDefEntry(AssertMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(BackgroundFlushingMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(BtreeIndexCounters.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(ConnectionMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(GlobalLockMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(MemoryMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(OperationCounters.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(ServerInfo.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(MongoAdmin.class, compositeDef, mongoRefName, eleSource, parserContext);
parserContext.registerComponent(compositeDef);
}
protected void createBeanDefEntry(Class<?> clazz, CompositeComponentDefinition compositeDef, String mongoRefName,
Object eleSource, ParserContext parserContext) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(clazz);
builder.getRawBeanDefinition().setSource(eleSource);
builder.addConstructorArgReference(mongoRefName);
BeanDefinition assertDef = builder.getBeanDefinition();
String assertName = parserContext.getReaderContext().registerWithGeneratedName(assertDef);
compositeDef.addNestedComponent(new BeanComponentDefinition(assertDef, assertName));
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -33,7 +33,6 @@ public class MongoNamespaceHandler extends NamespaceHandlerSupport {
public void init() {
registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser());
registerBeanDefinitionParser("mongo", new MongoParser());
registerBeanDefinitionParser("mongo-client", new MongoClientParser());
registerBeanDefinitionParser("db-factory", new MongoDbFactoryParser());
registerBeanDefinitionParser("jmx", new MongoJmxParser());

View File

@@ -1,76 +0,0 @@
/*
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.core.MongoFactoryBean;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* Parser for &lt;mongo;gt; definitions.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MongoParser implements BeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
Object source = parserContext.extractSource(element);
String id = element.getAttribute("id");
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(MongoFactoryBean.class);
ParsingUtils.setPropertyValue(builder, element, "port", "port");
ParsingUtils.setPropertyValue(builder, element, "host", "host");
ParsingUtils.setPropertyValue(builder, element, "write-concern", "writeConcern");
MongoParsingUtils.parseMongoOptions(element, builder);
MongoParsingUtils.parseReplicaSet(element, builder);
String defaultedId = StringUtils.hasText(id) ? id : BeanNames.MONGO_BEAN_NAME;
parserContext.pushContainingComponent(new CompositeComponentDefinition("Mongo", source));
BeanComponentDefinition mongoComponent = helper.getComponent(builder, defaultedId);
parserContext.registerBeanComponent(mongoComponent);
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(MongoParsingUtils
.getServerAddressPropertyEditorBuilder());
parserContext.registerBeanComponent(serverAddressPropertyEditor);
BeanComponentDefinition writeConcernPropertyEditor = helper.getComponent(MongoParsingUtils
.getWriteConcernPropertyEditorBuilder());
parserContext.registerBeanComponent(writeConcernPropertyEditor);
parserContext.popAndRegisterContainingComponent();
return mongoComponent.getBeanDefinition();
}
}

View File

@@ -1,207 +1,170 @@
/*
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.springframework.data.config.ParsingUtils.*;
import java.util.Map;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.data.mongodb.core.MongoClientOptionsFactoryBean;
import org.springframework.data.mongodb.core.MongoOptionsFactoryBean;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
/**
* Utility methods for {@link BeanDefinitionParser} implementations for MongoDB.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
@SuppressWarnings("deprecation")
abstract class MongoParsingUtils {
private MongoParsingUtils() {}
/**
* Parses the mongo replica-set element.
*
* @param parserContext the parser context
* @param element the mongo element
* @param mongoBuilder the bean definition builder to populate
* @return
*/
static void parseReplicaSet(Element element, BeanDefinitionBuilder mongoBuilder) {
setPropertyValue(mongoBuilder, element, "replica-set", "replicaSetSeeds");
}
/**
* Parses the {@code mongo:options} sub-element. Populates the given attribute factory with the proper attributes.
*
* @return true if parsing actually occured, {@literal false} otherwise
*/
static boolean parseMongoOptions(Element element, BeanDefinitionBuilder mongoBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "options");
if (optionsElement == null) {
return false;
}
BeanDefinitionBuilder optionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoOptionsFactoryBean.class);
setPropertyValue(optionsDefBuilder, optionsElement, "connections-per-host", "connectionsPerHost");
setPropertyValue(optionsDefBuilder, optionsElement, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(optionsDefBuilder, optionsElement, "max-wait-time", "maxWaitTime");
setPropertyValue(optionsDefBuilder, optionsElement, "connect-timeout", "connectTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "socket-timeout", "socketTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(optionsDefBuilder, optionsElement, "auto-connect-retry", "autoConnectRetry");
setPropertyValue(optionsDefBuilder, optionsElement, "max-auto-connect-retry-time", "maxAutoConnectRetryTime");
setPropertyValue(optionsDefBuilder, optionsElement, "write-number", "writeNumber");
setPropertyValue(optionsDefBuilder, optionsElement, "write-timeout", "writeTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "write-fsync", "writeFsync");
setPropertyValue(optionsDefBuilder, optionsElement, "slave-ok", "slaveOk");
setPropertyValue(optionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(optionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
mongoBuilder.addPropertyValue("mongoOptions", optionsDefBuilder.getBeanDefinition());
return true;
}
/**
* Parses the {@code mongo:client-options} sub-element. Populates the given attribute factory with the proper
* attributes.
*
* @param element must not be {@literal null}.
* @param mongoClientBuilder must not be {@literal null}.
* @return
* @since 1.7
*/
public static boolean parseMongoClientOptions(Element element, BeanDefinitionBuilder mongoClientBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "client-options");
if (optionsElement == null) {
return false;
}
BeanDefinitionBuilder clientOptionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoClientOptionsFactoryBean.class);
setPropertyValue(clientOptionsDefBuilder, optionsElement, "description", "description");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-connections-per-host", "minConnectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connections-per-host", "connectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-wait-time", "maxWaitTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-idle-time", "maxConnectionIdleTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-life-time", "maxConnectionLifeTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connect-timeout", "connectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-timeout", "socketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "read-preference", "readPreference");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "write-concern", "writeConcern");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-frequency", "heartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-heartbeat-frequency", "minHeartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-connect-timeout", "heartbeatConnectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-socket-timeout", "heartbeatSocketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(clientOptionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "server-selection-timeout", "serverSelectionTimeout");
mongoClientBuilder.addPropertyValue("mongoClientOptions", clientOptionsDefBuilder.getBeanDefinition());
return true;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link WriteConcernPropertyEditor}.
*
* @return
*/
static BeanDefinitionBuilder getWriteConcernPropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.WriteConcern", WriteConcernPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
* container.
*/
static BeanDefinitionBuilder getServerAddressPropertyEditorBuilder() {
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link ReadPreferencePropertyEditor}.
*
* @return
* @since 1.7
*/
static BeanDefinitionBuilder getReadPreferencePropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.ReadPreference", ReadPreferencePropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link MongoCredentialPropertyEditor}.
*
* @return
* @since 1.7
*/
static BeanDefinitionBuilder getMongoCredentialPropertyEditor() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.MongoCredential[]", MongoCredentialPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
}
/*
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.springframework.data.config.ParsingUtils.*;
import java.util.Map;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.data.mongodb.core.MongoClientOptionsFactoryBean;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
/**
* Utility methods for {@link BeanDefinitionParser} implementations for MongoDB.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
@SuppressWarnings("deprecation")
abstract class MongoParsingUtils {
private MongoParsingUtils() {}
/**
* Parses the mongo replica-set element.
*
* @param parserContext the parser context
* @param element the mongo element
* @param mongoBuilder the bean definition builder to populate
* @return
*/
static void parseReplicaSet(Element element, BeanDefinitionBuilder mongoBuilder) {
setPropertyValue(mongoBuilder, element, "replica-set", "replicaSetSeeds");
}
/**
* Parses the {@code mongo:client-options} sub-element. Populates the given attribute factory with the proper
* attributes.
*
* @param element must not be {@literal null}.
* @param mongoClientBuilder must not be {@literal null}.
* @return
* @since 1.7
*/
public static boolean parseMongoClientOptions(Element element, BeanDefinitionBuilder mongoClientBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "client-options");
if (optionsElement == null) {
return false;
}
BeanDefinitionBuilder clientOptionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoClientOptionsFactoryBean.class);
setPropertyValue(clientOptionsDefBuilder, optionsElement, "description", "description");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-connections-per-host", "minConnectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connections-per-host", "connectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-wait-time", "maxWaitTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-idle-time", "maxConnectionIdleTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-life-time", "maxConnectionLifeTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connect-timeout", "connectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-timeout", "socketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "read-preference", "readPreference");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "write-concern", "writeConcern");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-frequency", "heartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-heartbeat-frequency", "minHeartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-connect-timeout", "heartbeatConnectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-socket-timeout", "heartbeatSocketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(clientOptionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "server-selection-timeout", "serverSelectionTimeout");
mongoClientBuilder.addPropertyValue("mongoClientOptions", clientOptionsDefBuilder.getBeanDefinition());
return true;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link WriteConcernPropertyEditor}.
*
* @return
*/
static BeanDefinitionBuilder getWriteConcernPropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.WriteConcern", WriteConcernPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
* container.
*/
static BeanDefinitionBuilder getServerAddressPropertyEditorBuilder() {
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link ReadPreferencePropertyEditor}.
*
* @return
* @since 1.7
*/
static BeanDefinitionBuilder getReadPreferencePropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.ReadPreference", ReadPreferencePropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link MongoCredentialPropertyEditor}.
*
* @return
* @since 1.7
*/
static BeanDefinitionBuilder getMongoCredentialPropertyEditor() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.MongoCredential[]", MongoCredentialPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2019 the original author or authors.
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2012-2019 the original author or authors.
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,121 +0,0 @@
/*
* Copyright 2018-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import lombok.AllArgsConstructor;
import java.util.List;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.ObjectUtils;
import com.mongodb.BasicDBList;
import com.mongodb.DBObject;
/**
* Utility methods to map {@link org.springframework.data.mongodb.core.aggregation.Aggregation} pipeline definitions and
* create type-bound {@link AggregationOperationContext}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 1.10.13
*/
@AllArgsConstructor
class AggregationUtil {
QueryMapper queryMapper;
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
/**
* Prepare the {@link AggregationOperationContext} for a given aggregation by either returning the context itself it
* is not {@literal null}, create a {@link TypeBasedAggregationOperationContext} if the aggregation contains type
* information (is a {@link TypedAggregation}) or use the {@link Aggregation#DEFAULT_CONTEXT}.
*
* @param aggregation must not be {@literal null}.
* @param context can be {@literal null}.
* @return the root {@link AggregationOperationContext} to use.
*/
AggregationOperationContext prepareAggregationContext(Aggregation aggregation, AggregationOperationContext context) {
if (context != null) {
return context;
}
if (aggregation instanceof TypedAggregation) {
return new TypeBasedAggregationOperationContext(((TypedAggregation) aggregation).getInputType(), mappingContext,
queryMapper);
}
return Aggregation.DEFAULT_CONTEXT;
}
/**
* Extract and map the aggregation pipeline into a {@link List} of {@link Document}.
*
* @param aggregation
* @param context
* @return
*/
DBObject createPipeline(String collectionName, Aggregation aggregation, AggregationOperationContext context) {
if (!ObjectUtils.nullSafeEquals(context, Aggregation.DEFAULT_CONTEXT)) {
return aggregation.toDbObject(collectionName, context);
}
DBObject command = aggregation.toDbObject(collectionName, context);
command.put("pipeline", mapAggregationPipeline((List) command.get("pipeline")));
return command;
}
/**
* Extract the command and map the aggregation pipeline.
*
* @param aggregation
* @param context
* @return
*/
DBObject createCommand(String collection, Aggregation aggregation, AggregationOperationContext context) {
DBObject command = aggregation.toDbObject(collection, context);
if (!ObjectUtils.nullSafeEquals(context, Aggregation.DEFAULT_CONTEXT)) {
return command;
}
command.put("pipeline", mapAggregationPipeline((List) command.get("pipeline")));
return command;
}
private BasicDBList mapAggregationPipeline(List<DBObject> pipeline) {
BasicDBList mapped = new BasicDBList();
for (DBObject stage : pipeline) {
mapped.add(queryMapper.getMappedObject(stage, null));
}
return mapped;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2019 the original author or authors.
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -21,14 +21,14 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.Pair;
import com.mongodb.BulkWriteResult;
import com.mongodb.bulk.BulkWriteResult;
/**
* Bulk operations for insert/update/remove actions on a collection. These bulks operation are available since MongoDB
* 2.6 and make use of low level bulk commands on the protocol level. This interface defines a fluent API to add
* multiple single operations or list of similar operations in sequence which can then eventually be executed by calling
* {@link #execute()}.
*
*
* @author Tobias Trelle
* @author Oliver Gierke
* @since 1.9
@@ -38,7 +38,7 @@ public interface BulkOperations {
/**
* Mode for bulk operation.
**/
public enum BulkMode {
enum BulkMode {
/** Perform bulk operations in sequence. The first error will cancel processing. */
ORDERED,
@@ -49,7 +49,7 @@ public interface BulkOperations {
/**
* Add a single insert to the bulk operation.
*
*
* @param documents the document to insert, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the insert added, will never be {@literal null}.
*/
@@ -57,7 +57,7 @@ public interface BulkOperations {
/**
* Add a list of inserts to the bulk operation.
*
*
* @param documents List of documents to insert, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the insert added, will never be {@literal null}.
*/
@@ -65,7 +65,7 @@ public interface BulkOperations {
/**
* Add a single update to the bulk operation. For the update request, only the first matching document is updated.
*
*
* @param query update criteria, must not be {@literal null}.
* @param update {@link Update} operation to perform, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
@@ -74,7 +74,7 @@ public interface BulkOperations {
/**
* Add a list of updates to the bulk operation. For each update request, only the first matching document is updated.
*
*
* @param updates Update operations to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
@@ -82,7 +82,7 @@ public interface BulkOperations {
/**
* Add a single update to the bulk operation. For the update request, all matching documents are updated.
*
*
* @param query Update criteria.
* @param update Update operation to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
@@ -91,7 +91,7 @@ public interface BulkOperations {
/**
* Add a list of updates to the bulk operation. For each update request, all matching documents are updated.
*
*
* @param updates Update operations to perform.
* @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
@@ -101,7 +101,7 @@ public interface BulkOperations {
/**
* Add a single upsert to the bulk operation. An upsert is an update if the set of matching documents is not empty,
* else an insert.
*
*
* @param query Update criteria.
* @param update Update operation to perform.
* @return The bulk operation.
@@ -112,7 +112,7 @@ public interface BulkOperations {
/**
* Add a list of upserts to the bulk operation. An upsert is an update if the set of matching documents is not empty,
* else an insert.
*
*
* @param updates Updates/insert operations to perform.
* @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
@@ -121,7 +121,7 @@ public interface BulkOperations {
/**
* Add a single remove operation to the bulk operation.
*
*
* @param remove the {@link Query} to select the documents to be removed, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the removal added, will never be {@literal null}.
*/
@@ -129,7 +129,7 @@ public interface BulkOperations {
/**
* Add a list of remove operations to the bulk operation.
*
*
* @param removes the remove operations to perform, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the removal added, will never be {@literal null}.
*/
@@ -137,9 +137,9 @@ public interface BulkOperations {
/**
* Execute all bulk operations using the default write concern.
*
*
* @return Result of the bulk operation providing counters for inserts/updates etc.
* @throws org.springframework.data.mongodb.BulkOperationException if an error occurred during bulk processing.
* @throws {@link BulkOperationException} if an error occurred during bulk processing.
*/
BulkWriteResult execute();
}

View File

@@ -1,26 +1,44 @@
/*
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.DBCollection;
import com.mongodb.MongoException;
import org.springframework.dao.DataAccessException;
public interface CollectionCallback<T> {
T doInCollection(DBCollection collection) throws MongoException, DataAccessException;
}
/*
* Copyright 2010-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.bson.Document;
import org.springframework.dao.DataAccessException;
import com.mongodb.MongoException;
import com.mongodb.client.MongoCollection;
/**
* Callback interface for executing actions against a {@link MongoCollection}
*
* @author Mark Pollak
* @author Grame Rocher
* @author Oliver Gierke
* @author John Brisbin
* @auhtor Christoph Strobl
* @since 1.0
*/
public interface CollectionCallback<T> {
/**
* @param collection never {@literal null}.
* @return
* @throws MongoException
* @throws DataAccessException
*/
T doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException;
}

View File

@@ -1,70 +1,70 @@
/*
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
/**
* Provides a simple wrapper to encapsulate the variety of settings you can use when creating a collection.
*
* @author Thomas Risberg
*/
public class CollectionOptions {
private Integer maxDocuments;
private Integer size;
private Boolean capped;
/**
* Constructs a new <code>CollectionOptions</code> instance.
*
* @param size the collection size in bytes, this data space is preallocated
* @param maxDocuments the maximum number of documents in the collection.
* @param capped true to created a "capped" collection (fixed size with auto-FIFO behavior based on insertion order),
* false otherwise.
*/
public CollectionOptions(Integer size, Integer maxDocuments, Boolean capped) {
super();
this.maxDocuments = maxDocuments;
this.size = size;
this.capped = capped;
}
public Integer getMaxDocuments() {
return maxDocuments;
}
public void setMaxDocuments(Integer maxDocuments) {
this.maxDocuments = maxDocuments;
}
public Integer getSize() {
return size;
}
public void setSize(Integer size) {
this.size = size;
}
public Boolean getCapped() {
return capped;
}
public void setCapped(Boolean capped) {
this.capped = capped;
}
}
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
/**
* Provides a simple wrapper to encapsulate the variety of settings you can use when creating a collection.
*
* @author Thomas Risberg
*/
public class CollectionOptions {
private Integer maxDocuments;
private Integer size;
private Boolean capped;
/**
* Constructs a new <code>CollectionOptions</code> instance.
*
* @param size the collection size in bytes, this data space is preallocated
* @param maxDocuments the maximum number of documents in the collection.
* @param capped true to created a "capped" collection (fixed size with auto-FIFO behavior based on insertion order),
* false otherwise.
*/
public CollectionOptions(Integer size, Integer maxDocuments, Boolean capped) {
super();
this.maxDocuments = maxDocuments;
this.size = size;
this.capped = capped;
}
public Integer getMaxDocuments() {
return maxDocuments;
}
public void setMaxDocuments(Integer maxDocuments) {
this.maxDocuments = maxDocuments;
}
public Integer getSize() {
return size;
}
public void setSize(Integer size) {
this.size = size;
}
public Boolean getCapped() {
return capped;
}
public void setCapped(Boolean capped) {
this.capped = capped;
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2002-2019 the original author or authors.
* Copyright 2002-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,12 +15,15 @@
*/
package org.springframework.data.mongodb.core;
import com.mongodb.DBCursor;
import org.bson.Document;
import com.mongodb.client.FindIterable;
/**
* Simple callback interface to allow customization of a {@link DBCursor}.
* Simple callback interface to allow customization of a {@link FindIterable}.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
interface CursorPreparer {
@@ -29,5 +32,5 @@ interface CursorPreparer {
*
* @param cursor
*/
DBCursor prepare(DBCursor cursor);
FindIterable<Document> prepare(FindIterable<Document> cursor);
}

View File

@@ -1,25 +1,30 @@
/*
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.DB;
import com.mongodb.MongoException;
import org.springframework.dao.DataAccessException;
public interface DbCallback<T> {
T doInDB(DB db) throws MongoException, DataAccessException;
}
/*
* Copyright 2010-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.dao.DataAccessException;
import com.mongodb.MongoException;
import com.mongodb.client.MongoDatabase;
/**
*
* @param <T>
*/
public interface DbCallback<T> {
T doInDB(MongoDatabase db) throws MongoException, DataAccessException;
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2019 the original author or authors.
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,79 +15,80 @@
*/
package org.springframework.data.mongodb.core;
import lombok.NonNull;
import lombok.Value;
import java.util.Collections;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.bson.Document;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.Pair;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.BulkWriteException;
import com.mongodb.BulkWriteOperation;
import com.mongodb.BulkWriteRequestBuilder;
import com.mongodb.BulkWriteResult;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.WriteConcern;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.model.BulkWriteOptions;
import com.mongodb.client.model.DeleteManyModel;
import com.mongodb.client.model.InsertOneModel;
import com.mongodb.client.model.UpdateManyModel;
import com.mongodb.client.model.UpdateOneModel;
import com.mongodb.client.model.UpdateOptions;
import com.mongodb.client.model.WriteModel;
/**
* Default implementation for {@link BulkOperations}.
*
*
* @author Tobias Trelle
* @author Oliver Gierke
* @author Christoph Strobl
* @author Mark Paluch
* @since 1.9
*/
class DefaultBulkOperations implements BulkOperations {
private final MongoOperations mongoOperations;
private final BulkMode bulkMode;
private final String collectionName;
private final BulkOperationContext bulkOperationContext;
private PersistenceExceptionTranslator exceptionTranslator;
private WriteConcernResolver writeConcernResolver;
private WriteConcern defaultWriteConcern;
private BulkWriteOperation bulk;
private BulkWriteOptions bulkOptions;
List<WriteModel<Document>> models = new ArrayList<WriteModel<Document>>();
/**
* Creates a new {@link DefaultBulkOperations} for the given {@link MongoOperations}, collection name and
* {@link BulkOperationContext}.
*
* @param mongoOperations must not be {@literal null}.
* @param collectionName must not be {@literal null}.
* @param bulkOperationContext must not be {@literal null}.
* @since 1.10.5
* Creates a new {@link DefaultBulkOperations} for the given {@link MongoOperations}, {@link BulkMode}, collection
* name and {@link WriteConcern}.
*
* @param mongoOperations The underlying {@link MongoOperations}, must not be {@literal null}.
* @param bulkMode must not be {@literal null}.
* @param collectionName Name of the collection to work on, must not be {@literal null} or empty.
* @param entityType the entity type, can be {@literal null}.
*/
DefaultBulkOperations(MongoOperations mongoOperations, String collectionName,
BulkOperationContext bulkOperationContext) {
DefaultBulkOperations(MongoOperations mongoOperations, BulkMode bulkMode, String collectionName,
Class<?> entityType) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.hasText(collectionName, "CollectionName must not be null nor empty!");
Assert.notNull(bulkOperationContext, "BulkOperationContext must not be null!");
Assert.notNull(bulkMode, "BulkMode must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
this.mongoOperations = mongoOperations;
this.bulkMode = bulkMode;
this.collectionName = collectionName;
this.bulkOperationContext = bulkOperationContext;
this.exceptionTranslator = new MongoExceptionTranslator();
this.writeConcernResolver = DefaultWriteConcernResolver.INSTANCE;
this.bulk = getBulkWriteOptions(bulkOperationContext.getBulkMode());
this.bulkOptions = initBulkOperation();
}
/**
* Configures the {@link PersistenceExceptionTranslator} to be used. Defaults to {@link MongoExceptionTranslator}.
*
*
* @param exceptionTranslator can be {@literal null}.
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
@@ -96,7 +97,7 @@ class DefaultBulkOperations implements BulkOperations {
/**
* Configures the {@link WriteConcernResolver} to be used. Defaults to {@link DefaultWriteConcernResolver}.
*
*
* @param writeConcernResolver can be {@literal null}.
*/
public void setWriteConcernResolver(WriteConcernResolver writeConcernResolver) {
@@ -106,10 +107,10 @@ class DefaultBulkOperations implements BulkOperations {
/**
* Configures the default {@link WriteConcern} to be used. Defaults to {@literal null}.
*
*
* @param defaultWriteConcern can be {@literal null}.
*/
void setDefaultWriteConcern(WriteConcern defaultWriteConcern) {
public void setDefaultWriteConcern(WriteConcern defaultWriteConcern) {
this.defaultWriteConcern = defaultWriteConcern;
}
@@ -122,15 +123,16 @@ class DefaultBulkOperations implements BulkOperations {
Assert.notNull(document, "Document must not be null!");
if (document instanceof DBObject) {
if (document instanceof Document) {
bulk.insert((DBObject) document);
models.add(new InsertOneModel<>((Document) document));
return this;
}
DBObject sink = new BasicDBObject();
Document sink = new Document();
mongoOperations.getConverter().write(document, sink);
bulk.insert(sink);
models.add(new InsertOneModel<>(sink));
return this;
}
@@ -161,7 +163,7 @@ class DefaultBulkOperations implements BulkOperations {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
return updateOne(Collections.singletonList(Pair.of(query, update)));
return updateOne(Arrays.asList(Pair.of(query, update)));
}
/*
@@ -191,7 +193,7 @@ class DefaultBulkOperations implements BulkOperations {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
return updateMulti(Collections.singletonList(Pair.of(query, update)));
return updateMulti(Arrays.asList(Pair.of(query, update)));
}
/*
@@ -242,8 +244,7 @@ class DefaultBulkOperations implements BulkOperations {
Assert.notNull(query, "Query must not be null!");
bulk.find(getMappedQuery(query.getQueryObject())).remove();
models.add(new DeleteManyModel<Document>(query.getQueryObject()));
return this;
}
@@ -268,27 +269,30 @@ class DefaultBulkOperations implements BulkOperations {
* @see org.springframework.data.mongodb.core.BulkOperations#executeBulk()
*/
@Override
public BulkWriteResult execute() {
MongoAction action = new MongoAction(defaultWriteConcern, MongoActionOperation.BULK, collectionName,
bulkOperationContext.getEntityType(), null, null);
WriteConcern writeConcern = writeConcernResolver.resolve(action);
public com.mongodb.bulk.BulkWriteResult execute() {
try {
return writeConcern == null ? bulk.execute() : bulk.execute(writeConcern);
MongoCollection<Document> collection = mongoOperations.getCollection(collectionName);
if (defaultWriteConcern != null) {
collection = collection.withWriteConcern(defaultWriteConcern);
}
return collection.bulkWrite(models, bulkOptions);
} catch (BulkWriteException o_O) {
DataAccessException toThrow = exceptionTranslator.translateExceptionIfPossible(o_O);
throw toThrow == null ? o_O : toThrow;
} finally {
this.bulk = getBulkWriteOptions(bulkOperationContext.getBulkMode());
this.bulkOptions = initBulkOperation();
}
}
/**
* Performs update and upsert bulk operations.
*
*
* @param query the {@link Query} to determine documents to update.
* @param update the {@link Update} to perform, must not be {@literal null}.
* @param upsert whether to upsert.
@@ -300,68 +304,26 @@ class DefaultBulkOperations implements BulkOperations {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
BulkWriteRequestBuilder builder = bulk.find(getMappedQuery(query.getQueryObject()));
if (upsert) {
if (multi) {
builder.upsert().update(getMappedUpdate(update.getUpdateObject()));
} else {
builder.upsert().updateOne(getMappedUpdate(update.getUpdateObject()));
}
UpdateOptions options = new UpdateOptions();
options.upsert(upsert);
if (multi) {
models.add(new UpdateManyModel<Document>(query.getQueryObject(), update.getUpdateObject(), options));
} else {
if (multi) {
builder.update(getMappedUpdate(update.getUpdateObject()));
} else {
builder.updateOne(getMappedUpdate(update.getUpdateObject()));
}
models.add(new UpdateOneModel<Document>(query.getQueryObject(), update.getUpdateObject(), options));
}
return this;
}
private DBObject getMappedUpdate(DBObject update) {
return bulkOperationContext.getUpdateMapper().getMappedObject(update, bulkOperationContext.getEntity());
}
private DBObject getMappedQuery(DBObject query) {
return bulkOperationContext.getQueryMapper().getMappedObject(query, bulkOperationContext.getEntity());
}
private BulkWriteOperation getBulkWriteOptions(BulkMode bulkMode) {
DBCollection collection = mongoOperations.getCollection(collectionName);
private final BulkWriteOptions initBulkOperation() {
BulkWriteOptions options = new BulkWriteOptions();
switch (bulkMode) {
case ORDERED:
return collection.initializeOrderedBulkOperation();
return options.ordered(true);
case UNORDERED:
return collection.initializeUnorderedBulkOperation();
return options.ordered(false);
}
throw new IllegalStateException("BulkMode was null!");
}
/**
* {@link BulkOperationContext} holds information about
* {@link org.springframework.data.mongodb.core.BulkOperations.BulkMode} the entity in use as well as references to
* {@link QueryMapper} and {@link UpdateMapper}.
*
* @author Christoph Strobl
* @since 2.0
*/
@Value
static class BulkOperationContext {
@NonNull BulkMode bulkMode;
MongoPersistentEntity<?> entity;
@NonNull QueryMapper queryMapper;
@NonNull UpdateMapper updateMapper;
Class<?> getEntityType() {
return entity != null ? entity.getType() : null;
}
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,20 +15,25 @@
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.core.MongoTemplate.*;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import org.bson.Document;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.util.Assert;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.MongoException;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.MongoCursor;
import com.mongodb.client.model.IndexOptions;
/**
* Default implementation of {@link IndexOperations}.
@@ -37,12 +42,13 @@ import com.mongodb.MongoException;
* @author Oliver Gierke
* @author Komi Innocent
* @author Christoph Strobl
* @author Mark Paluch
*/
public class DefaultIndexOperations implements IndexOperations {
private static final String PARTIAL_FILTER_EXPRESSION_KEY = "partialFilterExpression";
private final MongoOperations mongoOperations;
private final MongoDbFactory mongoDbFactory;
private final String collectionName;
private final QueryMapper mapper;
private final Class<?> type;
@@ -50,29 +56,34 @@ public class DefaultIndexOperations implements IndexOperations {
/**
* Creates a new {@link DefaultIndexOperations}.
*
* @param mongoOperations must not be {@literal null}.
* @param mongoDbFactory must not be {@literal null}.
* @param collectionName must not be {@literal null}.
* @param queryMapper must not be {@literal null}.
*/
public DefaultIndexOperations(MongoOperations mongoOperations, String collectionName) {
this(mongoOperations, collectionName, null);
public DefaultIndexOperations(MongoDbFactory mongoDbFactory, String collectionName, QueryMapper queryMapper) {
this(mongoDbFactory, collectionName, queryMapper, null);
}
/**
* Creates a new {@link DefaultIndexOperations}.
*
* @param mongoOperations must not be {@literal null}.
* @param mongoDbFactory must not be {@literal null}.
* @param collectionName must not be {@literal null}.
* @param queryMapper must not be {@literal null}.
* @param type Type used for mapping potential partial index filter expression. Can be {@literal null}.
* @since 1.10
*/
public DefaultIndexOperations(MongoOperations mongoOperations, String collectionName, Class<?> type) {
public DefaultIndexOperations(MongoDbFactory mongoDbFactory, String collectionName, QueryMapper queryMapper,
Class<?> type) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.notNull(mongoDbFactory, "MongoDbFactory must not be null!");
Assert.notNull(collectionName, "Collection name can not be null!");
Assert.notNull(queryMapper, "QueryMapper must not be null!");
this.mongoOperations = mongoOperations;
this.mongoDbFactory = mongoDbFactory;
this.collectionName = collectionName;
this.mapper = new QueryMapper(mongoOperations.getConverter());
this.mapper = queryMapper;
this.type = type;
}
@@ -80,47 +91,47 @@ public class DefaultIndexOperations implements IndexOperations {
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/
public void ensureIndex(final IndexDefinition indexDefinition) {
public String ensureIndex(final IndexDefinition indexDefinition) {
mongoOperations.execute(collectionName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
DBObject indexOptions = indexDefinition.getIndexOptions();
return execute(collection -> {
if (indexOptions != null && indexOptions.containsField(PARTIAL_FILTER_EXPRESSION_KEY)) {
Document indexOptions = indexDefinition.getIndexOptions();
Assert.isInstanceOf(DBObject.class, indexOptions.get(PARTIAL_FILTER_EXPRESSION_KEY));
if (indexOptions != null) {
indexOptions.put(PARTIAL_FILTER_EXPRESSION_KEY,
mapper.getMappedObject((DBObject) indexOptions.get(PARTIAL_FILTER_EXPRESSION_KEY),
lookupPersistentEntity(type, collectionName)));
IndexOptions ops = IndexConverters.indexDefinitionToIndexOptionsConverter().convert(indexDefinition);
if (indexOptions.containsKey(PARTIAL_FILTER_EXPRESSION_KEY)) {
Assert.isInstanceOf(Document.class, indexOptions.get(PARTIAL_FILTER_EXPRESSION_KEY));
ops.partialFilterExpression( mapper.getMappedObject(
(Document) indexOptions.get(PARTIAL_FILTER_EXPRESSION_KEY), lookupPersistentEntity(type, collectionName)));
}
if (indexOptions != null) {
collection.createIndex(indexDefinition.getIndexKeys(), indexOptions);
} else {
collection.createIndex(indexDefinition.getIndexKeys());
}
return null;
return collection.createIndex(indexDefinition.getIndexKeys(), ops);
}
return collection.createIndex(indexDefinition.getIndexKeys());
}
private MongoPersistentEntity<?> lookupPersistentEntity(Class<?> entityType, String collection) {
);
}
if (entityType != null) {
return mongoOperations.getConverter().getMappingContext().getPersistentEntity(entityType);
}
private MongoPersistentEntity<?> lookupPersistentEntity(Class<?> entityType, String collection) {
Collection<? extends MongoPersistentEntity<?>> entities = mongoOperations.getConverter().getMappingContext()
.getPersistentEntities();
if (entityType != null) {
return mapper.getMappingContext().getRequiredPersistentEntity(entityType);
}
for (MongoPersistentEntity<?> entity : entities) {
if (entity.getCollection().equals(collection)) {
return entity;
}
}
Collection<? extends MongoPersistentEntity<?>> entities = mapper.getMappingContext().getPersistentEntities();
return null;
for (MongoPersistentEntity<?> entity : entities) {
if (entity.getCollection().equals(collection)) {
return entity;
}
});
}
return null;
}
/*
@@ -128,11 +139,10 @@ public class DefaultIndexOperations implements IndexOperations {
* @see org.springframework.data.mongodb.core.IndexOperations#dropIndex(java.lang.String)
*/
public void dropIndex(final String name) {
mongoOperations.execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
collection.dropIndex(name);
return null;
}
execute(collection -> {
collection.dropIndex(name);
return null;
});
}
@@ -145,45 +155,46 @@ public class DefaultIndexOperations implements IndexOperations {
dropIndex("*");
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#resetIndexCache()
*/
@Deprecated
public void resetIndexCache() {
mongoOperations.execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
ReflectiveDBCollectionInvoker.resetIndexCache(collection);
return null;
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#getIndexInfo()
*/
public List<IndexInfo> getIndexInfo() {
return mongoOperations.execute(collectionName, new CollectionCallback<List<IndexInfo>>() {
return execute(new CollectionCallback<List<IndexInfo>>() {
public List<IndexInfo> doInCollection(DBCollection collection) throws MongoException, DataAccessException {
public List<IndexInfo> doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
List<DBObject> dbObjectList = collection.getIndexInfo();
return getIndexData(dbObjectList);
MongoCursor<Document> cursor = collection.listIndexes(Document.class).iterator();
return getIndexData(cursor);
}
private List<IndexInfo> getIndexData(List<DBObject> dbObjectList) {
private List<IndexInfo> getIndexData(MongoCursor<Document> cursor) {
List<IndexInfo> indexInfoList = new ArrayList<IndexInfo>();
for (DBObject ix : dbObjectList) {
indexInfoList.add(IndexInfo.indexInfoOf(ix));
while (cursor.hasNext()) {
Document ix = cursor.next();
IndexInfo indexInfo = IndexConverters.documentToIndexInfoConverter().convert(ix);
indexInfoList.add(indexInfo);
}
return indexInfoList;
}
});
}
public <T> T execute(CollectionCallback<T> callback) {
Assert.notNull(callback, "CollectionCallback must not be null!");
try {
MongoCollection<Document> collection = mongoDbFactory.getDb().getCollection(collectionName);
return callback.doInCollection(collection);
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e, mongoDbFactory.getExceptionTranslator());
}
}
}

View File

@@ -0,0 +1,47 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.QueryMapper;
/**
* {@link IndexOperationsProvider} to obtain {@link IndexOperations} from a given {@link MongoDbFactory}. TODO: Review
* me
*
* @author Mark Paluch
* @since 2.0
*/
class DefaultIndexOperationsProvider implements IndexOperationsProvider {
private final MongoDbFactory mongoDbFactory;
private final QueryMapper mapper;
/**
* @param mongoDbFactory must not be {@literal null}.
*/
DefaultIndexOperationsProvider(MongoDbFactory mongoDbFactory, QueryMapper mapper) {
this.mongoDbFactory = mongoDbFactory; this.mapper = mapper;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperationsProvider#reactiveIndexOps(java.lang.String)
*/
@Override
public IndexOperations indexOps(String collectionName) {
return new DefaultIndexOperations(mongoDbFactory, collectionName, mapper);
}
}

View File

@@ -0,0 +1,102 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.bson.Document;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.util.Assert;
import com.mongodb.reactivestreams.client.ListIndexesPublisher;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
/**
* Default implementation of {@link IndexOperations}.
*
* @author Mark Paluch
* @since 1.11
*/
public class DefaultReactiveIndexOperations implements ReactiveIndexOperations {
private final ReactiveMongoOperations mongoOperations;
private final String collectionName;
/**
* Creates a new {@link DefaultReactiveIndexOperations}.
*
* @param mongoOperations must not be {@literal null}.
* @param collectionName must not be {@literal null}.
*/
public DefaultReactiveIndexOperations(ReactiveMongoOperations mongoOperations, String collectionName) {
Assert.notNull(mongoOperations, "ReactiveMongoOperations must not be null!");
Assert.notNull(collectionName, "Collection must not be null!");
this.mongoOperations = mongoOperations;
this.collectionName = collectionName;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveIndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/
public Mono<String> ensureIndex(final IndexDefinition indexDefinition) {
return mongoOperations.execute(collectionName, (ReactiveCollectionCallback<String>) collection -> {
Document indexOptions = indexDefinition.getIndexOptions();
if (indexOptions != null) {
return collection.createIndex(indexDefinition.getIndexKeys(),
IndexConverters.indexDefinitionToIndexOptionsConverter().convert(indexDefinition));
}
return collection.createIndex(indexDefinition.getIndexKeys());
}).next();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveIndexOperations#dropIndex(java.lang.String)
*/
public Mono<Void> dropIndex(final String name) {
return mongoOperations.execute(collectionName, collection -> {
return Mono.from(collection.dropIndex(name));
}).flatMap(success -> Mono.<Void>empty()).next();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveIndexOperations#dropAllIndexes()
*/
public Mono<Void> dropAllIndexes() {
return dropIndex("*");
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveIndexOperations#getIndexInfo()
*/
public Flux<IndexInfo> getIndexInfo() {
return mongoOperations.execute(collectionName, collection -> {
ListIndexesPublisher<Document> indexesPublisher = collection.listIndexes(Document.class);
return Flux.from(indexesPublisher).map(IndexConverters.documentToIndexInfoConverter()::convert);
});
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2014-2019 the original author or authors.
* Copyright 2014-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -20,11 +20,13 @@ import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
import org.bson.Document;
import org.bson.types.ObjectId;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.script.ExecutableMongoScript;
@@ -34,8 +36,9 @@ import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.DB;
import com.mongodb.BasicDBList;
import com.mongodb.MongoException;
import com.mongodb.client.MongoDatabase;
/**
* Default implementation of {@link ScriptOperations} capable of saving and executing {@link ServerSideJavaScript}.
@@ -97,8 +100,13 @@ class DefaultScriptOperations implements ScriptOperations {
return mongoOperations.execute(new DbCallback<Object>() {
@Override
public Object doInDB(DB db) throws MongoException, DataAccessException {
return db.eval(script.getCode(), convertScriptArgs(false, args));
public Object doInDB(MongoDatabase db) throws MongoException, DataAccessException {
Document command = new Document("$eval", script.getCode());
BasicDBList commandArgs = new BasicDBList();
commandArgs.addAll(Arrays.asList(convertScriptArgs(false, args)));
command.append("args", commandArgs);
return db.runCommand(command).get("retval");
}
});
}
@@ -115,8 +123,10 @@ class DefaultScriptOperations implements ScriptOperations {
return mongoOperations.execute(new DbCallback<Object>() {
@Override
public Object doInDB(DB db) throws MongoException, DataAccessException {
return db.eval(String.format("%s(%s)", scriptName, convertAndJoinScriptArgs(args)));
public Object doInDB(MongoDatabase db) throws MongoException, DataAccessException {
return db.runCommand(new Document("eval", String.format("%s(%s)", scriptName, convertAndJoinScriptArgs(args))))
.get("retval");
}
});
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2019 the original author or authors.
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2019 the original author or authors.
* Copyright 2010-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,24 +15,26 @@
*/
package org.springframework.data.mongodb.core;
import org.bson.Document;
import org.springframework.dao.DataAccessException;
import com.mongodb.DBObject;
import com.mongodb.MongoException;
/**
* An interface used by {@link MongoTemplate} for processing documents returned from a MongoDB query on a per-document
* basis. Implementations of this interface perform the actual work of prcoessing each document but don't need to worry
* about exception handling. {@MongoException}s will be caught and translated by the calling
* MongoTemplate
*
* An DocumentCallbackHandler is typically stateful: It keeps the result state within the object, to be available later
* for later inspection.
* about exception handling. {@MongoException}s will be caught and translated by the calling MongoTemplate An
* DocumentCallbackHandler is typically stateful: It keeps the result state within the object, to be available later for
* later inspection.
*
* @author Mark Pollack
*
* @author Grame Rocher
* @author Oliver Gierke
* @author John Brisbin
* @author Christoph Strobl
* @since 1.0
*/
public interface DocumentCallbackHandler {
void processDocument(DBObject dbObject) throws MongoException, DataAccessException;
void processDocument(Document document) throws MongoException, DataAccessException;
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2019 the original author or authors.
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -0,0 +1,34 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.DBCursor;
import com.mongodb.reactivestreams.client.FindPublisher;
/**
* Simple callback interface to allow customization of a {@link FindPublisher}.
*
* @author Mark Paluch
*/
interface FindPublisherPreparer {
/**
* Prepare the given cursor (apply limits, skips and so on). Returns the prepared cursor.
*
* @param cursor
*/
<T> FindPublisher<T> prepare(FindPublisher<T> findPublisher);
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016-2019 the original author or authors.
* Copyright 2016-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,29 +15,29 @@
*/
package org.springframework.data.mongodb.core;
import org.bson.Document;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Value object to mitigate different representations of geo command execution results in MongoDB.
*
* @author Oliver Gierke
* @author Christoph Strobl
* @soundtrack Fruitcake - Jeff Coffin (The Inside of the Outside)
* @since 1.9
*/
class GeoCommandStatistics {
private static final GeoCommandStatistics NONE = new GeoCommandStatistics(new BasicDBObject());
private static final GeoCommandStatistics NONE = new GeoCommandStatistics(new Document());
private final DBObject source;
private final Document source;
/**
* Creates a new {@link GeoCommandStatistics} instance with the given source document.
*
* @param source must not be {@literal null}.
*/
private GeoCommandStatistics(DBObject source) {
private GeoCommandStatistics(Document source) {
Assert.notNull(source, "Source document must not be null!");
this.source = source;
@@ -49,12 +49,12 @@ class GeoCommandStatistics {
* @param commandResult must not be {@literal null}.
* @return
*/
public static GeoCommandStatistics from(DBObject commandResult) {
public static GeoCommandStatistics from(Document commandResult) {
Assert.notNull(commandResult, "Command result must not be null!");
Object stats = commandResult.get("stats");
return stats == null ? NONE : new GeoCommandStatistics((DBObject) stats);
return stats == null ? NONE : new GeoCommandStatistics((Document) stats);
}
/**

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2019 the original author or authors.
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -0,0 +1,134 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.domain.Sort.Direction.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.List;
import java.util.concurrent.TimeUnit;
import org.bson.Document;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexField;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.util.ObjectUtils;
import com.mongodb.client.model.IndexOptions;
/**
* {@link Converter Converters} for index-related MongoDB documents/types.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.0
*/
abstract class IndexConverters {
private static final Converter<IndexDefinition, IndexOptions> DEFINITION_TO_MONGO_INDEX_OPTIONS;
private static final Converter<Document, IndexInfo> DOCUMENT_INDEX_INFO;
private static final Double ONE = Double.valueOf(1);
private static final Double MINUS_ONE = Double.valueOf(-1);
private static final Collection<String> TWO_D_IDENTIFIERS = Arrays.asList("2d", "2dsphere");
static {
DEFINITION_TO_MONGO_INDEX_OPTIONS = getIndexDefinitionIndexOptionsConverter();
DOCUMENT_INDEX_INFO = getDocumentIndexInfoConverter();
}
private IndexConverters() {
}
static Converter<IndexDefinition, IndexOptions> indexDefinitionToIndexOptionsConverter() {
return DEFINITION_TO_MONGO_INDEX_OPTIONS;
}
static Converter<Document, IndexInfo> documentToIndexInfoConverter() {
return DOCUMENT_INDEX_INFO;
}
private static Converter<IndexDefinition, IndexOptions> getIndexDefinitionIndexOptionsConverter() {
return indexDefinition -> {
Document indexOptions = indexDefinition.getIndexOptions();
IndexOptions ops = new IndexOptions();
if (indexOptions.containsKey("name")) {
ops = ops.name(indexOptions.get("name").toString());
}
if (indexOptions.containsKey("unique")) {
ops = ops.unique((Boolean) indexOptions.get("unique"));
}
if (indexOptions.containsKey("sparse")) {
ops = ops.sparse((Boolean) indexOptions.get("sparse"));
}
if (indexOptions.containsKey("background")) {
ops = ops.background((Boolean) indexOptions.get("background"));
}
if (indexOptions.containsKey("expireAfterSeconds")) {
ops = ops.expireAfter((Long) indexOptions.get("expireAfterSeconds"), TimeUnit.SECONDS);
}
if (indexOptions.containsKey("min")) {
ops = ops.min(((Number) indexOptions.get("min")).doubleValue());
}
if (indexOptions.containsKey("max")) {
ops = ops.max(((Number) indexOptions.get("max")).doubleValue());
}
if (indexOptions.containsKey("bits")) {
ops = ops.bits((Integer) indexOptions.get("bits"));
}
if (indexOptions.containsKey("bucketSize")) {
ops = ops.bucketSize(((Number) indexOptions.get("bucketSize")).doubleValue());
}
if (indexOptions.containsKey("default_language")) {
ops = ops.defaultLanguage(indexOptions.get("default_language").toString());
}
if (indexOptions.containsKey("language_override")) {
ops = ops.languageOverride(indexOptions.get("language_override").toString());
}
if (indexOptions.containsKey("weights")) {
ops = ops.weights((org.bson.Document) indexOptions.get("weights"));
}
for (String key : indexOptions.keySet()) {
if (ObjectUtils.nullSafeEquals("2dsphere", indexOptions.get(key))) {
ops = ops.sphereVersion(2);
}
}
if(indexOptions.containsKey("partialFilterExpression")) {
ops = ops.partialFilterExpression((org.bson.Document)indexOptions.get("partialFilterExpression"));
}
return ops;
};
}
private static Converter<Document, IndexInfo> getDocumentIndexInfoConverter() {
return ix -> {
return IndexInfo.indexInfoOf(ix);
};
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -35,7 +35,7 @@ public interface IndexOperations {
*
* @param indexDefinition must not be {@literal null}.
*/
void ensureIndex(IndexDefinition indexDefinition);
String ensureIndex(IndexDefinition indexDefinition);
/**
* Drops an index from this collection.
@@ -49,15 +49,6 @@ public interface IndexOperations {
*/
void dropAllIndexes();
/**
* Clears all indices that have not yet been applied to this collection.
*
* @deprecated since 1.7. The MongoDB Java driver version 3.0 does no longer support reseting the index cache.
* @throws {@link UnsupportedOperationException} when used with MongoDB Java driver version 3.0.
*/
@Deprecated
void resetIndexCache();
/**
* Returns the index information on the collection.
*

View File

@@ -0,0 +1,66 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.List;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.util.Assert;
/**
* Adapter for creating synchronous {@link IndexOperations}.
*
* @author Christoph Strobl
* @since 2.0
*/
public interface IndexOperationsAdapter extends IndexOperations {
/**
* Obtain a blocking variant of {@link IndexOperations} wrapping {@link ReactiveIndexOperations}.
*
* @param reactiveIndexOperations must not be {@literal null}.
* @return never {@literal null}
*/
static IndexOperationsAdapter blocking(ReactiveIndexOperations reactiveIndexOperations) {
Assert.notNull(reactiveIndexOperations, "ReactiveIndexOperations must not be null!");
return new IndexOperationsAdapter() {
@Override
public String ensureIndex(IndexDefinition indexDefinition) {
return reactiveIndexOperations.ensureIndex(indexDefinition).block();
}
@Override
public void dropIndex(String name) {
reactiveIndexOperations.dropIndex(name).block();
}
@Override
public void dropAllIndexes() {
reactiveIndexOperations.dropAllIndexes().block();
}
@Override
public List<IndexInfo> getIndexInfo() {
return reactiveIndexOperations.getIndexInfo().collectList().block();
}
};
}
}

View File

@@ -0,0 +1,34 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.data.mongodb.core.convert.QueryMapper;
/**
* TODO: Revisit for a better pattern.
* @author Mark Paluch
* @since 2.0
*/
public interface IndexOperationsProvider {
/**
* Returns the operations that can be performed on indexes
*
* @return index operations on the named collection
*/
IndexOperations indexOps(String collectionName);
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,9 +15,9 @@
*/
package org.springframework.data.mongodb.core;
import org.bson.Document;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
import com.mongodb.WriteConcern;
/**
@@ -31,6 +31,7 @@ import com.mongodb.WriteConcern;
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MongoAction {
@@ -38,8 +39,8 @@ public class MongoAction {
private final WriteConcern defaultWriteConcern;
private final Class<?> entityType;
private final MongoActionOperation mongoActionOperation;
private final DBObject query;
private final DBObject document;
private final Document query;
private final Document document;
/**
* Create an instance of a {@link MongoAction}.
@@ -48,11 +49,11 @@ public class MongoAction {
* @param mongoActionOperation action being taken against the collection
* @param collectionName the collection name, must not be {@literal null} or empty.
* @param entityType the POJO that is being operated against
* @param document the converted DBObject from the POJO or Spring Update object
* @param query the converted DBObject from the Spring Query object
* @param document the converted Document from the POJO or Spring Update object
* @param query the converted Document from the Spring Query object
*/
public MongoAction(WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation,
String collectionName, Class<?> entityType, DBObject document, DBObject query) {
public MongoAction(WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation, String collectionName,
Class<?> entityType, Document document, Document query) {
Assert.hasText(collectionName, "Collection name must not be null or empty!");
@@ -72,14 +73,6 @@ public class MongoAction {
return defaultWriteConcern;
}
/**
* @deprecated use {@link #getEntityType()} instead.
*/
@Deprecated
public Class<?> getEntityClass() {
return entityType;
}
public Class<?> getEntityType() {
return entityType;
}
@@ -88,11 +81,11 @@ public class MongoAction {
return mongoActionOperation;
}
public DBObject getQuery() {
public Document getQuery() {
return query;
}
public DBObject getDocument() {
public Document getDocument() {
return document;
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2019 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,101 +1,77 @@
/*
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.jmx.export.annotation.ManagedOperation;
import org.springframework.jmx.export.annotation.ManagedResource;
import org.springframework.util.Assert;
import com.mongodb.DB;
import com.mongodb.Mongo;
/**
* Mongo server administration exposed via JMX annotations
*
* @author Mark Pollack
* @author Thomas Darimont
* @author Mark Paluch
*/
@ManagedResource(description = "Mongo Admin Operations")
public class MongoAdmin implements MongoAdminOperations {
private final Mongo mongo;
private String username;
private String password;
private String authenticationDatabaseName;
public MongoAdmin(Mongo mongo) {
Assert.notNull(mongo, "Mongo must not be null!");
this.mongo = mongo;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#dropDatabase(java.lang.String)
*/
@ManagedOperation
public void dropDatabase(String databaseName) {
getDB(databaseName).dropDatabase();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#createDatabase(java.lang.String)
*/
@ManagedOperation
public void createDatabase(String databaseName) {
getDB(databaseName);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#getDatabaseStats(java.lang.String)
*/
@ManagedOperation
public String getDatabaseStats(String databaseName) {
return getDB(databaseName).getStats().toString();
}
/**
* Sets the username to use to connect to the Mongo database
*
* @param username The username to use
*/
public void setUsername(String username) {
this.username = username;
}
/**
* Sets the password to use to authenticate with the Mongo database.
*
* @param password The password to use
*/
public void setPassword(String password) {
this.password = password;
}
/**
* Sets the authenticationDatabaseName to use to authenticate with the Mongo database.
*
* @param authenticationDatabaseName The authenticationDatabaseName to use.
*/
public void setAuthenticationDatabaseName(String authenticationDatabaseName) {
this.authenticationDatabaseName = authenticationDatabaseName;
}
DB getDB(String databaseName) {
return MongoDbUtils.getDB(mongo, databaseName, new UserCredentials(username, password), authenticationDatabaseName);
}
}
/*
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.MongoClient;
import com.mongodb.client.MongoDatabase;
import org.bson.Document;
import org.springframework.jmx.export.annotation.ManagedOperation;
import org.springframework.jmx.export.annotation.ManagedResource;
import org.springframework.util.Assert;
/**
* Mongo server administration exposed via JMX annotations
*
* @author Mark Pollack
* @author Thomas Darimont
* @author Mark Paluch
* @author Christoph Strobl
*/
@ManagedResource(description = "Mongo Admin Operations")
public class MongoAdmin implements MongoAdminOperations {
private final MongoClient mongoClient;
public MongoAdmin(MongoClient mongoClient) {
Assert.notNull(mongoClient, "MongoClient must not be null!");
this.mongoClient = mongoClient;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#dropDatabase(java.lang.String)
*/
@ManagedOperation
public void dropDatabase(String databaseName) {
getDB(databaseName).drop();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#createDatabase(java.lang.String)
*/
@ManagedOperation
public void createDatabase(String databaseName) {
getDB(databaseName);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoAdminOperations#getDatabaseStats(java.lang.String)
*/
@ManagedOperation
public String getDatabaseStats(String databaseName) {
return getDB(databaseName).runCommand(new Document("dbStats", 1).append("scale" , 1024)).toJson();
}
@ManagedOperation
public String getServerStatus() {
return getDB("admin").runCommand(new Document("serverStatus", 1).append("rangeDeleter", 1).append("repl", 1)).toJson();
}
MongoDatabase getDB(String databaseName) {
return mongoClient.getDatabase(databaseName);
}
}

View File

@@ -1,34 +1,34 @@
/*
* Copyright 2011-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.jmx.export.annotation.ManagedOperation;
/**
* @author Mark Pollack
* @author Oliver Gierke
*/
public interface MongoAdminOperations {
@ManagedOperation
void dropDatabase(String databaseName);
@ManagedOperation
void createDatabase(String databaseName);
@ManagedOperation
String getDatabaseStats(String databaseName);
}
/*
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.jmx.export.annotation.ManagedOperation;
/**
* @author Mark Pollack
* @author Oliver Gierke
*/
public interface MongoAdminOperations {
@ManagedOperation
void dropDatabase(String databaseName);
@ManagedOperation
void createDatabase(String databaseName);
@ManagedOperation
String getDatabaseStats(String databaseName);
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2019 the original author or authors.
* Copyright 2015-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -26,7 +26,6 @@ import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.util.CollectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientOptions;
import com.mongodb.MongoCredential;
@@ -38,7 +37,7 @@ import com.mongodb.ServerAddress;
* @author Christoph Strobl
* @since 1.7
*/
public class MongoClientFactoryBean extends AbstractFactoryBean<Mongo> implements PersistenceExceptionTranslator {
public class MongoClientFactoryBean extends AbstractFactoryBean<MongoClient> implements PersistenceExceptionTranslator {
private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator();
@@ -108,8 +107,8 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<Mongo> implement
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
public Class<? extends Mongo> getObjectType() {
return Mongo.class;
public Class<? extends MongoClient> getObjectType() {
return MongoClient.class;
}
/*
@@ -125,7 +124,7 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<Mongo> implement
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override
protected Mongo createInstance() throws Exception {
protected MongoClient createInstance() throws Exception {
if (mongoClientOptions == null) {
mongoClientOptions = MongoClientOptions.builder().build();
@@ -143,7 +142,7 @@ public class MongoClientFactoryBean extends AbstractFactoryBean<Mongo> implement
* @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object)
*/
@Override
protected void destroyInstance(Mongo instance) throws Exception {
protected void destroyInstance(MongoClient instance) throws Exception {
instance.close();
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2019 the original author or authors.
* Copyright 2015-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -18,10 +18,8 @@ package org.springframework.data.mongodb.core;
import javax.net.SocketFactory;
import javax.net.ssl.SSLSocketFactory;
import org.springframework.beans.DirectFieldAccessor;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.DBDecoderFactory;
import com.mongodb.DBEncoderFactory;
@@ -35,6 +33,7 @@ import com.mongodb.WriteConcern;
*
* @author Christoph Strobl
* @author Oliver Gierke
* @author Mark Paluch
* @since 1.7
*/
public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClientOptions> {
@@ -64,7 +63,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
private int heartbeatConnectTimeout = DEFAULT_MONGO_OPTIONS.getHeartbeatConnectTimeout();
private int heartbeatSocketTimeout = DEFAULT_MONGO_OPTIONS.getHeartbeatSocketTimeout();
private String requiredReplicaSetName = DEFAULT_MONGO_OPTIONS.getRequiredReplicaSetName();
private int serverSelectionTimeout = Integer.MIN_VALUE;
private int serverSelectionTimeout = DEFAULT_MONGO_OPTIONS.getServerSelectionTimeout();
private boolean ssl;
private SSLSocketFactory sslSocketFactory;
@@ -250,7 +249,9 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
* @param sslSocketFactory
*/
public void setSslSocketFactory(SSLSocketFactory sslSocketFactory) {
this.sslSocketFactory = sslSocketFactory;
this.ssl = sslSocketFactory != null;
}
/**
@@ -274,13 +275,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
SocketFactory socketFactoryToUse = ssl
? (sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory.getDefault()) : this.socketFactory;
MongoClientOptions.Builder builder = MongoClientOptions.builder();
if (MongoClientVersion.isMongo3Driver() && serverSelectionTimeout != Integer.MIN_VALUE) {
new DirectFieldAccessor(builder).setPropertyValue("serverSelectionTimeout", serverSelectionTimeout);
}
return builder //
return MongoClientOptions.builder() //
.alwaysUseMBeans(this.alwaysUseMBeans) //
.connectionsPerHost(this.connectionsPerHost) //
.connectTimeout(connectTimeout) //
@@ -298,6 +293,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
.minHeartbeatFrequency(minHeartbeatFrequency) //
.readPreference(readPreference) //
.requiredReplicaSetName(requiredReplicaSetName) //
.serverSelectionTimeout(serverSelectionTimeout) //
.socketFactory(socketFactoryToUse) //
.socketKeepAlive(socketKeepAlive) //
.socketTimeout(socketTimeout) //

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2019 the original author or authors.
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,216 +0,0 @@
/*
* Copyright 2010-2019 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.util.MongoClientVersion;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.util.Assert;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Helper class featuring helper methods for internal MongoDb classes. Mainly intended for internal use within the
* framework.
*
* @author Thomas Risberg
* @author Graeme Rocher
* @author Oliver Gierke
* @author Randy Watler
* @author Thomas Darimont
* @author Christoph Strobl
* @since 1.0
*/
public abstract class MongoDbUtils {
private static final Logger LOGGER = LoggerFactory.getLogger(MongoDbUtils.class);
/**
* Private constructor to prevent instantiation.
*/
private MongoDbUtils() {}
/**
* Obtains a {@link DB} connection for the given {@link Mongo} instance and database name
*
* @param mongo the {@link Mongo} instance, must not be {@literal null}.
* @param databaseName the database name, must not be {@literal null} or empty.
* @return the {@link DB} connection
*/
public static DB getDB(Mongo mongo, String databaseName) {
return doGetDB(mongo, databaseName, UserCredentials.NO_CREDENTIALS, true, databaseName);
}
/**
* Obtains a {@link DB} connection for the given {@link Mongo} instance and database name
*
* @param mongo the {@link Mongo} instance, must not be {@literal null}.
* @param databaseName the database name, must not be {@literal null} or empty.
* @param credentials the credentials to use, must not be {@literal null}.
* @return the {@link DB} connection
* @deprecated since 1.7. The {@link MongoClient} itself should hold credentials within
* {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials) {
return getDB(mongo, databaseName, credentials, databaseName);
}
/**
* @param mongo
* @param databaseName
* @param credentials
* @param authenticationDatabaseName
* @return
* @deprecated since 1.7. The {@link MongoClient} itself should hold credentials within
* {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials,
String authenticationDatabaseName) {
Assert.notNull(mongo, "No Mongo instance specified!");
Assert.hasText(databaseName, "Database name must be given!");
Assert.notNull(credentials, "Credentials must not be null, use UserCredentials.NO_CREDENTIALS!");
Assert.hasText(authenticationDatabaseName, "Authentication database name must not be null or empty!");
return doGetDB(mongo, databaseName, credentials, true, authenticationDatabaseName);
}
private static DB doGetDB(Mongo mongo, String databaseName, UserCredentials credentials, boolean allowCreate,
String authenticationDatabaseName) {
DbHolder dbHolder = (DbHolder) TransactionSynchronizationManager.getResource(mongo);
// Do we have a populated holder and TX sync active?
if (dbHolder != null && !dbHolder.isEmpty() && TransactionSynchronizationManager.isSynchronizationActive()) {
DB db = dbHolder.getDB(databaseName);
// DB found but not yet synchronized
if (db != null && !dbHolder.isSynchronizedWithTransaction()) {
LOGGER.debug("Registering Spring transaction synchronization for existing MongoDB {}.", databaseName);
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(dbHolder, mongo));
dbHolder.setSynchronizedWithTransaction(true);
}
if (db != null) {
return db;
}
}
// Lookup fresh database instance
LOGGER.debug("Getting Mongo Database name=[{}]", databaseName);
DB db = mongo.getDB(databaseName);
if (!(mongo instanceof MongoClient) && requiresAuthDbAuthentication(credentials)) {
ReflectiveDbInvoker.authenticate(mongo, db, credentials, authenticationDatabaseName);
}
// TX sync active, bind new database to thread
if (TransactionSynchronizationManager.isSynchronizationActive()) {
LOGGER.debug("Registering Spring transaction synchronization for MongoDB instance {}.", databaseName);
DbHolder holderToUse = dbHolder;
if (holderToUse == null) {
holderToUse = new DbHolder(databaseName, db);
} else {
holderToUse.addDB(databaseName, db);
}
// synchronize holder only if not yet synchronized
if (!holderToUse.isSynchronizedWithTransaction()) {
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(holderToUse, mongo));
holderToUse.setSynchronizedWithTransaction(true);
}
if (holderToUse != dbHolder) {
TransactionSynchronizationManager.bindResource(mongo, holderToUse);
}
}
// Check whether we are allowed to return the DB.
if (!allowCreate && !isDBTransactional(db, mongo)) {
throw new IllegalStateException("No Mongo DB bound to thread, "
+ "and configuration does not allow creation of non-transactional one here");
}
return db;
}
/**
* Return whether the given DB instance is transactional, that is, bound to the current thread by Spring's transaction
* facilities.
*
* @param db the DB to check
* @param mongo the Mongo instance that the DB was created with (may be <code>null</code>)
* @return whether the DB is transactional
*/
public static boolean isDBTransactional(DB db, Mongo mongo) {
if (mongo == null) {
return false;
}
DbHolder dbHolder = (DbHolder) TransactionSynchronizationManager.getResource(mongo);
return dbHolder != null && dbHolder.containsDB(db);
}
/**
* Perform actual closing of the Mongo DB object, catching and logging any cleanup exceptions thrown.
*
* @param db the DB to close (may be <code>null</code>)
* @deprecated since 1.7. The main use case for this method is to ensure that applications can read their own
* unacknowledged writes, but this is no longer so prevalent since the MongoDB Java driver version 3
* started defaulting to acknowledged writes.
*/
@Deprecated
public static void closeDB(DB db) {
if (db != null) {
LOGGER.debug("Closing Mongo DB object");
try {
ReflectiveDbInvoker.requestDone(db);
} catch (Throwable ex) {
LOGGER.debug("Unexpected exception on closing Mongo DB object", ex);
}
}
}
/**
* Check if credentials present. In case we're using a mongo-java-driver version 3 or above we do not have the need
* for authentication as the auth data has to be provided within the MongoClient
*
* @param credentials
* @return
*/
private static boolean requiresAuthDbAuthentication(UserCredentials credentials) {
if (credentials == null || !credentials.hasUsername()) {
return false;
}
return !MongoClientVersion.isMongo3Driver();
}
}

Some files were not shown because too many files have changed in this diff Show More