Compare commits

...

642 Commits

Author SHA1 Message Date
Oliver Gierke
42cc6ff37f DATAMONGO-1750 - Release version 1.10.6 (Ingalls SR6). 2017-07-26 23:47:20 +02:00
Oliver Gierke
9ded78b13c DATAMONGO-1750 - Prepare 1.10.6 (Ingalls SR6). 2017-07-26 23:45:52 +02:00
Oliver Gierke
b0842a89fd DATAMONGO-1750 - Updated changelog. 2017-07-26 23:45:43 +02:00
Oliver Gierke
5a9eef7c96 DATAMONGO-1751 - Updated changelog. 2017-07-25 16:15:49 +02:00
Oliver Gierke
77425736e9 DATAMONGO-1717 - Updated changelog. 2017-07-25 10:04:03 +02:00
Oliver Gierke
6aa8f84428 DATAMONGO-1711 - After release cleanups. 2017-07-24 19:25:08 +02:00
Oliver Gierke
b83f2e9198 DATAMONGO-1711 - Prepare next development iteration. 2017-07-24 19:25:06 +02:00
Oliver Gierke
2171c814e8 DATAMONGO-1711 - Release version 1.10.5 (Ingalls SR5). 2017-07-24 18:44:18 +02:00
Oliver Gierke
d0e398a39c DATAMONGO-1711 - Prepare 1.10.5 (Ingalls SR5). 2017-07-24 18:43:23 +02:00
Oliver Gierke
428c60dee0 DATAMONGO-1711 - Updated changelog. 2017-07-24 18:43:16 +02:00
Oliver Gierke
80393b2dc2 DATAMONGO-1720 - Make sure benchmark module is not included by default.
The benchmarks module does not produce a JAR by default which let's our Maven Central deployment fail as a module has to produce one according to their rules. We're now only including the benchmark module when the benchmarks profile is active.
2017-07-24 18:39:38 +02:00
Oliver Gierke
c15a542863 DATAMONGO-1744 - Improved setup of default MongoMappingContext instances created.
We now make sure that the SimpleTypeHolder produced by MongoCustomConversions is used to set up default MongoMappingContext instances in (Reactive)MongoTemplate and unit tests.
2017-07-19 15:21:15 +02:00
Mark Paluch
92c6db13dc DATAMONGO-1703 - Polishing.
Use lombok's Value for ObjectPathItem. Make methods accessible in DefaultDbRefResolver before calling. Use class.cast to avoid warnings. Update Javadoc.

Original pull request: #478.
2017-07-14 11:47:59 +02:00
Christoph Strobl
1681bcd15b DATAMONGO-1703 - Convert resolved DBRef's from source that do not match the requested property type.
We now check if already resolved DBRef's are assignable to the target property type. If not, we perform conversion again to prevent ClassCastException when trying to assign non matching types.

Remove non applicable public modifiers in ObjectPath.

Original pull request: #478.
2017-07-14 11:47:59 +02:00
Mark Paluch
1f2d0da5ed DATAMONGO-1720 - Polishing.
Enhance benchmark statistics with Git/working tree details. Specify byte encoding for JSON to byte encoder.
Add status code check to HttpResultsWriter to verify that the results were accepted. Convert spaces to tabs in pom.xml.

Original pull request: #483.
2017-07-13 15:17:13 +02:00
Christoph Strobl
8009bd2846 DATAMONGO-1720 - Add JMH based benchmarks for MappingMongoConverter.
Run the benchmark via the maven profile "benchmarks":

    mvn -P benchmarks clean test

Or run them customized:

    mvn -P benchmarks -DwarmupIterations=2 -DmeasurementIterations=5 -Dforks=1 clean test

Original pull request: #483.
2017-07-13 15:17:09 +02:00
Oliver Gierke
cbd9807f16 DATAMONGO-1725 - Prevent NullPointerException in CloseableIterableCursorAdapter.close(). 2017-07-05 13:15:45 +02:00
Oliver Gierke
f672b17dfc DATAMONGO-1729 - Open projections don't get field restrictions applied.
We now only apply a field restriction if the projection used for a query is closed.
2017-07-03 22:17:38 +02:00
Oliver Gierke
2a018b04ec DATAMONGO-1723 - ConfigurationExtensionUnitTests now need to provide a BeanDefinitionRegistry. 2017-06-26 16:53:39 +02:00
Mark Paluch
8e748ab1c2 DATAMONGO-1678 - Polishing.
Use Lombok's Value annotation for immutable value objects. Use IllegalArgumentException for NonNull validation exceptions. Trim whitespaces, formatting.

Original pull request: #472.
2017-06-26 13:28:28 +02:00
Christoph Strobl
49f9307884 DATAMONGO-1678 - Run bulk update / remove documents through type mappers.
We now make sure to run any query / update object through the Query- / UpdateMapper. This ensures @Field annotations and potential custom conversions get processed correctly for update / remove operations.

Original pull request: #472.
2017-06-26 13:13:54 +02:00
Christoph Strobl
ebd8491642 DATAMONGO-1697 - Update MongoOperations JavaDoc regarding mapping limitations.
We now explicitly mention mapping/support limitations for API variants like count(Query, String) not having domain type specific information that allows field specific mapping.
2017-06-19 10:31:20 +02:00
Christoph Strobl
f581677bf2 DATAMONGO-1718 - Polishing.
Add test and hand over Object.class as placeholder for required domain type.

Original Pull Request: #469
2017-06-16 13:31:37 +02:00
Borislav Rangelov
a2a172e559 DATAMONGO-1718 - Fix MongoTemplate::findAllAndRemove(Query,String) delegating to wrong overload.
Original Pull Request: #469 (by Borislav Rangelov).
2017-06-16 13:31:26 +02:00
Mark Paluch
61fd09bb43 DATAMONGO-1688 - Updated changelog. 2017-06-14 17:35:01 +02:00
Mark Paluch
4b11d415f1 DATAMONGO-1672 - After release cleanups. 2017-06-08 11:26:20 +02:00
Mark Paluch
beabbc0307 DATAMONGO-1672 - Prepare next development iteration. 2017-06-08 11:26:18 +02:00
Mark Paluch
ad55a8bab7 DATAMONGO-1672 - Release version 1.10.4 (Ingalls SR4). 2017-06-08 10:56:51 +02:00
Mark Paluch
79e0b44b5e DATAMONGO-1672 - Prepare 1.10.4 (Ingalls SR4). 2017-06-08 10:56:03 +02:00
Mark Paluch
b747e226d7 DATAMONGO-1672 - Updated changelog. 2017-06-08 10:55:58 +02:00
Mark Paluch
0bdd4a2eb4 DATAMONGO-1671 - Updated changelog. 2017-06-07 12:23:37 +02:00
Christoph Strobl
dda91f9d48 DATAMONGO-1699 - Upgrade travis-ci build to use MongoDB 3.4 server.
We now do it explicitly as there seems to be almost no movement getting the alias on the whitelist.
2017-05-24 13:17:27 +02:00
Mark Paluch
d32afee0e0 DATAMONGO-1664 - Updated changelog. 2017-05-09 11:36:18 +02:00
Mark Paluch
5e4cced3e6 DATAMONGO-1205 - Polishing.
Add author tag. Extend year range in copyright header.

Original pull request: #397.
2017-04-20 08:36:04 +02:00
Martin Macko
375d59f7a2 DATAMONGO-1205 - Log only CyclicPropertyReferenceException message.
We log CyclicPropertyReferenceException with its message only and removed the stack trace from the log. The stacktrace points to a verifier location and is not particularly useful in finding the offending code. This change creates consistency over how CyclicPropertyReferenceException is logged.

Original pull request: #397.
2017-04-20 08:36:04 +02:00
Oliver Gierke
0fe197d608 DATAMONGO-1670 - Updated changelog. 2017-04-19 21:04:23 +02:00
Oliver Gierke
5325e46aaa DATAMONGO-1669 - After release cleanups. 2017-04-19 20:01:14 +02:00
Oliver Gierke
1fe5793bd7 DATAMONGO-1669 - Prepare next development iteration. 2017-04-19 20:01:12 +02:00
Oliver Gierke
909a0bb2f2 DATAMONGO-1669 - Release version 1.10.3 (Ingalls SR3). 2017-04-19 19:32:38 +02:00
Oliver Gierke
c3983a4a32 DATAMONGO-1669 - Prepare 1.10.3 (Ingalls SR3). 2017-04-19 19:32:01 +02:00
Oliver Gierke
94cd8fd827 DATAMONGO-1669 - Updated changelog. 2017-04-19 19:31:54 +02:00
Oliver Gierke
34ca906b80 DATAMONGO-1634 - Updated changelog. 2017-04-19 13:04:10 +02:00
Oliver Gierke
0a9f7d0e30 DATAMONGO-1633 - After release cleanups. 2017-04-19 11:48:46 +02:00
Oliver Gierke
36af892679 DATAMONGO-1633 - Prepare next development iteration. 2017-04-19 11:48:42 +02:00
Oliver Gierke
12f3dce709 DATAMONGO-1633 - Release version 1.10.2 (Ingalls SR2). 2017-04-19 10:11:28 +02:00
Oliver Gierke
ae3b4be772 DATAMONGO-1633 - Prepare 1.10.2 (Ingalls SR2). 2017-04-19 10:10:47 +02:00
Oliver Gierke
4f813ae7a7 DATAMONGO-1633 - Updated changelog. 2017-04-19 10:10:39 +02:00
Mark Paluch
bd92f67e54 DATAMONGO-1666 - Consider collection type in bulk DBRef fetching.
We now consider the property's collection type after bulk-fetching DBRefs before returning the actual result value. The issue got only visible if bulk fetching is possible and constructor creation is used. Setting the property value on through an property accessor works fine because the property accessor checks all values for assignability and potentially converts values to their target type. That's different for constructor creation.

Original Pull Request: #457
2017-04-13 11:27:26 +02:00
Oliver Gierke
63968513d9 DATAMONGO-1535 - Updated changelog. 2017-04-10 13:44:19 +02:00
Michael J. Simons
2b0026931a DATAMONGO-1662 - Fix classname in reference docs about projections in aggregations.
Original pull request: #455.
2017-04-10 09:12:38 +02:00
Mark Paluch
0df5e6ba58 DATAMONGO-1645 - Polishing.
Clean up appender and log level after test run. Suppress log output during tests.

Original pull request: #450.
2017-03-21 10:52:35 +01:00
Christoph Strobl
2a0a880f57 DATAMONGO-1645 - Safely serialize JSON output for log message in LoggingEventListener.
We now make sure to safely serialize JSON output for mapped documents. This prevents the logger from rendering false exception messages to log appender.

Original pull request: #450.
2017-03-21 10:33:37 +01:00
Mark Paluch
d920cee5e7 DATAMONGO-1620 - Polishing.
Use Integer.MIN_VALUE as placeholder for unset values to allow setting zero (immediately) and -1 (indefinite wait) server selection timeouts. Fix test method name. Add JavaDoc.

Original pull request: #449.
2017-03-13 16:21:12 +01:00
Christoph Strobl
3af2d810bc DATAMONGO-1620 - Add server-selection-timeout to XML MongoClientOptions config.
We now allow server-selection-timeout attribute on MongoClientOptions XML configuration for a MongoDB 3.x client.

Original pull request: #449.
2017-03-13 16:21:12 +01:00
Mark Paluch
a4c754467e DATAMONGO-1421 - Polishing.
Remove trailing whitespaces. Construct exception message with String.format(…).

Original pull request: #448.
2017-03-08 08:47:58 +01:00
Christoph Strobl
0ab635d7ee DATAMONGO-1421 - Fix serialization in error message causing error itself.
We now make sure to safely serialize the criteria object used for creating the error message when raising an `InvalidMongoDbApiUsageException` in cases where `addCriteria` is used to add multiple entries for the same property.

Original pull request: #448.
2017-03-08 08:47:58 +01:00
Oliver Gierke
ef46edc941 DATAMONGO-1639 - Polishing.
Formatting in MongoTemplateUnitTests.
2017-03-06 16:24:44 +01:00
Oliver Gierke
93f81f7321 DATAMONGO-1639 - Make sure BeforeConvertEvent sees new version for updates.
The changes for DATAMONGO-1617 subtley changed the behavior for entity updates in terms of the version value they see for entities using optimistic locking. Previously the updates already saw the new version value, where after the fix for DATAMONGO-1617 it saw the old one. That caused BeforeConvertEvent listeners not being able to distinguish between an original insert and the first update anymore.

This change is now rolled back and we introduced a test case that encodes this expectation explicitly.
2017-03-06 16:24:44 +01:00
Oliver Gierke
f42c24a605 DATAMONGO-1597 - Updated changelog. 2017-03-02 16:57:50 +01:00
Oliver Gierke
1ac88e44a1 DATAMONGO-1598 - After release cleanups. 2017-03-02 10:21:53 +01:00
Oliver Gierke
324f656410 DATAMONGO-1598 - Prepare next development iteration. 2017-03-02 10:21:36 +01:00
Oliver Gierke
265ad423e5 DATAMONGO-1598 - Release version 1.10.1 (Ingalls SR1). 2017-03-02 09:41:27 +01:00
Oliver Gierke
0a4310760c DATAMONGO-1598 - Prepare 1.10.1 (Ingalls SR1). 2017-03-02 09:33:27 +01:00
Oliver Gierke
38ce0a24cc DATAMONGO-1598 - Updated changelog. 2017-03-02 09:32:50 +01:00
Mark Paluch
fb835e6a29 DATAMONGO-1605 - Polishing.
Remove additional quoting around JSON serialization because JSON serialization adds quotes to a string. Reformat code.
2017-03-01 15:59:09 +01:00
Christoph Strobl
c77025859b DATAMONGO-1605 - Retain type of SpEL expression result when used in @Query.
Fix issue where any result of a SpEL expression had been treated as quoted String within the resulting MongoDB query.
2017-03-01 15:59:09 +01:00
Mark Paluch
61cb23db5c DATAMONGO-1600 - Make GraphLookupOperationBuilder public.
Make GraphLookupOperationBuilder public so it can be used in types outside the aggregation package.

Original Pull Request: #437
2017-03-01 12:55:27 +01:00
Mark Paluch
6f138e641b DATAMONGO-1603 - Polishing.
Remove code that became unused. Reformat code. Extend years in copyright header.

Original pull request: #441.
2017-03-01 08:54:57 +01:00
Christoph Strobl
5925b5133c DATAMONGO-1603 - Fix Placeholder not replaced correctly in @Query.
Fix issues when placeholders are appended with other chars eg. '?0xyz' or have been reused multiple times within the query. Additional tests and fixes for complex quoted replacements eg. in regex query. Rely on placeholder quotation indication instead of binding one. Might be misleading when placeholder is used more than once.

Original pull request: #441.
2017-03-01 08:43:31 +01:00
Oliver Gierke
9d4f34a6ff DATAMONGO-1617 - Reinstantiate version property initialization before BeforeConvertEvent publication.
Related pull request: #443.
2017-02-28 20:05:47 +01:00
Oliver Gierke
85ff5a7254 DATAMONGO-1617 - Polishing.
Some cleanups in MongoTemplateTests. Removed manual ID assignment in general id handling test to make sure we use the id generation. Removed unneccessary code from domain type in favor of Lombok.

Original pull request: #443.
2017-02-28 18:07:28 +01:00
Laszlo Csontos
593c5366fa DATAMONGO-1617 - BeforeConvertEvent is now emitted before updatable idendifier assertion.
We now make sure the BeforeConvertEvent is published before we check for identifier types that can potentially be auto-generated. That allows the event listeners to populate identifiers. Previously the identifier check kicked in before that and thus caused the listener not being able to populate the property.

Original pull request: #443.
2017-02-28 18:07:28 +01:00
Christoph Strobl
9c06124943 DATAMONGO-1608 - Polishing.
Throw an IllegalArgumentException when trying to create a query using 'null' as an argument for queries resulting in a $regex query operator.

Original Pull Request: #439
2017-02-13 08:07:31 +01:00
Edward Prentice
e342008c4b DATAMONGO-1608 - Add guard against NPE in MongoQueryCreator when using IgnoreCase.
Original Pull Request: #439
2017-02-13 08:07:24 +01:00
Christoph Strobl
a4782ae9ec DATAMONGO-1607 - Polishing.
Move coordinate conversion to dedicated method. Additionally fix issue with assertions applied to late in the chain and added some tests.

Original Pull Request: #438
2017-02-10 14:10:13 +01:00
Thiago Diniz da Silveira
4cc52dc1e1 DATAMONGO-1607 - Fix ClassCastException in Circle, Point and Sphere when coordinates are not Double.
Original Pull Request: #438
2017-02-10 14:10:04 +01:00
Mark Paluch
e41e072217 DATAMONGO-1602 - Remove references to Assert single-arg methods.
Replace references to Assert single-arg methods with references to methods accepting the test object and message.

Related ticket: SPR-15196.
2017-02-01 11:23:39 +01:00
Oliver Gierke
d199a40664 DATAMONGO-1573 - Updated changelog. 2017-01-26 12:08:50 +01:00
Oliver Gierke
25d29a2332 DATAMONGO-1574 - After release cleanups. 2017-01-26 10:57:47 +01:00
Oliver Gierke
b19816490f DATAMONGO-1574 - Prepare next development iteration. 2017-01-26 10:57:44 +01:00
Oliver Gierke
d5fec0989c DATAMONGO-1574 - Release version 1.10 GA (Ingalls). 2017-01-26 10:27:10 +01:00
Oliver Gierke
52b52dba93 DATAMONGO-1574 - Prepare 1.10 GA (Ingalls). 2017-01-26 10:26:35 +01:00
Oliver Gierke
d046f30862 DATAMONGO-1574 - Updated changelog. 2017-01-26 10:26:33 +01:00
Christoph Strobl
7413a031c1 DATAMONGO-1517 - Polishing.
Remove ReflectiveSimpleTypes in favor of MongoSimpleTypes.
Add add integration test.
2017-01-25 16:57:20 +01:00
Mark Paluch
8e3d7f96c4 DATAMONGO-1517 - Add support for Decimal128 BSON type.
Support Decimal128 as Mongo simple type if present. Decimal128 is stored as NumberDecimal.

class Person {

  String id;
  Decimal128 decimal128;

  Person(String id, Decimal128 decimal128) {
    this.id = id;
    this.decimal128 = decimal128;
  }
}

mongoTemplate.save(new Person("foo", new Decimal128(new BigDecimal("123.456"))));

is represented as:

{ "_id" : "foo", "decimal128" : NumberDecimal("123.456") }
2017-01-25 16:57:20 +01:00
Mark Paluch
d2d162dee6 DATAMONGO-1596 - Fix typo in JavaDoc.
Use correct @RelatedDocument annotation in MongoDB cross store reference documentation.
2017-01-25 16:53:26 +01:00
Mark Paluch
b9aa410ac1 DATAMONGO-1575 - Polishing.
Extend year range in license headers. Use MongoDB JSON serializer for String escaping. Move unquoting/quote checking to inner QuotedString utility class. Reformat code.
2017-01-25 11:48:14 +01:00
Christoph Strobl
6f278ce838 DATAMONGO-1575 - Escape Strings correctly.
Use regex groups and parameter index values for replacement in string based queries.
2017-01-25 11:48:14 +01:00
Christoph Strobl
3ed72a922a DATAMONGO-1594 - Update "what’s new" section in reference documentation. 2017-01-23 08:16:02 +01:00
Oliver Gierke
98b3c1678f DATAMONGO-1592 - Adapt AuditingEventListenerUnitTests to changes in core auditing.
The core auditing implementation now skips the invocation of auditing in case the candidate aggregate doesn't need any auditing in the first place. We needed to adapt the sample class we use to actually carry the necessary auditing annotations.

Related ticket: DATACMNS-957.
2017-01-20 16:33:59 +01:00
Oliver Gierke
45818f26b3 DATAMONGO-1590 - Polishing.
Removed some compiler warnings. Hide newly introduced class in package scope and made use of Lombok annotations to avoid boilerplate code.

Original pull request: #436.
2017-01-18 19:41:11 +01:00
Christoph Strobl
822e29524d DATAMONGO-1590 - EntityInformation selected now correctly considers Persistable.
We now wrap the MappingMongoEntityInformation into one that delegates the methods implemented by Persistable to the actual entity in case it implements said interface.

Original pull request: #436.
2017-01-18 19:39:53 +01:00
Mark Paluch
3476c639c2 DATAMONGO-1588 - Polishing.
Remove unused fields. Fix typo in method name. Reformat inner class to align formatting.

Original pull request: #435.
2017-01-16 09:12:33 +01:00
Christoph Strobl
35fa922daa DATAMONGO-1588 - Fix derived finder not accepting subclass of parameter type.
We now allow using sub types as arguments for derived queries. This makes it possible to use eg. a GeoJsonPoint for querying while the declared property type in the domain object remains a regular (legacy) Point.

Original pull request: #435.
2017-01-16 09:12:33 +01:00
Mark Paluch
8f7ea3a2ed DATAMONGO-1589 - Update project documentation with the CLA tool integration. 2017-01-13 11:46:51 +01:00
Mark Paluch
7ab7212edc DATAMONGO-1587 - Polishing.
Convert @see http://… links to valid format using @see <a href=…>…</a>
2017-01-12 17:08:07 +01:00
Mark Paluch
27651ef0be DATAMONGO-1587 - Migrate ticket references in test code to Spring Framework style. 2017-01-12 16:17:39 +01:00
Mark Paluch
174f7f0886 DATAMONGO-1586 - Consider field name in TypeBasedAggregationOperationContext.getReferenceFor(…).
We now consider the provided field name (alias) in mapped fields with which it is exposed. The field name applies to the exposed field after property path resolution in TypeBasedAggregationOperationContext. Previously, the field reference used the property name which caused fields to be considered non-aliased, so aggregation projection operations dropped the alias and exposed the field with its leaf property name.

Original Pull Request: #434
2017-01-12 09:51:13 +01:00
Christoph Strobl
437bf89533 DATAMONGO-1585 - Polishing.
Update documentation for better readability in html and pdf format.

Original Pull Request: #433
2017-01-12 09:42:20 +01:00
Mark Paluch
c3e5fca73d DATAMONGO-1585 - Expose synthetic fields in $project aggregation stage.
Field projections now expose their fields as synthetic simple fields. Projection aggregation stage redefines the available field set available for later aggregation stages entirely so projected fields are considered synthetic. A simple synthetic field has no target field which causes later aggregation stages to not pick up the underlying target but the exposed field name when rendering aggregation operations to Mongo documents.

The change is motivated by a bug where previously an aggregation consisting of projection of an aliased field and sort caused the sort projection stage to render with the original field name instead of the aliased field. The sort did not apply any sorting since projection redefines the available field set entirely and the original field is no longer accessible.

Original Pull Request: #433
2017-01-12 09:41:26 +01:00
Christoph Strobl
8340c02d9a DATAMONGO-1576 - Update lifecycle event documentation.
Add note on lifecycle event handling for property types.
2017-01-11 13:10:48 +01:00
Mark Paluch
3c6db34870 DATAMONGO-1578 - Polishing.
Add ticket references to test methods. Extend license years in copyright header.

Original pull request: #398.
2017-01-02 11:39:24 +01:00
Martin Macko
9f3319928b DATAMONGO-1578 - Add missing @Test annotation to ProjectionOperationUnitTests.
Original pull request: #398.
2017-01-02 11:38:49 +01:00
Mark Paluch
97a03d824a DATAMONGO-1508 - Improve reference documentation.
Replace Spring Data Document with Spring Data MongoDB. Extend copyright year range. Replace static Spring version leftover with variable. Fix typos.
2017-01-02 11:18:53 +01:00
Lukasz Kryger
8149273df9 DATAMONGO-1577 - Fix wording repetition in MongoRepository JavaDoc.
Original pull request: #407.
2017-01-02 11:18:53 +01:00
Ken Dombeck
e838662536 DATAMONGO-1577 - Fix Reference and JavaDoc spelling issues.
Replaced invalid class name MongoMappingConverter with actual class name of MappingMongoConverter. Fix typos.

Original pull request: #425.
2017-01-02 11:18:49 +01:00
Mark Paluch
e784e58a0a DATAMONGO-1508 - Polishing.
Highlight attribute name. Replace tabs with spaces.

Original pull request: #399.
2017-01-02 10:43:49 +01:00
John Lilley
fda72d6eb2 DATAMONGO-1508 - Document authentication-dbname attribute in db-factory.
Original pull request: #399.
2017-01-02 10:43:39 +01:00
Oliver Gierke
a96752da80 DATAMONGO-1522 - Updated changelog. 2016-12-21 19:35:31 +01:00
Oliver Gierke
320a28740a DATAMONGO-1469 - After release cleanups. 2016-12-21 16:33:13 +01:00
Oliver Gierke
e55e748cfd DATAMONGO-1469 - Prepare next development iteration. 2016-12-21 16:33:10 +01:00
Oliver Gierke
737f7b4f30 DATAMONGO-1469 - Release version 1.10 RC1 (Ingalls). 2016-12-21 16:16:51 +01:00
Oliver Gierke
408c5d8684 DATAMONGO-1469 - Prepare 1.10 RC1 (Ingalls). 2016-12-21 16:15:47 +01:00
Oliver Gierke
982adf317e DATAMONGO-1469 - Updated changelog. 2016-12-21 16:15:39 +01:00
Oliver Gierke
7914e8a630 DATAMONGO-1467 - Polishing.
Original pull request: #431.
2016-12-19 19:44:47 +01:00
Christoph Strobl
dc4a30a7f8 DATAMONGO-1467 - Add support for MongoDB 3.2 partialFilterExpression for index creation.
We now support partial filter expression on indexes via Index.partial(…). This allows to create partial indexes that only index the documents in a collection that meet a specified filter expression. 

new Index().named("idx").on("k3y", ASC).partial(filter(where("age").gte(10)))

The filter expression can be set via a plain DBObject or a CriteriaDefinition and is mapped against the associated domain type.

Original pull request: #431.
2016-12-19 19:44:47 +01:00
Oliver Gierke
29e405b800 DATAMONGO-1565 - Polishing.
Formatting in ExpressionEvaluatingParameterBinder and StringBasedMongoQueryUnitTests. Turned Placeholder into value object.
2016-12-19 14:37:26 +01:00
Mark Paluch
1a105333aa DATAMONGO-1565 - Polishing.
Consider quoted/unquoted parameter use with the same parameter reference. Extend date range in license headers.
2016-12-19 14:37:26 +01:00
Christoph Strobl
14e326dc09 DATAMONGO-1565 - Ignore placeholder pattern in replacement values for annotated queries.
We now make sure to quote single and double ticks in the replacement values before actually appending them to the query. We also replace single ticks around parameters in the actual raw annotated query by double quotes to make sure they are treated as a single string parameter.
2016-12-19 14:36:44 +01:00
Mark Paluch
f026ab419d DATAMONGO-1564 - Polishing.
Fix JavaDoc references and minor a import formatting issue.

Original pull request: #429.
2016-12-16 14:11:18 +01:00
Christoph Strobl
75139042e0 DATAMONGO-1564 - Split up AggregationExpressions.
Refactored to multiple smaller Aggregation Operator classes reflecting the grouping (array operators, string operators,…) predefined by MongoDB.

Original pull request: #429.
2016-12-16 14:11:18 +01:00
Mark Paluch
6236384c1d DATAMONGO-1567 - Use newer Java 8 on Travis CI. 2016-12-16 10:47:02 +01:00
Mark Paluch
6993054d6a DATAMONGO-1533 - Polishing.
Enhance JavaDoc. Minor reformatting.

Original pull request: #428.
2016-12-16 09:25:00 +01:00
Christoph Strobl
35bfb92ace DATAMONGO-1533 - Add AggregationExpression derived from SpEL AST.
We added an AggregationExpression that renders a MongoDB Aggregation Framework expression from the AST of a SpEL expression. This allows usage with various stages (eg. $project, $group) throughout the aggregation support.

  // { $and: [ { $gt: [ "$qty", 100 ] }, { $lt: [ "$qty", 250 ] } ] }
  expressionOf("qty > 100 && qty < 250);

  // { $cond : { if : { $gte : [ "$a", 42 ]}, then : "answer", else : "no-answer" } }
  expressionOf("cond(a >= 42, 'answer', 'no-answer')");

Original pull request: #428.
2016-12-16 09:02:15 +01:00
Oliver Gierke
89a02bb822 DATAMONGO-1566 - Adapt API in MongoRepositoryFactoryBean.
Related tickets: DATACMNS-891.
2016-12-15 14:57:26 +01:00
Christoph Strobl
c9c5fe62ca DATAMONGO-1552 - Polishing.
Updated doc, removed whitespaces, minor method wording changes.

Original Pull Request: #426
2016-12-14 12:05:26 +01:00
Mark Paluch
d250f88c38 DATAMONGO-1552 - Update Documentation.
Original Pull Request: #426
2016-12-14 11:09:46 +01:00
Mark Paluch
bcb63b2732 DATAMONGO-1552 - Add $facet aggregation stage.
Original Pull Request: #426
2016-12-14 11:09:01 +01:00
Mark Paluch
2c4377c9a6 DATAMONGO-1552 - Add $bucketAuto aggregation stage.
Original Pull Request: #426
2016-12-14 11:08:26 +01:00
Mark Paluch
e992d813fb DATAMONGO-1552 - Add $bucket aggregation stage.
Original Pull Request: #426
2016-12-14 11:07:40 +01:00
Mark Paluch
aa1e91c761 DATAMONGO-442 - Polishing.
Reformat code according to Spring Data style. Add test for authenticated use. Add JavaDoc to newly introduced methods. Allow configuration of an authentication database. Update reference documentation.

Original pull request: #419.
2016-12-13 16:50:37 +01:00
Ricardo Espirito Santo
9737464f9a DATAMONGO-442 - Support username/password authentication with MongoLog4jAppender.
Added optional username and password for authentication support on Log4jAppender.

Original pull request: #419.
2016-12-13 15:56:56 +01:00
Christoph Strobl
204a0515c4 DATAMONGO-1551 - Polishing.
Add startWith overload allowing to mix expressions, removed white spaces, updated doc.

Original Pull Request: #424
2016-12-13 15:15:55 +01:00
Mark Paluch
c6dae3c444 DATAMONGO-1551 - Add $graphLookup aggregation stage.
We now support the $graphLookup aggregation pipeline stage via Aggregation to perform recursive lookup adding the lookup result as array to documents entering $graphLookup.

TypedAggregation<Employee> agg = Aggregation.newAggregation(Employee.class,
		graphLookup("employee")
			.startWith("reportsTo")
			.connectFrom("reportsTo")
			.connectTo("name")
			.depthField("depth")
			.maxDepth(5)
			.as("reportingHierarchy"));

Original Pull Request: #424
2016-12-13 14:23:39 +01:00
Christoph Strobl
c1c7daf0ed DATAMONGO-1550 - Polishing $replaceRoot (aggregation stage).
Original Pull Request: #422.
2016-12-13 08:22:20 +01:00
Mark Paluch
14678ce7a9 DATAMONGO-1550 - Add $replaceRoot aggregation stage.
We now support the $replaceRoot stage in aggregation pipelines. $replaceRoot can reference either a field, an aggregation expression or it can be used to compose a replacement document.

newAggregation(
	replaceRoot().withDocument()
		.andValue("value").as("field")
		.and(MULTIPLY.of(field("total"), field("discounted")))
);

newAggregation(
	replaceRoot("item")));

Original Pull Request: #422
2016-12-13 08:22:20 +01:00
Christoph Strobl
4e56d9c575 DATAMONGO-1549 - Polishing $count (aggregation stage).
Original Pull Request: #422
2016-12-13 08:22:10 +01:00
Mark Paluch
cab35759db DATAMONGO-1549 - Add $count aggregation stage.
We now support the $count stage in aggregation pipelines.

newAggregation(
	match(where("hotelCode").is("0360")),
	count().as("documents"));

Original Pull Request: #422
2016-12-13 07:58:58 +01:00
Christoph Strobl
7b49b120e3 DATAMONGO-1558 - Upgrade MongoDB server version to, and add build profile for MongoDB 3.4.
Added MongoDB 3.4 profile to pom.xml and upgraded to MongoDB 3.4 on travis-ci.
2016-12-13 07:58:37 +01:00
Mark Paluch
dc57b66adf DATAMONGO-1548 - Polishing.
Enhance JavaDoc. Minor formatting. Fix typos.

Original pull request: #423.
2016-12-12 12:05:32 +01:00
Christoph Strobl
0449719a16 DATAMONGO-1548 - Add support for MongoDB 3.4 aggregation operators.
We now support the following MongoDB 3.4 aggregation operators:

$indexOfBytes, $indexOfCP, $split, $strLenBytes, $strLenCP, $substrCP, $indexOfArray, $range, $reverseArray, $reduce, $zip, $in, $isoDayOfWeek, $isoWeek, $isoWeekYear, $switch and $type.

Original pull request: #423.
2016-12-12 12:05:09 +01:00
Mark Paluch
3d8b6868c7 DATAMONGO-1538 - Polishing.
Use InheritingExposedFieldsAggregationOperationContext instead of anonymous context class for condition mapping. Drop aggregation input collections before tests. Minor reformatting.

Original pull request: #417.
2016-12-07 09:35:08 +01:00
Christoph Strobl
68db0d4cb0 DATAMONGO-1538 - Add support for $let to aggregation.
We now support $let in aggregation $project stage.

ExpressionVariable total = newExpressionVariable("total").forExpression(ADD.of(field("price"), field("tax")));
ExpressionVariable discounted = newExpressionVariable("discounted").forExpression(Cond.when("applyDiscount").then(0.9D).otherwise(1.0D));

newAggregation(Sales.class,
	project()
		.and(define(total, discounted)
			.andApply(MULTIPLY.of(field("total"), field("discounted"))))
		.as("finalTotal"));

Original pull request: #417.
2016-12-07 09:33:58 +01:00
Christoph Strobl
c9dfeea0c7 DATAMONGO-1542 - Polishing.
Added some static entry points for better readability.

Original Pull Request: #421
2016-12-06 10:13:32 +01:00
Mark Paluch
1a11877ae9 DATAMONGO-1542 - Refactor CondOperator and IfNullOperator to children of AggregationExpressions.
Renamed CondOperator to Cond and IfNullOperator to IfNull. Both conditional operations are now available from ConditionalOperators.when and ConditionalOperators.ifNull and accept AggregationExpressions for conditions and values.

Original Pull Request: #421
2016-12-06 09:19:28 +01:00
Christoph Strobl
ea4782c421 DATAMONGO-1520 - Add overload for aggregation $match accepting CriteriaDefinition.
We now also accept CriteriaDefinition next to Criteria for Aggregation.match. The existing match(Criteria) method remains to preserve binary compatibility.
2016-12-06 09:05:55 +01:00
Mark Paluch
a0ac3510a0 DATAMONGO-1540 - Polishing.
Reduce Map aggregation expression builder entrypoint. Fix JavaDoc.

Original pull request: #420.
2016-12-05 16:47:20 +01:00
Christoph Strobl
192399413d DATAMONGO-1540 - Add support for $map (aggregation).
We now support $map operator in aggregation.

Original pull request: #420.
2016-12-05 16:47:17 +01:00
Oliver Gierke
a0be890437 DATAMONGO-1547 - Register MongoRepositoryFactory in spring.factories.
This is required for the switch in support for multi-store detection.

Related ticket: DATACMNS-952.
2016-12-05 14:27:24 +01:00
Oliver Gierke
438dbc4b33 DATAMONGO-1546 - Register GeoJsonConfiguration via spring.factories.
Related tickets: DATACMNS-952.
2016-12-05 14:22:09 +01:00
Oliver Gierke
407affb458 DATAMONGO-1141 - Polishing.
Aligned assertion messages for consistency. Fixed imports in  UpdateMapperUnitTests.

Original pull request: #405.
2016-12-02 18:13:22 +01:00
Mark Paluch
c6a4e7166c DATAMONGO-1141 - Polishing.
Add property to field name mapping for Sort orders by moving Sort mapping to UpdateMapper. Fix typo. Add JavaDoc. Reformat code. Remove trailing whitespaces.

Original pull request: #405.
2016-12-02 18:13:18 +01:00
Pavel Vodrážka
7f39c42eb7 DATAMONGO-1141 - Add support for $push $sort in Update.
Sorting update modifier added. Supports sorting arrays by document fields and element values.

Original pull request: #405.
2016-12-02 18:13:02 +01:00
Oliver Gierke
40da4701de DATAMONGO-1454 - Polishing.
Formatting in test case.

Original pull request: #381.
2016-12-02 16:33:13 +01:00
Mark Paluch
3ae6aebebb DATAMONGO-1454 - Add support for exists projection in repository query methods.
We now support exists projections for query methods in query methods for derived and string queries.

public PersonRepository extends Repository<Person, String> {

  boolean existsByFirstname(String firstname);

  @ExistsQuery(value = "{ 'lastname' : ?0 }")
  boolean someExistQuery(String lastname);

  @Query(value = "{ 'lastname' : ?0 }", exists = true)
  boolean anotherExistQuery(String lastname);
}

Original pull request: #381.
2016-12-02 16:32:58 +01:00
Mark Paluch
bbfa0f7b83 DATAMONGO-1536 - Polishing.
Add JavaDoc. Change visibility of AbstractAggregationExpression.getMongoMethod() to protected.

Original pull request: #418.
2016-12-02 15:32:16 +01:00
Christoph Strobl
63d6234446 DATAMONGO-1536 - Add aggregation operators for array, arithmetic, date and set operations.
We now support the following aggregation framework operators:

- setEquals, setIntersection, setUnion, setDifference, setIsSubset, anyElementTrue, allElementsTrue
- stdDevPop, stdDevSamp
- abs, ceil, exp, floor, ln, log, log10, pow, sqrt, trunc
- arrayElementAt, concatArrays, isArray
- literal
- dayOfYear, dayOfMonth, dayOfWeek, year, month, week, hour, minute, second, millisecond, dateToString

Original pull request: #418.
2016-12-02 15:30:01 +01:00
Oliver Gierke
9ff86feb4f DATAMONGO-1539 - Polishing.
Renamed @Count and @Delete to @CountQuery and @DeleteQuery. Minor polishing in test cases and test repository methods. JavaDoc, formatting.

Original pull request: #416.
2016-12-02 11:55:14 +01:00
Fırat KÜÇÜK
8c838e8350 DATAMONGO-1539 - Introduce @CountQuery and @DeleteQuery.
Introducing dedicated annotations for manually defined count and delete queries to avoid misconfiguration and generally simplifying the declaration.

Original pull request: 416.
2016-12-02 11:55:07 +01:00
Oliver Gierke
a79930145d DATAMONGO-1525 - Improved creation of empty collections, esp. EnumSet.
We now use more type information to create a better empty collection in the first place. The previous algorithm always used an empty HashSet plus a subsequent conversion using the raw collection type. Especially the latter caused problems for EnumSets as the conversion into one requires the presence of component type information.

We now use Spring's collection factory and more available type information to create a proper collection in the first place and only rely on a subsequent conversion for arrays.
2016-12-01 20:10:41 +01:00
Christoph Strobl
9059a77712 DATAMONGO-1534 - Fix bulk operations missing to write type info.
We now correctly convert entities into their MongoDB representation including type information via _class property.

Original pull request: #415.
2016-11-28 09:06:13 +01:00
Christoph Strobl
a741400e9b DATAMONGO-1530 - Polishing.
Add missing transformations for ConstructorReference, OperatorNot, OpNE, OpEQ, OpGT, OpGE, OpLT, OpLE, OperatorPower, OpOr and OpAnd. This allows usage of logical operators &, || and ! as part of the expression, while ConstructorReference allows instantiating eg. arrays via an expression `new int[]{4,5,6}`. This can be useful eg. comparing arrays using $setEquals.

More complex aggregation operators like $filter can be created by defining the variable references as string inside the expression like filter(a, 'num', '$$num' > 10).
Commands like $let requires usage of InlineMap to pass in required arguments like eg. let({low:1, high:'$$low'}, gt('$$low', '$$high')).

Original Pull Request: #410
2016-11-25 15:45:14 +01:00
Sebastien Gerard
b786b8220a DATAMONGO-1530 - Add support for missing MongoDB 3.2 aggregation pipeline operators.
Original Pull Request: #410
2016-11-25 15:43:43 +01:00
Mark Paluch
710770e88d DATAMONGO-784 - Polishing.
Add JavaDoc for compareValue.

Original pull request: #414.
2016-11-24 13:56:27 +01:00
Christoph Strobl
e631e2d7c5 DATAMONGO-784 - Add support for comparison aggregation operators to group & project.
We now directly support comparison aggregation operators ($cmp, $eq, $gt, $gte, $lt, $lte and $ne) on both group and project stages.

Original pull request: #414.
2016-11-24 13:56:27 +01:00
Mark Paluch
3dc1e9355a DATAMONGO-1491 - Polishing.
Remove variable before returning value. Add generics for list creation.

Original pull request: #412.
2016-11-24 12:48:15 +01:00
Christoph Strobl
2985b4ca3d DATAMONGO-1491 - Add support for $filter (aggregation).
We new support $filter in aggregation pipeline.

Aggregation.newAggregation(Sales.class,
	Aggregation.project()
		.and(filter("items").as("item").by(GTE.of(field("item.price"), 100)))
		.as("items"))

Original pull request: #412.
2016-11-24 12:47:07 +01:00
Christoph Strobl
578441ee9f DATAMONGO-1327 - Polishing.
Just added overloads for stdDevSamp and stdDevPop taking AggregationExpression and updated the doc.
Also replaced String operation based MongoDB operation building by using operators directly.

Original Pull Request: #360
2016-11-23 15:39:01 +01:00
gustavodegeus
36838ffe31 DATAMONGO-1327 - Added support for $stdDevSamp and $stdDevPop to aggregation $group stage.
Original Pull Request: #360
CLA: 171720160409030719 (Gustavo de Geus)
2016-11-23 15:38:35 +01:00
Oliver Gierke
5bd0e21173 DATAMONGO-1527 - Updated changelog. 2016-11-23 13:52:43 +01:00
Oliver Gierke
255d32513c DATAMONGO-1502 - Updated changelog. 2016-11-03 18:56:45 +01:00
Oliver Gierke
6a9823fd24 DATAMONGO-1521 - Added Aggregation.skip(…) overload to support longs.
Deprecated the one taking an int.
2016-11-03 15:00:20 +01:00
Christoph Strobl
2ae75a4ff9 DATAMONGO-1500 - Fix JSON serialization error in derived queries with field spec.
We now make sure not to eagerly attempt to convert given query parameters into a mongo specific format by calling toString() the query object, but rather delegate this to another step later in the chain.

Original pull request: #404.
2016-11-03 09:36:42 +01:00
Christoph Strobl
9c20da3e8f DATAMONGO-1504 - Assert compatibility with MongoDB 3.4.
We now make sure to comply to the API requirements of mongo-java-driver 3.4 (in current beta1) by using empty DBObjects instead of null, ignoring non appropriate replication settings and cleaning up tests after execution.

Original pull request: #394.
2016-11-03 09:32:56 +01:00
Oliver Gierke
f782338581 DATAMONGO-1513 - Fixed identifier population for event listener generated, non-ObjectId on batch inserts.
The methods in MongoTemplate inserting a batch of documents previously only returned database generated identifiers, more especially ObjectId ones. This caused non-ObjectId identifiers potentially generated by other parties — i.e. an event listener reacting to a BeforeSaveEvent — not being considered for source object identifier population.

This commit adds a workaround augmenting the list of database generated identifiers with the ones actually present in the documents to be inserted. A follow-up ticket DATAMONGO-1519 was created to track the removal of the workaround in favor of a proper fix unfortunately requiring a change in public API (so a 2.0 candidate only).

Related tickets: DATAMONGO-1519.
2016-11-02 09:52:47 +01:00
Oliver Gierke
cb90bfc6a6 DATAMONGO-1514 - Polishing.
Extended license years in copyright header.

Original pull request: #401.
2016-10-27 14:32:37 +02:00
Martin Macko
189d4dd1b7 DATAMONGO-1514 - SpringDataMongodbQuery needs to be public.
SpringDataMongodbQuery is exposed publicly in QuerydslRepositorySupport, that's we've got to make it public to make sure class to the exposed methods from outside the package actually compile.

Original pull request: #401.
2016-10-27 14:31:46 +02:00
Christoph Strobl
10208001f8 DATAMONGO-1480 - Polishing.
Opened up Meta attributes to now allowing usage of more than one cursor option via dedicated enum.

new Query().noCursorTimeout();

and

interface PersonRepository extends CrudRepository<Person, String> {

    @Meta(flags = {CursorOptions.NO_TIMEOUT})
    Iterable<Person> findBy();
}

Original Pull Request: #390
2016-10-24 18:44:35 +02:00
Mark Paluch
98dca5a65e DATAMONGO-1480 - Add support for noCursorTimeout in Query.
We now allow setting noCursorTimeout for queries using `Query` and `@Meta`.

Query query = new Query().noCursorTimeout();

and

interface PersonRepository extends CrudRepository<Person, String> {

    @Meta(noCursorTimeout = true)
    Iterable<Person> findBy();

    @Meta(noCursorTimeout = true)
    Stream<Person> streamBy();
}

Original Pull Request: #390
2016-10-24 18:43:40 +02:00
Christoph Strobl
b6bc0ea316 DATAMONGO-1490 - Polishing.
Restore 1.8.xsd and create new 1.10 one reflecting the changes made.

Original Pull Request: #389
2016-10-24 13:26:51 +02:00
Mark Paluch
b67c551e19 DATAMONGO-1490 - Replace space indentation by tabs.
Original Pull Request: #389
2016-10-24 13:26:51 +02:00
Mark Paluch
e0bc1e0f20 DATAMONGO-1490 - Change the XML data type of boolean flags to String.
We now accept String data types for boolean flags in XML configurations. Boolean data types in the XSD don't allows use of property placeholders even if the resolved value could be converted to boolean. Affected fields by this change are:

* `<mongo:repositories create-query-indexes=… />`
* `<mongo:options ssl=…/>`
* `<mongo:client-options ssl=… />`

Original Pull Request: #389
2016-10-24 13:26:40 +02:00
Oliver Gierke
fcb436dd30 DATAMONGO-1495 - Updated changelog. 2016-09-29 14:42:09 +02:00
Oliver Gierke
0e57dd473c DATAMONGO-1499 - Updated changelog. 2016-09-29 11:42:09 +02:00
Oliver Gierke
8d36e42b1b DATAMONGO-1498 - Removed defaulting of MongoMappingContext for repositories and auditing.
Previously we created a default bean definition for MongoMappingContext if none was present in the application context. That lookup for an existing one unfortunately comes too early, especially with Spring Boot in place. This then caused the MappingContext not being aware of the custom conversions and simply types registered by Boot.

We now removed the defaulting relying on a MappingMongoConverter being present in the Application context (which usually is the case for the usage with AbstractMongoConfiguration or the XML <mongo:mapping-converter /> alternative. We use that bean to lookup the MappingContext.
2016-09-28 16:22:20 +02:00
Oliver Gierke
b66bfae105 DATAMONGO-1497 - MappingMongoConverter now consistently uses DbObjectAccessor.
We now use DbObjectAccessor also for preliminary inspections of the source DBObject (e.g. whether a value is present at all). Previously we operated on the DBObject directly which caused issues with properties mapped to nested fields as the keys weren't exploded correctly and thus the check always failed.
2016-09-22 17:55:50 +02:00
Oliver Gierke
eb3d55e0bd DATAMONGO-1494 - Updated changelog. 2016-09-21 08:04:20 +02:00
Oliver Gierke
84dbfdfd5e DATAMONGO-1450 - Updated changelog. 2016-09-21 08:04:14 +02:00
Mark Paluch
1813b1aea0 DATAMONGO-1493 - Fix minor typo in reference documentation.
Related pull request: #391.
2016-09-19 17:03:29 +02:00
Jordan Jennings
69241737b7 DATAMONGO-1493 - Fix minor typo in docs.
Original pull request: #391.
2016-09-19 16:21:12 +02:00
Christoph Strobl
f053bed447 DATAMONGO-1492 - Make o.s.d.m.core.aggregation.AggregationExpression public.
By turning `AggregationExpression` public we allow adding custom expressions without workarounds. It is now possible to create eg. `ProjectionOperation` like:

ProjectionOperation agg = Aggregation.project()
      .and(new AggregationExpression() {

        @Override
        public DBObject toDbObject(AggregationOperationContext context) {

          DBObject filterExpression = new BasicDBObject();
          filterExpression.put("input", "$x");
          filterExpression.put("as", "y");
          filterExpression.put("cond", new BasicDBObject("$eq", Arrays.<Object> asList("$$y.z", 2)));

          return new BasicDBObject("$filter", filterExpression);
        }
      }).as("profile");

Original pull request: #392.
2016-09-19 16:02:44 +02:00
Christoph Strobl
1a1cd9ef14 DATAMONGO-1485 - Consider potential custom conversion for enums in Querydsl paths.
We now take potential registered converters for enums into account when serializing path expressions via SpringDataMongodbSerializer.

Original pull request: #388.
2016-09-19 07:00:21 +02:00
Oliver Gierke
7effc0e10f DATAMONGO-1468 - Polishing.
Slightly touched test case.

Original pull request: #387
2016-09-08 10:29:23 +02:00
Christoph Strobl
395bb1faa4 DATAMONGO-1486 - Fix ClassCastException when mapping non-String Map key for updates.
We now make sure to convert Map keys into Strings when mapping update values for Map properties.

Original pull request: #387.
2016-09-08 10:29:22 +02:00
Christoph Strobl
eb1392cc1a DATAMONGO-861 - Polishing.
Favor usage of List over BasicDBList.
Rename ProjectionOperation.transform to applyCondition.
Add missing author and since tags, remove trailing white spaces and fix reference documentation headline clash.

Original Pull Request: #385
2016-08-30 14:40:24 +02:00
Mark Paluch
ace01e4e6d DATAMONGO-861 - Add support for $cond and $ifNull operators in aggregation operations.
We now support $cond and $ifNull operators for projection and grouping operations. ConditionalOperator and IfNullOperators are AggregationExpressions that can be applied to transform or generate values during aggregation.

TypedAggregation<InventoryItem> agg = newAggregation(InventoryItem.class,
  project().and("discount")
    .transform(ConditionalOperator.newBuilder().when(Criteria.where("qty").gte(250))
      .then(30)
      .otherwise(20))
    .and(ifNull("description", "Unspecified")).as("description")
);

corresponds to

{ "$project": { "discount": { "$cond": { "if": { "$gte": [ "$qty", 250 ] },
        "then": 30, "else": 20 } },
    "description": { "$ifNull": [ "$description", "Unspecified"] }
  }
}

Original Pull Request: #385
2016-08-30 14:39:43 +02:00
Mark Paluch
116dda63c2 DATAMONGO-1406 - Propagate PersistentEntity when mapping query criteria for nested keywords.
We now propagate the PersistentEntity when mapping nested keywords so that the criteria mapping chain for nested keywords and properties has now access to the PersistentEntity and can use configured field names.

Previously the plain property names have been used as field names and potential customizations via @Field have been ignored.

Original Pull Request: #384
2016-08-25 13:09:04 +02:00
Mark Paluch
4649872394 DATAMONGO-1465 - Polishing.
Replace boolean flag in convertAndJoinScriptArgs with literal. Joined args are rendered to JavaScript and require always string quotation.

Original pull request: #383.
2016-08-23 14:59:43 +02:00
Christoph Strobl
ecc6f3fc4e DATAMONGO-1465 - Fix String quotation in DefaultScriptOperations.execute().
This change prevents Strings from being quoted prior to sending them as args of a script.

Original pull request: #383.
2016-08-23 14:57:10 +02:00
Mark Paluch
512f68a611 DATAMONGO-1476 - Polishing.
Extend year range in license header. Use the specified collection name in doRemove.

Original pull request: #382.
2016-08-23 10:06:53 +02:00
Niko Schmuck
fbcd4ba367 DATAMONGO-1476 - Consistently use specified collection name in MongoTemplate.stream().
We now use the specified argument name instead of DBCollection.getName() when invoking ReadDbObjectCallback.

Original pull request: #382.
2016-08-23 10:01:59 +02:00
Oliver Gierke
760d7d6a32 DATAMONGO-1471 - Converter only applies identifier values if actually available.
Setting the value for the identifier property is an explicit step in MappingMongoConverter and always executed if the type to be created has an identifier property. If the source document doesn't contain an _id field (e.g. because it has been excluded explicitly) that previously caused null to be set on the identifier. This caused an exception if the identifier property is a primitive type.

We now explicitly check whether the field backing the identifier property is actually present in the source document and only explicitly set the value if so.
2016-08-17 16:54:10 +02:00
Oliver Gierke
29a6688e8c DATAMONGO-1470 - AbstractMongoConfiguration now supports multiple base packages for @Document scanning.
Introduced AbstractMongoConfiguration.getMappingBasePackages() to return multiple ones over the previously existing ….getMappingBasePackage(). The former is now used by the code triggering the scanning using what the latter returns by default.
2016-07-28 13:10:48 +02:00
Oliver Gierke
b6e7683202 DATAMONGO-1409 - After release cleanups. 2016-07-27 14:32:37 +02:00
Oliver Gierke
286a977575 DATAMONGO-1409 - Prepare next development iteration. 2016-07-27 14:32:35 +02:00
Oliver Gierke
d7f70e219b DATAMONGO-1409 - Release version 1.10 M1 (Ingalls). 2016-07-27 13:52:12 +02:00
Oliver Gierke
728cc390f6 DATAMONGO-1409 - Prepare 1.10 M1 (Ingalls). 2016-07-27 13:51:38 +02:00
Oliver Gierke
ddcc3914ff DATAMONGO-1409 - Updated changelog. 2016-07-27 13:51:32 +02:00
Oliver Gierke
9a385599af DATAMONGO-1394 - Polishing.
Some internal refactorings to avoid deeply nested if-clauses.

Original pull request: #373.
2016-07-27 10:54:21 +02:00
Christoph Strobl
c14c42fb0c DATAMONGO-1394 - Support identifier references on Querydsl expressions for DBRefs.
We now allow direct usage path.eq(…) on id properties of db referenced objects. This allows to write the query as person.coworker.id.eq(coworker.getId()) instead of person.coworker.eq(coworker). This helps building the query using just the plain id not having to actually create new object wrapping it.

Original pull request: #373.
2016-07-27 10:54:21 +02:00
Oliver Gierke
5a8e4f3dae DATAMONGO-1194 - Polishing.
Some missing JavaDoc and slight code polish.

Original pull request: #377.
2016-07-26 15:30:31 +02:00
Christoph Strobl
5d50155d81 DATAMONGO-1194 - Improve DBRef resolution for maps.
We bulk load maps of referenced objects as long as they are stored in the same collection. This reduces database roundtrips and network traffic.

Original pull request: #377.
2016-07-26 15:30:31 +02:00
Christoph Strobl
babab54ffd DATAMONGO-1194 - Improve DBRef resolution for collections.
We now bulk load collections of referenced objects as long as they are stored in the same collection. This reduces database roundtrips and network traffic.

Original pull request: #377.
2016-07-26 15:30:30 +02:00
Mark Paluch
1ba137b98a DATAMONGO-1464 - Polishing.
Added JavaDoc. Simplified if-check in MongoQueryExecution.isListOfGeoResult(…).

Original pull request: #379.
2016-07-26 14:51:55 +02:00
Mark Paluch
353b836a77 DATAMONGO-1464 - Optimize query execution for pagination queries.
We execute paged queries now in an optimized way. The data is obtained for each paged execution but the count query is deferred. We determine the total from the pageable and the results in which we don't hit the page size bounds (i.e. results are less than a full page without offset or results are greater 0 and less than a full page with offset). In all other cases we issue an additional count query.

Original pull request: #379.
2016-07-26 14:51:29 +02:00
Oliver Gierke
325bcd11b9 DATAMONGO-1431 - Added MongoOperations.stream(…) with explicit collection. 2016-07-20 15:59:52 +02:00
Mark Paluch
9db2dde19b DATAMONGO-1463 - Upgrade to mongo-java-driver 2.14.3.
Upgrade mongo-java-driver 2.14.3 and upgrade the mongo33 profile to use 3.3.0 (release).
2016-07-20 10:33:23 +02:00
Mark Paluch
318ba53e2f DATAMONGO-1462 - Integrate version badge from spring.io.
Add version badge from spring.io and replace fixed version numbers with a placeholder.
2016-07-20 08:43:37 +02:00
Mark Paluch
3db30bd4a6 DATAMONGO-1460 - User placeholder property for JSR-303 API. 2016-07-15 12:48:37 +02:00
Oliver Gierke
dd1fbfeb66 DATAMONGO-1459 - Polishing.
Added missing @Overrides to MongoRepository interface and polished non-JavaDoc references.
2016-07-14 15:52:32 +02:00
Oliver Gierke
3fa17272bb DATAMONGO-1459 - Added support for any-match mode in Query-by-example.
MongoExampleMapper now $or-concatenates the predicates derived from the example in case the ExampleMatcher expresses any-match binding to be desired.

Moved integration tests for Query-by-example to the appropriate package and polished the code a little.

Related ticket: DATACMNS-879.
2016-07-14 15:52:32 +02:00
Christoph Strobl
9361fc3c71 DATAMONGO-1455, DATAMONGO-1456 - Polishing.
Use static imports for org.junit.Assert and org.hamcrest.core. Fix spelling.

Original pull request: #375.
2016-07-12 15:25:49 +02:00
Christoph Strobl
ac55f5e77f DATAMONGO-1455, DATAMONGO-1456 - Add support for $caseSensitive and $diacriticSensitive to $text.
We added methods to set values for $caseSensitive and $diacriticSensitive when using TextCriteria. Both operators are optional and will not be used until explicitly set.

    // { "$text" : { "$search" : "coffee" , "$caseSensitive" : true } }
    TextCriteria.forDefaultLanguage().matching("coffee").caseSensitive(true);

    // { "$text" : { "$search" : "coffee" , "$diacriticSensitive" : true } }
    TextCriteria.forDefaultLanguage().matching("coffee").diacriticSensitive(true);

Original pull request: #375.
2016-07-12 15:25:49 +02:00
Christoph Strobl
116baf9a92 DATAMONGO-832 - Polishing
Moved newly introduced types into order. Added missing @since tag and additional test.
Updated reference documentation for update operators and added $slice operator to "what’s new" section.

Original Pull Request: #374
2016-07-11 10:02:22 +02:00
Mark Paluch
026dce2612 DATAMONGO-832 - Add support for $slice in Update.push.
We now support $slice in Update operations via the PushOperatorBuilder.

    new Update().push("key").slice(5).each(Arrays.asList("one", "two", "three"));

Original Pull Request: #374
2016-07-11 09:49:45 +02:00
Mark Paluch
eae32be568 DATAMONGO-1457 - Polishing.
Add missing Javadoc to size operator. Mention Array Aggregation Operators in reference docs. Fix typos in reference docs.

Original pull request: #372.
2016-07-08 10:34:08 +02:00
Christoph Strobl
9d51ea4c01 DATAMONGO-1457 - Add support for $slice in aggregation.
We now support $slice in aggregation projections via the ProjectionOperationBuilder.

    Aggregation.project().and("field").slice(10, 20)

Original pull request: #372.
2016-07-08 10:08:29 +02:00
Mark Paluch
f4a5482005 DATAMONGO-1418 - Polishing.
Added ticket references. Simplified code.

 Original pull request: #361.
2016-06-24 15:32:58 +02:00
Nikolai Bogdanov
0db36aff8f DATAMONGO-1418 - Add support for $out operator in aggregations.
We now support the $out operator via Aggregation.out(…) to store aggregation results in a collection. Using the $out operator returns an empty list in AggregationResults.

Original pull request: #361.
CLA: 172720160413124705 (Nikolai Bogdanov)
2016-06-24 15:28:45 +02:00
Christoph Strobl
ba8ece334a DATAMONGO-1453 - Fix GeoJson conversion when coordinates are Integers.
We now use Number instead of Double for reading "coordinates" from GeoJSON representations.

Original pull request: #369.
2016-06-24 14:04:58 +02:00
Oliver Gierke
a5394074c5 DATAMONGO-1410 - Updated changelog. 2016-06-15 14:31:49 +02:00
Kevin Dosey
fe5bb515b7 DATAMONGO-1449 - Switched to foreach loop in collection handling of MappingMongoConverter.
This should result in minor to moderate performance improvement for iteration on Collections/Arrays during DBObject to object mapping.

Original pull request: #368.
2016-06-11 18:29:32 +02:00
Mark Paluch
c84bfbccf4 DATAMONGO-1437 - Polishing.
Renamed test. Added JavaDoc. Simplify throws declaration.

Original pull request: #367.
2016-06-02 08:55:09 +02:00
Christoph Strobl
d147f80a39 DATAMONGO-1437 - Preserve non translatable Exception cause when lazily resolving DBRef.
We now preserve the cause of Exceptions that cannot be translated into DataAccessExceptions when an error occurs during lazily loading DBRefs.

Original pull request: #367.
2016-06-02 08:52:53 +02:00
Christoph Strobl
7b8dadeb74 DATAMONGO-1271 - Polishing.
Removed non Java 6 language features, reworked and added a few tests.

Original Pull Request: #322
2016-05-27 07:53:53 +02:00
Jordi Llach Fernandez
d1251c42ca DATAMONGO-1271 - Provide lifecycle events for DBRefs.
We now publish livecycle events when loading DBRefs.

Original Pull Request: #322
CLA: 121620150519031801 (Jordi Llach Fernandez)
2016-05-27 07:53:45 +02:00
Oliver Gierke
4140dd573f DATAMONGO-1423 - Polishing.
Orignal pull request: #365.
2016-05-25 17:26:31 +02:00
Christoph Strobl
0e60630393 DATAMONGO-1423 - Map keys now get registered conversions applied for Updates.
We now pipe map keys through the potentially registered conversions when mapping Updates.

Orignal pull request: #365.
2016-05-25 17:26:30 +02:00
Oliver Gierke
9bc35512fd DATAMONGO-1416 - Polishing.
Just use instanceOf(…) from Hamcrest's Matchers class instead of dedicated class.

Original pull request: #362.
2016-05-24 15:57:21 +02:00
Christoph Strobl
b626c2f82b DATAMONGO-1416 - Get rid of the warnings for Atomic… type conversions.
We now use explicit converters instead of a ConverterFactory. This reduces noise in log when registering converters.

Original pull request: #362.
2016-05-24 15:57:21 +02:00
Mark Paluch
c3e894ee8d DATAMONGO-1424 - Polishing.
Remove EndingWith from NotLike. Remove superfluous white-spaces. Split combined highlighted keywords to individual highlighting.

Original pull request: #364.
2016-05-09 11:06:44 +02:00
Christoph Strobl
2f713bede5 DATAMONGO-1424 - Add support for NOT_LIKE.
We now support `notLike` and `isNotLike` in query derivation.

Original pull request: #364.
2016-05-09 11:05:44 +02:00
Mark Paluch
e03520d2fb DATAMONGO-1425 - Polishing.
Add NotContaining to documentation. Add integration test for Containing/NotContaining on collection properties.

Original pull request: #363.
2016-05-09 10:42:19 +02:00
Christoph Strobl
3829d58dc2 DATAMONGO-1425 - Fix query derivation for notContaining on String properties.
We now correctly build up the criteria for derived queries using notContaining keyword on String properties.

Original pull request: #363.
2016-05-09 10:42:15 +02:00
Mark Paluch
7b87fa9509 DATAMONGO-1412 - Fix backticks and code element highlighting.
Fixed broken highlighting using backticks followed by chars/single quotes. Convert single quote emphasis of id to backtick code fences. Add missing spaces between words and backticks.

Original Pull Request: #359
2016-05-02 13:20:29 +02:00
Mark Paluch
b2b9f3406a DATAMONGO-1412 - Document mapping rules for Java types to MongoDB representation.
Original Pull Request: #359
Related pull request: #353
Related ticket: DATAMONGO-1404
2016-05-02 13:20:22 +02:00
Mark Paluch
d610761019 DATAMONGO-1404 - Polishing.
Add author and since tags. Update license headers. Reformat code. Replace FQCN with import and simple class name. Remove final keyword in test methods. Add tests for numeric values. Update documentation.

Original pull request: #353.
2016-05-02 13:20:10 +02:00
Alexey Plotnik
8983bd26ce DATAMONGO-1404 - Add support for $min and $max update operators.
Original pull request: #353.
CLA: 169820160330091912 (Alexey Plotnik)
2016-05-02 13:19:31 +02:00
Christoph Strobl
5485f2fcd4 DATAMONGO-1391 - Polishing.
Removed white spaces, updated Javadoc and return early when using non 3.2 $unwind options.

Original Pull Request: #355
2016-05-02 12:56:38 +02:00
Mark Paluch
f8681fec66 DATAMONGO-1391 - Support Mongo 3.2 syntax for $unwind in aggregation.
We now support both, the simple {$unwind: path} and the MongoDB 3.2 {$unwind: {…}} syntax.

Original Pull Request: #355
2016-05-02 12:52:39 +02:00
Mark Paluch
0dd904894d DATAMONGO-1399 - Polishing.
Update since version to 1.10. Remove trailing whitespaces.

Original pull request: #352.
2016-05-02 09:44:37 +02:00
Christoph Strobl
7d70a8677e DATAMONGO-1399 - Allow adding hole to GeoJSON Polygon.
We now allow creation of GeoJsonPolygon having an outer and multiple inner rings.

Original pull request: #352.
2016-05-02 09:40:44 +02:00
Mark Paluch
13a52b5ac9 DATAMONGO-1403 - Add maxExecutionTimeMs alias for @Meta(maxExcecutionTime).
We added maxExecutionTimeMs as an alias for maxExcecutionTime which has been deprecated due to spelling issues.

Original pull request: #356.
2016-04-26 12:43:37 +02:00
Mark Paluch
f5cfcda673 DATAMONGO-1411 - Enable build on TravisCI.
We now start MongoDB server via apt-get instead of relying on the TravisCI managed 2.4.2 installation.
Doing this we altered tests to just check on the port and not the host part of the URIs.

Additionally we upgraded build profiles, removed promoted snapshot-versions, renamed mongo32-next to mongo32  and added mongo33-next build profile.

Original pull request: #358
2016-04-26 10:40:29 +02:00
Raja Dilip Kolli
cf44a7105f DATAMONGO-1420 - Update version numbers in Github readme.
Bumped to latest versions available

Original pull request: #354
2016-04-15 08:30:32 +02:00
Oliver Gierke
0228255d2b DATAMONGO-1356 - AuditingEventListener now has an explicit order.
AuditingEventListener now has a fixed ordering of 100. This allows other listeners to be registered to be executed before or after it.
2016-04-14 22:27:46 +02:00
Oliver Gierke
50e37355d4 DATAMONGO-1419 - Removed deprecated methods of AbstractMongoEventListener. 2016-04-14 22:27:42 +02:00
Oliver Gierke
a15dababfa DATAMONGO-1408 - Updated changelog. 2016-04-06 23:14:25 +02:00
Oliver Gierke
9942451017 DATAMONGO-1405 - After release cleanups. 2016-04-06 16:37:03 +02:00
Oliver Gierke
e144c29316 DATAMONGO-1405 - Prepare next development iteration. 2016-04-06 16:36:59 +02:00
Oliver Gierke
64d4880983 DATAMONGO-1405 - Release version 1.9 GA (Hopper). 2016-04-06 16:35:59 +02:00
Oliver Gierke
47c348e03a DATAMONGO-1405 - Prepare 1.9 GA (Hopper). 2016-04-06 16:34:45 +02:00
Oliver Gierke
dea86535c1 DATAMONGO-1405 - Updated changelog. 2016-04-06 16:34:39 +02:00
Artur Konczak
eee6b62589 DATAMONGO-1407 - updated jira link to point to correct project on jira.
Original pull request: #357.
2016-04-05 14:13:28 +02:00
Mark Paluch
771ca8d84c DATAMONGO-1407 - Add pull request template. 2016-04-05 09:50:18 +02:00
Christoph Strobl
8f5b334951 DATAMONGO-1398 - Mention QBE and add links.
Original Pull Request: #349
2016-03-31 21:00:27 +02:00
Mark Paluch
0dc6169282 DATAMONGO-1398, DATAMONGO-1395 - Update Lifecycle Events examples in Reference Documentation.
Replace deprecated methods by the supported API.

Original Pull Request: #349
2016-03-31 20:59:53 +02:00
Mark Paluch
abe78f0428 DATAMONGO-1398 - Updated what's new section and general improvements.
Update Spring Framework documentation links to point always to the Spring Framework version specified in the pom, where possible. Mention $lookup in aggregation.

Original Pull Request: #349
2016-03-31 20:59:16 +02:00
Christoph Strobl
9930ec2d19 DATAMONGO-1401 - Fix error when updating entity with both GeoJsonPoint and Version property.
We now ignore property reference exceptions when resolving field values that have already been mapped. Eg. in case of an already mapped update extracted from an actual domain type instance.

Original pull request: #351.
2016-03-31 09:15:29 +02:00
Oliver Gierke
83d7f4477e DATAMONGO-1392 - After release cleanups. 2016-03-18 11:16:07 +01:00
Oliver Gierke
18c3704c2e DATAMONGO-1392 - Prepare next development iteration. 2016-03-18 11:15:51 +01:00
Oliver Gierke
bef581caa5 DATAMONGO-1392 - Release version 1.9 RC1 (Hopper). 2016-03-18 11:15:00 +01:00
Oliver Gierke
2f0abe0604 DATAMONGO-1392 - Prepare 1.9 RC1 (Hopper). 2016-03-18 11:06:57 +01:00
Oliver Gierke
4235b44c47 DATAMONGO-1392 - Updated changelog. 2016-03-18 11:06:52 +01:00
Oliver Gierke
f318185ad0 DATAMONGO-1400 - Adapt to rename of Tuple to Pair in Spring Data Commons.
Related tickets: DATACMNS-818.
2016-03-18 09:59:06 +01:00
Oliver Gierke
43b496287c DATAMONGO-1245 - Final tweaks to Query by Example documentation.
Tweaked section anchor to match conventions. Use level offsets to accommodate changes in Spring Data Commons.
2016-03-18 09:29:46 +01:00
Oliver Gierke
9d0c8ecdc3 DATAMONGO-1245 - Polishing.
Adapt to API changes in Spring Data Commons.

Related tickets: DATACMNS-810.
Original pull request: #341.
2016-03-17 18:39:18 +01:00
Mark Paluch
5a78d99af0 DATAMONGO-1245 - Initial documentation for Query by Example.
Adopt changes from query by example API refactoring.

Related tickets: DATACMNS-810.
Original pull request: #341.
2016-03-17 18:39:18 +01:00
Christoph Strobl
693f5ddf6e DATAMONGO-1245 - Add support for Query By Example.
An explorative approach to QBE trying find possibilities and limitations. We now support querying documents by providing a sample of the given object holding compare values. For the sake of partial matching we flatten out nested structures so we can create different queries for matching like:

{ _id : 1, nested : { value : "conflux" } }
{ _id : 1, nested.value : { "conflux" } }

This is useful when you want so search using a only partially filled nested document. String matching can be configured to wrap strings with $regex which creates { firstname : { $regex : "^foo", $options: "i" } } when using StringMatchMode.STARTING along with the ignoreCaseOption. DBRefs and geo structures such as Point or GeoJsonPoint is converted to their according structure.

Related tickets: DATACMNS-810.
Original pull request: #341.
2016-03-17 18:39:18 +01:00
Christoph Strobl
ece655f67d DATAMONGO-1387 - Polishing.
Added a few more tests and append values if present on Query.

Original Pull Request: #345
2016-03-17 13:10:51 +01:00
John Willemin
119692c979 DATAMONGO-1387 - Fix BasicQuery getFieldsObject() inconsistency.
We changed BasicQuery to consider its parent getFieldsObject() when not given an explicit fields DBObject.

Original Pull Request: #345
CLA: 165520160303021604 (John Willemin)
2016-03-17 13:09:58 +01:00
Oliver Gierke
6068f3243a DATAMONGO-1397 - Polishing.
Switched to Slf4J-native placeholder replacement in debug logging for MongoTemplate.

Original pull request: #348.
2016-03-16 17:28:04 +01:00
Mark Paluch
a7cda2e793 DATAMONGO-1397 - Log command, entity and collection name in MongoTemplate.geoNear(…).
Original pull request: #348.
2016-03-16 17:28:03 +01:00
Oliver Gierke
2687cb85f0 DATAMONGO-1373 - Polishing.
Added method, field and annotation target to @Field annotation explicitly. Fixed copyright date ranges where needed.

Tweaked formatting in test cases.

Original pull request: #347.
Related ticket: DATACMNS-825.
2016-03-15 15:35:10 +01:00
Mark Paluch
b2ce1700d2 DATAMONGO-1373 - Allow usage of @AliasFor with mapping and indexing annotations.
We now support @AliasFor to build composed annotations with: @Document, @Id, @Field, @Indexed, @CompoundIndex, @GeoSpatialIndexed, @TextIndexed, @Query, and @Meta. Added missing license header to @Field.

Original pull request: #347.
Related tickets: DATACMNS-825.
2016-03-15 15:20:45 +01:00
Christoph Strobl
0b634f8340 DATAMONGO-1373 - Allow usage of @AliasFor for composed @Document annotation.
We now resolve aliased attribute values when reading @Document on entity types. This allows creation of composed annotations like:

@Retention(RetentionPolicy.RUNTIME)
@Target({ ElementType.TYPE })
@Document
static @interface ComposedDocumentAnnotation {

  @AliasFor(annotation = Document.class, attribute = "collection")
  String name() default "custom-collection-name";
}

Original pull request: #347.
Related issue: DATACMNS-825.
2016-03-15 15:20:22 +01:00
Mark Paluch
9a078b743f DATAMONGO-1326 - Support field inheritance for $lookup aggregation operator.
We now distinguish between aggregation operations that replace fields in the aggregation pipeline and those which inherit fields from previous operations. InheritsFieldsAggregationOperation is a nested interface of FieldsExposingAggregationOperation is a marker to lookup fields along the aggregation context chain. Added unit and integration tests. Mention lookup operator in docs.

Original pull request: #344.
2016-03-08 09:00:55 +01:00
Christoph Strobl
65b6576cfc DATAMONGO-1326 - Add Builder, update javadoc and remove additional interface.
Updated javadoc and formatting. Added tests and removed marker interface.

Original Pull Request: #344
2016-03-08 08:59:54 +01:00
Alessio Fachechi
78e99e6df2 DATAMONGO-1326 - Add support for $lookup to aggregation.
Original Pull Request: #344
CLA: 164120160221125037 (Alessio Fachechi)
2016-03-07 14:20:14 +01:00
Oliver Gierke
bb0a42733d DATAMONGO-1389 - Fixed test case to verify type predicting bean registration.
Related ticket: DATACMNS-821.
2016-03-01 15:02:18 +01:00
Oliver Gierke
a2ae08e263 DATAMONGO-1381 - Updated changelog. 2016-02-23 14:27:23 +01:00
Oliver Gierke
eaa9d6c7e6 DATAMONGO-1366 - After release cleanups. 2016-02-12 15:43:57 +01:00
Oliver Gierke
8900695153 DATAMONGO-1366 - Prepare next development iteration. 2016-02-12 15:43:39 +01:00
Oliver Gierke
bfe548d573 DATAMONGO-1366 - Release version 1.9 M1 (Hopper). 2016-02-12 15:42:47 +01:00
Oliver Gierke
7ab4002771 DATAMONGO-1366 - Prepare 1.9 M1 (Hopper). 2016-02-12 15:36:19 +01:00
Oliver Gierke
6eace856aa DATAMONGO-1366 - Updated changelog. 2016-02-12 15:36:11 +01:00
Oliver Gierke
f10e5a19c5 DATAMONGO-1345 - Finalized application of projections in query methods.
Refactored the query execution out of AbstractMongoQuery into MongoQueryExecution. Made sure the streaming execution lazily applies the projections, too.

Added a DtoInstantiatingConverter to be able to copy data from created entities into DTOs as we cannot hand the DTO type into the MongoTemplate execution in the first place as it's going to be used for the query mapping currently.
2016-02-12 14:14:54 +01:00
Uxío Fuentefría
90a4a63776 DATAMONGO-1378 - Update reference documentation: Change Query.sort() to Query.with(Sort sort).
sort() is not a method of Query, to sort a query you have to use with().

Original pull request: #320.
CLA: 162620160211060822 (Uxío Fuentefría)
2016-02-11 20:22:36 +01:00
Oliver Gierke
0f14e35ba3 DATAMONGO-1288 - Polishing.
Some JavaDoc here and there. Moved converter factory registration into MongoConverters.getConvertersToRegister() for consistency with others.

Original pull request: #331.
2016-02-11 14:08:31 +01:00
Christoph Strobl
ad0c4207d6 DATAMONGO-1288 - Add conversion for AtomicInteger & AtomicLong.
We now convert AtomicInteger and AtomicLong to the required Number target type by calling get() followed by the actual conversion. This allows to directly use these types e.g. as part of an Update: new Update().set("intValue", new AtomicInteger(10));

Original pull request: #331.
2016-02-11 14:08:19 +01:00
Mark Paluch
97da43645a DATAMONGO-1380 - Polishing.
Add credits, use message formatting instead string concatenation.

Original pull request: #317.
2016-02-11 12:02:09 +01:00
Alex Vengrovsk
42b7c42617 DATAMONGO-1380 - Improve logging in MongoChangeSetPersister.
Add checking for debug enabling in the getPersistentId method

Original pull request: #317.
2016-02-11 11:53:15 +01:00
Timo Kockert
bd81e25e6b DATAMONGO-1270 - Update documentation to reflect deprecation of MongoFactoryBean.
Original pull request: #315.
2016-02-10 15:57:15 +01:00
Thomas Dudouet
debe6aa649 DATAMONGO-1377 - Update JavaDoc: Use @EnableMongoRepositories instead of @EnableJpaRepositories.
The JavaDoc description references the EnableJpaRepositories annotation instead of the EnableMongoRepositories annotation.

Original pull request: #340.
2016-02-10 15:13:24 +01:00
Oliver Gierke
6f433902f0 DATAMONGO-1376 - Moved away from SimpleTypeInformationMapper.INSTANCE.
Related tickets: DATACMNS-815.
2016-02-09 14:31:05 +01:00
Martin Macko
ba902e7f8e DATAMONGO-1375 - Fix typo in MongoOperations JavaDoc.
Original pull request: #343.
2016-02-09 11:29:30 +01:00
Oliver Gierke
7e8ec21684 DATAMONGO-1372 - Polishing.
Tiny formattings, collapsed if-clause into ternary operation.

Original pull request: #342.
2016-02-04 15:19:51 +01:00
Christoph Strobl
b7131b7efc DATAMONGO-1372 - Add and register Converters for java.util.Currency.
We now support conversion from currency into ISO 4217 String and back.

Original pull request: #342.
2016-02-04 15:19:48 +01:00
Oliver Gierke
ace99c3464 DATAMONGO-1371 - Added code of conduct.
Moved to Asciidoctor for the CONTRIBUTING file.
2016-02-02 09:42:48 +01:00
Oliver Gierke
83fc5bc113 DATAMONGO-1366 - Declare Artifactory Maven plugin to be able to distribute build artifacts. 2016-01-28 14:51:55 +01:00
Oliver Gierke
160de0adf6 DATAMONGO-1361 - Guard command result statistics evaluation against changes in MongoDB 3.2.
MongoDB 3.2 RC1 decided to remove fields from statistics JSON documents returned in case no result was found for a geo near query. The avgDistance field is unfortunately missing as of that version.

Introduced a value object to encapsulate the mitigation behavior and make client code unaware of that.
2016-01-21 12:45:10 +01:00
Oliver Gierke
b4753f3a83 DATAMONGO-1360 - Query instances contained in a Near Query now get mapped during geoNear(…) execution.
A Query instance which might be part of a NearQuery definition is now passed through the QueryMapper to make sure complex types contained in it or even in more general types that have custom conversions registered are mapped correctly before the near command is actually executed.
2016-01-20 13:10:50 +01:00
Oliver Gierke
bce6e2c78c DATAMONGO-1163 - Polishing.
Fixed indentation changes in IndexingIntegrationTests. Separated test cases from each other.

Original pull request: #325.
2015-12-27 12:05:19 +01:00
Jordi Llach
b5ea0eccd2 DATAMONGO-1163 - Allow usage of @Indexed as meta-annotation.
@Indexed can now be used as meta-annotation so that user annotations can be annotated with it and the index creation facilities still pick up the configuration information.

Original pull request: #325.
2015-12-27 12:05:17 +01:00
Oliver Gierke
87865b9761 DATAMONGO-1355 - Updated changelog. 2015-12-18 10:55:56 +01:00
Christoph Strobl
13fa4703c0 DATAMONGO-1334 - Map-reduce operations now honor MapReduceOptions.limit.
We now also consider the limit set via MapReduceOptions when executing mapReduce operations via MongoTemplate.mapReduce(…).

MapReduceOptions.limit(…) supersedes a potential limit set via the Query itself. This change also allows to define a limit even when no explicit Query is used.

Original pull request: #338.
2015-12-16 11:57:44 +01:00
Christoph Strobl
5a21e00322 DATAMONGO-1317 - Assert compatibility with mongo-java-driver 3.2.
We now do a defensive check against the actual WObject of WriteConcern to avoid the IllegalStateException raised by the new java-driver in case _w is null or not an Integer. This allows us to run against recent 2.13, 2.14, 3.0, 3.1 and the latest 3.2.0.

Original pull request: #337.
2015-12-16 11:49:01 +01:00
Oliver Gierke
3feed2bc5a DATAMONGO-1289 - Polishing.
Some additional JavaDoc and comment removal.

Original pull request: #333.
2015-12-16 11:38:31 +01:00
Christoph Strobl
501b9501e0 DATAMONGO-1289 - MappingMongoEntityInformation no uses fallback identifier type derived from repository declaration.
We now use RepositoryMetdata.getIdType() to provide a fallback identifier type in case the entity information does not hold an id property which is perfectly valid for MongoDB.

Original pull request: #333.
2015-12-16 11:37:51 +01:00
Oliver Gierke
727271e68c DATAMONGO-1345 - Added support for projections on repository methods.
Related tickets: DATACMNS-89.
2015-12-14 19:56:42 +01:00
Christoph Strobl
63a619dddf DATAMONGO-1349 - Upgrade to mongo-java-driver 2.14.0. 2015-12-11 10:38:36 +01:00
Oliver Gierke
113566a6ab DATAMONGO-1346 - Update.pullAll(…) now registers multiple invocations correctly.
Previously calling the method multiple times overrode the result of previous calls. We now use addMultiFieldOperation(…) to make sure already existing values are kept.
2015-12-10 15:38:40 +01:00
Oliver Gierke
7862841b48 DATAMONGO-934 - Polishing.
Polished JavaDoc and implementation as well as tests. Extracted Tuple to Spring Data Commons. Moved exception translation into MongoExceptionTranslator.

Changed implementation of DefaultBulkOperations to consider the WriteConcernResolver of the underlying MongoTemplate to avoid exposing the WriteConcern on execution.

Original pull request: #327.
Related tickets: DATACMNS-790.
2015-11-26 17:56:39 +01:00
Tobias Trelle
fe6cbaa03d DATAMONGO-934 - Added support for bulk operations.
Introduced BulkOperations that can be obtained via MongoOperations, register operations to be eventually executed in a bulk.

Original pull request: #327.
2015-11-26 17:56:35 +01:00
Oliver Gierke
9ef1fc7304 DATAMONGO-1337 - Another round of polishes on SonarQuber complaints. 2015-11-26 12:27:22 +01:00
Oliver Gierke
cf3a9d3ced DATAMONGO-1337 - Reverted making some of the loggers static.
The logger instance in AbstractMonitor is supposed to pick up the type of the actual implementation class and thus cannot be static.

Related pull request: #336.
2015-11-26 12:00:40 +01:00
Christian Ivan
1d1c80db7b DATAMONGO-1337 - General code quality improvements.
A round of code polish regarding the PMD and Squid rules referred to in the ticket.

Original pull request: #336.
2015-11-26 11:53:06 +01:00
Oliver Gierke
eeb37e9104 DATAMONGO-1342 - Fixed potential NullPointerException in MongoQueryCreator.
MongoQueryCreator.nextAsArray(…) now returns a single element object array in case null is handed to the method. It previously failed with a NullPointerException.
2015-11-25 17:23:15 +01:00
Oliver Gierke
18bf0daee7 DATAMONGO-1335 - DBObjectAccessor now writes all nested fields correctly.
Previously, DBObjectAccessor has always reset the in-between values when traversing nested properties. This caused previously written values to be erased if subsequent values are written. We now reuse an already existing BasicDBObject if present.
2015-11-25 16:06:52 +01:00
Oliver Gierke
1e9189aee7 DATAMONGO-1341 - Moved MongoDbErrorCodes into utility package.
This resolves a package cycle introduced by MongoPersistentEntityIndexCreator referring to error codes now.

Updated Sonargraph architecture description along the way.
2015-11-25 15:36:22 +01:00
Oliver Gierke
95f6dfafdd DATAMONGO-1287 - Optimizations in reading associations as constructor arguments.
As per discussion on the ticket we now omit looking up the value for an association being used as constructor argument as the simple check whether the currently handled property is a constructor argument is sufficient to potentially skip handling the value.

Related pull requests: #335, #322.
2015-11-23 11:13:07 +01:00
Christoph Strobl
bedaae8a90 DATAMONGO-1287 - Fix double fetching for lazy DbRefs used in entity constructor.
We now check properties for their usage as constructor arguments, that might already have been resolved, before setting the actual value. This prevents turning already eagerly fetched DBRefs back into LazyLoadingProxies.

Original pull request: #335.
Related pull request: #322.
2015-11-20 13:39:00 +01:00
Oliver Gierke
7bfa3fe7fd DATAMONGO-1290 - Polishing.
Removed a level of indentation from ExpressionEvaluationParameterBinder.replacePlaceholders(…). Polished JavaDoc.

Original pull request: #332.
2015-11-20 13:20:11 +01:00
Christoph Strobl
143b0b73b9 DATAMONGO-1290 - Move parameter binding for String based queries.
Moved parameter binding for string based queries into separate class.

Original pull request: #332.
2015-11-20 13:20:09 +01:00
Christoph Strobl
cbfc46270e DATAMONGO-1290 - Convert byte[] parameter in @Query to $binary representation.
We now convert non quoted binary parameters to the $binary format. This allows using them along with the @Query annotation.

Original pull request: #332.
2015-11-20 13:06:22 +01:00
Christoph Strobl
b31efb46ec DATAMONGO-1204 - ObjectPath now uses raw id values to track resolved objects.
We now use the native id within ObjectPath for checking if a DBref has already been resolved. This is required as MongoDB Java driver 3 generation changed ObjectId.equals(…) which now performs a type check.

Original pull request: #334.
Related pull request: #288.
2015-11-20 12:47:52 +01:00
Oliver Gierke
ef3477098f DATAMONGO-1324 - Register ObjectId converters unconditionally to make sure they really get used.
The presence of ObjectToObjectConverter in a DefaultConversionService causes the guard trying to register converters for ObjectIds in AbstractMongoConverter to not trigger the registration. This in turn caused ObjectId conversions to be executed via reflection instead of the straight forward method calls and thus a drop in performance for such operations.

We no unconditionally register the converters to make sure they really get applied.

Related tickets: SPR-13703.
2015-11-19 12:02:41 +01:00
Oliver Gierke
9dce117555 DATAMONGO-1238 - Upgraded to Querydsl 4. 2015-11-17 13:42:38 +01:00
Oliver Gierke
e66e1e0502 DATAMONGO-1316 - Updated changelog. 2015-11-16 08:31:45 +01:00
Christoph Strobl
19e1e9daeb DATAMONGO-1297 - Allow @Indexed annotation on DBRef.
We now also treat references as source of a potential index. This enforces index creation for Objects like:

@Document
class WithDbRef {

  @Indexed
  @DBRef
  ReferencedObject reference;
}

Combining @TextIndexed or @GeoSpatialIndexed with a DBRef will lead to a MappingException.

Original pull request: #329.
2015-11-13 17:54:42 +01:00
Christoph Strobl
ec8a948f3f DATAMONGO-1302 - Allow ConverterFactory to be registered in CustomConversions.
We now allow registration of ConverterFactory within CustomConversions by inspecting the generic type arguments for determining the conversion source and target types.

Original pull request: #330.
2015-11-10 14:37:02 +01:00
Ilho Ahn
38fc7641a0 DATAMONGO-1314 - Fix typo in Exception message.
Original Pull Request: #265
2015-11-09 20:37:26 +01:00
Christoph Strobl
ddc3925659 DATAMONGO-1291 - Made @Document usable as meta-annotation.
We now use Spring's AnnotationUtils.findAnnotation(…) for @Document lookup which enables the full power of Spring 4.2's composable annotations.

Original pull request: #326.
2015-11-06 14:34:43 +01:00
Christoph Strobl
f8416edf8f DATAMONGO-1293 - Polishing.
Move configuration parsing error into method actually responsible for reading uri/client-uri attributes.

Original Pull Request: #328
2015-10-29 12:47:16 +01:00
Viktor Khoroshko
4f94f37ce8 DATAMONGO-1293 - Allowed id attribute in addition to client-uri attribute in MongoDbFactoryParser.
We now allow write-concern and id to be configured along with the uri or client-uri attribute of <mongo:db-factory.

Original Pull Request: #328
CLA: 140120150929074128 (Viktor Khoroshko)
2015-10-29 12:47:08 +01:00
Oliver Gierke
528de58418 DATAMONGO-1276 - Fixed potential NullPointerExceptions in MongoTemplate.
Triggering data access exception translation could lead to NullPointerException in cases where PersistenceExceptionTranslator returned null because the original exception couldn't be translated and the result was directly used from a throw clause.

This is now fixed by consistently the potentiallyConvertRuntimeException(…) method, which was made static to be able to refer to it from nested static classes.

Refactored Scanner usage to actually close the Scanner instance to prevent a resource leak.
2015-10-21 15:04:12 +02:00
Oliver Gierke
e6ea34aed8 DATAMONGO-1304 - Updated changelog. 2015-10-14 13:46:21 +02:00
Oliver Gierke
f171938b00 DATAMONGO-1303 - Added build profiles for MongoDB Java driver 3.1 and 3.2 snapshots.
Added new build profiles mongod31 and mongo32-next to build the project against the latest MongoDB 3.1 driver as well as upcoming snapshots of the 3.2 generation.
2015-10-12 15:41:30 +02:00
Oliver Gierke
7b27368d2d DATAMONGO-1282 - After release cleanups. 2015-09-01 12:11:02 +02:00
Spring Buildmaster
f754df51bc DATAMONGO-1282 - Prepare next development iteration. 2015-09-01 02:12:29 -07:00
Spring Buildmaster
77dce53c7a DATAMONGO-1282 - Release version 1.8.0.RELEASE (Gosling GA). 2015-09-01 02:12:26 -07:00
Oliver Gierke
73f268e7c4 DATAMONGO-1282 - Prepare 1.8.0.RELEASE (Gosling GA). 2015-09-01 09:44:21 +02:00
Oliver Gierke
075d7d8131 DATAMONGO-1282 - Updated changelog. 2015-09-01 09:44:11 +02:00
Christoph Strobl
206337044a DATAMONGO-1280 - Updated "What’s new" section in reference documentation.
Original pull request: #319.
2015-08-31 12:55:30 +02:00
Christoph Strobl
55b44ff7aa DATAMONGO-1275 - Fixed broken links in reference documentation.
Original pull request: #318.
2015-08-22 13:16:49 +02:00
Christoph Strobl
ae48639ae9 DATAMONGO-1275 - Added documentation for optimistic locking.
Original pull request: #318.
2015-08-22 13:16:45 +02:00
Oliver Gierke
6b5e78f810 DATAMONGO-1256 - Polishing.
Minor Javadoc polishing.

Original pull request: #316.
2015-08-07 14:04:49 +02:00
Christoph Strobl
3e485e0a88 DATAMONGO-1256 - MongoMappingEvents now expose the collection name they're issued for.
We now directly expose the collection name via MongoMappingEvent.getCollectionName(). Therefore we added new constructors to all the events, deprecating the previous ones. 

Several overloads have been added to MongoEventListener, deprecating previous API. We’ll call the deprecated from the new ones until their removal.

Original pull request: #316.
2015-08-07 14:04:47 +02:00
Oliver Gierke
335c78f908 DATAMONGO-1269 - Polishing.
Original pull request: #314.
2015-08-06 11:00:36 +02:00
Christoph Strobl
b103e4eaf6 DATAMONGO-1269 - Retain position parameter in property path.
We now retain position parameters in paths used in queries when mapping the field name. This allows to map "list.1.name" to the name property of the first element in the list.

The change also fixes a glitch in mapping java.util.Map like structures having numeric keys.

Original pull request: #314.
2015-08-06 11:00:36 +02:00
Oliver Gierke
c4a6c63d23 DATAMONGO-1268 - After release cleanups. 2015-08-04 14:09:20 +02:00
Spring Buildmaster
4a4f10f97b DATAMONGO-1268 - Prepare next development iteration. 2015-08-04 04:37:14 -07:00
Spring Buildmaster
a5712daab7 DATAMONGO-1268 - Release version 1.8.0.RC1 (Gosling RC1). 2015-08-04 04:37:12 -07:00
Oliver Gierke
28cb1ef106 DATAMONGO-1268 - Prepare 1.8.0.RC1 (Gosling RC1). 2015-08-04 13:08:49 +02:00
Oliver Gierke
0d99a3e527 DATAMONGO-1268 - Updated changelog. 2015-08-04 13:08:49 +02:00
Christoph Strobl
9da43263ce DATAMONGO-1263 - Index resolver considers generic type argument of collection elements.
We now consider the potential generic type argument of collection elements. 
Prior to this change an index within List<GenericWrapper<ConcreteWithIndex>> would not have been resolved.

Original pull request: #312.
2015-08-04 08:48:57 +02:00
Oliver Gierke
784e199068 DATAMONGO-1266 - Fixed domain type lookup for methods returning primitves.
If a repository query method returned a primitive, that primitive was exposed as domain type which e.g. caused deleteBy…(…) methods to fail that returned a void.

We now shortcut the MongoEntityMetadata lookup in MongoQueryMethod to use the repository's domain type if a primitive or wrapper is returned.
2015-08-03 11:53:10 +02:00
Oliver Gierke
1ffee802c0 DATAMONGO-1261 - Updated changelog. 2015-07-28 16:42:58 +02:00
Christoph Strobl
6f0ac7f0c2 DATAMONGO-1254 - Grouping after projection in aggregation now uses correct aliased field name.
We now push the aliased field name down the aggregation pipeline for projections including operations. This allows to reference them in a later stage. Prior to this change the field reference was potentially resolved to the target field of the operation which did not result in an error but lead to false results.

Original pull request: #311.
2015-07-27 14:15:33 +02:00
Christoph Strobl
941d4d8985 DATAMONGO-1260 - Prevent accidental authentication misconfiguration on SimpleMongoDbFactory.
We now reject configuration using MongoClient along with UserCredentials in SimpleMongoDbFactory. This move favors the native authentication mechanism provided via MongoCredential.

<mongo:mongo-client id="mongo-client-with-credentials" credentials="jon:warg@snow?uri.authMechanism=PLAIN" />

Original pull request: #309.
2015-07-27 14:08:42 +02:00
Oliver Gierke
44c76d8ffb DATAMONGO-1257 - We now hint to credential quoting from the XSD.
The namespace XSD now mentions the capability of quoting more complex credentials in case they validly contain a comma.
2015-07-27 13:47:11 +02:00
Oliver Gierke
df9a9f5fb6 DATAMONGO-1257 - Polishing.
Made internal helper methods in MongoCredentialPropertyEditor static where possible.

Original pull request: #310.
2015-07-24 18:40:57 +02:00
Christoph Strobl
bebd0fa0e6 DATAMONGO-1257 - <mongo:mongo-client /> element now supports usernames with a comma.
We now allow grouping credentials by enclosing them in single quotes like this:

credentials='CN=myName,OU=myOrgUnit,O=myOrg,L=myLocality,ST=myState,C=myCountry?uri.authMechanism=MONGODB-X509'

We also changed the required argument checks to be more authentication mechanism specific which means the pattern is now username[:password@database][?options].

Original pull request: #310.
2015-07-24 18:40:56 +02:00
Oliver Gierke
594e90789d DATAMONGO-1244 - Polishing.
Minor reformattings and extracted a method to improve digestability.

Original pull request: #306.
2015-07-08 10:18:35 +02:00
Thomas Darimont
f2ab42cb80 DATAMONGO-1244 - Improved handling of expression parameters in StringBasedMongoQuery.
Replaced regex based parsing of dynamic expression based parameters with custom parsing to make sure we also support complex nested expression objects.
Previously we only supported simple named or positional expressions. Since MongoDBs JSON based query language uses deeply nested objects to express queries, we needed to improve the handling here.

Manual parsing is tedious and more verbose than regex based parsing but it gives us more control over the whole parsing process.

We also dynamically adjust  the quoting so that we only output quoted parameters if necessary.

This enables to express complex filtering queries the use Spring Security constructors like:
```
@Query("{id: ?#{ hasRole('ROLE_ADMIN') ? {$exists:true} : principal.id}}")
List<User> findAllForCurrentUserById();
```

Original pull request: #306.
2015-07-08 10:18:27 +02:00
Christoph Strobl
3224fa8ce7 DATAMONGO-1251 - Fixed potential NullPointerException in UpdateMapper.
We now explicitly handle the possibility of the source object a type hint needs to be calculated for being null.
2015-07-07 09:57:46 +02:00
Oliver Gierke
ce156c1344 DATAMONGO-1250 - Fixed inline code formatting in reference docs. 2015-07-04 19:07:09 +02:00
Oliver Gierke
434e553022 DATAMONGO-1250 - Fixed accidental duplicate invocation of value conversion in UpdateMapper.
UpdateMapper.getMappedObjectForField(…) invokes the very same method of the super class but handed in an already mapped value so that value conversion was invoked twice.

This was especially problematic in cases a dedicated converter had been registered for an object that is already a Mongo-storable one (e.g. an enum-to-string converter and back) without indicating which of the tow converter is the reading or the writing one. This basically caused the source value converted back and forth during the update mapping creating the impression the value wasn't converted at all.

This is now fixed by removing the superfluous mapping.
2015-07-04 19:00:21 +02:00
Oliver Gierke
de5b5ee4b0 DATAMONGO-1246 - Updated changelog. 2015-07-01 10:00:16 +02:00
Oliver Gierke
60636bf56d DATAMONGO-1247 - Updated changelog. 2015-07-01 07:48:31 +02:00
Oliver Gierke
1ca71f93e9 DATAMONGO-1248 - Updated changelog. 2015-06-30 13:58:37 +02:00
Oliver Gierke
63ff39bed6 DATAMONGO-1236 - Polishing.
Removed the creation of a BasicMongoPersistentEntity in favor of always handing ClassTypeInformation.OBJECT into the converter in case not entity can be found.

This makes sure type information is written for updates on properties of type Object (which essentially leads to no PersistentEntity being available).

Original pull request: #301.
2015-06-30 09:54:53 +02:00
Christoph Strobl
cb0b9604d4 DATAMONGO-1236 - Update now include type hint correctly.
We now use property type information when mapping fields affected by an update in case we do not have proper entity information within the context. This allows more precise type resolution required for determining the need to write type hints for a given property.

Original pull request: #301.
2015-06-30 09:54:53 +02:00
Christoph Strobl
1dbe3b62d7 DATAMONGO-1125 - Improve exception message for index creation errors.
We now use MongoExceptionTranslator to potentially convert exceptions during index creation into Springs DataAccessException hierarchy. In case we encounter an error code indicating DataIntegrityViolation we try to fetch existing index data and append it to the exceptions message.

Original pull request: #302.
2015-06-24 20:28:23 +02:00
Christoph Strobl
5c0707d221 DATAMONGO-1232 - IngoreCase in criteria now escapes query.
We now quote the original criteria before actually wrapping it inside of an regular expression for case insensitive search. This happens not only to case insensitive is, startsWith, endsWith criteria but also to those using like. In that case we quote the part between leading and trailing wildcard if required.

Original pull request: #301.
2015-06-22 12:50:05 +02:00
Christoph Strobl
c4ffc37dd5 DATAMONGO-1166 - ReadPreference is now be used for aggregations.
We now use MongoTemplate.readPreference(…) when executing commands such as geoNear(…) and aggregate(…).

Original pull request: #303.
2015-06-22 08:21:23 +02:00
Christoph Strobl
aaf93b0f6f DATAMONGO-1157 - Throw meaningful exception when @DbRef is used with unsupported types.
We now eagerly check DBRef properties for invalid definitions such as final class or array. In that case we throw a MappingException when verify is called.
2015-06-19 15:54:19 +02:00
Thomas Darimont
23eab1e84f DATAMONGO-1242 - Update MongoDB Java driver to 3.0.2 in mongo3 profile.
Update mongo driver.

Original pull request: #304.
2015-06-19 15:37:47 +02:00
Oliver Gierke
218f32e552 DATAMONGO-1229 - Fixed application of ignore case flag on nested properties.
Previously we tried to apply the ignore case settings found in the PartTree to the root PropertyPath we handle in MongoQueryCreator.create(). This is now changed to work on the leaf property of the PropertyPath.
2015-06-05 06:49:03 +02:00
Eddú Meléndez
62fbe4d08c DATAMONGO-1234 - Fix typos in JavaDoc. 2015-06-05 06:37:22 +02:00
Oliver Gierke
41ffd00619 DATAMONGO-1228 - After release cleanups. 2015-06-02 11:58:11 +02:00
Spring Buildmaster
98b9a604cf DATAMONGO-1228 - Prepare next development iteration. 2015-06-02 01:29:04 -07:00
Spring Buildmaster
01468b640a DATAMONGO-1228 - Release version 1.8.0.M1 (Gosling M1). 2015-06-02 01:29:01 -07:00
Oliver Gierke
4d96b036a2 DATAMONGO-1228 - Prepare 1.8.0.M1 (Gosling M1). 2015-06-02 09:29:53 +02:00
Oliver Gierke
2d1ac15e24 DATAMONGO-1228 - Updated changelog. 2015-06-02 08:24:47 +02:00
Oliver Gierke
2c27e8576f DATAMONGO-990 - Polishing.
Removed EvaluationExpressionContext from all AbstractMongoQuery implementations that don't actually need it and from AbstractMongoQuery itself, too. Cleaned up test cases after that.

Moved SpEL related tests into AbstractPersonRepositoryIntegrationTests to make sure they're executed for all sub-types. JavaDoc and assertion polishes.

Original pull request: #285.
2015-06-01 17:27:58 +02:00
Thomas Darimont
67f638d953 DATAMONGO-990 - Add support for SpEL expressions in @Query.
Ported and adapted support for SpEL expressions @Query annotations from Spring Data JPA. StringBasedMongoQuery can now evaluate SpEL fragments in queries with the help of the given EvaluationContextProvider. Introduced EvaluationContextProvider to AbstractMongoQuery. Exposed access to actual parameter values in MongoParameterAccessor.

Original pull request: #285.
2015-06-01 17:27:58 +02:00
Oliver Gierke
ea5bd5f7d3 DATAMONGO-1210 - Polishing.
Moved getTypeHint(…) method to Field class.

Original pull request: #292.
2015-06-01 13:21:07 +02:00
Christoph Strobl
394f695416 DATAMONGO-1210 - Fixed type hints for usage with findAndModify(…).
We now inspect the actual field type during update mapping and provide a type hint accordingly. Simple, non interface and non abstract types will no longer be decorated with the _class attribute. We now honor positional parameters when trying to map paths to properties. This allows more decent type mapping since we have now access to the meta model which allows us to check if presence of type hint (aka _class) is required.

We now add a special type hint indicating nested types to the converter. This allows more fine grained removal of _class property without the need to break the contract of MongoWriter.convertToMongoType(…).

Original pull request: #292.
2015-06-01 13:21:07 +02:00
Stefan Ganzer
e4db466ab9 DATAMONGO-1210 - Add breaking test case for findAndModify/addToSet/each.
The problem stems from the inconsistent handling of type hints such as MongoTemplate.save(…) does not add a type hint, but findAndModify(…) does. The same values are then treated differently by MongoDB, depending on whether they have a type hint or not. To verify this behavior, you can manually add the (superfluous) type hint to the saved object - findAndModify will then work as expected.

Additional tests demonstrate that findAndModify(…) removes type hints from complex documents in collections that are either nested in another collection or in a document, or doesn't add them in the first place.

Original pull requests: #290, #291.
Related pull request: #292.
CLA: 119820150506013701 (Stefan Ganzer)
2015-06-01 13:21:01 +02:00
Christoph Strobl
ee04c014c9 DATAMONGO-1134 - Add support for $geoIntersects.
We now support $geoIntersects via Criteria.intersects(…) using GeoJSON types.

Original pull request: #295.
2015-06-01 12:36:20 +02:00
Christoph Strobl
ea84f08de8 DATAMONGO-1216 - Skip authentication via AuthDB for MongoClient.
We now skip authentication via an explicit AuthDB when requesting a DB via a MongoClient instance.

Related ticket: DATACMNS-1218
Original pull request: #296.
2015-06-01 12:10:14 +02:00
Christoph Strobl
7d8a2b2d56 DATAMONGO-1218 - Deprecate non-MongoClient related configuration options in XML namespace.
We added deprecation hints to the description sections of elements and attributes within the spring-mongo.xsd of 1.7. Also we’ve added (for 1.8) a configuration attribute to db-factory allowing to set a client-uri creating a MongoClientURI instead of a MongoURI that will be passed on to MongoDbFactory. Just as 'uri', 'client-uri' will not allow additional configuration options like username, password next to it.

Original pull request: #296
2015-06-01 12:10:14 +02:00
Christoph Strobl
995d1e5aac DATAMONGO-1202 - Polishing.
Moved and renamed types into test class.
Added collection cleanup and missing author information.

Original pull request: #293.
2015-06-01 09:23:35 +02:00
Thomas Darimont
3b918492ae DATAMONGO-1202 - More robust type inspection for @Indexed properties.
We now use TypeInformation in IndexResolver to lookup the root PersistentEntity for resolving @Indexed properties to ensure that we retrieve the same PersistentEntity that was stored. Previously we used the Class to lookup up the PersistentEntity which yielded a partially processed result.

Original pull request: #293.
2015-06-01 09:08:31 +02:00
Christoph Strobl
66b419163c DATAMONGO-1193 - Prevent unnecessary database lookups when resolving DBRefs on 2.x driver.
We now check against the used driver version before requesting db instance from factory. Potential improvements on fetch strategy for MongoDB Java Driver 3 will be handled in DATAMONGO-1194.

Related tickets: DATAMONGO-1194.
Original pull request: #286.
2015-06-01 08:09:50 +02:00
Oliver Gierke
52bff39c22 DATAMONGO-1224 - Ensure Spring Framework 4.2 compatibility.
Removed obsolete generics in MongoPersistentEntityIndexCreator to make sure MappingContextEvents are delivered to the listener on Spring 4.2 which applies more strict generics handling to ApplicationEvents.

Tweaked PersonBeforeSaveListener in test code to actually reflect how an ApplicationEventListener for MongoDB would be implemented.

Removed deprecated (and now removed) usage of ConversionServiceFactory in AbstractMongoConverter. Added MongoMappingEventPublisher.publishEvent(Object) as NoOp.
2015-05-25 13:12:47 +02:00
Domenique Tilleuil
d151a13e87 DATAMONGO-1208 - Use QueryCursorPreparer for streaming in MongoTemplate.
We now use the QueryCursorPreparer honor skip, limit, sort, etc. for streaming.

Original pull request: #297.
Polishing pull request: #298.
2015-05-21 09:00:33 +02:00
Oliver Gierke
5e7e7d3598 DATAMONGO-1221 - Removed <relativePath /> element from parent POM declaration. 2015-05-15 15:07:30 +02:00
Oliver Gierke
356248bd05 DATAMONGO-1213 - Included section on dependency management in reference documentation.
Related ticket: DATACMNS-687.
2015-05-04 14:51:34 +02:00
Oliver Gierke
73a60153f6 DATAMONGO-1211 - Adapt to changes in Spring Data Commons.
Tweaked method signatures in MongoRepositoryFactory after some signature changes in Spring Data Commons. Use newly introduced getTragetRepositoryViaReflection(…) to obtain the repository instance via the super class.

Added repositoryBaseClass() attribute to @EnableMongoRepositories.

Related tickets: DATACMNS-542.
2015-05-02 14:49:31 +02:00
Oliver Gierke
67cf0e62a7 DATAMONGO-1207 - Fixed potential NPE in MongoTemplate.doInsertAll(…).
If a collection containing null values is handed to MongoTempalte.insertAll(…), a NullPointerException was caused by the unguarded attempt to lookup the class of the element. We now explicitly handle this case and skip the element.

Some code cleanups in MongoTemplate.doInsertAll(…).
2015-05-02 14:49:31 +02:00
Oliver Gierke
21fbcc3e67 DATAMONGO-1196 - Upgraded build profiles after MongoDB 3.0 Java driver GA release. 2015-04-01 17:11:55 +02:00
Oliver Gierke
0d63ff92a0 DATAMONGO-1192 - Switched to Spring 4.1's CollectionFactory. 2015-03-31 17:16:44 +02:00
Oliver Gierke
983645e222 DATAMONGO-1189 - After release cleanups. 2015-03-23 14:00:52 +01:00
Spring Buildmaster
d2805bfa47 DATAMONGO-1189 - Prepare next development iteration. 2015-03-23 13:03:26 +01:00
Spring Buildmaster
3f16b30631 DATAMONGO-1189 - Release version 1.7.0.RELEASE (Fowler GA). 2015-03-23 13:03:07 +01:00
Oliver Gierke
8ebcbe3c5c DATAMONGO-1189 - Prepare 1.7.0.RELEASE (Fowler GA). 2015-03-23 12:34:49 +01:00
Oliver Gierke
363bed5c37 DATAMONGO-1189 - Updated changelog. 2015-03-23 12:03:56 +01:00
Christoph Strobl
1547a646dd DATAMONGO-1189 - DATAJPA-692 - Polish reference docs before release.
Add repository query return types to reference doc.
Fall back to locally available Spring Data Commons reference docs as the remote variant doesn't seem to work currently
2015-03-23 11:17:25 +01:00
Oliver Gierke
1408d51065 DATAMONGO-979 - Polishing.
Minor JavaDoc and code style polishes.

Original pull request: #272.
2015-03-23 09:32:52 +01:00
Thomas Darimont
f5c319f18f DATAMONGO-979 - Add support for $size expression in project and group aggregation pipeline.
Introduced AggregationExpression interface to be able to represent arbitrary MongoDB expressions that can be used in projection and group operations. Supported function expressions are provided via the AggregationFunctionExpressions enum.

Original pull request: #272.
2015-03-23 09:32:26 +01:00
Christoph Strobl
a3c29054d0 DATAMONGO-1124 - Switch log level for cyclic reference index warnings to INFO.
Reduce log level from warn to info to avoid noise during application startup.

Original pull request: #282.
2015-03-23 09:00:24 +01:00
Oliver Gierke
01533ca34c DATAMONGO-1181 - Register GeoJsonModule with @EnableSpringDataWebSupport.
Added the necessary configuration infrastructure to automatically register the GeoJsonModule as Spring bean when @EnableSpringDataWebSupport is used. This is implemented by exposing a configuration class annotated with @SpringDataWebConfigurationMixin.

Added Spring WebMVC as test dependency to be able to write an integration test. Polished GeoJsonModule to hide the actual serializers.

Original pull request: #283.
Related ticket: DATACMNS-660.
2015-03-17 19:40:57 +01:00
Christoph Strobl
a1f6dc6db4 DATAMONGO-1181 - Add Jackson Module for GeoJSON types.
Added GeoJsonModule providing JsonDeserializers for GeoJsonPoint, GeoJsonMultiPoint, GeoJsonLineString, GeoJsonMultiLineString, GeoJsonPolygon and GeoJsonMultiPolygon.

Original pull request: #283.
2015-03-17 19:40:57 +01:00
Oliver Gierke
37d53d936d DATAMONGO-1179 - Polishing. 2015-03-10 14:29:22 +01:00
Christoph Strobl
bc0a2df653 DATAMONGO-1179 - Update reference documentation.
Added new-features section. Updated links and requirements. Added section for GeoJSON support. Updated Script Operations section. Added return type Stream to repositories section. Updated keyword list.

Original pull request: #281.
2015-03-10 14:29:22 +01:00
Oliver Gierke
7e50fd8273 DATAMONGO-1180 - Polishing.
Fixed copyright ranges in license headers. Added unit test to PartTreeMongoQueryUnitTests to verify the root exception being propagated correctly.

Original pull request: #280.
Related pull request: #259.
2015-03-10 12:20:53 +01:00
Thomas Darimont
ba560ffbad DATAMONGO-1180 - Fixed incorrect exception message creation in PartTreeMongoQuery.
The JSONParseException caught in PartTreeMongoQuery is now passed to the IllegalStateException we throw from the method. Previously it was passed to the String.format(…) varargs. Verified by manually throwing a JSONParseException in the debugger.

Original pull request: #280.
Related pull request: #259.
2015-03-10 12:20:53 +01:00
Oliver Gierke
50ca32c8b9 DATAMONGO-1173 - After release cleanups. 2015-03-05 19:41:05 +01:00
Spring Buildmaster
bdfe3af505 DATAMONGO-1173 - Prepare next development iteration. 2015-03-05 07:47:13 -08:00
Spring Buildmaster
798b56055d DATAMONGO-1173 - Release version 1.7.0.RC1. 2015-03-05 07:47:11 -08:00
Oliver Gierke
ce68e4a070 DATAMONGO-1173 - Prepare 1.7.0.RC1 (Fowler RC1). 2015-03-05 16:31:00 +01:00
Oliver Gierke
5da3130d26 DATAMONGO-1173 - Updated changelog. 2015-03-05 15:54:23 +01:00
Oliver Gierke
6687cdc101 DATAMONGO-1110 - Polishing.
Moved to newly introduced Range type in Spring Data Commons to more safely bind minimum and maximum distances. Changed internal APIs to always use a Range<Distance> which gets populated based on the method signature's characteristics: if only one Distance parameter is found it's interpreted as a range with upper bound only.

Removed invalid testcase for minDistance on 2D index.

Original pull request: #277.
2015-03-05 15:35:42 +01:00
Christoph Strobl
7e74ec6b62 DATAMONGO-1110 - Add support for $minDistance.
We now support $minDistance for NearQuery and Criteria. Please keep in mind that minDistance is only available for MongoDB 2.6 and better and can only be combined with $near or $nearSphere operator depending on the defined index type. Usage of $minDistance with NearQuery is only possible when a 2dsphere index is present. We also make sure $minDistance operator gets correctly nested when using GeoJSON types.

It is now possible to use a Range<Distance> parameter within the repository queries. This allows to define near queries like:

findByLocationNear(Point point, Range<Distance> distances);

The lower bound of the range is treated as the minimum distance while the upper one defines the maximum distance from the given point. In case a Distance parameter is provided it will serve as maxDistance.

Original pull request: #277.
2015-03-05 15:34:45 +01:00
Thomas Darimont
b887fa70a5 DATAMONGO-1133 - Fixed broken tests,
AggregationTests.shouldHonorFieldAliasesForFieldReferences() now correctly sets up 3 different instances of MeterData and correctly calculates the aggreated counter values.

Original pull request: #279.
2015-03-05 15:30:35 +01:00
Oliver Gierke
1c6ab25253 DATAMONGO-1135 - Polishing.
A few polishing changes to the GeoConverters.
2015-03-05 14:28:11 +01:00
Christoph Strobl
1c43a3d1ee DATAMONGO-1135 - Add support for GeoJson.
We’ve added special types representing GeoJson structures. This allows to use those within both queries and domain types.

GeoJson types should only be used in combination with a 2dsphere index as 2d index is not able to handle the structure. Though legacy coordinate pairs and GeoJson types can be mixed inside MongoDB, we currently do not support conversion of legacy coordinates to GeoJson types.
2015-03-05 14:28:11 +01:00
Thomas Darimont
60ca1b3509 DATAMONGO-1133 - Assert that field aliasing is honored in aggregation operations.
Added some test to show that field aliases are honored during object rendering in aggregation operations.

Original pull request: #279.
2015-03-05 12:21:12 +01:00
Oliver Gierke
39d9312005 DATAMONGO-479 - Polishing.
Removed ServersideJavaScript abstraction as we still had to resort on instanceof checks and it created more ambiguities than it helped (e.g. in a script with name and code, which of the two get's executed?). We now have an ExecutableMongoScript which is code only and a NamedMongoScript, which basically is the former assigned to a name. Execution can be triggered on the former or a name.

ScriptOperations.exists(…) now returns a primitive boolean to avoid null checks. JavaDoc.

Original pull request: #254.
2015-03-04 15:18:46 +01:00
Christoph Strobl
a0e42f5dfe DATAMONGO-479 - Add support for calling functions.
We added ScriptOperations to MongoTemplate. Those allow storage and execution of java script function directly on the MongoDB server instance. Having ScriptOperations in place builds the foundation for annotation driver support in repository layer.

Original pull request: #254.
2015-03-04 15:18:40 +01:00
Oliver Gierke
7a3aff12a5 DATAMONGO-1165 - Polishing.
Renamed MongoOperations executeAsStream(…) to stream(…). Make use of Spring Data Commons StreamUtils in AbstractMongoQuery's StreamExecution. Moved test case from PersonRepositoryIntegrationTests to AbstractPersonRepositoryIntegrationTests to make sure they're executed for all sub-types.

Original pull request: #274.
2015-03-03 22:33:33 +01:00
Thomas Darimont
d4f1ef8704 DATAMONGO-1165 - Add support for Java 8 Stream as return type for repository methods.
Added support for a MongoDB Cursor backed Iterator that allows the usage of a Java 8 Stream at the repository level.

Original pull request: #274.
2015-03-03 20:56:47 +01:00
Oliver Gierke
a86d704bec DATAMONGO-1158 - Polishing.
MongoFactoryBean, MongoOptionsFactoryBean, MongoClientFactoryBean and MongoClientOptionsFactoryBean now extend AbstractFactoryBean to get a lot of the lifecycle callbacks without further code.

Added non-null assertions to newly introduced methods on MongoOperations/MongoTemplate.

Moved MongoClientVersion into util package. Introduced static imports for ReflectionUtils and MongoClientVersion for all references in the newly introduced Invoker types.

Some formatting, JavaDoc polishes, suppress deprecation warnings. Added build profile for MongoDB Java driver 3.0 as well as the following snapshot.

Original pull request: #273.
2015-03-02 21:50:27 +01:00
Christoph Strobl
57ab27aa5b DATAMONGO-1158 - Add Support for MongoDB Java driver 3.0.
We now support mongo-java-driver version 2.x and 3.0 along with MongoDB Server 2.6.7 and 3.0.0.

Please note that some of the configurations options might no longer be valid when used with version 3 of the MongoDB Java driver. Have a look at the table below so see some of the major differences in using version 2.x or 3.0

                      | 2.x                  | 3.0
----------------------+----------------------+-----------------------------------------------
default WriteConcern  | NONE                 | UNACKNOWLEDGED
----------------------+----------------------+-----------------------------------------------
option for slaveOk    | available            | ignored
----------------------+----------------------+-----------------------------------------------
option for autoConnect| available            | ignored
----------------------+----------------------+-----------------------------------------------
write result checking | available            | ignored (errors are exceptions anyway)
----------------------+----------------------+-----------------------------------------------
rest index cache      | available            | throws UnsupportedOperationException
----------------------+----------------------+-----------------------------------------------
DBRef resolution      | via DBRef.fetch      | via collection.findOne
----------------------+----------------------+-----------------------------------------------
MapReduce Options     | applied              | ignored
----------------------+----------------------+-----------------------------------------------
authentication        | via UserCredentials  | via MongoClient
----------------------+----------------------+-----------------------------------------------
WriteConcernException | not available        | translated to DataIntegretyViolationException
----------------------+----------------------+-----------------------------------------------
executeInSession      | available            | requestStart/requestDone commands ignored.
----------------------+----------------------+-----------------------------------------------
index creation        | via createIndex      | via createIndex
----------------------+----------------------+-----------------------------------------------

We need to soften the exception validation a bit since the message is slightly different when using different storage engines in a MongoDB 3.0 environment.

Added an explicit <mongo-client /> element and <client-options /> to the configuration schema. These elements will replace existing <mongo /> and <options /> elements in a subsequent release. Added credentials attribute to <mongo-client /> which allows to define a set of credentials used for setting up the MongoClient correctly using authentication data. We now reject <mongo-options /> configuration when using MongoDB Java driver generation 3.0 and above.

Original pull request: #273.
2015-03-02 20:26:50 +01:00
Thomas Darimont
909cc8b5d3 DATAMONGO-1081 - Improve documentation on field mapping semantics.
Added table with examples to identifier field mapping section.

Original pull request: #276.
2015-03-02 18:15:18 +01:00
Thomas Darimont
b7acbc4347 DATAMONGO-1167 - Added QueryDslPredicateExecutor.findAll(Predicate, Sort).
We now support findAll on QueryDslMongoRepository that accepts a Querydsl Predicate and a Sort and returns a List<T>.

Original pull request: #275.
2015-02-24 09:51:39 +01:00
Oliver Gierke
d276306ddc DATAMONGO-1162 - Adapt to API changes in Spring Data Commons. 2015-02-06 12:48:26 +01:00
Oliver Gierke
25b98b7ad2 DATAMONGO-1154 - Upgraded to MongoDB Java driver 2.13.0. 2015-01-29 20:54:39 +01:00
Oliver Gierke
819b424142 DATAMONGO-1153 - Fix documentation build.
Movend jconsole.png to the images folder. Extracted MongoDB-specific auditing documentation into separate file for inclusion after the general auditing docs.
2015-01-29 14:04:16 +01:00
Oliver Gierke
5d0328ba4b DATAMONGO-1144 - Updated changelog. 2015-01-28 20:46:19 +01:00
Oliver Gierke
b219cff29c DATAMONGO-1143 - Updated changelog. 2015-01-28 10:00:59 +01:00
Oliver Gierke
409eeaf962 DATAMONGO-1148 - Favor EclipseLink’s JPA over the Hibernate one. 2015-01-27 21:43:46 +01:00
Oliver Gierke
4e5e8bd026 DATAMONGO-1146 - Polishing.
Added missing @Override annotations to QueryDslMongoRepository methods.

Related tickets: DATACMNS-636.
Original pull request: #270.
2015-01-26 11:52:14 +01:00
Thomas Darimont
b91ec53ae0 DATAMONGO-1146 - Added QueryDslMongoRepository.exists(…) which accepts a Querydsl predicate.
Added explicit test case for QueryDslMongoRepository.

Related tickets: DATACMNS-636.
Original pull request: #270.
2015-01-26 11:52:06 +01:00
Oliver Gierke
ce0624b8b0 DATAMONGO-712 - Another round of performance improvements.
Refactored CustomConversions to unify locked access to the cached types. Added a cache for raw-write-targets so that they’re cached, too.

DBObjectAccessor now avoids expensive code paths for both reads and writes in case of simple field names.

MappingMongoConverter now eagerly skips conversions of simple types in case the value is already assignable to the target type.

QueryMapper now checks the ConversionService and only triggers a conversion if it’s actually capable of doing so instead of catching a more expensive exception.

CachingMongoPersistentProperty now also caches usePropertyAccess() and isTransient() as they’re used quite frequently.

Related ticket: DATACMNS-637.
2015-01-25 18:57:56 +01:00
alex-on-java
b4de2769cf DATAMONGO-1147 - Remove manual array copy.
Remove manual array coping by using Arrays.copyOf(values, values.length).

Original pull request: #258.
2015-01-23 17:51:44 +01:00
Thomas Darimont
3f7b0f1eb6 DATAMONGO-1082 - Improved documentation of alias usage in aggregation framework.
Added missing JavaDoc and added short note to the reference documentation.

Original pull request: #268.
2015-01-22 08:47:15 +01:00
Thomas Darimont
4055365c57 DATAMONGO-1127 - Add support for geoNear queries with distance information.
Made unit tests more robust to small differences in distance calculations between MongoDB versions.
2015-01-20 19:04:01 +01:00
Thomas Darimont
db7f782ca6 DATAMONGO-1127 - Add support for geoNear queries with distance information.
We now support geoNear queries in Aggregations. Exposed GeoNearOperation factory method in Aggregation. Introduced new distanceField property to NearQuery since it is required for geoNear queries in Aggregations.

Original pull request: #261.
2015-01-20 18:16:12 +01:00
Christoph Strobl
cde9d8d23a DATAMONGO-1121 - Fix false positive when checking for potential cycles.
We now only check for cycles on entity types and explicitly exclude simple types.

Original pull request: #267.
2015-01-20 12:17:13 +01:00
Oliver Gierke
3dd9b0a2b6 DATAMONGO-1136 - Polishing.
Polished equals(…) / hashCode() methods in GeoCommand.

Original pull request: #263.
2015-01-12 19:42:50 +01:00
Christoph Strobl
59e54cecd2 DATAMONGO-1136 - Use $geoWithin instead of $within for geo queries.
We now use the $geoWithin operator for geospatial criteria which requires to run on  at least MongoDB 2.4.

Original pull request: #263.
2015-01-12 19:42:15 +01:00
Oliver Gierke
5ed7e8efc2 DATAMONGO-1139 - MongoQueryCreator now only uses $nearSpherical if non-neutral Metric is used.
Fixed the evaluation of the Distance for a near clause handed into a query method. Previously we evaluated against null, which will never result in true as Distance returns Metrics.NEUTRAL by default.
2015-01-12 19:10:10 +01:00
Oliver Gierke
fa85adfe0b DATAMONGO-1123 - Improve JavaDoc of MongoOperations.geoNear(…).
The JavaDoc of the geoNear(…) methods in MongoOperations now contain a hint to MongoDB limiting the number of results by default and an explicit limit on the NearQuery can be used to disable that.
2015-01-07 15:02:32 +01:00
Oliver Gierke
a3e4f44a64 DATAMONGO-1118 - Polishing.
Created dedicated prepareMapKey(…) method to chain calls to potentiallyConvertMapKey(…) and potentiallyEscapeMapKey(…) and make sure they always get applied in combination.

Fixed initial map creation for DBRefs to apply the fixed behavior, too.

Original pull request: #260.
2015-01-06 15:45:30 +01:00
Thomas Darimont
4a7a485e62 DATAMONGO-1118 - Simplified potentiallyConvertMapKey in MappingMongoConverter.
Fixed typos in CustomConversions.

Original pull request: #260.
2015-01-06 15:45:30 +01:00
Christoph Strobl
c353e02b3e DATAMONGO-1118 - MappingMongoConverter now uses custom conversions for Map keys, too.
We now allow conversions of map keys using custom Converter implementations if the conversion target type is a String.

Original pull request: #260.
2015-01-06 15:45:26 +01:00
Christophe Fargette
1c2964cab4 DATACMNS-1132 - Fixed keyword translation table in the reference documentation.
Original pull request: #262.
2015-01-06 13:29:37 +01:00
Oliver Gierke
47e083280a DATACMNS-1131 - We now register the ThreeTen back port converters by default.
Related ticket: DATACMNS-628.
2015-01-05 19:11:54 +01:00
Oliver Gierke
7db003100b DATAMONGO-1129 - Upgraded to MongoDB Java driver 2.12.4.
Added Travis build configuration, too.
2014-12-31 14:20:52 +01:00
Oliver Gierke
f814b1ef47 DATAMONGO-1128 - Added test cases to validate Optional mapping.
Added test cases to make sure Optional instances are handled correctly and the converters are actually applied to the nested value.
2014-12-31 13:59:43 +01:00
Oliver Gierke
f3d2ae366e DATAMONGO-1120 - Fix execution of query methods using pagination and field mapping customizations.
Repository queries that used pagination and referred to a field that was customized were failing as the count query executed was not mapped correctly in MongoOperations.

This result from the fix for DATAMONGO-1080 which removed the premature field name translation from AbstractMongoQuery and thus lead to unmapped field names being used for the count query.

We now expose the previously existing, but not public count(…) method on MongoOperations that takes both an entity type as well as an explicit collection name to be able to count-query a dedicated collection but still get the query mapping applied for a certain type.

Related ticket: DATAMONGO-1080.
2014-12-18 15:47:54 +01:00
Oliver Gierke
b6ecce3aa2 DATAMONGO-1096 - Polishing.
Fixed formatting for changes introduced with DATAMONGO-1096.
2014-12-17 18:37:33 +01:00
Oliver Gierke
c5235be9a7 DATAMONGO-1106 - After release cleanups. 2014-12-01 13:44:58 +01:00
Spring Buildmaster
23300de9d4 DATAMONGO-1106 - Prepare next development iteration. 2014-12-01 13:36:36 +01:00
Spring Buildmaster
41dc57c84f DATAMONGO-1106 - Release version 1.7.0.M1 (Fowler M1). 2014-12-01 13:36:32 +01:00
Oliver Gierke
85d1fe1ce6 DATAMONGO-1106 - Prepare 1.7.0.M1 (Fowler M1). 2014-12-01 12:26:52 +01:00
Oliver Gierke
ac6067ad53 DATAMONGO-1106 - Updated changelog. 2014-12-01 12:25:53 +01:00
Thomas Darimont
173a62b5ce DATAMONGO-1085 - Fixed sorting with Querydsl in QueryDslMongoRepository.
We now translate QSort's OrderSpecifiers into appropriate sort criteria.
Previously the OrderSpecifiers were not correctly translated to appropriate property path expressions.

We're now overriding support for findAll(Pageable) and findAll(Sort) to QueryDslMongoRepository to apply special QSort handling.

Original pull request: #236.
2014-12-01 12:09:14 +01:00
Oliver Gierke
cbbafce73d DATAMONGO-1043 - Make sure we dynamically lookup SpEL based collection names for query execution.
Changed SimpleMongoEntityMetadata to keep a reference to the collection entity instead of the eagerly resolved collection name. This is to make sure the name gets re-evaluated for every query execution to support dynamically changing collections defined via SpEL expressions.

Related pull request: #238.
2014-11-28 20:26:23 +01:00
Oliver Gierke
2e74c19995 DATAMONGO-1054 - Polishing.
Tweaked JavaDoc of the APIs to be less specific about implementation internals and rather point to the save(…) methods. Changed SimpleMongoRepository.save(…) methods to inspect the given entity/entities and use the optimized insert(All)-calls if all entities are considered new.

Original pull request: #253.
2014-11-28 18:33:19 +01:00
Thomas Darimont
a212b7566c DATAMONGO-1054 - Add support for fast insertion via MongoRepository.insert(..).
Introduced new insert(..) method variants on MongoRepositories that delegates to MongoTemplate.insert(..). This bypasses ID-population, save event generation and version checking and allows for fast insertion of bulk data.

Original pull request: #253.
2014-11-28 18:33:18 +01:00
Oliver Gierke
08faa52ef4 DATAMONGO-1108 - Performance improvements in BasicMongoPersistentEntity.
BasicMongoPersistentEntity.getCollection() now avoids repeated SpEL-parsing and evaluating in case no SpEL expression is used. Parsing is happening at most once now. Evaluation is skipped entirely if the configured collection String is not or does not contain an expression.
2014-11-28 16:24:25 +01:00
Oliver Gierke
33bc4fffd9 DATAMONGO-1079 - Updated changelog. 2014-11-28 12:06:05 +01:00
Christoph Strobl
eca2108e15 DATAMONGO-1087 - Fix index resolver detecting cycles for partial match.
We now check for presence of a dot path to verify that we’ve detected a cycle.

Original pull request: #240.
2014-11-28 12:03:38 +01:00
Christoph Strobl
dab6034eb9 DATAMONGO-943 - Add support for $position to Update $push $each.
We now support $position on update.push.

Original pull request: #248.
2014-11-28 11:41:51 +01:00
Christoph Strobl
461e7d05d7 DATAMONGO-1092 - Ensure compatibility with MongoDB 2.8.0.rc0 and java driver 2.13.0-rc0.
We updated GroupByResults to allow working with changed data types returned for count and keys and fixed assertion on error message for duplicate keys.
Using java-driver 2.12.x when connecting to an 2.8.0.rc-0 instance is likely to cause trouble with authentication. This is the intended behavior.

2.8.0-rc0 throws error when removing elements from a collection that does not yet exist, which is different to what 2.6.x does.

The java-driver 2.13.0-rc0 works perfectly fine with a 2.6.x Server instance.
We deprecated Index.Duplicates#DROP since it has been removed in MongoDB 2.8

Original pull request: #246.
2014-11-28 11:33:12 +01:00
Oliver Gierke
10c37b101d DATAMONGO-1105 - Added implementation of QueryDslPredicateExecutor.findAll(OrderSpecifier<?>... orders).
Renamed QuerydslRepositorySupportUnitTests to QuerydslRepositorySupportTests as it's an integration test.
2014-11-28 10:37:21 +01:00
Christoph Strobl
81f2c910f7 DATAMONGO-1075 - Containing keyword is now correctly translated for collection properties.
We now inspect the properties type when creating criteria for CONTAINS keyword so that, if the target property is of type String, we use an expression, and if the property is collection like we try to finds an exact match within the collection using $in.

Added support for NotContaining along the way.

Original pull request: #241.
2014-11-27 17:06:39 +01:00
Thomas Darimont
1fd97713c1 DATAMONGO-1093 - Added hashCode() and equals(…) in BasicQuery.
We now have equals(…) and hashCode(…) methods on BasicQuery. Previously we solely relied on Query.hashCode()/equals(…) which didn't consider the fields of BasicQuery.

Introduced equals verifier library to automatically test equals contracts.
Added some additional test cases to BasicQueryUnitTests.

Original pull request: #252.
2014-11-27 16:45:35 +01:00
Oliver Gierke
2d3eeed9ec DATAMONGO-1102 - Added support for Java 8 date/time types.
We're now able to persist and read non-time-zoned JDK 8 date/time types (LocalDate, LocalTime, LocalDateTime) to and from Date instances.
2014-11-27 16:28:36 +01:00
Christoph Strobl
b22eb6f12f DATAMONGO-1101 - Add support for $bit to Update.
We now support bitwise and/or/xor operations for Update.
2014-11-26 11:26:56 +01:00
Mikhail Mikhaylenko
dfb0a2a368 DATAMONGO-1096 - Use null-safe toString representation of query for debug logging.
We now use the null-safe serailizeToJsonSafely to avoid potential RuntimeExceptions during debug query printing in MongoTemplate.

Based on original PR: #247.

Original pull request: #251.
2014-11-26 09:40:30 +01:00
Thomas Darimont
03bcc56429 DATAMONGO-1094 - Fixed ambiguous field mapping error message in BasicMongoPersistentEntity.
Original pull request: #245.
2014-11-25 17:32:16 +01:00
Christoph Strobl
457fda3fc3 DATAMONGO-1097 - Add support for $mul to Update.
We now support multiply on Update allowing to multiply the value of the given key by a multiplier.
2014-11-24 20:38:44 +01:00
Oliver Gierke
54cee64610 DATAMONGO-1100 - Upgrade to new PersistentPropertyAccessor API. 2014-11-20 15:12:25 +01:00
Christoph Strobl
477499248a DATAMONGO-1086 - Mapping fails for collection with two embbeded types that extend a generic abstract.
We now use the type information of the raw property type to check if we need to include _class.
2014-11-20 15:12:25 +01:00
Oliver Gierke
3b70b6aeee DATAMONGO-1078 - Polishing.
Polished test cases. Simplified equals(…)/hashCode() for sample entity and its identifier type.

Original pull request: #239.
2014-11-10 16:38:03 +01:00
Christoph Strobl
163762e99e DATAMONGO-1078 - @Query annotated repository method fails for complex Id when used with Collection type.
Remove object type hint defaulting.
2014-11-10 16:37:56 +01:00
Oliver Gierke
b99833df75 DATAMONGO-1080 - AbstractMongoQuery now refrains from eagerly post-processing the query execution results.
To properly support general post processing of query execution results (in QueryExecutorMethodInterceptor) we need to remove the eager post-processing of query execution results in AbstractMongoQuery.

Removed the usage of the local ConversionService all together.
2014-10-30 11:35:51 +01:00
Thomas Darimont
4be6231426 DATAMONGO-1076 - Avoid resolving lazy-loading proxy for DBRefs during finalize.
We now handle intercepted finalize method invocations by not resolving the proxy. Previously the LazyLoadingProxy tried to resolve the proxy during finalization which could lead to unnecessary database accesses.

Original pull request: #234.
2014-10-29 10:16:12 +01:00
Christoph Strobl
4673e3d511 DATAMONGO-1077 - Fix Update removing $ operator for DBRef.
We now retain the positional parameter "$" when mapping field names for associations.

Orignal pull request: #235.
2014-10-28 14:28:22 +01:00
Christoph Strobl
00e48cc424 DATAMONGO-1050 - Explicitly annotated Field should not be considered Id.
We changed the id resolution to skip properties having an explicit name set via @Field unless they are marked with @Id. This means that

@Field(“id”) String id;

will be stored as “id” within mongodb. Prior to this change the fieldname would have been changed to “_id”.
Added tests to ensure proper field mapping for various "id" field variants.

Original pull request: #225.
2014-10-23 11:39:17 +02:00
Christoph Strobl
f8453825fb DATAMONGO-1072 - Fix annotated query placeholders not replaced correctly.
We now also check field names for potential placeholder matches to ensure those are registered for binding parameters.

Original pull request: #233.
2014-10-22 13:55:50 +02:00
Christoph Strobl
6cda9ab939 DATAMONGO-1068 - Fix getCritieriaObject returns empty DBO when no key defined.
We now check for the presence of a Critieria key.

Original pull request: #232.
2014-10-21 11:36:15 +02:00
Oliver Gierke
831d667896 DATAMONGO-1070 - Fixed a few glitches in DBRef binding for repository query methods.
The QueryMapping for derived repository queries pointing to the identifier of the referenced document. We now reduce the query field's key from reference.id to reference so that the generated DBRef is applied correctly and also take care that the id's are potentially converted to ObjectIds. This is mainly achieved by using the AssociationConverter pulled up from UpdateMapper in ObjectMapper.getMappedKey().

MongoQueryCreator now refrains from translating the field keys as that will fail the QueryMapper to correctly detect id properties.

Fixed DBRef handling for StringBasedMongoQuery which previously didn't parse the DBRef instance created after JSON parsing for placeholders.
2014-10-15 10:13:53 +02:00
Christoph Strobl
17c342895a DATAMONGO-1063 - Fix application of Querydsl'S any().in() throwing Exception.
We now only convert paths that point to either a property or variable.

Original pull request: #230.
2014-10-10 11:35:45 +02:00
Christoph Strobl
6ef518e6a0 DATAMONGO-1053 - Type check is now only performed on explicit language properties.
We now only perform a type check on via @Language explicitly defined language properties. Prior to this change non-String properties named language caused errors on entity validation.

Original pull request: #228.
2014-10-10 11:31:30 +02:00
Oliver Gierke
ddee2fbb12 DATAMONGO-1057 - Polishing.
Slightly tweaked the changes in SlicedExecution to simplify the implementation. We now apply the given pageable but tweak the limit the query uses to peek into the next page.

Original pull request: #226.
2014-10-08 07:06:18 +02:00
Christoph Strobl
6512c2cdfb DATAMONGO-1057 - Fix SliceExecution skipping elements.
We now directly set the offset to use instead of reading it from the used pageable. This asserts that every single element is read from the store.
Prior to this change the altered pageSize lead to an unintended increase of the number of elements to skip.

Original pull request: #226.
2014-10-08 07:06:14 +02:00
Oliver Gierke
0eee05adaa DATAMONGO-1062 - Polishing.
Removed exploded static imports. Updated copyright header.

Original pull request: #229.
2014-10-07 15:32:18 +02:00
Christoph Strobl
17e0154ff3 DATAMONGO-1058 - DBRef should respect explicit field name.
We now use property.getFieldName() for mapping DbRefs. This assures we also capture explicitly defined names set via @Field.

Original pull request: #227.
2014-10-01 10:06:22 +02:00
Thomas Darimont
2780f60c65 DATAMONGO-1062 - Fix failing test in ServerAddressPropertyEditorUnitTests.
The test rejectsAddressConfigWithoutASingleParsableServerAddress fails because the supposedly non-existing hostname "bar" "now" resolves to a real host-address.

The addresses "gugu.nonexistant.example.org, gaga.nonexistant.example.org" shouldn't be resolvable TM.

Original pull request: #229.
2014-09-30 12:55:24 +02:00
Christoph Strobl
7dd3450362 DATAMONGO-1049 - Check for explicitly declared language field.
We now check for an explicitly declared language field for setting language_override within a text index. Therefore the attribute (even if named with the reserved keyword language) has to be explicitly marked with @Language. Prior to this change having:

@Language String lang;
String language;

would have caused trouble when trying to resolve index structures as one cannot set language override to more than one property.

Original pull request: #224.
2014-09-25 12:40:26 +02:00
Oliver Gierke
ca4b2a61b8 DATAMONGO-1046 - After release cleanups. 2014-09-15 14:30:23 +02:00
Oliver Gierke
d2ecd65ca5 DATAMONGO-1046 - After release cleanups. 2014-09-05 14:27:21 +02:00
Spring Buildmaster
03bd49f6c8 DATAMONGO-1046 - Prepare next development iteration. 2014-09-05 03:12:04 -07:00
Spring Buildmaster
51607c5ed8 DATAMONGO-1046 - Release version 1.6.0.RELEASE (Evans GA). 2014-09-05 03:12:02 -07:00
Oliver Gierke
e2cbd3ee28 DATAMONGO-1046 - Prepare 1.6.0.RELEASE (Evans GA). 2014-09-05 11:43:58 +02:00
Oliver Gierke
5944e6b57e DATAMONGO-1046 - Updated changelog. 2014-09-05 09:23:31 +02:00
Oliver Gierke
efd46498ef DATAMONGO-1033 - Updated changelog. 2014-09-05 07:31:54 +02:00
Christoph Strobl
3d705a737f DATAMONGO-1040 - Derived delete should respect collection name.
Adding collection metadata allows to fine grained remove entities from specific collections using derived delete queries.

Original pull request: #223.
2014-09-04 15:47:13 +02:00
Christoph Strobl
996c57bccf DATAMONGO-1039 - Polish db clean hook implementation.
- Refactored internal structure.
- Updated documentation.
- Added some tests

Original pull request: #222.
2014-09-04 11:21:55 +02:00
Oliver Gierke
a31e72ff06 DATAMONGO-1045 - Tweak AspectJ setup in cross-store module to be able to build against Spring 4.1.
Added an aop.xml to only compile explicitly listed aspects in the cross-store module. This is needed as Spring 4.1 includes a new aspect for JavaEE 7 JCache support that has optional dependencies which we don't have in the classpath. Trying to compile all aspects contained in spring-aspects will result in ClassNotFoundExceptions for the aspects with missing dependencies.
2014-09-04 08:51:31 +02:00
Mark Paluch
f07d8fca8c DATAMONGO-1036 - Improved detection of custom implementations for CDI repositories.
Adapted to API changes in CDI extension.

Related ticket: DATACMNS-565.
2014-09-01 13:51:20 +02:00
Christoph Strobl
69dbdee01f DATAMONGO-1038 - Assert Mongo instances cleaned up properly after test runs.
Add JUnit rule and RunListener taking care of clean up task.

Original pull request: #221.
2014-08-27 11:12:39 +02:00
Oliver Gierke
dedb9f3dc0 DATAMONGO-1034 - Explicitly reject incompatible types in MappingMongoConverter.
Improved the exception message that is occurs if the source document contains a BasicDBList but has to be converted into a complex object. We now explicitly hint to use a custom Converter to manually.

Improved toString() method on ObjectPath to create more helpful output.
2014-08-26 20:07:46 +02:00
Oliver Gierke
7d69b840fe DATAMONGO-1030 - Projections now work on single-entity query method executions.
We now correctly forward the domain type collection to the query executing a query for a projection type.
2014-08-26 15:16:18 +02:00
Christoph Strobl
4eaef300cb DATAMONGO-1025 - Fix creation of nested named index.
Deprecated collection attribute for @Indexed, @CompoundIndex, @GeoSpatialIndexed. Removed deprecated attribute `expireAfterSeconds` from @CompoundIndex.

Original pull request: #219.
2014-08-26 14:33:47 +02:00
Christoph Strobl
ec1a6b5edd DATAMONGO-1025 - Fix creation of nested named index.
We new prefix explicitly named indexes on nested types (eg. for embedded properties) with the path pointing to the property. This avoids errors having equally named index definitions on different paths pointing to the same type within one collection.

Along the way we harmonized index naming for geospatial index definitions where only the properties field name was taken into account where it should have been the full property path.

Original pull request: #219.
2014-08-26 14:33:47 +02:00
Oliver Gierke
adc5485c09 DATAMONGO-1032 - Polished Asciidoctor documentation. 2014-08-26 14:24:51 +02:00
Oliver Gierke
f622b2916d DATAMONGO-1021 - After release cleanups. 2014-08-13 16:32:42 +02:00
Spring Buildmaster
26be0cf948 DATAMONGO-1021 - Prepare next development iteration. 2014-08-13 07:02:43 -07:00
Spring Buildmaster
e27c01fe5b DATAMONGO-1021 - Release version 1.6.0.RC1 (Evans RC1). 2014-08-13 07:02:41 -07:00
Oliver Gierke
d639e58fb9 DATAMONGO-1021 - Prepare 1.6.0.RC1 (Evans RC1). 2014-08-13 15:37:48 +02:00
Oliver Gierke
0195c2cb48 DATAMONGO-1021 - Updated changelog. 2014-08-13 15:37:44 +02:00
Oliver Gierke
068e2ec49b DATAMONGO-1024 - Upgraded to MongoDB Java driver 2.12.3. 2014-08-13 15:36:08 +02:00
Christoph Strobl
a9306b99ec DATAMONGO-957 - Add support for query modifiers.
Using Meta allows to the define $comment, $maxScan, $maxTimeMS and $snapshot on query. When executed we add the meta information to the cursor in use.

We’ve introduced the @Meta annotation that allows to the define $comment, $maxScan, $maxTimeMS and $snapshot on a repository finder method.
Added tests to verify proper invocation of template methods
Use DBCursor.copy() for CursorPreparer.

Original pull request: #216.
2014-08-13 14:56:10 +02:00
Thomas Darimont
3597194742 DATAMONGO-1012 - Improved identifier initialization on DBRef proxies.
Identifier initalization is now only triggered if field access is used. Before that the id initialization would've resolved the proxy eagerly as the getter access performed by the BeanWrapper would've been intercepted by the proxy and is indistinguishable from a normal method call. This would've rendered the entire use case to create proxies ad absurdum.

Added test case to check for non-initialization in the property access scenario.
2014-08-13 14:34:38 +02:00
Oliver Gierke
6f06ccec8e DATAMONGO-1012 - Identifier initialization for lazy DBRef proxies with field access.
We now initialize the ID property for proxies created for lazily initialized DBRefs. This will allow the lookup of ID properties for types that use field access without initializing the entire proxy.
2014-08-13 14:34:15 +02:00
Oliver Gierke
6fe7f220f9 DATAMONGO-1007 - Updated changelog. 2014-08-13 10:56:02 +02:00
Christoph Strobl
45e70d493d DATAMONGO-1016 - Remove deprecations in geospatial area.
Removed:
 - Box
 - Circle
 - CustomMetric
 - Distance
 - GeoPage
 - GeoResult
 - GeoResults
 - Metric
 - Metrics
 - Point
 - Polygon
 - Shape

Updated api doc.
Removed deprecation warnings.
2014-08-13 09:52:02 +02:00
Thomas Darimont
ce71ab83f2 DATAMONGO-1020 - LimitOperation is now a public class.
Original pull request: #218.
2014-08-12 12:30:41 +02:00
Oliver Gierke
bf85d8facd DATAMONGO-1005 - Polishing introduction of ObjectPath.
Simplified implementation of ObjectPath to use a static root instance and hand the path further down until final resolution in MappingMongoConverter.readValue(…). This removes a bit of boxing and unboxing code both in ObjectPath and the converter.

Introduced ObjectPath.getPathItem(…) to internalize the iteration to find a potentially already resolved object.

Renamed parameters and fields of type ObjectPath to path consistently. Removed obsolete method in MappingMongoConverter.

Original pull request: #209.
2014-08-12 08:12:31 +02:00
Thomas Darimont
c5ff7cdb2b DATAMONGO-1005 - Improve cycle-detection for DbRef's.
Introduced ObjectPath that collects the target objects while converting a DBObject to a domain object. If we detect that a potentially nested DBRef points to an object that is already under construction we simply return a reference to that object in order to avoid StackOverFlowErrors.

Original pull request: #209.
2014-08-12 08:10:47 +02:00
Mark Paluch
f9ccf4f532 DATAMONGO-1017 - Add support for custom implementations in CDI repositories.
Original pull request: #215.
2014-08-11 07:47:57 +02:00
Greg Turnquist
ab731f40a7 DATAMONGO-1019 - Corrected examples in reference documentation.
Examples were not properly converted. One table got dropped, so I added it back. Fix IMPORTANT notes.

Original pull requests: #214.
2014-08-10 16:04:45 +02:00
Oliver Gierke
d8434fffa8 DATAMONGO-1015 - Fixed link to Spring Data Commons reference docs. 2014-08-10 15:55:13 +02:00
Christoph Strobl
151b1d4510 DATAMONGO-973 - Add support for deriving full-text queries.
Added support to execute full-text queries on repositories. Query methods now can have a parameter of type TextCriteria which will be triggering a text search clause for the property annotated with @TestScore.

Retrieving document score and sorting by score is only possible if the entity holds a property annotated with @TextScore. If present, any find execution will be enriched so that it asserts loading of the according { $meta : textScore } field. The sort object will only be mapped in case the existing sort property already exists - in that case we replace the existing expression for the property with its $meta representation.

This allows for example the following:

TextCriteria criteria = TextCriteria.forDefaultLanguage().matching("term");

repository.findAllBy(criteria, new Sort("score"));
repository.findAllBy(criteria, new PageRequest(0, 10, Direction.DESC, "score"));
repository.findByFooOrderByScoreDesc("foo", criteria);

For more details and examples see the "Full text search queries" section in the reference manual.
2014-08-06 22:25:38 +02:00
Greg Turnquist
168cf3e1f6 DATAMONGO-1015 - Migrate reference documentation from Docbook to Asciidoctor. 2014-08-06 21:38:46 +02:00
Oliver Gierke
52dab0fa20 DATAMONGO-1008 - Polishing.
Slightly changed the implementation of the 2dsphere check, Minor refactorings in the test case.

Original pull request: #210.
2014-07-31 17:23:19 +02:00
Christoph Strobl
9257bab06e DATAMONGO-1008 - DefaultIndexOperations no considers 2dsphere, too.
We now also check for 2dsphere when inspecting index keys and create an geo IndexField in that case.

Original pull request: #210.
2014-07-31 17:23:19 +02:00
Oliver Gierke
27f0a6f27a DATAMONGO-1008 - Added repository type based checks to strict matching algorithm.
Repositories extending MongoRepository are now considered strict matches as well.

Related ticket: DATACMNS-526.
2014-07-31 16:20:26 +02:00
Oliver Gierke
5bedbef2f2 DATAMONGO-1009 - Adapt to new multi-store configuration detection.
We now consider repositories managing domain types annotated with @Document MongoDB specific ones.

Related ticket: DATACMNS-526.
2014-07-28 20:15:40 +02:00
Christoph Strobl
51e7be8aa0 DATAMONGO-1001 - Renamed LazyLoadingProxy.initialize() to getTarget().
Original pull request: #208.
2014-07-24 13:29:27 +02:00
Christoph Strobl
6c85bb39a3 DATAMONGO-1001 - Fix saving lazy loaded object.
We now resolve the target type for CGLib-proxied objects and initialize lazy loaded ones before saving. As it turns out CustomConversions already knows how to deal with proxies correctly. Ee added an explicit test to assert that.

Original pull request: #208.
2014-07-24 13:28:36 +02:00
Oliver Gierke
07f7247707 DATAMONGO-1002 - Update.toString() now uses SerializationUtils.
A simple call of toString() on a DBObject might result in an exception if the DBObject contains objects that are non-native MongoDB types (i.e. types that need to be converted prior to persistence).

We now use SerializationUtils.serializeToJsonSafely(…) to avoid exceptions.
2014-07-23 12:36:15 +02:00
Thomas Darimont
f669711670 DATAMONGO-995 - Improve support of quote handling for custom query parameters.
Introduced ParameterBindingParser which exposes parameter references in query strings as ParameterBindings. This allows us to detect whether a parameter reference in a query string is already quoted avoiding wrongly double-quoting the parameter value.

Original pull request: #185.
Related ticket: DATAMONGO-420.
2014-07-21 20:15:18 +02:00
Oliver Gierke
5f3671f349 DATAMONGO-996 - Fixed boundary detection in pagination.
The fix for DATAMONGO-950 introduced a tiny glitch so that retrieving pages after the first one was broken in the repository query execution. We now correctly use the previously detected number of elements to detect whether the Pageable given is out of scope.

Related ticket: DATAMONGO-950.
2014-07-18 19:01:44 +02:00
Thomas Darimont
1335cb699b DATAMONGO-420 - Improve support of quote handling for custom query parameters.
Introduced ParameterBindingParser which exposes parameter references in query strings as ParameterBindings. This allows us to detect whether a parameter reference in a query string is already quoted avoiding wrongly double-quoting the parameter value.

Original pull request: #185.
2014-07-17 15:27:46 +02:00
Oliver Gierke
84414b87c0 DATAMONGO-987 - Some polishing in MappingMongoConverter.
Let getValueInternal(…) use the provided SpELExpressionEvaluator instead of relying on the MongoDbPropertyValueProvider to create a new one. Removed the obsolete constructor in MongoDbPropertyValueProvider.
2014-07-17 15:18:21 +02:00
Thomas Darimont
a1ecd4a501 DATAMONGO-987 - Avoid creation of lazy-loading proxies for null-values.
We now avoid creating a lazy-loading proxy if we detect that the property-value in the backing DbObject for a @Lazy(true) annotated field is null.

Original pull request: #207.
2014-07-17 15:18:21 +02:00
Thomas Darimont
d7e6f2ee41 DATAMONGO-989 - MatchOperation should accept CriteriaDefinition.
We replaced the constructor that accepted a Criteria with one that accepts a CriteriaDefinition to not force clients to extends Criteria. 
Original pull request: #206.
2014-07-17 09:21:29 +02:00
Oliver Gierke
04870fb8b3 DATAMONGO-991 - Adapted to deprecation removals in Spring Data Commons.
Related ticket: DATACMNS-469.
2014-07-16 12:04:10 +02:00
Oliver Gierke
9d196b78f7 DATAMONGO-981 - After release cleanups. 2014-07-10 20:44:20 +02:00
Spring Buildmaster
4229525928 DATAMONGO-981 - Prepare next development iteration. 2014-07-10 10:38:58 -07:00
Spring Buildmaster
d861fecdb8 DATAMONGO-981 - Release version 1.6.0.M1. 2014-07-10 10:38:55 -07:00
Oliver Gierke
f280e23095 DATAMONGO-981 - Prepare 1.6.0.M1 (Evans M1). 2014-07-10 19:28:34 +02:00
Oliver Gierke
ed0e1d92c0 DATAMONGO-981 - Updated changelog. 2014-07-10 17:14:24 +02:00
Christoph Strobl
d82fc22659 DATAMONGO-944 - Add support for $currentDate to Update.
Added currentDate and currentTimestamp to Update.

Original pull request: #200.
2014-07-10 15:13:59 +02:00
Thomas Darimont
6616d6788c DATAMONGO-975 - Add support for extracting date/time components from a field projection.
We added some extract-methods to ProjectionOperationBuilder to be able to extract date / time components from projected fields.

Original pull request: #204.
2014-07-10 12:45:17 +02:00
Christoph Strobl
322a7cf033 DATAMONGO-969 - Fixed nested id handling in SpringDataMongodbSerializer.
SpringDataMongodbSerializer now defensively triggers mapping of the DBObject created by the default serializer. This asserts that ids buried in nested structures like { "_id" : { "$in" : ["x", "y"] } } are converted correctly.

Original pull request: #202.
2014-07-09 21:48:02 +02:00
Christoph Strobl
0f487c10ba DATAMONGO-983 - Remove links to forum.spring.io.
Replace forum links with those to stackoverflow.

Original Pull Request: #205
2014-07-09 21:21:04 +02:00
Christoph Strobl
11417144bd DATAMONGO-980 - Use meta annotations from commons for @Score.
We now use Spring Data Commons' @ReadOnlyProperty to meta-annotate @Score to mark it as read-only property.

Original pull request: #201.
Related tickets: DATACMNS-534.
2014-07-09 14:51:21 +02:00
Christoph Strobl
dafc59b163 DATAMONGO-972 - Querydsl integration now handles references correctly.
SpringDataMongodbSerializer now overrides the necessary methods to create the appropriate DBRef objects when serializing data via Querydsl.

We currently disable the test case as it the fix taking effect requires Querydsl 3.4.1 which unfortunately breaks Java 6 compatibility. We include the fix nonetheless to allow users on Java 7 to potentially use the latest Querydsl.

Original pull request: #203.
Related tickets: querydsl/querydsl#803.
2014-07-09 13:47:04 +02:00
Oliver Gierke
566f9a80c4 DATAMONGO-982 - Added build profiles to build against next MongoDB driver versions.
Added build profile for MongoDB Java driver versions 2.12.3-SNAPSHOT and 3.0.0-SNAPSHOT. Added another property to be able to build manifests correctly as the snapshot versions aren't valid OSGi versions.

Adapted MongoExceptionTranslator to convert the new Exceptions being thrown for server timeouts and the deprecated values we currently handle.
2014-07-08 17:36:01 +02:00
Christoph Strobl
89a42c5648 DATAMONGO-976 - Add support for reading $meta projection on textScore into document.
We introduced @TextScore that can be used to mark a property to take { $meta : “textScore” }. In contrast to @Transient the value will be considered when reading documents.
The value can and will only get picked up if the score field is retrieved from the store.

Original pull request: #198.
2014-07-07 11:43:43 +02:00
Christoph Strobl
83ffbb00e8 DATAMONGO-850 - Add support for full text search via $text
Using TextQuery and TextCriteria allows creation of queries using $text $search.

{ $meta : “textScore” } can be included using TextQuery.includeScore. As the fieldname used for textScore must not be fixed to “score” one can use the overload taking the fieldname to override the default.

Original pull request: #198.
2014-07-07 11:39:47 +02:00
Christoph Strobl
84913cecab DATAMONGO-937 - Add support for creating text index.
We now support creating text index based on information gathered on domain types.

Using @TextIndexed marks properties to be considered for the full text index. Use the weight attribute to influence document scoring during search operations.

Please note that using @TextIndexed on entity properties forces all properties of any sub document to be considered as part of the text index. Any set weight will in that case be propagated to the siblings taking the most recent weight information into account, which means that a the weight attribute can be overridden for properties in sub documents.

The setting the index default language can be done via @Document(language) while @Language can be used to define the language_override field.

As text search is disabled by default for mongodb 2.4 we use a jUnit ClassRule to restrict integration tests potentially creating text index (as the entities for testing are found in the classpath) to only be executed in when a 2.6+ mongodb server is present.

For usage hints please see section 6.3.4 (Text Indexes) of reference manual.

Original pull request: #198.
2014-07-07 11:30:08 +02:00
Christoph Strobl
998bb09a92 DATAMONGO-978 - Derived delete query should pass on type information.
We now pass on type information for derived delete queries to the according delete operation. This propagates the information correctly to the according Before and After events.

Before this change the type would have been set to null in case of non collection like method return type.

Original pull request: #199.
2014-07-03 14:16:57 +02:00
Oliver Gierke
cd68a8db54 DATAMONGO-977 - Removed reflective detection of Spring 4 in DBRef proxy creation.
After the Spring 4 upgrade we can now directly use the Objenesis infrastructure of it.
2014-07-02 09:23:19 +02:00
Oliver Gierke
df8477d180 DATAMONGO-955 - Updated changelog. 2014-06-30 10:57:12 +02:00
Christoph Strobl
244fbae0ce DATAMONGO-962 - Cycle guard should respect full path.
We now check on intersections of given path and existing to not only check types and contained property names but also properties full path which must not be present in already traversed paths.

Additionally we’ll now catch any CyclicPropertyReferenceExceptions on the root level to prevent cycle detection interfering with application startup.

Original pull request: #197.
2014-06-27 19:25:26 +02:00
Oliver Gierke
19e08a52c0 DATAMONGO-970 - MongoTemplate.remove(…) now correctly builds query for DBObjects.
If a DBObject was handed into MongoTemplate.remove(…) we previously failed to look up the id value to create a by-id-query. This commit adds explicit handling of DBObjects by looking up their _id field to obtain the id value.
2014-06-27 16:12:50 +02:00
Thomas Darimont
6389b1bb73 DATAMONGO-954 - Add support for system variables in aggregation operations.
Add assumes for appropriate MongoDB version.
2014-06-26 14:19:24 +02:00
Thomas Darimont
cadcbf6106 DATAMONGO-954 - Add support for system variables in aggregation operations.
We now support referring to system variables like for instance $$ROOT or $$CURRENT from within aggregation framework pipeline projection and group expressions.

Original pull request: #190.
2014-06-25 15:53:44 +02:00
Christoph Strobl
118f007ca6 DATAMONGO-963 - @CompoundIndex should treat expireAfterSeconds correctly.
We added an additional check on the fields used as key, so that TTL is ignored for CompoundIndex with more than one field (which effectively renders it useless on @CompoundIndex at all).

Prior to this change potentially invalid index structures would have been created for e.g. @CompoundIndex(def = "{'foo': 1, 'bar': 1}", expireAfterSeconds=10) leading to MongoDB not being able to clean up the indexes (logs: "ERROR: key for ttl index can only have 1 field")

This fix is related to https://jira.mongodb.org/browse/SERVER-10075.

Original pull request: #196.
2014-06-25 15:12:32 +02:00
Christoph Strobl
cbb32bd29d DATAMONGO-950 - Add support for limiting the query result in the query derivation mechanism.
When deriving the query from its method name we check for the limit set on the PartTree to pass this on to the created query. PagedExecution not takes the overall limit into account, skips a query execution entirely (if the Pageable is out of scope completely) or alters the query limits accordingly.

Note, that there has been significant rework of this compared to the pull request to avoid new API in Query and extensive changes in MongoTemplate's QueryCursorPreparer.

Original pull request: #191.
2014-06-25 14:58:59 +02:00
Thomas Darimont
9858dcd740 DATAMONGO-960 - Allow to pass options to the Aggregation Pipeline.
We introduced AggregationOptions abstraction to conveniently construct option objects that can be handed to an Aggregation via the new Aggregation.withOptions(...) factory method. For more details, see the Builder class' JavaDoc.

Note that we exposed the "rawResults" in AggregationResults and put a null guard in MongoTemplate aggregate in order to support the "explain" option.

Original pull request: #195.
2014-06-25 13:25:57 +02:00
Thomas Darimont
1fb76d135b DATAMONGO-953 - Add equals(…)/hashCode()/toString() to Update.
We now use the underlying updateObject to implement appropriate equals(…)/hashCode() and toString() methods.

Original pull request: #192.
2014-06-25 12:40:30 +02:00
Oliver Gierke
bb62c8b2f1 DATAMONGO-958 - Switch to FieldNamingStrategy SPI in Spring Data Commons. 2014-06-20 21:27:24 +02:00
Christoph Strobl
2cbe7bf885 DATAMONGO-952 - Derived queries should consider field specification in @Query.
PartTreeMongoQuery now explicitly check the presence of a manually defined field spec on the query method and creates a new Query if so.

Original pull request: #188.
2014-06-18 12:53:14 +02:00
Christoph Strobl
6043f6b74d DATAMONGO-949 - CycleGuard should only match properties in word boundaries.
We modified the regular expression used for cycle detection to match on the exact property name within the inspected path using word boundaries. This fix prevents sub sequences of an existing property (like ‘sub’ would have matched ‘substr’) from being matched.

Along the way we fixed the (false) assertion in one of the tests, as we create the +1 cycle reference index before actually breaking the operation.
2014-06-18 08:32:30 +02:00
Christoph Strobl
ef1366592a DATAMONGO-948 - Sort should be taken as is when no type information available.
Object type mapping for sort is skipped in the case no type information is present when executing query using mongo template.
2014-06-18 08:27:57 +02:00
Thomas Darimont
01cf9fb8f3 DATAMONGO-938 - Apply QueryMapper in MongoTemplate.mapReduce(…).
Previously MongoTemplate.mapReduce(...) didn't translate nested objects, e.g. GeoCommand, within the given query. That could lead to exceptions during query serialization. We now pass the query and sort object of the given Query through the QueryMapper to avoid such problems.

Original pull request: #184.
2014-06-18 08:27:54 +02:00
Thomas Darimont
285c406d5d DATAMONGO-745 - Added test cases for custom query with $in and pageable parameter.
Added test cases to verify that this works.

Original pull request: #186.
2014-06-18 07:49:06 +02:00
Oliver Gierke
ad29e52a57 DATAMONGO-936 - After release cleanups. 2014-05-20 19:54:03 +02:00
Spring Buildmaster
3cfe207c83 DATAMONGO-936 - Prepare next development iteration. 2014-05-20 09:35:38 -07:00
Spring Buildmaster
c7e65cbc40 DATAMONGO-936 - Release version 1.5.0.RELEASE. 2014-05-20 09:35:35 -07:00
Oliver Gierke
b8e02efb04 DATAMONGO-936 - Prepare 1.5 GA.
Removed obsolete readme.txt.
2014-05-20 17:14:04 +02:00
Oliver Gierke
c7f20fb836 DATAMONGO-936 - Prepare 1.5 GA. 2014-05-20 16:57:34 +02:00
Oliver Gierke
28a0202ef4 DATAMONGO-936 - Updated changelog. 2014-05-20 16:26:20 +02:00
Christoph Strobl
164e947045 DATAMONGO-926 - Avoid StackOverflowError while resolving index structures.
We now guard cyclic non transient, non DBRef property references while inspecting domain types for potentially index structures. To do so we check on the properties path and owning type to determine potential cycles. If found we log a warn message pointing to path, entity and property potentially causing problems and skip processing for this path.

Original pull request: #180.
2014-05-19 19:09:40 +02:00
Christoph Strobl
7e65c0c87d DATAMONGO-367 - Nested @Indexed should not trigger creation of separate collection.
The issue has been solved along with DATAMONGO-888 (Pull Request: #162). We have created additional tests to explicitly check it has truly been fixed.
2014-05-19 17:19:28 +02:00
Christoph Strobl
9c1f753f17 DATAMONGO-929 - Use property path for keys of Indexed and CompoundIndex.
Index creation failed for @Indexed and @CompoundIndex as the resolved dotPath was not used for creation. We now not only resolve the dotPath but also use it within the key for index definition. In case of a nested compound index the key definition is enhanced by the provided path.

When leaving the key definition empty for nested compound index we'll create an index for the whole nested document. Trying to create a compound index on root level not providing key information leads to InvalidDataApiUsageException.

Original pull request: #179.
2014-05-19 17:14:36 +02:00
Oliver Gierke
0f821eb52d DATAMONGO-925, DATAMONGO-928 - Polishing.
Only reject attribute setup if abbreviation is activated and a custom strategy is configured. Additional test cases for the rejection case and a custom, over-configuration (explicitly setting abbreviation to false, which is the default anyway).

Related pull request: #177.
2014-05-19 17:07:50 +02:00
Ryan Tenney
e3aadd63ab DATAMONGO-928 - Removed explicit default value for abbreviate-field-names from namespace XSD.
The default for boolean attributes leaks into the evaluation of XML namespace attributes which causes us being unable to detect whether two attributes have been set in a conflicting way.

Fix the documentation on the field-naming-strategy-ref attribute.

Original pull request: #183.
Related pull request: #177.
Related ticket: DATAMONGO-925.
2014-05-19 16:44:03 +02:00
Christoph Strobl
aa06d520df DATAMONGO-647 - Added test case to show that field names are mapped correctly.
Additional test added to check if the issue has truly been resolved by DATAMONGO-888.

Original pull request: #181.
Related pull Request: #162.
Related ticket: DATAMONGO-888.
2014-05-19 14:38:06 +02:00
Oliver Gierke
4fa1d4ba97 DATAMONGO-919 - After release cleanups. 2014-05-02 15:45:06 +02:00
Spring Buildmaster
ba9f11b345 DATAMONGO-919 - Prepare next development iteration. 2014-05-02 05:58:16 -07:00
Spring Buildmaster
4777dd2e5e DATAMONGO-919 - Release version 1.5.0.RC1. 2014-05-02 05:58:13 -07:00
Oliver Gierke
64b4591b72 DATAMONGO-919 - Prepare 1.5.0.RC1.
Upgraded to Spring Data Parent 1.4.0.RC1 and Spring Data Commons 1.8.0.RC1. Switched to milestone repository.
2014-05-02 14:50:48 +02:00
Oliver Gierke
bac5961fd8 DATAMONGO-919 - Updated changelog. 2014-05-02 14:50:48 +02:00
Thomas Darimont
b3cac862d8 DATAMONGO-924 - Improve aggregation field reference resolving.
Previously we didn't support referring to aliased fields defined in former stages of an aggregation pipeline. We now also consider field aliases during field reference lookup.

Original pull request: #176.
2014-05-02 14:46:13 +02:00
Oliver Gierke
b8ab2ad539 DATAMNOG-919 - Forward port changelog entries from bugfix branch. 2014-05-02 12:48:18 +02:00
Christoph Strobl
618d5bd5e9 DATAMONGO-919 - Prepare Release 1.5.0.RC1
Upgrade AspectJ Maven plugin to recent version which enables building the module on java 8 using AspectJ 1.8.
2014-05-01 21:02:01 +02:00
Oliver Gierke
096c3278b3 DATAMONGO-92 - Upgraded to MongoDB Java driver 2.12.1. 2014-04-30 08:51:18 +02:00
Kim Toms
c9d9976c22 DATAMONGO-920 - Improve debug message for delete events in AbstractMongoEventListener.
Adjusted debug message to reflect the actual operation.

Original pull request: #95.
2014-04-29 15:52:12 +02:00
Thomas Darimont
65497f93d4 DATAMONGO-827 - Allow index creation use mongodb auto generated names.
We now support letting MongoDB generate index names by introducing the attribute "useGeneratedName" to the @Indexed, @GeoSpatialIndex, @CompoundIndex annotations.
2014-04-28 19:58:56 +02:00
Christoph Strobl
af8a53bef6 DATAMONGO-909 - Compound index on inherited class should be created correctly.
With the overhaul of the index creation done in DATAMONGO-899 the CompoundIndex annotation is not longer just looked up at the concrete type but rather all its interfaces and super classes. So we just added an additional test to verify this behaviour.
2014-04-28 19:58:56 +02:00
Oliver Gierke
916b856e97 DATAMONGO-899 - Polished API of new index creation abstractions.
Removed the introduction of the IndexDefinition being collection aware again. The collection an index is created in is now held in the IndexDefinitionHolder. This is mostly due to the fact that the IndexDefinition implementations can be used with MongoTemplate and the index opoerations take a collection alongside the index definition.

Made the IndexResolver API package protected so that we can further change it going forward. We should think about deprecating the collectionName attributes on index annotations as it doesn't make too much sense to manually configure the collection name for the indexes as the collection is predefined through the domain type setting here. This would allow us to remove the entire collection handling code inside the IndexResolver implementation.

Turned IndexDefinitionHolder into a value object.

Original pull request: #168.
2014-04-28 19:58:50 +02:00
Christoph Strobl
7848da63f2 DATAMONGO-899 - Ensure proper creation of index structures for nested entities.
Index creation did not consider the properties path when creating the index. This lead to broken index creation when nesting entities that might require index structures.

Off now index creation traverses the entities property path for all top level entities (namely those holding the @Document annotation) and creates the index using the full property path.

This is a breaking change as having an entity to carry the @Document annotation has not been required by now.

Original Pull Request: #168
2014-04-28 16:50:26 +02:00
Christoph Strobl
f359a1d31a DATAMONGO-847 - Allow usage of Query within an Update clause.
In case we detect Query within a value used for an Update value we map the query itself to build the expression to use. This allows to form query statements for e.g. $pull using the same API as for the query itself.

Update update = new Update().pull("list", query(where("value").in("foo", "bar")));

Original Pull Request: #172.
2014-04-28 13:30:13 +02:00
Thomas Darimont
72d645feae DATAMONGO-917 - Improve Spring 4.0 framework version detection to avoid NullPointerExceptions.
We now check for the presence of DefaultParameterNameDiscoverer in order to determine if we are running with a Spring version later than 4.0 since this avoids potential NullPointerExceptions in cases where the package version information is not available e.g. in cases where the application was bundled into an uberjar e.g. via the maven-shade-plugin.

Original pull request: #173.
2014-04-28 13:21:09 +02:00
Thomas Darimont
72fe382bba DATAMONGO-913 - Improve DBRef handling in for LazyLoadingProxies.
We now use the captured DBRef of a given LazyLoadingProxy in MappingMongoConverter.toDBRef(..) in order to avoid a new DBRef creation that would fail for the proxy.

Original pull request: #174.
2014-04-28 13:07:57 +02:00
Thomas Darimont
d25e840cf5 DATAMONGO-914 - Improve resolving of lazy-loading proxies for classes that override equals(…)/hashCode().
We now properly resolve lazy-loading proxies for @DBRef's when an overridden equals or hash code method is called with Spring 4. We fall back to our old Objenesis proxy generation in order to circumvent the default handling for overridden hashcCode() and equals(…) methods in CglibAopProxies generated by Spring 4.

If we detect that we run with Spring 4 we use the repacked Objenesis that is included in Spring 4. Previously the generated proxy used some generic hashCode() or equals(…) logic that did not trigger a proper lazy loading in such cases.

Original pull request: #171.
2014-04-23 09:29:33 +02:00
Thomas Darimont
df1775572a DATAMONGO-912 - Consider custom conversions in all stages of an aggregation pipeline.
We now consider custom mongo conversions in all stages of an aggregation pipeline. Previously we did this only for the first stage and returned object basically unmapped in later stages. We now pass the root AggregationOperationContext on to nested ExposedFieldsAggregationOperationContexts so that those can delegate any mongo Mapping to the root context.

Original pull request: #170.
2014-04-23 09:00:56 +02:00
Christoph Strobl
86670cd49f DATAMONGO-893 - Converter must not write "_class" information for know types.
We now actively pass on property type information to MetadataBackedField to ensure type hints get picked up correctly when converting a value to the according DBObject.

This has to be done as the fix for DATAMONGO-812 enforced proper writing of _class information for Updates, which caused trouble when querying documents by nested (complex) properties using an 'in' clause.

Original pull request: #169.
2014-04-15 17:36:11 +02:00
Christoph Strobl
31a4bf906e DATAMONGO-892 - Reject nested MappingMongoConverter declarations in XML.
Mapping information is potentially required by multiple instances and thus must not be registered as nested bean. We now actively check for such an invalid scenario and explicitly reject it.

Original pull request: #165.
2014-04-15 09:11:12 +02:00
Christoph Strobl
599291e8b7 DATAMONGO-897 - Fixed potential NullPointerException in QueryMapper.
If an association property points to an interface not containing the id property QueryMapper threw a NullPointerException in isAssociationConversionNecessary(…) as the lookup of the id property fails. 

We now check for the presence of an id property on the target type and check for assignability to indicated the need for conversion (usually in case when developers use raw ids in their update clauses, not the actual target instance.

Original pull request: #164.
2014-04-15 09:11:00 +02:00
Thomas Darimont
f5a04fb9fb DATAMONGO-908 - Support for nested field references in group operations.
We now allow referring to nested field expressions if the root segment of the nested field expression was exposed in earlier stages of the aggregation pipeline.

Original pull request: #167.
2014-04-15 07:55:57 +02:00
Oliver Gierke
88558b67c3 DATAMONGO-866 - Polishing for new field naming strategy configuration support.
Use the camel case split logic from Spring Data Commons (introduced for DATACMNS-486) in a common CamelCaseSplittingFieldNamingStrategy super class.

MappingMongoConverterParser now also rejects the configuration if both abbreviate-field-names and field-naming-strategy-ref are configured.
2014-04-10 18:36:37 +02:00
Ryan Tenney
d52cb255e0 DATAMONGO-866 - Add means to configure field naming strategy. 2014-04-10 18:03:55 +02:00
Oliver Gierke
6c214cbc37 DATAMONGO-910 - Upgraded to latest MongoDB Java driver 2.12.0.
Removed special mongo-osgi property as the driver version is now a valid OSGi version number.
2014-04-10 16:55:22 +02:00
Jeff Yemin
01012c1448 DATAMONGO-895, DATAMONGO-896 - Assert compatibility with latest MongoDB Java driver.
Upgrade next MongoDB driver version to 2.12.0. Strong upgrade coming in a subsequent commit to make sure we can backport the compatibility checks to the bugfix branch without forcing users into a driver upgrade.

Relaxing error message comparison in assertion so that it still matches against the message returned by MongoDB 2.6. When comparing the value of the version field, compare against a Long rather than an Integer, since the version field generated is a Long. This allows the test to pass against the upcoming 2.12.0 release of the Java driver, which has a stricter implementation of BasisDBObject.equals(…).

Original pull requests: #159, #160.
2014-04-10 15:57:45 +02:00
Christoph Strobl
4c7befb910 DATAMONGO-888 - Sorting now considers mapping information.
We now pipe the DBObject containing sorting information for queries through the QueryMapper to make sure potential field mappings are applied.

Original Pull Request: #162.
2014-04-10 15:43:51 +02:00
Christoph Strobl
b62669ec8f DATAMONGO-907 - Assert compatibility with mongodb 2.6.
Fix test to only check on parts of the expected error message common in both 2.4 and 2.6.

Original Pull Request: #166.
2014-04-10 13:33:34 +02:00
Oliver Gierke
0fb74caf9b DATAMONGO-905 - Removed obsolete dependency to CGLib from cross-store support.
Also we now optionally depend on the HIbernate JPA API JAR so that using other persistence providers doesn'T cause an API JAR duplication.
2014-04-09 12:12:29 +02:00
Oliver Gierke
9623dac01f DATAMONGO-901 - Fixed regression of not registering type predicting post processor.
The changes for DATAMONGO-843 introduced a regression by skipping the registration of the RepositoryInterfaceAwareBeanPostProcessor. This can cause the wiring of repository bean definitions to fail depending on in which order the bean definitions get instantiated.

This change reintroduces the registration and adds an explicit test case for it.
2014-04-02 17:58:36 +02:00
Spring Buildmaster
695b27968c DATAMONGO-859 - Prepare next development iteration. 2014-03-31 17:10:40 +02:00
524 changed files with 66310 additions and 13214 deletions

12
.github/PULL_REQUEST_TEMPLATE.md vendored Normal file
View File

@@ -0,0 +1,12 @@
<!--
Thank you for proposing a pull request. This template will guide you through the essential steps necessary for a pull request.
Make sure that:
-->
- [ ] You have read the [Spring Data contribution guidelines](https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.adoc).
- [ ] There is a ticket in the bug tracker for the project in our [JIRA](https://jira.spring.io/browse/DATAMONGO).
- [ ] You use the code formatters provided [here](https://github.com/spring-projects/spring-data-build/tree/master/etc/ide) and have them applied to your changes. Dont submit any formatting related changes.
- [ ] You submit test cases (unit or integration tests) that back your changes.
- [ ] You added yourself as author in the headers of the classes you touched. Amend the date range in the Apache license header if needed. For new types, add the license header (copy from another file and set the current year only).

42
.travis.yml Normal file
View File

@@ -0,0 +1,42 @@
language: java
jdk:
- oraclejdk8
before_script:
- mongod --version
env:
matrix:
- PROFILE=ci
- PROFILE=mongo-next
- PROFILE=mongo3
- PROFILE=mongo3-next
- PROFILE=mongo31
- PROFILE=mongo32
- PROFILE=mongo33
- PROFILE=mongo34
- PROFILE=mongo34-next
# Current MongoDB version is 2.4.2 as of 2016-04, see https://github.com/travis-ci/travis-ci/issues/3694
# apt-get starts a MongoDB instance so it's not started using before_script
addons:
apt:
sources:
- mongodb-upstart
- sourceline: 'deb [arch=amd64] http://repo.mongodb.org/apt/ubuntu precise/mongodb-org/3.4 multiverse'
key_url: 'https://www.mongodb.org/static/pgp/server-3.4.asc'
packages:
- mongodb-org-server
- mongodb-org-shell
- oracle-java8-installer
sudo: false
cache:
directories:
- $HOME/.m2
install: true
script: "mvn clean dependency:list test -P${PROFILE} -Dsort"

27
CODE_OF_CONDUCT.adoc Normal file
View File

@@ -0,0 +1,27 @@
= Contributor Code of Conduct
As contributors and maintainers of this project, and in the interest of fostering an open and welcoming community, we pledge to respect all people who contribute through reporting issues, posting feature requests, updating documentation, submitting pull requests or patches, and other activities.
We are committed to making participation in this project a harassment-free experience for everyone, regardless of level of experience, gender, gender identity and expression, sexual orientation, disability, personal appearance, body size, race, ethnicity, age, religion, or nationality.
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery
* Personal attacks
* Trolling or insulting/derogatory comments
* Public or private harassment
* Publishing other's private information, such as physical or electronic addresses,
without explicit permission
* Other unethical or unprofessional conduct
Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful.
By adopting this Code of Conduct, project maintainers commit themselves to fairly and consistently applying these principles to every aspect of managing this project. Project maintainers who do not follow or enforce the Code of Conduct may be permanently removed from the project team.
This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community.
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting a project maintainer at spring-code-of-conduct@pivotal.io.
All complaints will be reviewed and investigated and will result in a response that is deemed necessary and appropriate to the circumstances.
Maintainers are obligated to maintain confidentiality with regard to the reporter of an incident.
This Code of Conduct is adapted from the http://contributor-covenant.org[Contributor Covenant], version 1.3.0, available at http://contributor-covenant.org/version/1/3/0/[contributor-covenant.org/version/1/3/0/].

View File

@@ -1 +0,0 @@
You find the contribution guidelines for Spring Data projects [here](https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.md).

3
CONTRIBUTING.adoc Normal file
View File

@@ -0,0 +1,3 @@
= Spring Data contribution guidelines
You find the contribution guidelines for Spring Data projects https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.adoc[here].

View File

@@ -1,3 +1,6 @@
[![Spring Data MongoDB](https://spring.io/badges/spring-data-mongodb/ga.svg)](http://projects.spring.io/spring-data-mongodb#quick-start)
[![Spring Data MongoDB](https://spring.io/badges/spring-data-mongodb/snapshot.svg)](http://projects.spring.io/spring-data-mongodb#quick-start)
# Spring Data MongoDB
The primary goal of the [Spring Data](http://projects.spring.io/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
@@ -11,7 +14,7 @@ For a comprehensive treatment of all the Spring Data MongoDB features, please re
* the [User Guide](http://docs.spring.io/spring-data/mongodb/docs/current/reference/html/)
* the [JavaDocs](http://docs.spring.io/spring-data/mongodb/docs/current/api/) have extensive comments in them as well.
* the home page of [Spring Data MongoDB](http://projects.spring.io/spring-data-mongodb) contains links to articles and other resources.
* for more detailed questions, use the [forum](http://forum.spring.io/forum/spring-projects/data/nosql).
* for more detailed questions, use [Spring Data Mongodb on Stackoverflow](http://stackoverflow.com/questions/tagged/spring-data-mongodb).
If you are new to Spring as well as to Spring Data, look for information about [Spring projects](http://projects.spring.io/).
@@ -26,7 +29,7 @@ Add the Maven dependency:
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.4.1.RELEASE</version>
<version>${version}.RELEASE</version>
</dependency>
```
@@ -36,7 +39,7 @@ If you'd rather like the latest snapshots of the upcoming major version, use our
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.5.0.BUILD-SNAPSHOT</version>
<version>${version}.BUILD-SNAPSHOT</version>
</dependency>
<repository>
@@ -139,9 +142,9 @@ public class MyService {
Here are some ways for you to get involved in the community:
* Get involved with the Spring community on the Spring Community Forums. Please help out on the [forum](http://forum.spring.io/forum/spring-projects/data/nosql) by responding to questions and joining the debate.
* Create [JIRA](https://jira.springframework.org/browse/DATADOC) tickets for bugs and new features and comment and vote on the ones that you are interested in.
* Get involved with the Spring community on Stackoverflow and help out on the [spring-data-mongodb](http://stackoverflow.com/questions/tagged/spring-data-mongodb) tag by responding to questions and joining the debate.
* Create [JIRA](https://jira.spring.io/browse/DATAMONGO) tickets for bugs and new features and comment and vote on the ones that you are interested in.
* Github is for social coding: if you want to write code, we encourage contributions through pull requests from [forks of this repository](http://help.github.com/forking/). If you want to contribute code this way, please reference a JIRA ticket as well covering the specific issue you are addressing.
* Watch for upcoming articles on Spring by [subscribing](http://spring.io/blog) to spring.io.
Before we accept a non-trivial patch or pull request we will need you to sign the [contributor's agreement](https://support.springsource.com/spring_committer_signup). Signing the contributor's agreement does not grant anyone commit rights to the main repository, but it does mean that we can accept your contributions, and you will get an author credit if we do. Active contributors might be asked to join the core team, and given the ability to merge pull requests.
Before we accept a non-trivial patch or pull request we will need you to [sign the Contributor License Agreement](https://cla.pivotal.io/sign/spring). Signing the contributors agreement does not grant anyone commit rights to the main repository, but it does mean that we can accept your contributions, and you will get an author credit if we do. If you forget to do so, you'll be reminded when you submit a pull request. Active contributors might be asked to join the core team, and given the ability to merge pull requests.

2
lombok.config Normal file
View File

@@ -0,0 +1,2 @@
lombok.nonNull.exceptionType = IllegalArgumentException
lombok.log.fieldName = LOG

149
pom.xml
View File

@@ -1,11 +1,11 @@
<?xml version="1.0" encoding="UTF-8"?>
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.5.0.M1</version>
<version>1.10.6.RELEASE</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,8 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.4.0.M1</version>
<relativePath>../spring-data-build/parent/pom.xml</relativePath>
<version>1.9.6.RELEASE</version>
</parent>
<modules>
@@ -29,11 +28,12 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.8.0.M1</springdata.commons>
<mongo>2.11.4</mongo>
<mongo-osgi>${mongo}</mongo-osgi>
<springdata.commons>1.13.6.RELEASE</springdata.commons>
<mongo>2.14.3</mongo>
<mongo.osgi>2.13.0</mongo.osgi>
<jmh.version>1.19</jmh.version>
</properties>
<developers>
<developer>
<id>ogierke</id>
@@ -105,12 +105,129 @@
<profiles>
<profile>
<id>mongo-next</id>
<properties>
<mongo>2.12.0-rc0</mongo>
<mongo-osgi>2.12.0</mongo-osgi>
<mongo>2.15.0-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>mongo3</id>
<properties>
<mongo>3.0.4</mongo>
</properties>
</profile>
<profile>
<id>mongo3-next</id>
<properties>
<mongo>3.0.5-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>mongo31</id>
<properties>
<mongo>3.1.1</mongo>
</properties>
</profile>
<profile>
<id>mongo32</id>
<properties>
<mongo>3.2.2</mongo>
</properties>
</profile>
<profile>
<id>mongo33</id>
<properties>
<mongo>3.3.0</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>mongo34</id>
<properties>
<mongo>3.4.0</mongo>
</properties>
</profile>
<profile>
<id>mongo34-next</id>
<properties>
<mongo>3.4.1-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>release</id>
<build>
<plugins>
<plugin>
<groupId>org.jfrog.buildinfo</groupId>
<artifactId>artifactory-maven-plugin</artifactId>
<inherited>false</inherited>
</plugin>
</plugins>
</build>
</profile>
<profile>
<id>benchmarks</id>
<modules>
<module>spring-data-mongodb</module>
<module>spring-data-mongodb-cross-store</module>
<module>spring-data-mongodb-log4j</module>
<module>spring-data-mongodb-distribution</module>
<module>spring-data-mongodb-benchmarks</module>
</modules>
</profile>
</profiles>
<dependencies>
@@ -121,18 +238,18 @@
<version>${mongo}</version>
</dependency>
</dependencies>
<repositories>
<repository>
<id>spring-libs-milestone</id>
<url>http://repo.spring.io/libs-milestone/</url>
<id>spring-libs-release</id>
<url>https://repo.spring.io/libs-release</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>spring-plugins-release</id>
<url>http://repo.spring.io/plugins-release</url>
<url>https://repo.spring.io/plugins-release</url>
</pluginRepository>
</pluginRepositories>

View File

@@ -0,0 +1,76 @@
# Benchmarks
Benchmarks are based on [JMH](http://openjdk.java.net/projects/code-tools/jmh/).
# Running Benchmarks
Running benchmarks is disabled by default and can be activated via the `benchmarks` profile.
To run the benchmarks with default settings use.
```bash
mvn -P benchmarks clean test
```
A basic report will be printed to the CLI.
```bash
# Run complete. Total time: 00:00:15
Benchmark Mode Cnt Score Error Units
MappingMongoConverterBenchmark.readObject thrpt 10 1920157,631 ± 64310,809 ops/s
MappingMongoConverterBenchmark.writeObject thrpt 10 782732,857 ± 53804,130 ops/s
```
## Running all Benchmarks of a specific class
To run all Benchmarks of a specific class, just provide its simple class name via the `benchmark` command line argument.
```bash
mvn -P benchmarks clean test -D benchmark=MappingMongoConverterBenchmark
```
## Running a single Benchmark
To run a single Benchmark provide its containing class simple name followed by `#` and the method name via the `benchmark` command line argument.
```bash
mvn -P benchmarks clean test -D benchmark=MappingMongoConverterBenchmark#readObjectWith2Properties
```
# Saving Benchmark Results
A detailed benchmark report is stored in JSON format in the `/target/reports/performance` directory.
To store the report in a different location use the `benchmarkReportDir` command line argument.
## MongoDB
Results can be directly piped to MongoDB by providing a valid [Connection String](https://docs.mongodb.com/manual/reference/connection-string/) via the `publishTo` command line argument.
```bash
mvn -P benchmarks clean test -D publishTo=mongodb://127.0.0.1:27017
```
NOTE: If the uri does not explicitly define a database the default `spring-data-mongodb-benchmarks` is used.
## HTTP Endpoint
The benchmark report can also be posted as `application/json` to an HTTP Endpoint by providing a valid URl via the `publishTo` command line argument.
```bash
mvn -P benchmarks clean test -D publishTo=http://127.0.0.1:8080/capture-benchmarks
```
# Customizing Benchmarks
Following options can be set via command line.
Option | Default Value
--- | ---
warmupIterations | 10
warmupTime | 1 (seconds)
measurementIterations | 10
measurementTime | 1 (seconds)
forks | 1
benchmarkReportDir | /target/reports/performance (always relative to project root dir)
benchmark | .* (single benchmark via `classname#benchmark`)
publishTo | \[not set\] (mongodb-uri or http-endpoint)

View File

@@ -0,0 +1,112 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.6.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-benchmarks</artifactId>
<packaging>jar</packaging>
<name>Spring Data MongoDB - Microbenchmarks</name>
<properties>
<!-- Skip tests by default; run only if -DskipTests=false is specified or benchmarks profile is activated -->
<skipTests>true</skipTests>
<bundlor.enabled>false</bundlor.enabled>
</properties>
<dependencies>
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.openjdk.jmh</groupId>
<artifactId>jmh-core</artifactId>
<version>${jmh.version}</version>
</dependency>
<dependency>
<groupId>org.openjdk.jmh</groupId>
<artifactId>jmh-generator-annprocess</artifactId>
<version>${jmh.version}</version>
<scope>provided</scope>
</dependency>
</dependencies>
<profiles>
<profile>
<id>benchmarks</id>
<properties>
<skipTests>false</skipTests>
</properties>
</profile>
</profiles>
<build>
<plugins>
<plugin>
<groupId>pl.project13.maven</groupId>
<artifactId>git-commit-id-plugin</artifactId>
<version>2.2.2</version>
<executions>
<execution>
<goals>
<goal>revision</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<executions>
<execution>
<id>default-jar</id>
<phase>never</phase>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<testSourceDirectory>${project.build.sourceDirectory}</testSourceDirectory>
<testClassesDirectory>${project.build.outputDirectory}</testClassesDirectory>
<excludes>
<exclude>**/AbstractMicrobenchmark.java</exclude>
<exclude>**/*$*.class</exclude>
<exclude>**/generated/*.class</exclude>
</excludes>
<includes>
<include>**/*Benchmark*</include>
</includes>
<systemPropertyVariables>
<benchmarkReportDir>${project.build.directory}/reports/performance</benchmarkReportDir>
<project.version>${project.version}</project.version>
<git.dirty>${git.dirty}</git.dirty>
<git.commit.id>${git.commit.id}</git.commit.id>
<git.branch>${git.branch}</git.branch>
</systemPropertyVariables>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -0,0 +1,111 @@
/*
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import lombok.Data;
import java.util.ArrayList;
import java.util.List;
import org.bson.types.ObjectId;
import org.openjdk.jmh.annotations.Benchmark;
import org.openjdk.jmh.annotations.Scope;
import org.openjdk.jmh.annotations.Setup;
import org.openjdk.jmh.annotations.State;
import org.openjdk.jmh.annotations.TearDown;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.microbenchmark.AbstractMicrobenchmark;
import com.mongodb.MongoClient;
import com.mongodb.ServerAddress;
/**
* @author Christoph Strobl
*/
@State(Scope.Benchmark)
public class DbRefMappingBenchmark extends AbstractMicrobenchmark {
private static final String DB_NAME = "dbref-loading-benchmark";
private MongoClient client;
private MongoTemplate template;
private Query queryObjectWithDBRef;
private Query queryObjectWithDBRefList;
@Setup
public void setUp() throws Exception {
client = new MongoClient(new ServerAddress());
template = new MongoTemplate(client, DB_NAME);
List<RefObject> refObjects = new ArrayList<RefObject>();
for (int i = 0; i < 1; i++) {
RefObject o = new RefObject();
template.save(o);
refObjects.add(o);
}
ObjectWithDBRef singleDBRef = new ObjectWithDBRef();
singleDBRef.ref = refObjects.iterator().next();
template.save(singleDBRef);
ObjectWithDBRef multipleDBRefs = new ObjectWithDBRef();
multipleDBRefs.refList = refObjects;
template.save(multipleDBRefs);
queryObjectWithDBRef = query(where("id").is(singleDBRef.id));
queryObjectWithDBRefList = query(where("id").is(multipleDBRefs.id));
}
@TearDown
public void tearDown() {
client.dropDatabase(DB_NAME);
client.close();
}
@Benchmark // DATAMONGO-1720
public ObjectWithDBRef readSingleDbRef() {
return template.findOne(queryObjectWithDBRef, ObjectWithDBRef.class);
}
@Benchmark // DATAMONGO-1720
public ObjectWithDBRef readMultipleDbRefs() {
return template.findOne(queryObjectWithDBRefList, ObjectWithDBRef.class);
}
@Data
static class ObjectWithDBRef {
private @Id ObjectId id;
private @DBRef RefObject ref;
private @DBRef List<RefObject> refList;
}
@Data
static class RefObject {
private @Id String id;
private String someValue;
}
}

View File

@@ -0,0 +1,182 @@
/*
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.Getter;
import lombok.RequiredArgsConstructor;
import java.util.Arrays;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.UUID;
import org.bson.types.ObjectId;
import org.openjdk.jmh.annotations.Benchmark;
import org.openjdk.jmh.annotations.Scope;
import org.openjdk.jmh.annotations.Setup;
import org.openjdk.jmh.annotations.State;
import org.openjdk.jmh.annotations.TearDown;
import org.springframework.data.annotation.Id;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.microbenchmark.AbstractMicrobenchmark;
import com.mongodb.BasicDBObject;
import com.mongodb.MongoClient;
import com.mongodb.ServerAddress;
import com.mongodb.util.JSON;
/**
* @author Christoph Strobl
*/
@State(Scope.Benchmark)
public class MappingMongoConverterBenchmark extends AbstractMicrobenchmark {
private static final String DB_NAME = "mapping-mongo-converter-benchmark";
private MongoClient client;
private MongoMappingContext mappingContext;
private MappingMongoConverter converter;
private BasicDBObject documentWith2Properties, documentWith2PropertiesAnd1Nested;
private Customer objectWith2PropertiesAnd1Nested;
private BasicDBObject documentWithFlatAndComplexPropertiesPlusListAndMap;
private SlightlyMoreComplexObject objectWithFlatAndComplexPropertiesPlusListAndMap;
@Setup
public void setUp() throws Exception {
client = new MongoClient(new ServerAddress());
this.mappingContext = new MongoMappingContext();
this.mappingContext.setInitialEntitySet(Collections.singleton(Customer.class));
this.mappingContext.afterPropertiesSet();
DbRefResolver dbRefResolver = new DefaultDbRefResolver(new SimpleMongoDbFactory(client, DB_NAME));
this.converter = new MappingMongoConverter(dbRefResolver, mappingContext);
this.converter.setCustomConversions(new CustomConversions(Collections.emptyList()));
this.converter.afterPropertiesSet();
// just a flat document
this.documentWith2Properties = new BasicDBObject("firstname", "Dave").append("lastname", "Matthews");
// document with a nested one
BasicDBObject address = new BasicDBObject("zipCode", "ABCDE").append("city", "Some Place");
this.documentWith2PropertiesAnd1Nested = new BasicDBObject("firstname", "Dave").//
append("lastname", "Matthews").//
append("address", address);
// object equivalent of documentWith2PropertiesAnd1Nested
this.objectWith2PropertiesAnd1Nested = new Customer("Dave", "Matthews", new Address("zipCode", "City"));
// a bit more challenging object with list & map conversion.
objectWithFlatAndComplexPropertiesPlusListAndMap = new SlightlyMoreComplexObject();
objectWithFlatAndComplexPropertiesPlusListAndMap.id = UUID.randomUUID().toString();
objectWithFlatAndComplexPropertiesPlusListAndMap.addressList = Arrays.asList(new Address("zip-1", "city-1"),
new Address("zip-2", "city-2"));
objectWithFlatAndComplexPropertiesPlusListAndMap.customer = objectWith2PropertiesAnd1Nested;
objectWithFlatAndComplexPropertiesPlusListAndMap.customerMap = new LinkedHashMap<String, Customer>();
objectWithFlatAndComplexPropertiesPlusListAndMap.customerMap.put("dave", objectWith2PropertiesAnd1Nested);
objectWithFlatAndComplexPropertiesPlusListAndMap.customerMap.put("deborah",
new Customer("Deborah Anne", "Dyer", new Address("?", "london")));
objectWithFlatAndComplexPropertiesPlusListAndMap.customerMap.put("eddie",
new Customer("Eddie", "Vedder", new Address("??", "Seattle")));
objectWithFlatAndComplexPropertiesPlusListAndMap.intOne = Integer.MIN_VALUE;
objectWithFlatAndComplexPropertiesPlusListAndMap.intTwo = Integer.MAX_VALUE;
objectWithFlatAndComplexPropertiesPlusListAndMap.location = new Point(-33.865143, 151.209900);
objectWithFlatAndComplexPropertiesPlusListAndMap.renamedField = "supercalifragilisticexpialidocious";
objectWithFlatAndComplexPropertiesPlusListAndMap.stringOne = "¯\\_(ツ)_/¯";
objectWithFlatAndComplexPropertiesPlusListAndMap.stringTwo = " (╯°□°)╯︵ ┻━┻";
// JSON equivalent of objectWithFlatAndComplexPropertiesPlusListAndMap
documentWithFlatAndComplexPropertiesPlusListAndMap = (BasicDBObject) JSON.parse(
"{ \"_id\" : \"517f6aee-e9e0-44f0-88ed-f3694a019f27\", \"intOne\" : -2147483648, \"intTwo\" : 2147483647, \"stringOne\" : \"¯\\\\_(ツ)_/¯\", \"stringTwo\" : \" (╯°□°)╯︵ ┻━┻\", \"explicit-field-name\" : \"supercalifragilisticexpialidocious\", \"location\" : { \"x\" : -33.865143, \"y\" : 151.2099 }, \"objectWith2PropertiesAnd1Nested\" : { \"firstname\" : \"Dave\", \"lastname\" : \"Matthews\", \"address\" : { \"zipCode\" : \"zipCode\", \"city\" : \"City\" } }, \"addressList\" : [{ \"zipCode\" : \"zip-1\", \"city\" : \"city-1\" }, { \"zipCode\" : \"zip-2\", \"city\" : \"city-2\" }], \"customerMap\" : { \"dave\" : { \"firstname\" : \"Dave\", \"lastname\" : \"Matthews\", \"address\" : { \"zipCode\" : \"zipCode\", \"city\" : \"City\" } }, \"deborah\" : { \"firstname\" : \"Deborah Anne\", \"lastname\" : \"Dyer\", \"address\" : { \"zipCode\" : \"?\", \"city\" : \"london\" } }, \"eddie\" : { \"firstname\" : \"Eddie\", \"lastname\" : \"Vedder\", \"address\" : { \"zipCode\" : \"??\", \"city\" : \"Seattle\" } } }, \"_class\" : \"org.springframework.data.mongodb.core.convert.MappingMongoConverterBenchmark$SlightlyMoreComplexObject\" }");
}
@TearDown
public void tearDown() {
client.dropDatabase(DB_NAME);
client.close();
}
@Benchmark // DATAMONGO-1720
public Customer readObjectWith2Properties() {
return converter.read(Customer.class, documentWith2Properties);
}
@Benchmark // DATAMONGO-1720
public Customer readObjectWith2PropertiesAnd1NestedObject() {
return converter.read(Customer.class, documentWith2PropertiesAnd1Nested);
}
@Benchmark // DATAMONGO-1720
public BasicDBObject writeObjectWith2PropertiesAnd1NestedObject() {
BasicDBObject sink = new BasicDBObject();
converter.write(objectWith2PropertiesAnd1Nested, sink);
return sink;
}
@Benchmark // DATAMONGO-1720
public Object readObjectWithListAndMapsOfComplexType() {
return converter.read(SlightlyMoreComplexObject.class, documentWithFlatAndComplexPropertiesPlusListAndMap);
}
@Benchmark // DATAMONGO-1720
public Object writeObjectWithListAndMapsOfComplexType() {
BasicDBObject sink = new BasicDBObject();
converter.write(objectWithFlatAndComplexPropertiesPlusListAndMap, sink);
return sink;
}
@Getter
@RequiredArgsConstructor
static class Customer {
private @Id ObjectId id;
private final String firstname, lastname;
private final Address address;
}
@Getter
@AllArgsConstructor
static class Address {
private String zipCode, city;
}
@Data
static class SlightlyMoreComplexObject {
@Id String id;
int intOne, intTwo;
String stringOne, stringTwo;
@Field("explicit-field-name") String renamedField;
Point location;
Customer customer;
List<Address> addressList;
Map<String, Customer> customerMap;
}
}

View File

@@ -0,0 +1,329 @@
/*
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.microbenchmark;
import java.io.File;
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.Collection;
import java.util.Date;
import org.junit.Test;
import org.openjdk.jmh.annotations.Fork;
import org.openjdk.jmh.annotations.Measurement;
import org.openjdk.jmh.annotations.Scope;
import org.openjdk.jmh.annotations.State;
import org.openjdk.jmh.annotations.Warmup;
import org.openjdk.jmh.results.RunResult;
import org.openjdk.jmh.results.format.ResultFormatType;
import org.openjdk.jmh.runner.Runner;
import org.openjdk.jmh.runner.options.ChainedOptionsBuilder;
import org.openjdk.jmh.runner.options.OptionsBuilder;
import org.openjdk.jmh.runner.options.TimeValue;
import org.springframework.core.env.StandardEnvironment;
import org.springframework.data.mongodb.microbenchmark.ResultsWriter.Utils;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ResourceUtils;
import org.springframework.util.StringUtils;
/**
* @author Christoph Strobl
*/
@Warmup(iterations = AbstractMicrobenchmark.WARMUP_ITERATIONS)
@Measurement(iterations = AbstractMicrobenchmark.MEASUREMENT_ITERATIONS)
@Fork(AbstractMicrobenchmark.FORKS)
@State(Scope.Thread)
public class AbstractMicrobenchmark {
static final int WARMUP_ITERATIONS = 5;
static final int MEASUREMENT_ITERATIONS = 10;
static final int FORKS = 1;
static final String[] JVM_ARGS = { "-server", "-XX:+HeapDumpOnOutOfMemoryError", "-Xms1024m", "-Xmx1024m",
"-XX:MaxDirectMemorySize=1024m" };
private final StandardEnvironment environment = new StandardEnvironment();
/**
* Run matching {@link org.openjdk.jmh.annotations.Benchmark} methods with options collected from
* {@link org.springframework.core.env.Environment}.
*
* @throws Exception
* @see #options(String)
*/
@Test
public void run() throws Exception {
String includes = includes();
if (!includes.contains(org.springframework.util.ClassUtils.getShortName(getClass()))) {
return;
}
publishResults(new Runner(options(includes).build()).run());
}
/**
* Get the regex for all benchmarks to be included in the run. By default every benchmark within classes matching the
* current ones short name. <br />
* The {@literal benchmark} command line argument allows overriding the defaults using {@code #} as class / method
* name separator.
*
* @return never {@literal null}.
* @see org.springframework.util.ClassUtils#getShortName(Class)
*/
protected String includes() {
String tests = environment.getProperty("benchmark", String.class);
if (!StringUtils.hasText(tests)) {
return ".*" + org.springframework.util.ClassUtils.getShortName(getClass()) + ".*";
}
if (!tests.contains("#")) {
return ".*" + tests + ".*";
}
String[] args = tests.split("#");
return ".*" + args[0] + "." + args[1];
}
/**
* Collect all options for the {@link Runner}.
*
* @param includes regex for matching benchmarks to be included in the run.
* @return never {@literal null}.
* @throws Exception
*/
protected ChainedOptionsBuilder options(String includes) throws Exception {
ChainedOptionsBuilder optionsBuilder = new OptionsBuilder().include(includes).jvmArgs(jvmArgs());
optionsBuilder = warmup(optionsBuilder);
optionsBuilder = measure(optionsBuilder);
optionsBuilder = forks(optionsBuilder);
optionsBuilder = report(optionsBuilder);
return optionsBuilder;
}
/**
* JVM args to apply to {@link Runner} via its {@link org.openjdk.jmh.runner.options.Options}.
*
* @return {@link #JVM_ARGS} by default.
*/
protected String[] jvmArgs() {
String[] args = new String[JVM_ARGS.length];
System.arraycopy(JVM_ARGS, 0, args, 0, JVM_ARGS.length);
return args;
}
/**
* Read {@code warmupIterations} property from {@link org.springframework.core.env.Environment}.
*
* @return -1 if not set.
*/
protected int getWarmupIterations() {
return environment.getProperty("warmupIterations", Integer.class, -1);
}
/**
* Read {@code measurementIterations} property from {@link org.springframework.core.env.Environment}.
*
* @return -1 if not set.
*/
protected int getMeasurementIterations() {
return environment.getProperty("measurementIterations", Integer.class, -1);
}
/**
* Read {@code forks} property from {@link org.springframework.core.env.Environment}.
*
* @return -1 if not set.
*/
protected int getForksCount() {
return environment.getProperty("forks", Integer.class, -1);
}
/**
* Read {@code benchmarkReportDir} property from {@link org.springframework.core.env.Environment}.
*
* @return {@literal null} if not set.
*/
protected String getReportDirectory() {
return environment.getProperty("benchmarkReportDir");
}
/**
* Read {@code measurementTime} property from {@link org.springframework.core.env.Environment}.
*
* @return -1 if not set.
*/
protected long getMeasurementTime() {
return environment.getProperty("measurementTime", Long.class, -1L);
}
/**
* Read {@code warmupTime} property from {@link org.springframework.core.env.Environment}.
*
* @return -1 if not set.
*/
protected long getWarmupTime() {
return environment.getProperty("warmupTime", Long.class, -1L);
}
/**
* {@code project.version_yyyy-MM-dd_ClassName.json} eg.
* {@literal 1.11.0.BUILD-SNAPSHOT_2017-03-07_MappingMongoConverterBenchmark.json}
*
* @return
*/
protected String reportFilename() {
StringBuilder sb = new StringBuilder();
if (environment.containsProperty("project.version")) {
sb.append(environment.getProperty("project.version"));
sb.append("_");
}
sb.append(new SimpleDateFormat("yyyy-MM-dd").format(new Date()));
sb.append("_");
sb.append(org.springframework.util.ClassUtils.getShortName(getClass()));
sb.append(".json");
return sb.toString();
}
/**
* Apply measurement options to {@link ChainedOptionsBuilder}.
*
* @param optionsBuilder must not be {@literal null}.
* @return {@link ChainedOptionsBuilder} with options applied.
* @see #getMeasurementIterations()
* @see #getMeasurementTime()
*/
private ChainedOptionsBuilder measure(ChainedOptionsBuilder optionsBuilder) {
int measurementIterations = getMeasurementIterations();
long measurementTime = getMeasurementTime();
if (measurementIterations > 0) {
optionsBuilder = optionsBuilder.measurementIterations(measurementIterations);
}
if (measurementTime > 0) {
optionsBuilder = optionsBuilder.measurementTime(TimeValue.seconds(measurementTime));
}
return optionsBuilder;
}
/**
* Apply warmup options to {@link ChainedOptionsBuilder}.
*
* @param optionsBuilder must not be {@literal null}.
* @return {@link ChainedOptionsBuilder} with options applied.
* @see #getWarmupIterations()
* @see #getWarmupTime()
*/
private ChainedOptionsBuilder warmup(ChainedOptionsBuilder optionsBuilder) {
int warmupIterations = getWarmupIterations();
long warmupTime = getWarmupTime();
if (warmupIterations > 0) {
optionsBuilder = optionsBuilder.warmupIterations(warmupIterations);
}
if (warmupTime > 0) {
optionsBuilder = optionsBuilder.warmupTime(TimeValue.seconds(warmupTime));
}
return optionsBuilder;
}
/**
* Apply forks option to {@link ChainedOptionsBuilder}.
*
* @param optionsBuilder must not be {@literal null}.
* @return {@link ChainedOptionsBuilder} with options applied.
* @see #getForksCount()
*/
private ChainedOptionsBuilder forks(ChainedOptionsBuilder optionsBuilder) {
int forks = getForksCount();
if (forks <= 0) {
return optionsBuilder;
}
return optionsBuilder.forks(forks);
}
/**
* Apply report option to {@link ChainedOptionsBuilder}.
*
* @param optionsBuilder must not be {@literal null}.
* @return {@link ChainedOptionsBuilder} with options applied.
* @throws IOException if report file cannot be created.
* @see #getReportDirectory()
*/
private ChainedOptionsBuilder report(ChainedOptionsBuilder optionsBuilder) throws IOException {
String reportDir = getReportDirectory();
if (!StringUtils.hasText(reportDir)) {
return optionsBuilder;
}
String reportFilePath = reportDir + (reportDir.endsWith(File.separator) ? "" : File.separator) + reportFilename();
File file = ResourceUtils.getFile(reportFilePath);
if (file.exists()) {
file.delete();
} else {
file.getParentFile().mkdirs();
file.createNewFile();
}
optionsBuilder.resultFormat(ResultFormatType.JSON);
optionsBuilder.result(reportFilePath);
return optionsBuilder;
}
/**
* Publish results to an external system.
*
* @param results must not be {@literal null}.
*/
private void publishResults(Collection<RunResult> results) {
if (CollectionUtils.isEmpty(results) || !environment.containsProperty("publishTo")) {
return;
}
String uri = environment.getProperty("publishTo");
try {
Utils.forUri(uri).write(results);
} catch (Exception e) {
System.err.println(String.format("Cannot save benchmark results to '%s'. Error was %s.", uri, e));
}
}
}

View File

@@ -0,0 +1,86 @@
/*
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.microbenchmark;
import lombok.SneakyThrows;
import java.io.OutputStream;
import java.net.HttpURLConnection;
import java.net.URL;
import java.net.URLConnection;
import java.nio.charset.Charset;
import java.util.Collection;
import org.openjdk.jmh.results.RunResult;
import org.springframework.core.env.StandardEnvironment;
import org.springframework.util.CollectionUtils;
/**
* {@link ResultsWriter} implementation of {@link URLConnection}.
*
* @since 2.0
*/
class HttpResultsWriter implements ResultsWriter {
private final String url;
HttpResultsWriter(String url) {
this.url = url;
}
@Override
@SneakyThrows
public void write(Collection<RunResult> results) {
if (CollectionUtils.isEmpty(results)) {
return;
}
StandardEnvironment env = new StandardEnvironment();
String projectVersion = env.getProperty("project.version", "unknown");
String gitBranch = env.getProperty("git.branch", "unknown");
String gitDirty = env.getProperty("git.dirty", "no");
String gitCommitId = env.getProperty("git.commit.id", "unknown");
HttpURLConnection connection = (HttpURLConnection) new URL(url).openConnection();
connection.setConnectTimeout(1000);
connection.setReadTimeout(1000);
connection.setDoOutput(true);
connection.setRequestMethod("POST");
connection.setRequestProperty("Content-Type", "application/json");
connection.addRequestProperty("X-Project-Version", projectVersion);
connection.addRequestProperty("X-Git-Branch", gitBranch);
connection.addRequestProperty("X-Git-Dirty", gitDirty);
connection.addRequestProperty("X-Git-Commit-Id", gitCommitId);
OutputStream output = null;
try {
output = connection.getOutputStream();
output.write(ResultsWriter.Utils.jsonifyResults(results).getBytes(Charset.forName("UTF-8")));
} finally {
if (output != null) {
output.close();
}
}
if (connection.getResponseCode() >= 400) {
throw new IllegalStateException(
String.format("Status %d %s", connection.getResponseCode(), connection.getResponseMessage()));
}
}
}

View File

@@ -0,0 +1,135 @@
/*
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.microbenchmark;
import java.net.UnknownHostException;
import java.util.Collection;
import java.util.Date;
import java.util.List;
import org.openjdk.jmh.results.RunResult;
import org.springframework.core.env.StandardEnvironment;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientURI;
import com.mongodb.util.JSON;
/**
* MongoDB specific {@link ResultsWriter} implementation.
*
* @author Christoph Strobl
* @since 2.0
*/
class MongoResultsWriter implements ResultsWriter {
private final String uri;
MongoResultsWriter(String uri) {
this.uri = uri;
}
@Override
public void write(Collection<RunResult> results) {
Date now = new Date();
StandardEnvironment env = new StandardEnvironment();
String projectVersion = env.getProperty("project.version", "unknown");
String gitBranch = env.getProperty("git.branch", "unknown");
String gitDirty = env.getProperty("git.dirty", "no");
String gitCommitId = env.getProperty("git.commit.id", "unknown");
MongoClientURI uri = new MongoClientURI(this.uri);
MongoClient client = null;
try {
client = new MongoClient(uri);
} catch (UnknownHostException e) {
throw new RuntimeException(e);
}
String dbName = StringUtils.hasText(uri.getDatabase()) ? uri.getDatabase() : "spring-data-mongodb-benchmarks";
DB db = client.getDB(dbName);
for (BasicDBObject dbo : (List<BasicDBObject>) JSON.parse(Utils.jsonifyResults(results))) {
String collectionName = extractClass(dbo.get("benchmark").toString());
BasicDBObject sink = new BasicDBObject();
sink.append("_version", projectVersion);
sink.append("_branch", gitBranch);
sink.append("_commit", gitCommitId);
sink.append("_dirty", gitDirty);
sink.append("_method", extractBenchmarkName(dbo.get("benchmark").toString()));
sink.append("_date", now);
sink.append("_snapshot", projectVersion.toLowerCase().contains("snapshot"));
sink.putAll(dbo.toMap());
db.getCollection(collectionName).insert(fixDocumentKeys(sink));
}
client.close();
}
/**
* Replace {@code .} by {@code ,}.
*
* @param doc
* @return
*/
private BasicDBObject fixDocumentKeys(BasicDBObject doc) {
BasicDBObject sanitized = new BasicDBObject();
for (Object key : doc.keySet()) {
Object value = doc.get(key);
if (value instanceof BasicDBObject) {
value = fixDocumentKeys((BasicDBObject) value);
}
if (key instanceof String) {
String newKey = (String) key;
if (newKey.contains(".")) {
newKey = newKey.replace('.', ',');
}
sanitized.put(newKey, value);
} else {
sanitized.put(ObjectUtils.nullSafeToString(key).replace('.', ','), value);
}
}
return sanitized;
}
private static String extractClass(String source) {
String tmp = source.substring(0, source.lastIndexOf('.'));
return tmp.substring(tmp.lastIndexOf(".") + 1);
}
private static String extractBenchmarkName(String source) {
return source.substring(source.lastIndexOf(".") + 1);
}
}

View File

@@ -0,0 +1,71 @@
/*
* Copyright 2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.microbenchmark;
import lombok.SneakyThrows;
import java.io.ByteArrayOutputStream;
import java.io.PrintStream;
import java.nio.charset.Charset;
import java.util.Collection;
import org.openjdk.jmh.results.RunResult;
import org.openjdk.jmh.results.format.ResultFormatFactory;
import org.openjdk.jmh.results.format.ResultFormatType;
/**
* @author Christoph Strobl
* @since 2.0
*/
interface ResultsWriter {
/**
* Write the {@link RunResult}s.
*
* @param results can be {@literal null}.
*/
void write(Collection<RunResult> results);
/* non Java8 hack */
class Utils {
/**
* Get the uri specific {@link ResultsWriter}.
*
* @param uri must not be {@literal null}.
* @return
*/
static ResultsWriter forUri(String uri) {
return uri.startsWith("mongodb:") ? new MongoResultsWriter(uri) : new HttpResultsWriter(uri);
}
/**
* Convert {@link RunResult}s to JMH Json representation.
*
* @param results
* @return json string representation of results.
* @see org.openjdk.jmh.results.format.JSONResultFormat
*/
@SneakyThrows
static String jsonifyResults(Collection<RunResult> results) {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ResultFormatFactory.getInstance(ResultFormatType.JSON, new PrintStream(baos, true, "UTF-8")).writeOut(results);
return new String(baos.toByteArray(), Charset.forName("UTF-8"));
}
}
}

View File

@@ -0,0 +1,14 @@
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="console" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d %5p %40.40c:%4L - %m%n</pattern>
</encoder>
</appender>
<root level="error">
<appender-ref ref="console" />
</root>
</configuration>

View File

@@ -0,0 +1,7 @@
<?xml version="1.0" encoding="UTF-8"?>
<aspectj>
<aspects>
<aspect name="org.springframework.beans.factory.aspectj.AnnotationBeanConfigurerAspect" />
<aspect name="org.springframework.data.mongodb.crossstore.MongoDocumentBacking" />
</aspects>
</aspectj>

View File

@@ -2,22 +2,22 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.5.0.M1</version>
<version>1.10.6.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-cross-store</artifactId>
<name>Spring Data MongoDB - Cross-Store Support</name>
<properties>
<jpa>1.0.0.Final</jpa>
<jpa>2.0.0</jpa>
<hibernate>3.6.10.Final</hibernate>
</properties>
<dependencies>
<!-- Spring -->
@@ -48,7 +48,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.5.0.M1</version>
<version>1.10.6.RELEASE</version>
</dependency>
<dependency>
@@ -56,17 +56,13 @@
<artifactId>aspectjrt</artifactId>
<version>${aspectj}</version>
</dependency>
<dependency>
<groupId>cglib</groupId>
<artifactId>cglib</artifactId>
<version>2.2</version>
</dependency>
<!-- JPA -->
<dependency>
<groupId>org.hibernate.javax.persistence</groupId>
<artifactId>hibernate-jpa-2.0-api</artifactId>
<groupId>org.eclipse.persistence</groupId>
<artifactId>javax.persistence</artifactId>
<version>${jpa}</version>
<optional>true</optional>
</dependency>
<!-- For Tests -->
@@ -85,7 +81,7 @@
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>1.0.0.GA</version>
<version>${validation}</version>
<scope>test</scope>
</dependency>
<dependency>
@@ -102,7 +98,7 @@
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.4</version>
<version>1.6</version>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
@@ -131,8 +127,10 @@
<artifactId>spring-aspects</artifactId>
</aspectLibrary>
</aspectLibraries>
<complianceLevel>${source.level}</complianceLevel>
<source>${source.level}</source>
<target>${source.level}</target>
<xmlConfigured>aop.xml</xmlConfigured>
</configuration>
</plugin>
</plugins>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -37,6 +37,8 @@ import com.mongodb.MongoException;
/**
* @author Thomas Risberg
* @author Oliver Gierke
* @author Alex Vengrovsk
* @author Mark Paluch
*/
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
@@ -45,7 +47,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
protected final Logger log = LoggerFactory.getLogger(getClass());
private final Logger log = LoggerFactory.getLogger(getClass());
private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory;
@@ -76,25 +78,25 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for " + dbk);
log.debug("Loading MongoDB data for {}", dbk);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
for (DBObject dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) {
log.debug("Processing key: " + key);
log.debug("Processing key: {}", key);
}
if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) {
throw new DataIntegrityViolationException("Unble to convert property " + key + ": Invalid metadata, "
+ ENTITY_FIELD_CLASS + " not available");
throw new DataIntegrityViolationException(
"Unble to convert property " + key + ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
}
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: " + key);
log.debug("Adding to ChangeSet: {}", key);
}
changeSet.set(key, value);
}
@@ -109,9 +111,9 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
log.debug("getPersistentId called on " + entity);
if (log.isDebugEnabled()) {
log.debug("getPersistentId called on {}", entity);
}
if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
}
@@ -130,7 +132,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
}
if (log.isDebugEnabled()) {
log.debug("Flush: changeset: " + cs.getValues());
log.debug("Flush: changeset: {}", cs.getValues());
}
String collName = getCollectionNameForEntity(entity.getClass());
@@ -152,7 +154,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
});
if (value == null) {
if (log.isDebugEnabled()) {
log.debug("Flush: removing: " + dbQuery);
log.debug("Flush: removing: {}", dbQuery);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
@@ -164,7 +166,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
final DBObject dbDoc = new BasicDBObject();
dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) {
log.debug("Flush: saving: " + dbQuery);
log.debug("Flush: saving: {}", dbQuery);
}
mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());

View File

@@ -2,12 +2,12 @@ Bundle-SymbolicName: org.springframework.data.mongodb.crossstore
Bundle-Name: Spring Data MongoDB Cross Store Support
Bundle-Vendor: Pivotal Software, Inc.
Bundle-ManifestVersion: 2
Import-Package:
Import-Package:
sun.reflect;version="0";resolution:=optional
Export-Template:
org.springframework.data.mongodb.crossstore.*;version="${project.version}"
Import-Template:
com.mongodb.*;version="${mongo-osgi:[=.=.=,+1.0.0)}",
Import-Template:
com.mongodb.*;version="${mongo.osgi:[=.=.=,+1.0.0)}",
javax.persistence.*;version="${jpa:[=.=.=,+1.0.0)}",
org.aspectj.*;version="${aspectj:[1.0.0, 2.0.0)}",
org.bson.*;version="0",

View File

@@ -2,7 +2,7 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>spring-data-mongodb-distribution</artifactId>
<packaging>pom</packaging>
@@ -13,10 +13,10 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.5.0.M1</version>
<version>1.10.6.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<project.root>${basedir}/..</project.root>
<dist.key>SDMONGO</dist.key>
@@ -32,6 +32,10 @@
<groupId>org.codehaus.mojo</groupId>
<artifactId>wagon-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
</plugin>
</plugins>
</build>

View File

@@ -5,7 +5,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.5.0.M1</version>
<version>1.10.6.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,6 +18,8 @@ package org.springframework.data.mongodb.log4j;
import java.net.UnknownHostException;
import java.util.Arrays;
import java.util.Calendar;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import org.apache.log4j.AppenderSkeleton;
@@ -30,13 +32,18 @@ import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoCredential;
import com.mongodb.ServerAddress;
import com.mongodb.WriteConcern;
/**
* Log4j appender writing log entries into a MongoDB instance.
*
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Christoph Strobl
* @author Ricardo Espirito Santo
*/
public class MongoLog4jAppender extends AppenderSkeleton {
@@ -54,17 +61,19 @@ public class MongoLog4jAppender extends AppenderSkeleton {
protected String host = "localhost";
protected int port = 27017;
protected String username;
protected String password;
protected String authenticationDatabase;
protected String database = "logs";
protected String collectionPattern = "%c";
protected PatternLayout collectionLayout = new PatternLayout(collectionPattern);
protected String applicationId = System.getProperty("APPLICATION_ID", null);
protected WriteConcern warnOrHigherWriteConcern = WriteConcern.SAFE;
protected WriteConcern infoOrLowerWriteConcern = WriteConcern.NORMAL;
protected WriteConcern warnOrHigherWriteConcern = WriteConcern.ACKNOWLEDGED;
protected WriteConcern infoOrLowerWriteConcern = WriteConcern.UNACKNOWLEDGED;
protected Mongo mongo;
protected DB db;
public MongoLog4jAppender() {
}
public MongoLog4jAppender() {}
public MongoLog4jAppender(boolean isActive) {
super(isActive);
@@ -86,6 +95,53 @@ public class MongoLog4jAppender extends AppenderSkeleton {
this.port = port;
}
/**
* @return
* @since 1.10
*/
public String getUsername() {
return username;
}
/**
* @param username may be {@literal null} for unauthenticated access.
* @since 1.10
*/
public void setUsername(String username) {
this.username = username;
}
/**
* @return
* @since 1.10
*/
public String getPassword() {
return password;
}
/**
* @param password may be {@literal null} for unauthenticated access.
* @since 1.10
*/
public void setPassword(String password) {
this.password = password;
}
/**
* @return
*/
public String getAuthenticationDatabase() {
return authenticationDatabase;
}
/**
* @param authenticationDatabase may be {@literal null} to use {@link #getDatabase()} as authentication database.
* @since 1.10
*/
public void setAuthenticationDatabase(String authenticationDatabase) {
this.authenticationDatabase = authenticationDatabase;
}
public String getDatabase() {
return database;
}
@@ -111,14 +167,14 @@ public class MongoLog4jAppender extends AppenderSkeleton {
this.applicationId = applicationId;
}
public void setWarnOrHigherWriteConcern(String wc) {
this.warnOrHigherWriteConcern = WriteConcern.valueOf(wc);
}
public String getWarnOrHigherWriteConcern() {
return warnOrHigherWriteConcern.toString();
}
public void setWarnOrHigherWriteConcern(String wc) {
this.warnOrHigherWriteConcern = WriteConcern.valueOf(wc);
}
public String getInfoOrLowerWriteConcern() {
return infoOrLowerWriteConcern.toString();
}
@@ -128,10 +184,26 @@ public class MongoLog4jAppender extends AppenderSkeleton {
}
protected void connectToMongo() throws UnknownHostException {
this.mongo = new Mongo(host, port);
this.mongo = createMongoClient();
this.db = mongo.getDB(database);
}
private MongoClient createMongoClient() throws UnknownHostException {
ServerAddress serverAddress = new ServerAddress(host, port);
if (null == password || null == username) {
return new MongoClient(serverAddress);
}
String authenticationDatabaseToUse = authenticationDatabase == null ? this.database : authenticationDatabase;
MongoCredential mongoCredential = MongoCredential.createCredential(username,
authenticationDatabaseToUse, password.toCharArray());
List<MongoCredential> credentials = Collections.singletonList(mongoCredential);
return new MongoClient(serverAddress, credentials);
}
/*
* (non-Javadoc)
* @see org.apache.log4j.AppenderSkeleton#append(org.apache.log4j.spi.LoggingEvent)
@@ -160,7 +232,7 @@ public class MongoLog4jAppender extends AppenderSkeleton {
// Copy properties into document
Map<Object, Object> props = event.getProperties();
if (null != props && props.size() > 0) {
if (null != props && !props.isEmpty()) {
BasicDBObject propsDbo = new BasicDBObject();
for (Map.Entry<Object, Object> entry : props.entrySet()) {
propsDbo.put(entry.getKey().toString(), entry.getValue().toString());

View File

@@ -0,0 +1,114 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.log4j;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.util.Calendar;
import java.util.Collections;
import org.apache.log4j.LogManager;
import org.apache.log4j.Logger;
import org.apache.log4j.MDC;
import org.apache.log4j.PropertyConfigurator;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.DB;
import com.mongodb.DBCursor;
import com.mongodb.MongoClient;
import com.mongodb.MongoCredential;
import com.mongodb.ServerAddress;
/**
* Integration tests for {@link MongoLog4jAppender} using authentication.
*
* @author Mark Paluch
*/
public class MongoLog4jAppenderAuthenticationIntegrationTests {
private final static String username = "admin";
private final static String password = "test";
private final static String authenticationDatabase = "logs";
MongoClient mongo;
DB db;
String collection;
ServerAddress serverLocation;
Logger log;
@Before
public void setUp() throws Exception {
serverLocation = new ServerAddress("localhost", 27017);
mongo = new MongoClient(serverLocation);
db = mongo.getDB("logs");
BasicDBList roles = new BasicDBList();
roles.add("dbOwner");
db.command(new BasicDBObjectBuilder().add("createUser", username).add("pwd", password).add("roles", roles).get());
mongo.close();
mongo = new MongoClient(serverLocation, Collections
.singletonList(MongoCredential.createCredential(username, authenticationDatabase, password.toCharArray())));
db = mongo.getDB("logs");
Calendar now = Calendar.getInstance();
collection = String.valueOf(now.get(Calendar.YEAR)) + String.format("%1$02d", now.get(Calendar.MONTH) + 1);
LogManager.resetConfiguration();
PropertyConfigurator.configure(getClass().getResource("/log4j-with-authentication.properties"));
log = Logger.getLogger(MongoLog4jAppenderIntegrationTests.class.getName());
}
@After
public void tearDown() {
if (db != null) {
db.getCollection(collection).remove(new BasicDBObject());
db.command(new BasicDBObject("dropUser", username));
}
LogManager.resetConfiguration();
PropertyConfigurator.configure(getClass().getResource("/log4j.properties"));
}
@Test
public void testLogging() {
log.debug("DEBUG message");
log.info("INFO message");
log.warn("WARN message");
log.error("ERROR message");
DBCursor msgs = db.getCollection(collection).find();
assertThat(msgs.count(), is(4));
}
@Test
public void testProperties() {
MDC.put("property", "one");
log.debug("DEBUG message");
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,39 +20,51 @@ import static org.junit.Assert.*;
import java.util.Calendar;
import org.apache.log4j.LogManager;
import org.apache.log4j.Logger;
import org.apache.log4j.MDC;
import org.apache.log4j.PropertyConfigurator;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.DBCursor;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.ServerAddress;
/**
* Integration tests for {@link MongoLog4jAppender}.
*
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MongoLog4jAppenderIntegrationTests {
static final String NAME = MongoLog4jAppenderIntegrationTests.class.getName();
Logger log = Logger.getLogger(NAME);
Mongo mongo;
MongoClient mongo;
DB db;
String collection;
ServerAddress serverLocation;
Logger log;
@Before
public void setUp() throws Exception {
serverLocation = new ServerAddress("localhost", 27017);
mongo = new Mongo("localhost", 27017);
mongo = new MongoClient(serverLocation);
db = mongo.getDB("logs");
Calendar now = Calendar.getInstance();
collection = String.valueOf(now.get(Calendar.YEAR)) + String.format("%1$02d", now.get(Calendar.MONTH) + 1);
db.getCollection(collection).drop();
log = Logger.getLogger(MongoLog4jAppenderIntegrationTests.class.getName());
}
@After
public void tearDown() {
db.getCollection(collection).remove(new BasicDBObject());
}
@Test
@@ -64,13 +76,11 @@ public class MongoLog4jAppenderIntegrationTests {
log.error("ERROR message");
DBCursor msgs = db.getCollection(collection).find();
assertThat(msgs.count(), is(4));
}
@Test
public void testProperties() {
MDC.put("property", "one");
log.debug("DEBUG message");
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,10 +24,7 @@ import org.junit.Test;
*/
public class MongoLog4jAppenderUnitTests {
/**
* @see DATAMONGO-641
*/
@Test
@Test // DATAMONGO-641
public void closesWithoutMongoInstancePresent() {
new MongoLog4jAppender().close();
}

View File

@@ -0,0 +1,16 @@
log4j.rootCategory=INFO, mongo
log4j.appender.mongo=org.springframework.data.mongodb.log4j.MongoLog4jAppender
log4j.appender.mongo.layout=org.apache.log4j.PatternLayout
log4j.appender.mongo.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.appender.mongo.host = localhost
log4j.appender.mongo.port = 27017
log4j.appender.mongo.database = logs
log4j.appender.mongo.username = admin
log4j.appender.mongo.password = test
log4j.appender.mongo.authenticationDatabase = logs
log4j.appender.mongo.collectionPattern = %X{year}%X{month}
log4j.appender.mongo.applicationId = my.application
log4j.appender.mongo.warnOrHigherWriteConcern = FSYNC_SAFE
log4j.category.org.springframework.data.mongodb=DEBUG

View File

@@ -1,13 +1,13 @@
log4j.rootCategory=INFO, stdout
log4j.rootCategory=INFO, mongo
log4j.appender.stdout=org.springframework.data.mongodb.log4j.MongoLog4jAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.appender.stdout.host = localhost
log4j.appender.stdout.port = 27017
log4j.appender.stdout.database = logs
log4j.appender.stdout.collectionPattern = %X{year}%X{month}
log4j.appender.stdout.applicationId = my.application
log4j.appender.stdout.warnOrHigherWriteConcern = FSYNC_SAFE
log4j.appender.mongo=org.springframework.data.mongodb.log4j.MongoLog4jAppender
log4j.appender.mongo.layout=org.apache.log4j.PatternLayout
log4j.appender.mongo.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.appender.mongo.host = localhost
log4j.appender.mongo.port = 27017
log4j.appender.mongo.database = logs
log4j.appender.mongo.collectionPattern = %X{year}%X{month}
log4j.appender.mongo.applicationId = my.application
log4j.appender.mongo.warnOrHigherWriteConcern = FSYNC_SAFE
log4j.category.org.springframework.data.mongodb=DEBUG

View File

@@ -5,5 +5,5 @@ Bundle-ManifestVersion: 2
Import-Package:
sun.reflect;version="0";resolution:=optional
Import-Template:
com.mongodb.*;version="${mongo-osgi:[=.=.=,+1.0.0)}",
com.mongodb.*;version="${mongo.osgi:[=.=.=,+1.0.0)}",
org.apache.log4j.*;version="${log4j:[=.=.=,+1.0.0)}"

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<context version="7.1.9.205">
<context version="7.2.2.230">
<scope type="Project" name="spring-data-mongodb">
<element type="TypeFilterReferenceOverridden" name="Filter">
<element type="IncludeTypePattern" name="org.springframework.data.mongodb.**"/>
@@ -32,8 +32,15 @@
<element type="IncludeTypePattern" name="**.config.**"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Implementation" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="CDI">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.cdi.**"/>
</element>
<stereotype name="Unrestricted"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Config" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
</element>
@@ -75,6 +82,11 @@
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="Script">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.script.**"/>
</element>
</element>
<element type="Subsystem" name="Conversion">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.core.convert.**"/>
@@ -82,6 +94,7 @@
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Script" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="SpEL">
<element type="TypeFilter" name="Assignment">
@@ -104,6 +117,11 @@
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="MapReduce">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.mapreduce.**"/>
</element>
</element>
<element type="Subsystem" name="Core">
<element type="TypeFilter" name="Assignment">
<element type="WeakTypePattern" name="**.core.**"/>
@@ -112,8 +130,10 @@
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Conversion" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Index" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|MapReduce" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Script" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="Util">
<element type="TypeFilter" name="Assignment">
@@ -168,7 +188,32 @@
</element>
<element type="Subsystem" name="Querydsl">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="com.mysema.query.**"/>
<element type="IncludeTypePattern" name="com.querydsl.**"/>
</element>
</element>
<element type="Subsystem" name="Slf4j">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="org.slf4j.**"/>
</element>
</element>
<element type="Subsystem" name="Jackson">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="com.fasterxml.jackson.**"/>
</element>
</element>
<element type="Subsystem" name="DOM">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="org.w3c.dom.**"/>
</element>
</element>
<element type="Subsystem" name="AOP Alliance">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="org.aopalliance.**"/>
</element>
</element>
<element type="Subsystem" name="Guava">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="com.google.common.**"/>
</element>
</element>
</architecture>

View File

@@ -2,7 +2,7 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>spring-data-mongodb</artifactId>
<name>Spring Data MongoDB - Core</name>
@@ -11,18 +11,18 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.5.0.M1</version>
<version>1.10.6.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<validation>1.0.0.GA</validation>
<objenesis>1.3</objenesis>
<equalsverifier>1.5</equalsverifier>
</properties>
<dependencies>
<!-- Spring -->
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
@@ -50,7 +50,7 @@
<artifactId>spring-expression</artifactId>
</dependency>
<!-- Spring Data -->
<!-- Spring Data -->
<dependency>
<groupId>${project.groupId}</groupId>
<artifactId>spring-data-commons</artifactId>
@@ -58,14 +58,14 @@
</dependency>
<dependency>
<groupId>com.mysema.querydsl</groupId>
<groupId>com.querydsl</groupId>
<artifactId>querydsl-mongodb</artifactId>
<version>${querydsl}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>com.mysema.querydsl</groupId>
<groupId>com.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>${querydsl}</version>
<scope>provided</scope>
@@ -77,7 +77,7 @@
<version>1.0</version>
<optional>true</optional>
</dependency>
<!-- CDI -->
<dependency>
<groupId>javax.enterprise</groupId>
@@ -86,21 +86,21 @@
<scope>provided</scope>
<optional>true</optional>
</dependency>
<dependency>
<groupId>javax.el</groupId>
<artifactId>el-api</artifactId>
<version>${cdi}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.openwebbeans.test</groupId>
<artifactId>cditest-owb</artifactId>
<version>${webbeans}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
@@ -115,7 +115,7 @@
<version>${validation}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.objenesis</groupId>
<artifactId>objenesis</artifactId>
@@ -129,23 +129,50 @@
<version>4.2.0.Final</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>${jodatime}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.threeten</groupId>
<artifactId>threetenbp</artifactId>
<version>${threetenbp}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jul-to-slf4j</artifactId>
<version>${slf4j}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>nl.jqno.equalsverifier</groupId>
<artifactId>equalsverifier</artifactId>
<version>${equalsverifier}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-webmvc</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
@@ -155,7 +182,7 @@
<version>${apt}</version>
<dependencies>
<dependency>
<groupId>com.mysema.querydsl</groupId>
<groupId>com.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>${querydsl}</version>
</dependency>
@@ -189,9 +216,14 @@
<systemPropertyVariables>
<java.util.logging.config.file>src/test/resources/logging.properties</java.util.logging.config.file>
</systemPropertyVariables>
<properties>
<property>
<name>listener</name>
<value>org.springframework.data.mongodb.test.util.CleanMongoDBJunitRunListener</value>
</property>
</properties>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -0,0 +1,61 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import java.util.List;
import org.springframework.dao.DataAccessException;
import com.mongodb.BulkWriteError;
import com.mongodb.BulkWriteException;
import com.mongodb.BulkWriteResult;
/**
* Is thrown when errors occur during bulk operations.
*
* @author Tobias Trelle
* @author Oliver Gierke
* @since 1.9
*/
public class BulkOperationException extends DataAccessException {
private static final long serialVersionUID = 73929601661154421L;
private final List<BulkWriteError> errors;
private final BulkWriteResult result;
/**
* Creates a new {@link BulkOperationException} with the given message and source {@link BulkWriteException}.
*
* @param message must not be {@literal null}.
* @param source must not be {@literal null}.
*/
public BulkOperationException(String message, BulkWriteException source) {
super(message, source);
this.errors = source.getWriteErrors();
this.result = source.getWriteResult();
}
public List<BulkWriteError> getErrors() {
return errors;
}
public BulkWriteResult getResult() {
return result;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.config;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
@@ -28,6 +29,9 @@ import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mapping.model.FieldNamingStrategy;
import org.springframework.data.mapping.model.PropertyNameFieldNamingStrategy;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
@@ -35,7 +39,6 @@ import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.support.CachingIsNewStrategyFactory;
@@ -44,6 +47,7 @@ import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Base class for Spring Data MongoDB configuration using JavaConfig.
@@ -51,6 +55,8 @@ import com.mongodb.Mongo;
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Ryan Tenney
* @author Christoph Strobl
*/
@Configuration
public abstract class AbstractMongoConfiguration {
@@ -67,7 +73,10 @@ public abstract class AbstractMongoConfiguration {
* returned by {@link #getDatabaseName()} later on effectively.
*
* @return
* @deprecated since 1.7. {@link MongoClient} should hold authentication data within
* {@link MongoClient#getCredentialsList()}
*/
@Deprecated
protected String getAuthenticationDatabaseName() {
return null;
}
@@ -110,23 +119,42 @@ public abstract class AbstractMongoConfiguration {
* Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration
* class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending
* {@link AbstractMongoConfiguration} the base package will be considered {@code com.acme} unless the method is
* overriden to implement alternate behaviour.
* overridden to implement alternate behavior.
*
* @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for
* entities.
* @deprecated use {@link #getMappingBasePackages()} instead.
*/
@Deprecated
protected String getMappingBasePackage() {
Package mappingBasePackage = getClass().getPackage();
return mappingBasePackage == null ? null : mappingBasePackage.getName();
}
/**
* Returns the base packages to scan for MongoDB mapped entities at startup. Will return the package name of the
* configuration class' (the concrete class, not this one here) by default. So if you have a
* {@code com.acme.AppConfig} extending {@link AbstractMongoConfiguration} the base package will be considered
* {@code com.acme} unless the method is overridden to implement alternate behavior.
*
* @return the base packages to scan for mapped {@link Document} classes or an empty collection to not enable scanning
* for entities.
* @since 1.10
*/
protected Collection<String> getMappingBasePackages() {
return Collections.singleton(getMappingBasePackage());
}
/**
* Return {@link UserCredentials} to be used when connecting to the MongoDB instance or {@literal null} if none shall
* be used.
*
* @return
* @deprecated since 1.7. {@link MongoClient} should hold authentication data within
* {@link MongoClient#getCredentialsList()}
*/
@Deprecated
protected UserCredentials getUserCredentials() {
return null;
}
@@ -144,10 +172,7 @@ public abstract class AbstractMongoConfiguration {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(getInitialEntitySet());
mappingContext.setSimpleTypeHolder(customConversions().getSimpleTypeHolder());
if (abbreviateFieldNames()) {
mappingContext.setFieldNamingStrategy(new CamelCaseAbbreviatingFieldNamingStrategy());
}
mappingContext.setFieldNamingStrategy(fieldNamingStrategy());
return mappingContext;
}
@@ -196,26 +221,52 @@ public abstract class AbstractMongoConfiguration {
}
/**
* Scans the mapping base package for classes annotated with {@link Document}.
* Scans the mapping base package for classes annotated with {@link Document}. By default, it scans for entities in
* all packages returned by {@link #getMappingBasePackages()}.
*
* @see #getMappingBasePackage()
* @see #getMappingBasePackages()
* @return
* @throws ClassNotFoundException
*/
protected Set<Class<?>> getInitialEntitySet() throws ClassNotFoundException {
String basePackage = getMappingBasePackage();
Set<Class<?>> initialEntitySet = new HashSet<Class<?>>();
for (String basePackage : getMappingBasePackages()) {
initialEntitySet.addAll(scanForEntities(basePackage));
}
return initialEntitySet;
}
/**
* Scans the given base package for entities, i.e. MongoDB specific types annotated with {@link Document} and
* {@link Persistent}.
*
* @param basePackage must not be {@literal null}.
* @return
* @throws ClassNotFoundException
* @since 1.10
*/
protected Set<Class<?>> scanForEntities(String basePackage) throws ClassNotFoundException {
if (!StringUtils.hasText(basePackage)) {
return Collections.emptySet();
}
Set<Class<?>> initialEntitySet = new HashSet<Class<?>>();
if (StringUtils.hasText(basePackage)) {
ClassPathScanningCandidateComponentProvider componentProvider = new ClassPathScanningCandidateComponentProvider(
false);
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Document.class));
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Persistent.class));
for (BeanDefinition candidate : componentProvider.findCandidateComponents(basePackage)) {
initialEntitySet.add(ClassUtils.forName(candidate.getBeanClassName(),
AbstractMongoConfiguration.class.getClassLoader()));
initialEntitySet
.add(ClassUtils.forName(candidate.getBeanClassName(), AbstractMongoConfiguration.class.getClassLoader()));
}
}
@@ -232,4 +283,15 @@ public abstract class AbstractMongoConfiguration {
protected boolean abbreviateFieldNames() {
return false;
}
/**
* Configures a {@link FieldNamingStrategy} on the {@link MongoMappingContext} instance created.
*
* @return
* @since 1.5
*/
protected FieldNamingStrategy fieldNamingStrategy() {
return abbreviateFieldNames() ? new CamelCaseAbbreviatingFieldNamingStrategy()
: PropertyNameFieldNamingStrategy.INSTANCE;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -30,6 +30,7 @@ import org.springframework.beans.factory.config.BeanDefinitionHolder;
import org.springframework.beans.factory.config.RuntimeBeanReference;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.parsing.ReaderContext;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
@@ -51,10 +52,10 @@ import org.springframework.core.type.filter.TypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.mapping.context.MappingContextIsNewStrategyFactory;
import org.springframework.data.mapping.model.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.event.ValidatingMongoEventListener;
@@ -71,6 +72,7 @@ import org.w3c.dom.Element;
* @author Oliver Gierke
* @author Maciej Walkowiak
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class MappingMongoConverterParser implements BeanDefinitionParser {
@@ -83,8 +85,11 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
BeanDefinitionRegistry registry = parserContext.getRegistry();
if (parserContext.isNested()) {
parserContext.getReaderContext().error("Mongo Converter must not be defined as nested bean.", element);
}
BeanDefinitionRegistry registry = parserContext.getRegistry();
String id = element.getAttribute(AbstractBeanDefinitionParser.ID_ATTRIBUTE);
id = StringUtils.hasText(id) ? id : DEFAULT_CONVERTER_BEAN_NAME;
@@ -209,11 +214,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
mappingContextBuilder.addPropertyValue("simpleTypeHolder", simpleTypesDefinition);
}
String abbreviateFieldNames = element.getAttribute("abbreviate-field-names");
if ("true".equals(abbreviateFieldNames)) {
mappingContextBuilder.addPropertyValue("fieldNamingStrategy", new RootBeanDefinition(
CamelCaseAbbreviatingFieldNamingStrategy.class));
}
parseFieldNamingStrategy(element, parserContext.getReaderContext(), mappingContextBuilder);
ctxRef = converterId == null || DEFAULT_CONVERTER_BEAN_NAME.equals(converterId) ? MAPPING_CONTEXT_BEAN_NAME
: converterId + "." + MAPPING_CONTEXT_BEAN_NAME;
@@ -222,6 +223,34 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
return ctxRef;
}
private static void parseFieldNamingStrategy(Element element, ReaderContext context, BeanDefinitionBuilder builder) {
String abbreviateFieldNames = element.getAttribute("abbreviate-field-names");
String fieldNamingStrategy = element.getAttribute("field-naming-strategy-ref");
boolean fieldNamingStrategyReferenced = StringUtils.hasText(fieldNamingStrategy);
boolean abbreviationActivated = StringUtils.hasText(abbreviateFieldNames)
&& Boolean.parseBoolean(abbreviateFieldNames);
if (fieldNamingStrategyReferenced && abbreviationActivated) {
context.error("Field name abbreviation cannot be activated if a field-naming-strategy-ref is configured!",
element);
return;
}
Object value = null;
if ("true".equals(abbreviateFieldNames)) {
value = new RootBeanDefinition(CamelCaseAbbreviatingFieldNamingStrategy.class);
} else if (fieldNamingStrategyReferenced) {
value = new RuntimeBeanReference(fieldNamingStrategy);
}
if (value != null) {
builder.addPropertyValue("fieldNamingStrategy", value);
}
}
private BeanDefinition getCustomConversions(Element element, ParserContext parserContext) {
List<Element> customConvertersElements = DomUtils.getChildElementsByTagName(element, "custom-converters");
@@ -332,7 +361,9 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
* @param filters
*/
public NegatingFilter(TypeFilter... filters) {
Assert.notNull(filters);
Assert.notNull(filters, "TypeFilters must not be null");
this.delegates = new HashSet<TypeFilter>(Arrays.asList(filters));
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,24 +15,24 @@
*/
package org.springframework.data.mongodb.config;
import static org.springframework.beans.factory.config.BeanDefinition.*;
import static org.springframework.data.mongodb.config.BeanNames.*;
import java.lang.annotation.Annotation;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.RootBeanDefinition;
import org.springframework.context.annotation.ImportBeanDefinitionRegistrar;
import org.springframework.core.type.AnnotationMetadata;
import org.springframework.data.auditing.IsNewAwareAuditingHandler;
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
import org.springframework.data.auditing.config.AuditingConfiguration;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.event.AuditingEventListener;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.Assert;
/**
@@ -71,7 +71,6 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
Assert.notNull(annotationMetadata, "AnnotationMetadata must not be null!");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
defaultDependenciesIfNecessary(registry, annotationMetadata);
super.registerBeanDefinitions(annotationMetadata, registry);
}
@@ -85,7 +84,11 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
Assert.notNull(configuration, "AuditingConfiguration must not be null!");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(IsNewAwareAuditingHandler.class);
builder.addConstructorArgReference(MAPPING_CONTEXT_BEAN_NAME);
BeanDefinitionBuilder definition = BeanDefinitionBuilder.genericBeanDefinition(MongoMappingContextLookup.class);
definition.setAutowireMode(AbstractBeanDefinition.AUTOWIRE_CONSTRUCTOR);
builder.addConstructorArgValue(definition.getBeanDefinition());
return configureDefaultAuditHandlerAttributes(configuration, builder);
}
@@ -102,29 +105,58 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
BeanDefinitionBuilder listenerBeanDefinitionBuilder = BeanDefinitionBuilder
.rootBeanDefinition(AuditingEventListener.class);
listenerBeanDefinitionBuilder.addConstructorArgValue(ParsingUtils.getObjectFactoryBeanDefinition(
getAuditingHandlerBeanName(), registry));
listenerBeanDefinitionBuilder
.addConstructorArgValue(ParsingUtils.getObjectFactoryBeanDefinition(getAuditingHandlerBeanName(), registry));
registerInfrastructureBeanWithId(listenerBeanDefinitionBuilder.getBeanDefinition(),
AuditingEventListener.class.getName(), registry);
}
/**
* Register default bean definitions for a {@link MongoMappingContext} and an {@link IsNewStrategyFactory} in case we
* don't find beans with the assumed names in the registry.
*
* @param registry the {@link BeanDefinitionRegistry} to use to register the components into.
* @param source the source which the registered components shall be registered with
* Simple helper to be able to wire the {@link MappingContext} from a {@link MappingMongoConverter} bean available in
* the application context.
*
* @author Oliver Gierke
*/
private void defaultDependenciesIfNecessary(BeanDefinitionRegistry registry, Object source) {
static class MongoMappingContextLookup
implements FactoryBean<MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty>> {
if (!registry.containsBeanDefinition(MAPPING_CONTEXT_BEAN_NAME)) {
private final MappingMongoConverter converter;
RootBeanDefinition definition = new RootBeanDefinition(MongoMappingContext.class);
definition.setRole(ROLE_INFRASTRUCTURE);
definition.setSource(source);
/**
* Creates a new {@link MongoMappingContextLookup} for the given {@link MappingMongoConverter}.
*
* @param converter must not be {@literal null}.
*/
public MongoMappingContextLookup(MappingMongoConverter converter) {
this.converter = converter;
}
registry.registerBeanDefinition(MAPPING_CONTEXT_BEAN_NAME, definition);
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
@Override
public MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> getObject() throws Exception {
return converter.getMappingContext();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
@Override
public Class<?> getObjectType() {
return MappingContext.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
*/
@Override
public boolean isSingleton() {
return true;
}
}
}

View File

@@ -0,0 +1,85 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.core.MongoClientFactoryBean;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
/**
* Parser for {@code mongo-client} definitions.
*
* @author Christoph Strobl
* @since 1.7
*/
public class MongoClientParser implements BeanDefinitionParser {
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.BeanDefinitionParser#parse(org.w3c.dom.Element, org.springframework.beans.factory.xml.ParserContext)
*/
public BeanDefinition parse(Element element, ParserContext parserContext) {
Object source = parserContext.extractSource(element);
String id = element.getAttribute("id");
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(MongoClientFactoryBean.class);
ParsingUtils.setPropertyValue(builder, element, "port", "port");
ParsingUtils.setPropertyValue(builder, element, "host", "host");
ParsingUtils.setPropertyValue(builder, element, "credentials", "credentials");
MongoParsingUtils.parseMongoClientOptions(element, builder);
MongoParsingUtils.parseReplicaSet(element, builder);
String defaultedId = StringUtils.hasText(id) ? id : BeanNames.MONGO_BEAN_NAME;
parserContext.pushContainingComponent(new CompositeComponentDefinition("Mongo", source));
BeanComponentDefinition mongoComponent = helper.getComponent(builder, defaultedId);
parserContext.registerBeanComponent(mongoComponent);
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(MongoParsingUtils
.getServerAddressPropertyEditorBuilder());
parserContext.registerBeanComponent(serverAddressPropertyEditor);
BeanComponentDefinition writeConcernEditor = helper.getComponent(MongoParsingUtils
.getWriteConcernPropertyEditorBuilder());
parserContext.registerBeanComponent(writeConcernEditor);
BeanComponentDefinition readPreferenceEditor = helper.getComponent(MongoParsingUtils
.getReadPreferencePropertyEditorBuilder());
parserContext.registerBeanComponent(readPreferenceEditor);
BeanComponentDefinition credentialsEditor = helper.getComponent(MongoParsingUtils
.getMongoCredentialPropertyEditor());
parserContext.registerBeanComponent(credentialsEditor);
parserContext.popAndRegisterContainingComponent();
return mongoComponent.getBeanDefinition();
}
}

View File

@@ -0,0 +1,198 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.Properties;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.springframework.util.StringUtils;
import com.mongodb.MongoCredential;
/**
* Parse a {@link String} to a Collection of {@link MongoCredential}.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
private static final Pattern GROUP_PATTERN = Pattern.compile("(\\\\?')(.*?)\\1");
private static final String AUTH_MECHANISM_KEY = "uri.authMechanism";
private static final String USERNAME_PASSWORD_DELIMINATOR = ":";
private static final String DATABASE_DELIMINATOR = "@";
private static final String OPTIONS_DELIMINATOR = "?";
private static final String OPTION_VALUE_DELIMINATOR = "&";
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(String text) throws IllegalArgumentException {
if (!StringUtils.hasText(text)) {
return;
}
List<MongoCredential> credentials = new ArrayList<MongoCredential>();
for (String credentialString : extractCredentialsString(text)) {
String[] userNameAndPassword = extractUserNameAndPassword(credentialString);
String database = extractDB(credentialString);
Properties options = extractOptions(credentialString);
if (!options.isEmpty()) {
if (options.containsKey(AUTH_MECHANISM_KEY)) {
String authMechanism = options.getProperty(AUTH_MECHANISM_KEY);
if (MongoCredential.GSSAPI_MECHANISM.equals(authMechanism)) {
verifyUserNamePresent(userNameAndPassword);
credentials.add(MongoCredential.createGSSAPICredential(userNameAndPassword[0]));
} else if (MongoCredential.MONGODB_CR_MECHANISM.equals(authMechanism)) {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.createMongoCRCredential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if (MongoCredential.MONGODB_X509_MECHANISM.equals(authMechanism)) {
verifyUserNamePresent(userNameAndPassword);
credentials.add(MongoCredential.createMongoX509Credential(userNameAndPassword[0]));
} else if (MongoCredential.PLAIN_MECHANISM.equals(authMechanism)) {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.createPlainCredential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if (MongoCredential.SCRAM_SHA_1_MECHANISM.equals(authMechanism)) {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.createScramSha1Credential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else {
throw new IllegalArgumentException(
String.format("Cannot create MongoCredentials for unknown auth mechanism '%s'!", authMechanism));
}
}
} else {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(
MongoCredential.createCredential(userNameAndPassword[0], database, userNameAndPassword[1].toCharArray()));
}
}
setValue(credentials);
}
private List<String> extractCredentialsString(String source) {
Matcher matcher = GROUP_PATTERN.matcher(source);
List<String> list = new ArrayList<String>();
while (matcher.find()) {
String value = StringUtils.trimLeadingCharacter(matcher.group(), '\'');
list.add(StringUtils.trimTrailingCharacter(value, '\''));
}
if (!list.isEmpty()) {
return list;
}
return Arrays.asList(source.split(","));
}
private static String[] extractUserNameAndPassword(String text) {
int index = text.lastIndexOf(DATABASE_DELIMINATOR);
index = index != -1 ? index : text.lastIndexOf(OPTIONS_DELIMINATOR);
return index == -1 ? new String[] {} : text.substring(0, index).split(USERNAME_PASSWORD_DELIMINATOR);
}
private static String extractDB(String text) {
int dbSeperationIndex = text.lastIndexOf(DATABASE_DELIMINATOR);
if (dbSeperationIndex == -1) {
return "";
}
String tmp = text.substring(dbSeperationIndex + 1);
int optionsSeperationIndex = tmp.lastIndexOf(OPTIONS_DELIMINATOR);
return optionsSeperationIndex > -1 ? tmp.substring(0, optionsSeperationIndex) : tmp;
}
private static Properties extractOptions(String text) {
int optionsSeperationIndex = text.lastIndexOf(OPTIONS_DELIMINATOR);
int dbSeperationIndex = text.lastIndexOf(OPTIONS_DELIMINATOR);
if (optionsSeperationIndex == -1 || dbSeperationIndex > optionsSeperationIndex) {
return new Properties();
}
Properties properties = new Properties();
for (String option : text.substring(optionsSeperationIndex + 1).split(OPTION_VALUE_DELIMINATOR)) {
String[] optionArgs = option.split("=");
properties.put(optionArgs[0], optionArgs[1]);
}
return properties;
}
private static void verifyUsernameAndPasswordPresent(String[] source) {
verifyUserNamePresent(source);
if (source.length != 2) {
throw new IllegalArgumentException(
"Credentials need to specify username and password like in 'username:password@database'!");
}
}
private static void verifyDatabasePresent(String source) {
if (!StringUtils.hasText(source)) {
throw new IllegalArgumentException("Credentials need to specify database like in 'username:password@database'!");
}
}
private static void verifyUserNamePresent(String[] source) {
if (source.length == 0 || !StringUtils.hasText(source[0])) {
throw new IllegalArgumentException("Credentials need to specify username!");
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 by the original author(s).
* Copyright 2011-2015 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,6 +18,10 @@ package org.springframework.data.mongodb.config;
import static org.springframework.data.config.ParsingUtils.*;
import static org.springframework.data.mongodb.config.MongoParsingUtils.*;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
@@ -34,6 +38,7 @@ import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
import com.mongodb.Mongo;
import com.mongodb.MongoClientURI;
import com.mongodb.MongoURI;
/**
@@ -42,9 +47,22 @@ import com.mongodb.MongoURI;
* @author Jon Brisbin
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
* @author Viktor Khoroshko
*/
public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
private static final Set<String> MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES;
static {
Set<String> mongoUriAllowedAdditionalAttributes = new HashSet<String>();
mongoUriAllowedAdditionalAttributes.add("id");
mongoUriAllowedAdditionalAttributes.add("write-concern");
MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES = Collections.unmodifiableSet(mongoUriAllowedAdditionalAttributes);
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
@@ -64,29 +82,25 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
Object source = parserContext.extractSource(element);
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
String uri = element.getAttribute("uri");
String mongoRef = element.getAttribute("mongo-ref");
String dbname = element.getAttribute("dbname");
BeanDefinition userCredentials = getUserCredentialsBeanDefinition(element, parserContext);
// Common setup
BeanDefinitionBuilder dbFactoryBuilder = BeanDefinitionBuilder.genericBeanDefinition(SimpleMongoDbFactory.class);
setPropertyValue(dbFactoryBuilder, element, "write-concern", "writeConcern");
if (StringUtils.hasText(uri)) {
if (StringUtils.hasText(mongoRef) || StringUtils.hasText(dbname) || userCredentials != null) {
parserContext.getReaderContext().error("Configure either Mongo URI or details individually!", source);
}
BeanDefinition mongoUri = getMongoUri(element, parserContext);
dbFactoryBuilder.addConstructorArgValue(getMongoUri(uri));
if (mongoUri != null) {
dbFactoryBuilder.addConstructorArgValue(mongoUri);
return getSourceBeanDefinition(dbFactoryBuilder, parserContext, element);
}
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
String mongoRef = element.getAttribute("mongo-ref");
String dbname = element.getAttribute("dbname");
BeanDefinition userCredentials = getUserCredentialsBeanDefinition(element, parserContext);
// Defaulting
if (StringUtils.hasText(mongoRef)) {
dbFactoryBuilder.addConstructorArgReference(mongoRef);
@@ -147,14 +161,42 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
}
/**
* Creates a {@link BeanDefinition} for a {@link MongoURI}.
* Creates a {@link BeanDefinition} for a {@link MongoURI} or {@link MongoClientURI} depending on configured
* attributes. <br />
* Errors when configured element contains {@literal uri} or {@literal client-uri} along with other attributes except
* {@literal write-concern} and/or {@literal id}.
*
* @param uri
* @return
* @param element must not be {@literal null}.
* @param parserContext
* @return {@literal null} in case no client-/uri defined.
*/
private BeanDefinition getMongoUri(String uri) {
private BeanDefinition getMongoUri(Element element, ParserContext parserContext) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(MongoURI.class);
boolean hasClientUri = element.hasAttribute("client-uri");
if (!hasClientUri && !element.hasAttribute("uri")) {
return null;
}
int allowedAttributesCount = 1;
for (String attribute : MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES) {
if (element.hasAttribute(attribute)) {
allowedAttributesCount++;
}
}
if (element.getAttributes().getLength() > allowedAttributesCount) {
parserContext.getReaderContext().error(
"Configure either " + (hasClientUri ? "Mongo Client URI" : "Mongo URI") + " or details individually!",
parserContext.extractSource(element));
}
Class<?> type = hasClientUri ? MongoClientURI.class : MongoURI.class;
String uri = hasClientUri ? element.getAttribute("client-uri") : element.getAttribute("uri");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(type);
builder.addConstructorArgValue(uri);
return builder.getBeanDefinition();

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,6 +22,7 @@ import org.springframework.beans.factory.xml.NamespaceHandlerSupport;
*
* @author Oliver Gierke
* @author Martin Baumgartner
* @author Christoph Strobl
*/
public class MongoNamespaceHandler extends NamespaceHandlerSupport {
@@ -33,6 +34,7 @@ public class MongoNamespaceHandler extends NamespaceHandlerSupport {
registerBeanDefinitionParser("mapping-converter", new MappingMongoConverterParser());
registerBeanDefinitionParser("mongo", new MongoParser());
registerBeanDefinitionParser("mongo-client", new MongoClientParser());
registerBeanDefinitionParser("db-factory", new MongoDbFactoryParser());
registerBeanDefinitionParser("jmx", new MongoJmxParser());
registerBeanDefinitionParser("auditing", new MongoAuditingBeanDefinitionParser());

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,14 +15,10 @@
*/
package org.springframework.data.mongodb.config;
import java.util.Map;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.config.BeanComponentDefinitionBuilder;
@@ -36,6 +32,7 @@ import org.w3c.dom.Element;
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MongoParser implements BeanDefinitionParser {
@@ -64,7 +61,8 @@ public class MongoParser implements BeanDefinitionParser {
BeanComponentDefinition mongoComponent = helper.getComponent(builder, defaultedId);
parserContext.registerBeanComponent(mongoComponent);
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(registerServerAddressPropertyEditor());
BeanComponentDefinition serverAddressPropertyEditor = helper.getComponent(MongoParsingUtils
.getServerAddressPropertyEditorBuilder());
parserContext.registerBeanComponent(serverAddressPropertyEditor);
BeanComponentDefinition writeConcernPropertyEditor = helper.getComponent(MongoParsingUtils
.getWriteConcernPropertyEditorBuilder());
@@ -75,19 +73,4 @@ public class MongoParser implements BeanDefinitionParser {
return mongoComponent.getBeanDefinition();
}
/**
* One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
* container.
*/
private BeanDefinitionBuilder registerServerAddressPropertyEditor() {
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,6 +24,7 @@ import org.springframework.beans.factory.config.CustomEditorConfigurer;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedMap;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.data.mongodb.core.MongoClientOptionsFactoryBean;
import org.springframework.data.mongodb.core.MongoOptionsFactoryBean;
import org.springframework.util.xml.DomUtils;
import org.w3c.dom.Element;
@@ -33,13 +34,13 @@ import org.w3c.dom.Element;
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Thomas Darimont
* @author Christoph Strobl
*/
@SuppressWarnings("deprecation")
abstract class MongoParsingUtils {
private MongoParsingUtils() {
}
private MongoParsingUtils() {}
/**
* Parses the mongo replica-set element.
@@ -54,12 +55,14 @@ abstract class MongoParsingUtils {
}
/**
* Parses the mongo:options sub-element. Populates the given attribute factory with the proper attributes.
* Parses the {@code mongo:options} sub-element. Populates the given attribute factory with the proper attributes.
*
* @return true if parsing actually occured, false otherwise
* @return true if parsing actually occured, {@literal false} otherwise
*/
static boolean parseMongoOptions(Element element, BeanDefinitionBuilder mongoBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "options");
if (optionsElement == null) {
return false;
}
@@ -80,13 +83,59 @@ abstract class MongoParsingUtils {
setPropertyValue(optionsDefBuilder, optionsElement, "write-timeout", "writeTimeout");
setPropertyValue(optionsDefBuilder, optionsElement, "write-fsync", "writeFsync");
setPropertyValue(optionsDefBuilder, optionsElement, "slave-ok", "slaveOk");
setPropertyValue(optionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(optionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
setPropertyValue(optionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(optionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
mongoBuilder.addPropertyValue("mongoOptions", optionsDefBuilder.getBeanDefinition());
return true;
}
/**
* Parses the {@code mongo:client-options} sub-element. Populates the given attribute factory with the proper
* attributes.
*
* @param element must not be {@literal null}.
* @param mongoClientBuilder must not be {@literal null}.
* @return
* @since 1.7
*/
public static boolean parseMongoClientOptions(Element element, BeanDefinitionBuilder mongoClientBuilder) {
Element optionsElement = DomUtils.getChildElementByTagName(element, "client-options");
if (optionsElement == null) {
return false;
}
BeanDefinitionBuilder clientOptionsDefBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MongoClientOptionsFactoryBean.class);
setPropertyValue(clientOptionsDefBuilder, optionsElement, "description", "description");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-connections-per-host", "minConnectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connections-per-host", "connectionsPerHost");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "threads-allowed-to-block-for-connection-multiplier",
"threadsAllowedToBlockForConnectionMultiplier");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-wait-time", "maxWaitTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-idle-time", "maxConnectionIdleTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "max-connection-life-time", "maxConnectionLifeTime");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "connect-timeout", "connectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-timeout", "socketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "read-preference", "readPreference");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "write-concern", "writeConcern");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-frequency", "heartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "min-heartbeat-frequency", "minHeartbeatFrequency");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-connect-timeout", "heartbeatConnectTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-socket-timeout", "heartbeatSocketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(clientOptionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "server-selection-timeout", "serverSelectionTimeout");
mongoClientBuilder.addPropertyValue("mongoClientOptions", clientOptionsDefBuilder.getBeanDefinition());
return true;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link WriteConcernPropertyEditor}.
@@ -103,4 +152,56 @@ abstract class MongoParsingUtils {
return builder;
}
/**
* One should only register one bean definition but want to have the convenience of using
* AbstractSingleBeanDefinitionParser but have the side effect of registering a 'default' property editor with the
* container.
*/
static BeanDefinitionBuilder getServerAddressPropertyEditorBuilder() {
Map<String, String> customEditors = new ManagedMap<String, String>();
customEditors.put("com.mongodb.ServerAddress[]",
"org.springframework.data.mongodb.config.ServerAddressPropertyEditor");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link ReadPreferencePropertyEditor}.
*
* @return
* @since 1.7
*/
static BeanDefinitionBuilder getReadPreferencePropertyEditorBuilder() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.ReadPreference", ReadPreferencePropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
/**
* Returns the {@link BeanDefinitionBuilder} to build a {@link BeanDefinition} for a
* {@link MongoCredentialPropertyEditor}.
*
* @return
* @since 1.7
*/
static BeanDefinitionBuilder getMongoCredentialPropertyEditor() {
Map<String, Class<?>> customEditors = new ManagedMap<String, Class<?>>();
customEditors.put("com.mongodb.MongoCredential[]", MongoCredentialPropertyEditor.class);
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(CustomEditorConfigurer.class);
builder.addPropertyValue("customEditors", customEditors);
return builder;
}
}

View File

@@ -0,0 +1,66 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import com.mongodb.ReadPreference;
/**
* Parse a {@link String} to a {@link ReadPreference}.
*
* @author Christoph Strobl
* @since 1.7
*/
public class ReadPreferencePropertyEditor extends PropertyEditorSupport {
/*
* (non-Javadoc)
* @see java.beans.PropertyEditorSupport#setAsText(java.lang.String)
*/
@Override
public void setAsText(String readPreferenceString) throws IllegalArgumentException {
if (readPreferenceString == null) {
return;
}
ReadPreference preference = null;
try {
preference = ReadPreference.valueOf(readPreferenceString);
} catch (IllegalArgumentException ex) {
// ignore this one and try to map it differently
}
if (preference != null) {
setValue(preference);
} else if ("PRIMARY".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.primary());
} else if ("PRIMARY_PREFERRED".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.primaryPreferred());
} else if ("SECONDARY".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.secondary());
} else if ("SECONDARY_PREFERRED".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.secondaryPreferred());
} else if ("NEAREST".equalsIgnoreCase(readPreferenceString)) {
setValue(ReadPreference.nearest());
} else {
throw new IllegalArgumentException(String.format("Cannot find matching ReadPreference for %s",
readPreferenceString));
}
}
}

View File

@@ -0,0 +1,145 @@
/*
* Copyright 2015-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.List;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.Pair;
import com.mongodb.BulkWriteResult;
/**
* Bulk operations for insert/update/remove actions on a collection. These bulks operation are available since MongoDB
* 2.6 and make use of low level bulk commands on the protocol level. This interface defines a fluent API to add
* multiple single operations or list of similar operations in sequence which can then eventually be executed by calling
* {@link #execute()}.
*
* @author Tobias Trelle
* @author Oliver Gierke
* @since 1.9
*/
public interface BulkOperations {
/**
* Mode for bulk operation.
**/
public enum BulkMode {
/** Perform bulk operations in sequence. The first error will cancel processing. */
ORDERED,
/** Perform bulk operations in parallel. Processing will continue on errors. */
UNORDERED
};
/**
* Add a single insert to the bulk operation.
*
* @param documents the document to insert, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the insert added, will never be {@literal null}.
*/
BulkOperations insert(Object documents);
/**
* Add a list of inserts to the bulk operation.
*
* @param documents List of documents to insert, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the insert added, will never be {@literal null}.
*/
BulkOperations insert(List<? extends Object> documents);
/**
* Add a single update to the bulk operation. For the update request, only the first matching document is updated.
*
* @param query update criteria, must not be {@literal null}.
* @param update {@link Update} operation to perform, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateOne(Query query, Update update);
/**
* Add a list of updates to the bulk operation. For each update request, only the first matching document is updated.
*
* @param updates Update operations to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateOne(List<Pair<Query, Update>> updates);
/**
* Add a single update to the bulk operation. For the update request, all matching documents are updated.
*
* @param query Update criteria.
* @param update Update operation to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateMulti(Query query, Update update);
/**
* Add a list of updates to the bulk operation. For each update request, all matching documents are updated.
*
* @param updates Update operations to perform.
* @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateMulti(List<Pair<Query, Update>> updates);
/**
* Add a single upsert to the bulk operation. An upsert is an update if the set of matching documents is not empty,
* else an insert.
*
* @param query Update criteria.
* @param update Update operation to perform.
* @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations upsert(Query query, Update update);
/**
* Add a list of upserts to the bulk operation. An upsert is an update if the set of matching documents is not empty,
* else an insert.
*
* @param updates Updates/insert operations to perform.
* @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations upsert(List<Pair<Query, Update>> updates);
/**
* Add a single remove operation to the bulk operation.
*
* @param remove the {@link Query} to select the documents to be removed, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the removal added, will never be {@literal null}.
*/
BulkOperations remove(Query remove);
/**
* Add a list of remove operations to the bulk operation.
*
* @param removes the remove operations to perform, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the removal added, will never be {@literal null}.
*/
BulkOperations remove(List<Query> removes);
/**
* Execute all bulk operations using the default write concern.
*
* @return Result of the bulk operation providing counters for inserts/updates etc.
* @throws org.springframework.data.mongodb.BulkOperationException if an error occurred during bulk processing.
*/
BulkWriteResult execute();
}

View File

@@ -25,7 +25,7 @@ import com.mongodb.DBCursor;
interface CursorPreparer {
/**
* Prepare the given cursor (apply limits, skips and so on). Returns th eprepared cursor.
* Prepare the given cursor (apply limits, skips and so on). Returns the prepared cursor.
*
* @param cursor
*/

View File

@@ -0,0 +1,367 @@
/*
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import lombok.NonNull;
import lombok.Value;
import java.util.Collections;
import java.util.List;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.Pair;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.BulkWriteException;
import com.mongodb.BulkWriteOperation;
import com.mongodb.BulkWriteRequestBuilder;
import com.mongodb.BulkWriteResult;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.WriteConcern;
/**
* Default implementation for {@link BulkOperations}.
*
* @author Tobias Trelle
* @author Oliver Gierke
* @author Christoph Strobl
* @author Mark Paluch
* @since 1.9
*/
class DefaultBulkOperations implements BulkOperations {
private final MongoOperations mongoOperations;
private final String collectionName;
private final BulkOperationContext bulkOperationContext;
private PersistenceExceptionTranslator exceptionTranslator;
private WriteConcernResolver writeConcernResolver;
private WriteConcern defaultWriteConcern;
private BulkWriteOperation bulk;
/**
* Creates a new {@link DefaultBulkOperations} for the given {@link MongoOperations}, collection name and
* {@link BulkOperationContext}.
*
* @param mongoOperations must not be {@literal null}.
* @param collectionName must not be {@literal null}.
* @param bulkOperationContext must not be {@literal null}.
* @since 1.10.5
*/
DefaultBulkOperations(MongoOperations mongoOperations, String collectionName,
BulkOperationContext bulkOperationContext) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.hasText(collectionName, "CollectionName must not be null nor empty!");
Assert.notNull(bulkOperationContext, "BulkOperationContext must not be null!");
this.mongoOperations = mongoOperations;
this.collectionName = collectionName;
this.bulkOperationContext = bulkOperationContext;
this.exceptionTranslator = new MongoExceptionTranslator();
this.writeConcernResolver = DefaultWriteConcernResolver.INSTANCE;
this.bulk = getBulkWriteOptions(bulkOperationContext.getBulkMode());
}
/**
* Configures the {@link PersistenceExceptionTranslator} to be used. Defaults to {@link MongoExceptionTranslator}.
*
* @param exceptionTranslator can be {@literal null}.
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? new MongoExceptionTranslator() : exceptionTranslator;
}
/**
* Configures the {@link WriteConcernResolver} to be used. Defaults to {@link DefaultWriteConcernResolver}.
*
* @param writeConcernResolver can be {@literal null}.
*/
public void setWriteConcernResolver(WriteConcernResolver writeConcernResolver) {
this.writeConcernResolver = writeConcernResolver == null ? DefaultWriteConcernResolver.INSTANCE
: writeConcernResolver;
}
/**
* Configures the default {@link WriteConcern} to be used. Defaults to {@literal null}.
*
* @param defaultWriteConcern can be {@literal null}.
*/
void setDefaultWriteConcern(WriteConcern defaultWriteConcern) {
this.defaultWriteConcern = defaultWriteConcern;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#insert(java.lang.Object)
*/
@Override
public BulkOperations insert(Object document) {
Assert.notNull(document, "Document must not be null!");
if (document instanceof DBObject) {
bulk.insert((DBObject) document);
return this;
}
DBObject sink = new BasicDBObject();
mongoOperations.getConverter().write(document, sink);
bulk.insert(sink);
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#insert(java.util.List)
*/
@Override
public BulkOperations insert(List<? extends Object> documents) {
Assert.notNull(documents, "Documents must not be null!");
for (Object document : documents) {
insert(document);
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateOne(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateOne(Query query, Update update) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
return updateOne(Collections.singletonList(Pair.of(query, update)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateOne(java.util.List)
*/
@Override
public BulkOperations updateOne(List<Pair<Query, Update>> updates) {
Assert.notNull(updates, "Updates must not be null!");
for (Pair<Query, Update> update : updates) {
update(update.getFirst(), update.getSecond(), false, false);
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateMulti(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateMulti(Query query, Update update) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
return updateMulti(Collections.singletonList(Pair.of(query, update)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateMulti(java.util.List)
*/
@Override
public BulkOperations updateMulti(List<Pair<Query, Update>> updates) {
Assert.notNull(updates, "Updates must not be null!");
for (Pair<Query, Update> update : updates) {
update(update.getFirst(), update.getSecond(), false, true);
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#upsert(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
public BulkOperations upsert(Query query, Update update) {
return update(query, update, true, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#upsert(java.util.List)
*/
@Override
public BulkOperations upsert(List<Pair<Query, Update>> updates) {
for (Pair<Query, Update> update : updates) {
upsert(update.getFirst(), update.getSecond());
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#remove(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public BulkOperations remove(Query query) {
Assert.notNull(query, "Query must not be null!");
bulk.find(getMappedQuery(query.getQueryObject())).remove();
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#remove(java.util.List)
*/
@Override
public BulkOperations remove(List<Query> removes) {
Assert.notNull(removes, "Removals must not be null!");
for (Query query : removes) {
remove(query);
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#executeBulk()
*/
@Override
public BulkWriteResult execute() {
MongoAction action = new MongoAction(defaultWriteConcern, MongoActionOperation.BULK, collectionName,
bulkOperationContext.getEntityType(), null, null);
WriteConcern writeConcern = writeConcernResolver.resolve(action);
try {
return writeConcern == null ? bulk.execute() : bulk.execute(writeConcern);
} catch (BulkWriteException o_O) {
DataAccessException toThrow = exceptionTranslator.translateExceptionIfPossible(o_O);
throw toThrow == null ? o_O : toThrow;
} finally {
this.bulk = getBulkWriteOptions(bulkOperationContext.getBulkMode());
}
}
/**
* Performs update and upsert bulk operations.
*
* @param query the {@link Query} to determine documents to update.
* @param update the {@link Update} to perform, must not be {@literal null}.
* @param upsert whether to upsert.
* @param multi whether to issue a multi-update.
* @return the {@link BulkOperations} with the update registered.
*/
private BulkOperations update(Query query, Update update, boolean upsert, boolean multi) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
BulkWriteRequestBuilder builder = bulk.find(getMappedQuery(query.getQueryObject()));
if (upsert) {
if (multi) {
builder.upsert().update(getMappedUpdate(update.getUpdateObject()));
} else {
builder.upsert().updateOne(getMappedUpdate(update.getUpdateObject()));
}
} else {
if (multi) {
builder.update(getMappedUpdate(update.getUpdateObject()));
} else {
builder.updateOne(getMappedUpdate(update.getUpdateObject()));
}
}
return this;
}
private DBObject getMappedUpdate(DBObject update) {
return bulkOperationContext.getUpdateMapper().getMappedObject(update, bulkOperationContext.getEntity());
}
private DBObject getMappedQuery(DBObject query) {
return bulkOperationContext.getQueryMapper().getMappedObject(query, bulkOperationContext.getEntity());
}
private BulkWriteOperation getBulkWriteOptions(BulkMode bulkMode) {
DBCollection collection = mongoOperations.getCollection(collectionName);
switch (bulkMode) {
case ORDERED:
return collection.initializeOrderedBulkOperation();
case UNORDERED:
return collection.initializeUnorderedBulkOperation();
}
throw new IllegalStateException("BulkMode was null!");
}
/**
* {@link BulkOperationContext} holds information about
* {@link org.springframework.data.mongodb.core.BulkOperations.BulkMode} the entity in use as well as references to
* {@link QueryMapper} and {@link UpdateMapper}.
*
* @author Christoph Strobl
* @since 2.0
*/
@Value
static class BulkOperationContext {
@NonNull BulkMode bulkMode;
MongoPersistentEntity<?> entity;
@NonNull QueryMapper queryMapper;
@NonNull UpdateMapper updateMapper;
Class<?> getEntityType() {
return entity != null ? entity.getType() : null;
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,15 +15,15 @@
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.domain.Sort.Direction.*;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexField;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.util.Assert;
import com.mongodb.DBCollection;
@@ -36,14 +36,16 @@ import com.mongodb.MongoException;
* @author Mark Pollack
* @author Oliver Gierke
* @author Komi Innocent
* @author Christoph Strobl
*/
public class DefaultIndexOperations implements IndexOperations {
private static final Double ONE = Double.valueOf(1);
private static final Double MINUS_ONE = Double.valueOf(-1);
private static final String PARTIAL_FILTER_EXPRESSION_KEY = "partialFilterExpression";
private final MongoOperations mongoOperations;
private final String collectionName;
private final QueryMapper mapper;
private final Class<?> type;
/**
* Creates a new {@link DefaultIndexOperations}.
@@ -52,12 +54,26 @@ public class DefaultIndexOperations implements IndexOperations {
* @param collectionName must not be {@literal null}.
*/
public DefaultIndexOperations(MongoOperations mongoOperations, String collectionName) {
this(mongoOperations, collectionName, null);
}
/**
* Creates a new {@link DefaultIndexOperations}.
*
* @param mongoOperations must not be {@literal null}.
* @param collectionName must not be {@literal null}.
* @param type Type used for mapping potential partial index filter expression. Can be {@literal null}.
* @since 1.10
*/
public DefaultIndexOperations(MongoOperations mongoOperations, String collectionName, Class<?> type) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.notNull(collectionName, "Collection name can not be null!");
this.mongoOperations = mongoOperations;
this.collectionName = collectionName;
this.mapper = new QueryMapper(mongoOperations.getConverter());
this.type = type;
}
/*
@@ -65,14 +81,43 @@ public class DefaultIndexOperations implements IndexOperations {
* @see org.springframework.data.mongodb.core.IndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/
public void ensureIndex(final IndexDefinition indexDefinition) {
mongoOperations.execute(collectionName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
DBObject indexOptions = indexDefinition.getIndexOptions();
if (indexOptions != null) {
collection.ensureIndex(indexDefinition.getIndexKeys(), indexOptions);
} else {
collection.ensureIndex(indexDefinition.getIndexKeys());
if (indexOptions != null && indexOptions.containsField(PARTIAL_FILTER_EXPRESSION_KEY)) {
Assert.isInstanceOf(DBObject.class, indexOptions.get(PARTIAL_FILTER_EXPRESSION_KEY));
indexOptions.put(PARTIAL_FILTER_EXPRESSION_KEY,
mapper.getMappedObject((DBObject) indexOptions.get(PARTIAL_FILTER_EXPRESSION_KEY),
lookupPersistentEntity(type, collectionName)));
}
if (indexOptions != null) {
collection.createIndex(indexDefinition.getIndexKeys(), indexOptions);
} else {
collection.createIndex(indexDefinition.getIndexKeys());
}
return null;
}
private MongoPersistentEntity<?> lookupPersistentEntity(Class<?> entityType, String collection) {
if (entityType != null) {
return mongoOperations.getConverter().getMappingContext().getPersistentEntity(entityType);
}
Collection<? extends MongoPersistentEntity<?>> entities = mongoOperations.getConverter().getMappingContext()
.getPersistentEntities();
for (MongoPersistentEntity<?> entity : entities) {
if (entity.getCollection().equals(collection)) {
return entity;
}
}
return null;
}
});
@@ -104,10 +149,12 @@ public class DefaultIndexOperations implements IndexOperations {
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.IndexOperations#resetIndexCache()
*/
@Deprecated
public void resetIndexCache() {
mongoOperations.execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
collection.resetIndexCache();
ReflectiveDBCollectionInvoker.resetIndexCache(collection);
return null;
}
});
@@ -120,7 +167,9 @@ public class DefaultIndexOperations implements IndexOperations {
public List<IndexInfo> getIndexInfo() {
return mongoOperations.execute(collectionName, new CollectionCallback<List<IndexInfo>>() {
public List<IndexInfo> doInCollection(DBCollection collection) throws MongoException, DataAccessException {
List<DBObject> dbObjectList = collection.getIndexInfo();
return getIndexData(dbObjectList);
}
@@ -130,37 +179,7 @@ public class DefaultIndexOperations implements IndexOperations {
List<IndexInfo> indexInfoList = new ArrayList<IndexInfo>();
for (DBObject ix : dbObjectList) {
DBObject keyDbObject = (DBObject) ix.get("key");
int numberOfElements = keyDbObject.keySet().size();
List<IndexField> indexFields = new ArrayList<IndexField>(numberOfElements);
for (String key : keyDbObject.keySet()) {
Object value = keyDbObject.get(key);
if ("2d".equals(value)) {
indexFields.add(IndexField.geo(key));
} else {
Double keyValue = new Double(value.toString());
if (ONE.equals(keyValue)) {
indexFields.add(IndexField.create(key, ASC));
} else if (MINUS_ONE.equals(keyValue)) {
indexFields.add(IndexField.create(key, DESC));
}
}
}
String name = ix.get("name").toString();
boolean unique = ix.containsField("unique") ? (Boolean) ix.get("unique") : false;
boolean dropDuplicates = ix.containsField("dropDups") ? (Boolean) ix.get("dropDups") : false;
boolean sparse = ix.containsField("sparse") ? (Boolean) ix.get("sparse") : false;
indexInfoList.add(new IndexInfo(indexFields, name, unique, dropDuplicates, sparse));
indexInfoList.add(IndexInfo.indexInfoOf(ix));
}
return indexInfoList;

View File

@@ -0,0 +1,188 @@
/*
* Copyright 2014-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static java.util.UUID.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
import org.bson.types.ObjectId;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.script.ExecutableMongoScript;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.DB;
import com.mongodb.MongoException;
/**
* Default implementation of {@link ScriptOperations} capable of saving and executing {@link ServerSideJavaScript}.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class DefaultScriptOperations implements ScriptOperations {
private static final String SCRIPT_COLLECTION_NAME = "system.js";
private static final String SCRIPT_NAME_PREFIX = "func_";
private final MongoOperations mongoOperations;
/**
* Creates new {@link DefaultScriptOperations} using given {@link MongoOperations}.
*
* @param mongoOperations must not be {@literal null}.
*/
public DefaultScriptOperations(MongoOperations mongoOperations) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
this.mongoOperations = mongoOperations;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.ExecutableMongoScript)
*/
@Override
public NamedMongoScript register(ExecutableMongoScript script) {
return register(new NamedMongoScript(generateScriptName(), script));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#register(org.springframework.data.mongodb.core.script.NamedMongoScript)
*/
@Override
public NamedMongoScript register(NamedMongoScript script) {
Assert.notNull(script, "Script must not be null!");
mongoOperations.save(script, SCRIPT_COLLECTION_NAME);
return script;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#execute(org.springframework.data.mongodb.core.script.ExecutableMongoScript, java.lang.Object[])
*/
@Override
public Object execute(final ExecutableMongoScript script, final Object... args) {
Assert.notNull(script, "Script must not be null!");
return mongoOperations.execute(new DbCallback<Object>() {
@Override
public Object doInDB(DB db) throws MongoException, DataAccessException {
return db.eval(script.getCode(), convertScriptArgs(false, args));
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#call(java.lang.String, java.lang.Object[])
*/
@Override
public Object call(final String scriptName, final Object... args) {
Assert.hasText(scriptName, "ScriptName must not be null or empty!");
return mongoOperations.execute(new DbCallback<Object>() {
@Override
public Object doInDB(DB db) throws MongoException, DataAccessException {
return db.eval(String.format("%s(%s)", scriptName, convertAndJoinScriptArgs(args)));
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#exists(java.lang.String)
*/
@Override
public boolean exists(String scriptName) {
Assert.hasText(scriptName, "ScriptName must not be null or empty!");
return mongoOperations.exists(query(where("name").is(scriptName)), NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ScriptOperations#getScriptNames()
*/
@Override
public Set<String> getScriptNames() {
List<NamedMongoScript> scripts = mongoOperations.findAll(NamedMongoScript.class, SCRIPT_COLLECTION_NAME);
if (CollectionUtils.isEmpty(scripts)) {
return Collections.emptySet();
}
Set<String> scriptNames = new HashSet<String>();
for (NamedMongoScript script : scripts) {
scriptNames.add(script.getName());
}
return scriptNames;
}
private Object[] convertScriptArgs(boolean quote, Object... args) {
if (ObjectUtils.isEmpty(args)) {
return args;
}
List<Object> convertedValues = new ArrayList<Object>(args.length);
for (Object arg : args) {
convertedValues.add(arg instanceof String && quote ? String.format("'%s'", arg)
: this.mongoOperations.getConverter().convertToMongoType(arg));
}
return convertedValues.toArray();
}
private String convertAndJoinScriptArgs(Object... args) {
return ObjectUtils.isEmpty(args) ? "" : StringUtils.arrayToCommaDelimitedString(convertScriptArgs(true, args));
}
/**
* Generate a valid name for the {@literal JavaScript}. MongoDB requires an id of type String for scripts. Calling
* scripts having {@link ObjectId} as id fails. Therefore we create a random UUID without {@code -} (as this won't
* work) an prefix the result with {@link #SCRIPT_NAME_PREFIX}.
*
* @return
*/
private static String generateScriptName() {
return SCRIPT_NAME_PREFIX + randomUUID().toString().replaceAll("-", "");
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,15 +13,20 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
package org.springframework.data.mongodb.core;
import com.mongodb.WriteConcern;
/**
* Interface for {@link Metric}s that can be applied to a base scale.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metric}. This class is scheduled to be
* removed in the next major release.
* Default {@link WriteConcernResolver} resolving the {@link WriteConcern} from the given {@link MongoAction}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
@Deprecated
public interface Metric extends org.springframework.data.geo.Metric {}
enum DefaultWriteConcernResolver implements WriteConcernResolver {
INSTANCE;
public WriteConcern resolve(MongoAction action) {
return action.getDefaultWriteConcern();
}
}

View File

@@ -0,0 +1,72 @@
/*
* Copyright 2016-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Value object to mitigate different representations of geo command execution results in MongoDB.
*
* @author Oliver Gierke
* @soundtrack Fruitcake - Jeff Coffin (The Inside of the Outside)
*/
class GeoCommandStatistics {
private static final GeoCommandStatistics NONE = new GeoCommandStatistics(new BasicDBObject());
private final DBObject source;
/**
* Creates a new {@link GeoCommandStatistics} instance with the given source document.
*
* @param source must not be {@literal null}.
*/
private GeoCommandStatistics(DBObject source) {
Assert.notNull(source, "Source document must not be null!");
this.source = source;
}
/**
* Creates a new {@link GeoCommandStatistics} from the given command result extracting the statistics.
*
* @param commandResult must not be {@literal null}.
* @return
*/
public static GeoCommandStatistics from(DBObject commandResult) {
Assert.notNull(commandResult, "Command result must not be null!");
Object stats = commandResult.get("stats");
return stats == null ? NONE : new GeoCommandStatistics((DBObject) stats);
}
/**
* Returns the average distance reported by the command result. Mitigating a removal of the field in case the command
* didn't return any result introduced in MongoDB 3.2 RC1.
*
* @return
* @see <a href="https://jira.mongodb.org/browse/SERVER-21024">MongoDB Jira SERVER-21024</a>
*/
public double getAverageDistance() {
Object averageDistance = source.get("avgDistance");
return averageDistance == null ? Double.NaN : (Double) averageDistance;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,24 +13,21 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapping;
package org.springframework.data.mongodb.core;
import org.springframework.context.annotation.Bean;
import org.springframework.data.mongodb.core.geo.GeoJsonModule;
import org.springframework.data.web.config.SpringDataJacksonModules;
/**
* SPI interface to determine how to name document fields in cases the field name is not manually defined.
* Configuration class to expose {@link GeoJsonModule} as a Spring bean.
*
* @see DocumentField
* @see PropertyNameFieldNamingStrategy
* @see CamelCaseAbbreviatingFieldNamingStrategy
* @since 1.3
* @author Oliver Gierke
*/
public interface FieldNamingStrategy {
public class GeoJsonConfiguration implements SpringDataJacksonModules {
/**
* Returns the field name to be used for the given {@link MongoPersistentProperty}.
*
* @param property must not be {@literal null} or empty;
* @return
*/
String getFieldName(MongoPersistentProperty property);
@Bean
public GeoJsonModule geoJsonModule() {
return new GeoJsonModule();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,12 +20,12 @@ import java.util.List;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexInfo;
/**
* Index operations on a collection.
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
*/
public interface IndexOperations {
@@ -51,7 +51,11 @@ public interface IndexOperations {
/**
* Clears all indices that have not yet been applied to this collection.
*
* @deprecated since 1.7. The MongoDB Java driver version 3.0 does no longer support reseting the index cache.
* @throws {@link UnsupportedOperationException} when used with MongoDB Java driver version 3.0.
*/
@Deprecated
void resetIndexCache();
/**

View File

@@ -49,7 +49,7 @@ public class MongoAction {
* @param collectionName the collection name, must not be {@literal null} or empty.
* @param entityType the POJO that is being operated against
* @param document the converted DBObject from the POJO or Spring Update object
* @param query the converted DBOjbect from the Spring Query object
* @param query the converted DBObject from the Spring Query object
*/
public MongoAction(WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation,
String collectionName, Class<?> entityType, DBObject document, DBObject query) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -25,5 +25,5 @@ package org.springframework.data.mongodb.core;
*/
public enum MongoActionOperation {
REMOVE, UPDATE, INSERT, INSERT_LIST, SAVE
REMOVE, UPDATE, INSERT, INSERT_LIST, SAVE, BULK;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,7 +27,8 @@ import com.mongodb.Mongo;
* Mongo server administration exposed via JMX annotations
*
* @author Mark Pollack
* @author Thomas Darimont
* @author Thomas Darimont
* @author Mark Paluch
*/
@ManagedResource(description = "Mongo Admin Operations")
public class MongoAdmin implements MongoAdminOperations {
@@ -35,10 +36,11 @@ public class MongoAdmin implements MongoAdminOperations {
private final Mongo mongo;
private String username;
private String password;
private String authenticationDatabaseName;
private String authenticationDatabaseName;
public MongoAdmin(Mongo mongo) {
Assert.notNull(mongo);
Assert.notNull(mongo, "Mongo must not be null!");
this.mongo = mongo;
}
@@ -84,16 +86,16 @@ public class MongoAdmin implements MongoAdminOperations {
this.password = password;
}
/**
* Sets the authenticationDatabaseName to use to authenticate with the Mongo database.
*
* @param authenticationDatabaseName The authenticationDatabaseName to use.
*/
public void setAuthenticationDatabaseName(String authenticationDatabaseName) {
this.authenticationDatabaseName = authenticationDatabaseName;
}
/**
* Sets the authenticationDatabaseName to use to authenticate with the Mongo database.
*
* @param authenticationDatabaseName The authenticationDatabaseName to use.
*/
public void setAuthenticationDatabaseName(String authenticationDatabaseName) {
this.authenticationDatabaseName = authenticationDatabaseName;
}
DB getDB(String databaseName) {
return MongoDbUtils.getDB(mongo, databaseName, new UserCredentials(username, password), authenticationDatabaseName);
return MongoDbUtils.getDB(mongo, databaseName, new UserCredentials(username, password), authenticationDatabaseName);
}
}

View File

@@ -0,0 +1,189 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.net.UnknownHostException;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.util.CollectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientOptions;
import com.mongodb.MongoCredential;
import com.mongodb.ServerAddress;
/**
* Convenient factory for configuring MongoDB.
*
* @author Christoph Strobl
* @since 1.7
*/
public class MongoClientFactoryBean extends AbstractFactoryBean<Mongo> implements PersistenceExceptionTranslator {
private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator();
private MongoClientOptions mongoClientOptions;
private String host;
private Integer port;
private List<ServerAddress> replicaSetSeeds;
private List<MongoCredential> credentials;
private PersistenceExceptionTranslator exceptionTranslator = DEFAULT_EXCEPTION_TRANSLATOR;
/**
* Set the {@link MongoClientOptions} to be used when creating {@link MongoClient}.
*
* @param mongoClientOptions
*/
public void setMongoClientOptions(MongoClientOptions mongoClientOptions) {
this.mongoClientOptions = mongoClientOptions;
}
/**
* Set the list of credentials to be used when creating {@link MongoClient}.
*
* @param credentials can be {@literal null}.
*/
public void setCredentials(MongoCredential[] credentials) {
this.credentials = filterNonNullElementsAsList(credentials);
}
/**
* Set the list of {@link ServerAddress} to build up a replica set for.
*
* @param replicaSetSeeds can be {@literal null}.
*/
public void setReplicaSetSeeds(ServerAddress[] replicaSetSeeds) {
this.replicaSetSeeds = filterNonNullElementsAsList(replicaSetSeeds);
}
/**
* Configures the host to connect to.
*
* @param host
*/
public void setHost(String host) {
this.host = host;
}
/**
* Configures the port to connect to.
*
* @param port
*/
public void setPort(int port) {
this.port = port;
}
/**
* Configures the {@link PersistenceExceptionTranslator} to use.
*
* @param exceptionTranslator
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
public Class<? extends Mongo> getObjectType() {
return Mongo.class;
}
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
*/
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
return exceptionTranslator.translateExceptionIfPossible(ex);
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override
protected Mongo createInstance() throws Exception {
if (mongoClientOptions == null) {
mongoClientOptions = MongoClientOptions.builder().build();
}
if (credentials == null) {
credentials = Collections.emptyList();
}
return createMongoClient();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object)
*/
@Override
protected void destroyInstance(Mongo instance) throws Exception {
instance.close();
}
private MongoClient createMongoClient() throws UnknownHostException {
if (!CollectionUtils.isEmpty(replicaSetSeeds)) {
return new MongoClient(replicaSetSeeds, credentials, mongoClientOptions);
}
return new MongoClient(createConfiguredOrDefaultServerAddress(), credentials, mongoClientOptions);
}
private ServerAddress createConfiguredOrDefaultServerAddress() throws UnknownHostException {
ServerAddress defaultAddress = new ServerAddress();
return new ServerAddress(StringUtils.hasText(host) ? host : defaultAddress.getHost(),
port != null ? port.intValue() : defaultAddress.getPort());
}
/**
* Returns the given array as {@link List} with all {@literal null} elements removed.
*
* @param elements the elements to filter <T>, can be {@literal null}.
* @return a new unmodifiable {@link List#} from the given elements without {@literal null}s.
*/
private static <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
}
}

View File

@@ -0,0 +1,315 @@
/*
* Copyright 2015-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import javax.net.SocketFactory;
import javax.net.ssl.SSLSocketFactory;
import org.springframework.beans.DirectFieldAccessor;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.DBDecoderFactory;
import com.mongodb.DBEncoderFactory;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientOptions;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
/**
* A factory bean for construction of a {@link MongoClientOptions} instance.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClientOptions> {
private static final MongoClientOptions DEFAULT_MONGO_OPTIONS = MongoClientOptions.builder().build();
private String description = DEFAULT_MONGO_OPTIONS.getDescription();
private int minConnectionsPerHost = DEFAULT_MONGO_OPTIONS.getMinConnectionsPerHost();
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.getConnectionsPerHost();
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS
.getThreadsAllowedToBlockForConnectionMultiplier();
private int maxWaitTime = DEFAULT_MONGO_OPTIONS.getMaxWaitTime();
private int maxConnectionIdleTime = DEFAULT_MONGO_OPTIONS.getMaxConnectionIdleTime();
private int maxConnectionLifeTime = DEFAULT_MONGO_OPTIONS.getMaxConnectionLifeTime();
private int connectTimeout = DEFAULT_MONGO_OPTIONS.getConnectTimeout();
private int socketTimeout = DEFAULT_MONGO_OPTIONS.getSocketTimeout();
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.isSocketKeepAlive();
private ReadPreference readPreference = DEFAULT_MONGO_OPTIONS.getReadPreference();
private DBDecoderFactory dbDecoderFactory = DEFAULT_MONGO_OPTIONS.getDbDecoderFactory();
private DBEncoderFactory dbEncoderFactory = DEFAULT_MONGO_OPTIONS.getDbEncoderFactory();
private WriteConcern writeConcern = DEFAULT_MONGO_OPTIONS.getWriteConcern();
private SocketFactory socketFactory = DEFAULT_MONGO_OPTIONS.getSocketFactory();
private boolean cursorFinalizerEnabled = DEFAULT_MONGO_OPTIONS.isCursorFinalizerEnabled();
private boolean alwaysUseMBeans = DEFAULT_MONGO_OPTIONS.isAlwaysUseMBeans();
private int heartbeatFrequency = DEFAULT_MONGO_OPTIONS.getHeartbeatFrequency();
private int minHeartbeatFrequency = DEFAULT_MONGO_OPTIONS.getMinHeartbeatFrequency();
private int heartbeatConnectTimeout = DEFAULT_MONGO_OPTIONS.getHeartbeatConnectTimeout();
private int heartbeatSocketTimeout = DEFAULT_MONGO_OPTIONS.getHeartbeatSocketTimeout();
private String requiredReplicaSetName = DEFAULT_MONGO_OPTIONS.getRequiredReplicaSetName();
private int serverSelectionTimeout = Integer.MIN_VALUE;
private boolean ssl;
private SSLSocketFactory sslSocketFactory;
/**
* Set the {@link MongoClient} description.
*
* @param description
*/
public void setDescription(String description) {
this.description = description;
}
/**
* Set the minimum number of connections per host.
*
* @param minConnectionsPerHost
*/
public void setMinConnectionsPerHost(int minConnectionsPerHost) {
this.minConnectionsPerHost = minConnectionsPerHost;
}
/**
* Set the number of connections allowed per host. Will block if run out. Default is 10. System property
* {@code MONGO.POOLSIZE} can override
*
* @param connectionsPerHost
*/
public void setConnectionsPerHost(int connectionsPerHost) {
this.connectionsPerHost = connectionsPerHost;
}
/**
* Set the multiplier for connectionsPerHost for # of threads that can block. Default is 5. If connectionsPerHost is
* 10, and threadsAllowedToBlockForConnectionMultiplier is 5, then 50 threads can block more than that and an
* exception will be thrown.
*
* @param threadsAllowedToBlockForConnectionMultiplier
*/
public void setThreadsAllowedToBlockForConnectionMultiplier(int threadsAllowedToBlockForConnectionMultiplier) {
this.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;
}
/**
* Set the max wait time of a blocking thread for a connection. Default is 12000 ms (2 minutes)
*
* @param maxWaitTime
*/
public void setMaxWaitTime(int maxWaitTime) {
this.maxWaitTime = maxWaitTime;
}
/**
* The maximum idle time for a pooled connection.
*
* @param maxConnectionIdleTime
*/
public void setMaxConnectionIdleTime(int maxConnectionIdleTime) {
this.maxConnectionIdleTime = maxConnectionIdleTime;
}
/**
* Set the maximum life time for a pooled connection.
*
* @param maxConnectionLifeTime
*/
public void setMaxConnectionLifeTime(int maxConnectionLifeTime) {
this.maxConnectionLifeTime = maxConnectionLifeTime;
}
/**
* Set the connect timeout in milliseconds. 0 is default and infinite.
*
* @param connectTimeout
*/
public void setConnectTimeout(int connectTimeout) {
this.connectTimeout = connectTimeout;
}
/**
* Set the socket timeout. 0 is default and infinite.
*
* @param socketTimeout
*/
public void setSocketTimeout(int socketTimeout) {
this.socketTimeout = socketTimeout;
}
/**
* Set the keep alive flag, controls whether or not to have socket keep alive timeout. Defaults to false.
*
* @param socketKeepAlive
*/
public void setSocketKeepAlive(boolean socketKeepAlive) {
this.socketKeepAlive = socketKeepAlive;
}
/**
* Set the {@link ReadPreference}.
*
* @param readPreference
*/
public void setReadPreference(ReadPreference readPreference) {
this.readPreference = readPreference;
}
/**
* Set the {@link WriteConcern} that will be the default value used when asking the {@link MongoDbFactory} for a DB
* object.
*
* @param writeConcern
*/
public void setWriteConcern(WriteConcern writeConcern) {
this.writeConcern = writeConcern;
}
/**
* @param socketFactory
*/
public void setSocketFactory(SocketFactory socketFactory) {
this.socketFactory = socketFactory;
}
/**
* Set the frequency that the driver will attempt to determine the current state of each server in the cluster.
*
* @param heartbeatFrequency
*/
public void setHeartbeatFrequency(int heartbeatFrequency) {
this.heartbeatFrequency = heartbeatFrequency;
}
/**
* In the event that the driver has to frequently re-check a server's availability, it will wait at least this long
* since the previous check to avoid wasted effort.
*
* @param minHeartbeatFrequency
*/
public void setMinHeartbeatFrequency(int minHeartbeatFrequency) {
this.minHeartbeatFrequency = minHeartbeatFrequency;
}
/**
* Set the connect timeout for connections used for the cluster heartbeat.
*
* @param heartbeatConnectTimeout
*/
public void setHeartbeatConnectTimeout(int heartbeatConnectTimeout) {
this.heartbeatConnectTimeout = heartbeatConnectTimeout;
}
/**
* Set the socket timeout for connections used for the cluster heartbeat.
*
* @param heartbeatSocketTimeout
*/
public void setHeartbeatSocketTimeout(int heartbeatSocketTimeout) {
this.heartbeatSocketTimeout = heartbeatSocketTimeout;
}
/**
* Configures the name of the replica set.
*
* @param requiredReplicaSetName
*/
public void setRequiredReplicaSetName(String requiredReplicaSetName) {
this.requiredReplicaSetName = requiredReplicaSetName;
}
/**
* This controls if the driver should us an SSL connection. Defaults to |@literal false}.
*
* @param ssl
*/
public void setSsl(boolean ssl) {
this.ssl = ssl;
}
/**
* Set the {@link SSLSocketFactory} to use for the {@literal SSL} connection. If none is configured here,
* {@link SSLSocketFactory#getDefault()} will be used.
*
* @param sslSocketFactory
*/
public void setSslSocketFactory(SSLSocketFactory sslSocketFactory) {
this.sslSocketFactory = sslSocketFactory;
}
/**
* Set the {@literal server selection timeout} in msec for a 3.x MongoDB Java driver. If not set the default value of
* 30 sec will be used. A value of 0 means that it will timeout immediately if no server is available. A negative
* value means to wait indefinitely.
*
* @param serverSelectionTimeout in msec.
*/
public void setServerSelectionTimeout(int serverSelectionTimeout) {
this.serverSelectionTimeout = serverSelectionTimeout;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override
protected MongoClientOptions createInstance() throws Exception {
SocketFactory socketFactoryToUse = ssl
? (sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory.getDefault()) : this.socketFactory;
MongoClientOptions.Builder builder = MongoClientOptions.builder();
if (MongoClientVersion.isMongo3Driver() && serverSelectionTimeout != Integer.MIN_VALUE) {
new DirectFieldAccessor(builder).setPropertyValue("serverSelectionTimeout", serverSelectionTimeout);
}
return builder //
.alwaysUseMBeans(this.alwaysUseMBeans) //
.connectionsPerHost(this.connectionsPerHost) //
.connectTimeout(connectTimeout) //
.cursorFinalizerEnabled(cursorFinalizerEnabled) //
.dbDecoderFactory(dbDecoderFactory) //
.dbEncoderFactory(dbEncoderFactory) //
.description(description) //
.heartbeatConnectTimeout(heartbeatConnectTimeout) //
.heartbeatFrequency(heartbeatFrequency) //
.heartbeatSocketTimeout(heartbeatSocketTimeout) //
.maxConnectionIdleTime(maxConnectionIdleTime) //
.maxConnectionLifeTime(maxConnectionLifeTime) //
.maxWaitTime(maxWaitTime) //
.minConnectionsPerHost(minConnectionsPerHost) //
.minHeartbeatFrequency(minHeartbeatFrequency) //
.readPreference(readPreference) //
.requiredReplicaSetName(requiredReplicaSetName) //
.socketFactory(socketFactoryToUse) //
.socketKeepAlive(socketKeepAlive) //
.socketTimeout(socketTimeout) //
.threadsAllowedToBlockForConnectionMultiplier(threadsAllowedToBlockForConnectionMultiplier) //
.writeConcern(writeConcern).build();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
public Class<?> getObjectType() {
return MongoClientOptions.class;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,12 +18,13 @@ package org.springframework.data.mongodb.core;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import org.springframework.data.mongodb.util.MongoClientVersion;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.util.Assert;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Helper class featuring helper methods for internal MongoDb classes. Mainly intended for internal use within the
@@ -34,6 +35,7 @@ import com.mongodb.Mongo;
* @author Oliver Gierke
* @author Randy Watler
* @author Thomas Darimont
* @author Christoph Strobl
* @since 1.0
*/
public abstract class MongoDbUtils {
@@ -43,9 +45,7 @@ public abstract class MongoDbUtils {
/**
* Private constructor to prevent instantiation.
*/
private MongoDbUtils() {
}
private MongoDbUtils() {}
/**
* Obtains a {@link DB} connection for the given {@link Mongo} instance and database name
@@ -65,11 +65,24 @@ public abstract class MongoDbUtils {
* @param databaseName the database name, must not be {@literal null} or empty.
* @param credentials the credentials to use, must not be {@literal null}.
* @return the {@link DB} connection
* @deprecated since 1.7. The {@link MongoClient} itself should hold credentials within
* {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials) {
return getDB(mongo, databaseName, credentials, databaseName);
}
/**
* @param mongo
* @param databaseName
* @param credentials
* @param authenticationDatabaseName
* @return
* @deprecated since 1.7. The {@link MongoClient} itself should hold credentials within
* {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public static DB getDB(Mongo mongo, String databaseName, UserCredentials credentials,
String authenticationDatabaseName) {
@@ -109,22 +122,9 @@ public abstract class MongoDbUtils {
LOGGER.debug("Getting Mongo Database name=[{}]", databaseName);
DB db = mongo.getDB(databaseName);
boolean credentialsGiven = credentials.hasUsername() && credentials.hasPassword();
DB authDb = databaseName.equals(authenticationDatabaseName) ? db : mongo.getDB(authenticationDatabaseName);
synchronized (authDb) {
if (credentialsGiven && !authDb.isAuthenticated()) {
String username = credentials.getUsername();
String password = credentials.hasPassword() ? credentials.getPassword() : null;
if (!authDb.authenticate(username, password == null ? null : password.toCharArray())) {
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName + "], "
+ credentials.toString(), databaseName, credentials);
}
}
if (!(mongo instanceof MongoClient) && requiresAuthDbAuthentication(credentials)) {
ReflectiveDbInvoker.authenticate(mongo, db, credentials, authenticationDatabaseName);
}
// TX sync active, bind new database to thread
@@ -181,16 +181,36 @@ public abstract class MongoDbUtils {
* Perform actual closing of the Mongo DB object, catching and logging any cleanup exceptions thrown.
*
* @param db the DB to close (may be <code>null</code>)
* @deprecated since 1.7. The main use case for this method is to ensure that applications can read their own
* unacknowledged writes, but this is no longer so prevalent since the MongoDB Java driver version 3
* started defaulting to acknowledged writes.
*/
@Deprecated
public static void closeDB(DB db) {
if (db != null) {
LOGGER.debug("Closing Mongo DB object");
try {
db.requestDone();
ReflectiveDbInvoker.requestDone(db);
} catch (Throwable ex) {
LOGGER.debug("Unexpected exception on closing Mongo DB object", ex);
}
}
}
/**
* Check if credentials present. In case we're using a mongo-java-driver version 3 or above we do not have the need
* for authentication as the auth data has to be provided within the MongoClient
*
* @param credentials
* @return
*/
private static boolean requiresAuthDbAuthentication(UserCredentials credentials) {
if (credentials == null || !credentials.hasUsername()) {
return false;
}
return !MongoClientVersion.isMongo3Driver();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,19 +15,25 @@
*/
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.HashSet;
import java.util.Set;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.dao.DuplicateKeyException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.InvalidDataAccessResourceUsageException;
import org.springframework.dao.PermissionDeniedDataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.BulkOperationException;
import org.springframework.data.mongodb.UncategorizedMongoDbException;
import org.springframework.data.mongodb.util.MongoDbErrorCodes;
import org.springframework.util.ClassUtils;
import com.mongodb.BulkWriteException;
import com.mongodb.MongoException;
import com.mongodb.MongoException.CursorNotFound;
import com.mongodb.MongoException.DuplicateKey;
import com.mongodb.MongoException.Network;
import com.mongodb.MongoInternalException;
/**
* Simple {@link PersistenceExceptionTranslator} for Mongo. Convert the given runtime exception to an appropriate
@@ -36,9 +42,23 @@ import com.mongodb.MongoInternalException;
*
* @author Oliver Gierke
* @author Michal Vich
* @author Christoph Strobl
*/
public class MongoExceptionTranslator implements PersistenceExceptionTranslator {
private static final Set<String> DULICATE_KEY_EXCEPTIONS = new HashSet<String>(
Arrays.asList("MongoException.DuplicateKey", "DuplicateKeyException"));
private static final Set<String> RESOURCE_FAILURE_EXCEPTIONS = new HashSet<String>(
Arrays.asList("MongoException.Network", "MongoSocketException", "MongoException.CursorNotFound",
"MongoCursorNotFoundException", "MongoServerSelectionException", "MongoTimeoutException"));
private static final Set<String> RESOURCE_USAGE_EXCEPTIONS = new HashSet<String>(
Arrays.asList("MongoInternalException"));
private static final Set<String> DATA_INTEGRETY_EXCEPTIONS = new HashSet<String>(
Arrays.asList("WriteConcernException"));
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
@@ -47,38 +67,42 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
// Check for well-known MongoException subclasses.
// All other MongoExceptions
if (ex instanceof DuplicateKey) {
String exception = ClassUtils.getShortName(ClassUtils.getUserClass(ex.getClass()));
if (DULICATE_KEY_EXCEPTIONS.contains(exception)) {
return new DuplicateKeyException(ex.getMessage(), ex);
}
if (ex instanceof Network) {
if (RESOURCE_FAILURE_EXCEPTIONS.contains(exception)) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof CursorNotFound) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
// Driver 2.12 throws this to indicate connection problems. String comparison to avoid hard dependency
if (ex.getClass().getName().equals("com.mongodb.MongoServerSelectionException")) {
return new DataAccessResourceFailureException(ex.getMessage(), ex);
}
if (ex instanceof MongoInternalException) {
if (RESOURCE_USAGE_EXCEPTIONS.contains(exception)) {
return new InvalidDataAccessResourceUsageException(ex.getMessage(), ex);
}
if (DATA_INTEGRETY_EXCEPTIONS.contains(exception)) {
return new DataIntegrityViolationException(ex.getMessage(), ex);
}
if (ex instanceof BulkWriteException) {
return new BulkOperationException(ex.getMessage(), (BulkWriteException) ex);
}
// All other MongoExceptions
if (ex instanceof MongoException) {
int code = ((MongoException) ex).getCode();
if (code == 11000 || code == 11001) {
if (MongoDbErrorCodes.isDuplicateKeyCode(code)) {
throw new DuplicateKeyException(ex.getMessage(), ex);
} else if (code == 12000 || code == 13440) {
} else if (MongoDbErrorCodes.isDataAccessResourceFailureCode(code)) {
throw new DataAccessResourceFailureException(ex.getMessage(), ex);
} else if (code == 10003 || code == 12001 || code == 12010 || code == 12011 || code == 12012) {
} else if (MongoDbErrorCodes.isInvalidDataAccessApiUsageCode(code) || code == 10003 || code == 12001
|| code == 12010 || code == 12011 || code == 12012) {
throw new InvalidDataAccessApiUsageException(ex.getMessage(), ex);
} else if (MongoDbErrorCodes.isPermissionDeniedCode(code)) {
throw new PermissionDeniedDataAccessException(ex.getMessage(), ex);
}
return new UncategorizedMongoDbException(ex.getMessage(), ex);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,9 +20,7 @@ import java.util.Collection;
import java.util.Collections;
import java.util.List;
import org.springframework.beans.factory.DisposableBean;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
@@ -40,12 +38,14 @@ import com.mongodb.WriteConcern;
* @author Graeme Rocher
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
* @since 1.0
* @deprecated since 1.7. Please use {@link MongoClientFactoryBean} instead.
*/
public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, DisposableBean,
PersistenceExceptionTranslator {
@Deprecated
public class MongoFactoryBean extends AbstractFactoryBean<Mongo> implements PersistenceExceptionTranslator {
private Mongo mongo;
private static final PersistenceExceptionTranslator DEFAULT_EXCEPTION_TRANSLATOR = new MongoExceptionTranslator();
private MongoOptions mongoOptions;
private String host;
@@ -53,9 +53,11 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
private WriteConcern writeConcern;
private List<ServerAddress> replicaSetSeeds;
private List<ServerAddress> replicaPair;
private PersistenceExceptionTranslator exceptionTranslator = DEFAULT_EXCEPTION_TRANSLATOR;
private PersistenceExceptionTranslator exceptionTranslator = new MongoExceptionTranslator();
/**
* @param mongoOptions
*/
public void setMongoOptions(MongoOptions mongoOptions) {
this.mongoOptions = mongoOptions;
}
@@ -66,7 +68,6 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
/**
* @deprecated use {@link #setReplicaSetSeeds(ServerAddress[])} instead
*
* @param replicaPair
*/
@Deprecated
@@ -75,30 +76,19 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
}
/**
* @param elements the elements to filter <T>
* @return a new unmodifiable {@link List#} from the given elements without nulls
* Configures the host to connect to.
*
* @param host
*/
private <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
}
public void setHost(String host) {
this.host = host;
}
/**
* Configures the port to connect to.
*
* @param port
*/
public void setPort(int port) {
this.port = port;
}
@@ -112,12 +102,13 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
this.writeConcern = writeConcern;
}
/**
* Configures the {@link PersistenceExceptionTranslator} to use.
*
* @param exceptionTranslator can be {@literal null}.
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator;
}
public Mongo getObject() throws Exception {
return mongo;
this.exceptionTranslator = exceptionTranslator == null ? DEFAULT_EXCEPTION_TRANSLATOR : exceptionTranslator;
}
/*
@@ -128,14 +119,6 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
return Mongo.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
*/
public boolean isSingleton() {
return true;
}
/*
* (non-Javadoc)
* @see org.springframework.dao.support.PersistenceExceptionTranslator#translateExceptionIfPossible(java.lang.RuntimeException)
@@ -146,10 +129,10 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@SuppressWarnings("deprecation")
public void afterPropertiesSet() throws Exception {
@Override
protected Mongo createInstance() throws Exception {
Mongo mongo;
ServerAddress defaultOptions = new ServerAddress();
@@ -175,18 +158,42 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, InitializingBean, D
mongo.setWriteConcern(writeConcern);
}
this.mongo = mongo;
}
private boolean isNullOrEmpty(Collection<?> elements) {
return elements == null || elements.isEmpty();
return mongo;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.DisposableBean#destroy()
* @see org.springframework.beans.factory.config.AbstractFactoryBean#destroyInstance(java.lang.Object)
*/
public void destroy() throws Exception {
this.mongo.close();
@Override
protected void destroyInstance(Mongo mongo) throws Exception {
mongo.close();
}
private static boolean isNullOrEmpty(Collection<?> elements) {
return elements == null || elements.isEmpty();
}
/**
* Returns the given array as {@link List} with all {@literal null} elements removed.
*
* @param elements the elements to filter <T>
* @return a new unmodifiable {@link List#} from the given elements without nulls
*/
private static <T> List<T> filterNonNullElementsAsList(T[] elements) {
if (elements == null) {
return Collections.emptyList();
}
List<T> candidateElements = new ArrayList<T>();
for (T element : elements) {
if (element != null) {
candidateElements.add(element);
}
}
return Collections.unmodifiableList(candidateElements);
}
}

View File

@@ -1,6 +1,6 @@
/*
* Copyright 2011-2014 the original author or authors.
*
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
@@ -19,11 +19,12 @@ import java.util.Collection;
import java.util.List;
import java.util.Set;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.mongodb.core.BulkOperations.BulkMode;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.geo.GeoResults;
import org.springframework.data.mongodb.core.mapreduce.GroupBy;
import org.springframework.data.mongodb.core.mapreduce.GroupByResults;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
@@ -33,17 +34,21 @@ import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.CloseableIterator;
import com.mongodb.CommandResult;
import com.mongodb.Cursor;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.ReadPreference;
import com.mongodb.WriteResult;
/**
* Interface that specifies a basic set of MongoDB operations. Implemented by {@link MongoTemplate}. Not often used but
* a useful option for extensibility and testability (as it can be easily mocked, stubbed, or be the target of a JDK
* proxy).
*
*
* @author Thomas Risberg
* @author Mark Pollack
* @author Oliver Gierke
@@ -52,12 +57,11 @@ import com.mongodb.WriteResult;
* @author Christoph Strobl
* @author Thomas Darimont
*/
@SuppressWarnings("deprecation")
public interface MongoOperations {
/**
* The collection name used for the specified class by this template.
*
*
* @param entityClass must not be {@literal null}.
* @return
*/
@@ -67,7 +71,7 @@ public interface MongoOperations {
* Execute the a MongoDB command expressed as a JSON string. This will call the method JSON.parse that is part of the
* MongoDB driver to convert the JSON string to a DBObject. Any errors that result from executing this command will be
* converted into Spring's DAO exception hierarchy.
*
*
* @param jsonCommand a MongoDB command expressed as a JSON string.
*/
CommandResult executeCommand(String jsonCommand);
@@ -75,7 +79,7 @@ public interface MongoOperations {
/**
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's DAO
* exception hierarchy.
*
*
* @param command a MongoDB command
*/
CommandResult executeCommand(DBObject command);
@@ -83,15 +87,29 @@ public interface MongoOperations {
/**
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's DAO
* exception hierarchy.
*
*
* @param command a MongoDB command
* @param options query options to use
* @deprecated since 1.7. Please use {@link #executeCommand(DBObject, ReadPreference)}, as the MongoDB Java driver
* version 3 no longer supports this operation.
*/
@Deprecated
CommandResult executeCommand(DBObject command, int options);
/**
* Execute a MongoDB command. Any errors that result from executing this command will be converted into Spring's data
* access exception hierarchy.
*
* @param command a MongoDB command, must not be {@literal null}.
* @param readPreference read preferences to use, can be {@literal null}.
* @return
* @since 1.7
*/
CommandResult executeCommand(DBObject command, ReadPreference readPreference);
/**
* Execute a MongoDB query and iterate over the query results on a per-document basis with a DocumentCallbackHandler.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param collectionName name of the collection to retrieve the objects from
@@ -103,7 +121,7 @@ public interface MongoOperations {
* Executes a {@link DbCallback} translating any exceptions as necessary.
* <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
*
* @param <T> return type
* @param action callback object that specifies the MongoDB actions to perform on the passed in DB instance.
* @return a result object returned by the action or <tt>null</tt>
@@ -114,7 +132,7 @@ public interface MongoOperations {
* Executes the given {@link CollectionCallback} on the entity collection of the specified class.
* <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
*
* @param entityClass class that determines the collection to use
* @param <T> return type
* @param action callback object that specifies the MongoDB action
@@ -126,7 +144,7 @@ public interface MongoOperations {
* Executes the given {@link CollectionCallback} on the collection of the given name.
* <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
*
* @param <T> return type
* @param collectionName the name of the collection that specifies which DBCollection instance will be passed into
* @param action callback object that specifies the MongoDB action the callback action.
@@ -140,24 +158,56 @@ public interface MongoOperations {
* href=http://www.mongodb.org/display/DOCS/Java+Driver+Concurrency>Java Driver Concurrency</a>}
* <p/>
* Allows for returning a result object, that is a domain object or a collection of domain objects.
*
*
* @param <T> return type
* @param action callback that specified the MongoDB actions to perform on the DB instance
* @return a result object returned by the action or <tt>null</tt>
* @deprecated since 1.7 as the MongoDB Java driver version 3 does not longer support request boundaries via
* {@link DB#requestStart()} and {@link DB#requestDone()}.
*/
@Deprecated
<T> T executeInSession(DbCallback<T> action);
/**
* Executes the given {@link Query} on the entity collection of the specified {@code entityType} backed by a Mongo DB
* {@link Cursor}.
* <p>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed.
*
* @param <T> element return type
* @param query must not be {@literal null}.
* @param entityType must not be {@literal null}.
* @return will never be {@literal null}.
* @since 1.7
*/
<T> CloseableIterator<T> stream(Query query, Class<T> entityType);
/**
* Executes the given {@link Query} on the entity collection of the specified {@code entityType} and collection backed
* by a Mongo DB {@link Cursor}.
* <p>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed.
*
* @param <T> element return type
* @param query must not be {@literal null}.
* @param entityType must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return will never be {@literal null}.
* @since 1.10
*/
<T> CloseableIterator<T> stream(Query query, Class<T> entityType, String collectionName);
/**
* Create an uncapped collection with a name based on the provided entity class.
*
*
* @param entityClass class that determines the collection to create
* @return the created collection
*/
<T> DBCollection createCollection(Class<T> entityClass);
/**
* Create a collect with a name based on the provided entity class using the options.
*
* Create a collection with a name based on the provided entity class using the options.
*
* @param entityClass class that determines the collection to create
* @param collectionOptions options to use when creating the collection.
* @return the created collection
@@ -166,15 +216,15 @@ public interface MongoOperations {
/**
* Create an uncapped collection with the provided name.
*
*
* @param collectionName name of the collection
* @return the created collection
*/
DBCollection createCollection(String collectionName);
/**
* Create a collect with the provided name and options.
*
* Create a collection with the provided name and options.
*
* @param collectionName name of the collection
* @param collectionOptions options to use when creating the collection.
* @return the created collection
@@ -183,7 +233,7 @@ public interface MongoOperations {
/**
* A set of collection names.
*
*
* @return list of collection names
*/
Set<String> getCollectionNames();
@@ -192,7 +242,7 @@ public interface MongoOperations {
* Get a collection by name, creating it if it doesn't exist.
* <p/>
* Translate any exceptions as necessary.
*
*
* @param collectionName name of the collection
* @return an existing collection or a newly created one.
*/
@@ -202,7 +252,7 @@ public interface MongoOperations {
* Check to see if a collection with a name indicated by the entity class exists.
* <p/>
* Translate any exceptions as necessary.
*
*
* @param entityClass class that determines the name of the collection
* @return true if a collection with the given name is found, false otherwise.
*/
@@ -212,7 +262,7 @@ public interface MongoOperations {
* Check to see if a collection with a given name exists.
* <p/>
* Translate any exceptions as necessary.
*
*
* @param collectionName name of the collection
* @return true if a collection with the given name is found, false otherwise.
*/
@@ -222,7 +272,7 @@ public interface MongoOperations {
* Drop the collection with the name indicated by the entity class.
* <p/>
* Translate any exceptions as necessary.
*
*
* @param entityClass class that determines the collection to drop/delete.
*/
<T> void dropCollection(Class<T> entityClass);
@@ -231,25 +281,64 @@ public interface MongoOperations {
* Drop the collection with the given name.
* <p/>
* Translate any exceptions as necessary.
*
*
* @param collectionName name of the collection to drop/delete.
*/
void dropCollection(String collectionName);
/**
* Returns the operations that can be performed on indexes
*
*
* @return index operations on the named collection
*/
IndexOperations indexOps(String collectionName);
/**
* Returns the operations that can be performed on indexes
*
*
* @return index operations on the named collection associated with the given entity class
*/
IndexOperations indexOps(Class<?> entityClass);
/**
* Returns the {@link ScriptOperations} that can be performed on {@link com.mongodb.DB} level.
*
* @return
* @since 1.7
*/
ScriptOperations scriptOps();
/**
* Returns a new {@link BulkOperations} for the given collection. <br />
* <strong>NOTE:</strong> Any additional support for field mapping, etc. is not available for {@literal update} or
* {@literal remove} operations in bulk mode due to the lack of domain type information. Use
* {@link #bulkOps(BulkMode, Class, String)} to get full type specific support.
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param collectionName the name of the collection to work on, must not be {@literal null} or empty.
* @return {@link BulkOperations} on the named collection
*/
BulkOperations bulkOps(BulkMode mode, String collectionName);
/**
* Returns a new {@link BulkOperations} for the given entity type.
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param entityType the name of the entity class, must not be {@literal null}.
* @return {@link BulkOperations} on the named collection associated of the given entity class.
*/
BulkOperations bulkOps(BulkMode mode, Class<?> entityType);
/**
* Returns a new {@link BulkOperations} for the given entity type and collection name.
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param entityClass the name of the entity class, must not be {@literal null}.
* @param collectionName the name of the collection to work on, must not be {@literal null} or empty.
* @return {@link BulkOperations} on the named collection associated with the given entity class.
*/
BulkOperations bulkOps(BulkMode mode, Class<?> entityType, String collectionName);
/**
* Query for a list of objects of type T from the collection used by the entity class.
* <p/>
@@ -258,7 +347,7 @@ public interface MongoOperations {
* <p/>
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
*
*
* @param entityClass the parameterized type of the returned list
* @return the converted collection
*/
@@ -272,7 +361,7 @@ public interface MongoOperations {
* <p/>
* If your collection does not contain a homogeneous collection of types, this operation will not be an efficient way
* to map objects since the test for class type is done in the client and not on the server.
*
*
* @param entityClass the parameterized type of the returned list.
* @param collectionName name of the collection to retrieve the objects from
* @return the converted collection
@@ -282,7 +371,7 @@ public interface MongoOperations {
/**
* Execute a group operation over the entire collection. The group operation entity class should match the 'shape' of
* the returned object that takes int account the initial document structure as well as any finalize functions.
*
*
* @param criteria The criteria that restricts the row that are considered for grouping. If not specified all rows are
* considered.
* @param inputCollectionName the collection where the group operation will read from
@@ -297,7 +386,7 @@ public interface MongoOperations {
* Execute a group operation restricting the rows to those which match the provided Criteria. The group operation
* entity class should match the 'shape' of the returned object that takes int account the initial document structure
* as well as any finalize functions.
*
*
* @param criteria The criteria that restricts the row that are considered for grouping. If not specified all rows are
* considered.
* @param inputCollectionName the collection where the group operation will read from
@@ -311,7 +400,7 @@ public interface MongoOperations {
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class. The name of the
* inputCollection is derived from the inputType of the aggregation.
*
*
* @param aggregation The {@link TypedAggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param collectionName The name of the input collection to use for the aggreation.
@@ -324,7 +413,7 @@ public interface MongoOperations {
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class. The name of the
* inputCollection is derived from the inputType of the aggregation.
*
*
* @param aggregation The {@link TypedAggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param outputType The parameterized type of the returned list, must not be {@literal null}.
@@ -335,7 +424,7 @@ public interface MongoOperations {
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class.
*
*
* @param aggregation The {@link Aggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param inputType the inputType where the aggregation operation will read from, must not be {@literal null} or
@@ -348,7 +437,7 @@ public interface MongoOperations {
/**
* Execute an aggregation operation. The raw results will be mapped to the given entity class.
*
*
* @param aggregation The {@link Aggregation} specification holding the aggregation operations, must not be
* {@literal null}.
* @param collectionName the collection where the aggregation operation will read from, must not be {@literal null} or
@@ -361,7 +450,7 @@ public interface MongoOperations {
/**
* Execute a map-reduce operation. The map-reduce operation will be formed with an output type of INLINE
*
*
* @param inputCollectionName the collection where the map-reduce will read from
* @param mapFunction The JavaScript map function
* @param reduceFunction The JavaScript reduce function
@@ -374,7 +463,7 @@ public interface MongoOperations {
/**
* Execute a map-reduce operation that takes additional map-reduce options.
*
*
* @param inputCollectionName the collection where the map-reduce will read from
* @param mapFunction The JavaScript map function
* @param reduceFunction The JavaScript reduce function
@@ -388,7 +477,7 @@ public interface MongoOperations {
/**
* Execute a map-reduce operation that takes a query. The map-reduce operation will be formed with an output type of
* INLINE
*
*
* @param query The query to use to select the data for the map phase
* @param inputCollectionName the collection where the map-reduce will read from
* @param mapFunction The JavaScript map function
@@ -402,7 +491,7 @@ public interface MongoOperations {
/**
* Execute a map-reduce operation that takes a query and additional map-reduce options
*
*
* @param query The query to use to select the data for the map phase
* @param inputCollectionName the collection where the map-reduce will read from
* @param mapFunction The JavaScript map function
@@ -416,8 +505,10 @@ public interface MongoOperations {
/**
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}. Will consider entity mapping
* information to determine the collection the query is ran against.
*
* information to determine the collection the query is ran against. Note, that MongoDB limits the number of results
* by default. Make sure to add an explicit limit to the {@link NearQuery} if you expect a particular number of
* results.
*
* @param near must not be {@literal null}.
* @param entityClass must not be {@literal null}.
* @return
@@ -425,8 +516,10 @@ public interface MongoOperations {
<T> GeoResults<T> geoNear(NearQuery near, Class<T> entityClass);
/**
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}.
*
* Returns {@link GeoResults} for all entities matching the given {@link NearQuery}. Note, that MongoDB limits the
* number of results by default. Make sure to add an explicit limit to the {@link NearQuery} if you expect a
* particular number of results.
*
* @param near must not be {@literal null}.
* @param entityClass must not be {@literal null}.
* @param collectionName the collection to trigger the query against. If no collection name is given the entity class
@@ -444,7 +537,7 @@ public interface MongoOperations {
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parameterized type of the returned list.
@@ -461,7 +554,7 @@ public interface MongoOperations {
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parameterized type of the returned list.
@@ -471,8 +564,10 @@ public interface MongoOperations {
<T> T findOne(Query query, Class<T> entityClass, String collectionName);
/**
* Determine result of given {@link Query} contains at least one element.
*
* Determine result of given {@link Query} contains at least one element. <br />
* <strong>NOTE:</strong> Any additional support for query/field mapping, etc. is not available due to the lack of
* domain type information. Use {@link #exists(Query, Class, String)} to get full type specific support.
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param collectionName name of the collection to check for objects.
* @return
@@ -481,7 +576,7 @@ public interface MongoOperations {
/**
* Determine result of given {@link Query} contains at least one element.
*
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param entityClass the parameterized type.
* @return
@@ -490,7 +585,7 @@ public interface MongoOperations {
/**
* Determine result of given {@link Query} contains at least one element.
*
*
* @param query the {@link Query} class that specifies the criteria used to find a record.
* @param entityClass the parameterized type.
* @param collectionName name of the collection to check for objects.
@@ -506,7 +601,7 @@ public interface MongoOperations {
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parameterized type of the returned list.
@@ -522,7 +617,7 @@ public interface MongoOperations {
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parameterized type of the returned list.
@@ -534,7 +629,7 @@ public interface MongoOperations {
/**
* Returns a document with the given id mapped onto the given class. The collection the query is ran against will be
* derived from the given target class as well.
*
*
* @param <T>
* @param id the id of the document to return.
* @param entityClass the type the document shall be converted into.
@@ -544,7 +639,7 @@ public interface MongoOperations {
/**
* Returns the document with the given id from the given collection mapped onto the given target class.
*
*
* @param id the id of the document to return
* @param entityClass the type to convert the document to
* @param collectionName the collection to query for the document
@@ -554,9 +649,9 @@ public interface MongoOperations {
<T> T findById(Object id, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
@@ -566,9 +661,9 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, Class<T> entityClass);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
@@ -579,10 +674,10 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
@@ -593,10 +688,10 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
* @param update the {@link Update} to apply on matching documents.
@@ -617,7 +712,7 @@ public interface MongoOperations {
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parameterized type of the returned list.
@@ -634,7 +729,7 @@ public interface MongoOperations {
* <p/>
* The query is specified as a {@link Query} which can be created either using the {@link BasicQuery} or the more
* feature rich {@link Query}.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param entityClass the parameterized type of the returned list.
@@ -645,7 +740,7 @@ public interface MongoOperations {
/**
* Returns the number of documents for the given {@link Query} by querying the collection of the given entity class.
*
*
* @param query
* @param entityClass must not be {@literal null}.
* @return
@@ -653,14 +748,28 @@ public interface MongoOperations {
long count(Query query, Class<?> entityClass);
/**
* Returns the number of documents for the given {@link Query} querying the given collection.
*
* Returns the number of documents for the given {@link Query} querying the given collection. The given {@link Query}
* must solely consist of document field references as we lack type information to map potential property references
* onto document fields. Use {@link #count(Query, Class, String)} to get full type specific support.
*
* @param query
* @param collectionName must not be {@literal null} or empty.
* @return
* @see #count(Query, Class, String)
*/
long count(Query query, String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the given collection using the given entity
* class to map the given {@link Query}.
*
* @param query
* @param entityClass must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return
*/
long count(Query query, Class<?> entityClass, String collectionName);
/**
* Insert the object into the collection for the entity type of the object to save.
* <p/>
@@ -668,13 +777,13 @@ public interface MongoOperations {
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert"
* >Spring's Type Conversion"</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" >
* Spring's Type Conversion"</a> for more details.
* <p/>
* <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method.
*
*
* @param objectToSave the object to store in the collection.
*/
void insert(Object objectToSave);
@@ -686,7 +795,7 @@ public interface MongoOperations {
* configured otherwise, an instance of MappingMongoConverter will be used.
* <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method.
*
*
* @param objectToSave the object to store in the collection
* @param collectionName name of the collection to store the object in
*/
@@ -694,7 +803,7 @@ public interface MongoOperations {
/**
* Insert a Collection of objects into a collection in a single batch write to the database.
*
*
* @param batchToSave the list of objects to save.
* @param entityClass class that determines the collection to use
*/
@@ -702,7 +811,7 @@ public interface MongoOperations {
/**
* Insert a list of objects into the specified collection in a single batch write to the database.
*
*
* @param batchToSave the list of objects to save.
* @param collectionName name of the collection to store the object in
*/
@@ -711,7 +820,7 @@ public interface MongoOperations {
/**
* Insert a mixed Collection of objects into a database collection determining the collection name to use based on the
* class.
*
*
* @param collectionToSave the list of objects to save.
*/
void insertAll(Collection<? extends Object> objectsToSave);
@@ -725,10 +834,10 @@ public interface MongoOperations {
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert"
* >Spring's Type Conversion"</a> for more details.
*
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" >
* Spring's Type Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection
*/
void save(Object objectToSave);
@@ -745,7 +854,7 @@ public interface MongoOperations {
* property type will be handled by Spring's BeanWrapper class that leverages Type Cobnversion API. See <a
* http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert">Spring's
* Type Conversion"</a> for more details.
*
*
* @param objectToSave the object to store in the collection
* @param collectionName name of the collection to store the object in
*/
@@ -754,7 +863,7 @@ public interface MongoOperations {
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
*
*
* @param query the query document that specifies the criteria used to select a record to be upserted
* @param update the update document that contains the updated object or $ operators to manipulate the existing object
* @param entityClass class that determines the collection to use
@@ -764,8 +873,10 @@ public interface MongoOperations {
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
*
* combining the query document and the update document. <br />
* <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of
* domain type information. Use {@link #upsert(Query, Update, Class, String)} to get full type specific support.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -777,7 +888,7 @@ public interface MongoOperations {
/**
* Performs an upsert. If no document is found that matches the query, a new document is created and inserted by
* combining the query document and the update document.
*
*
* @param query the query document that specifies the criteria used to select a record to be upserted
* @param update the update document that contains the updated object or $ operators to manipulate the existing object
* @param entityClass class of the pojo to be operated on
@@ -789,7 +900,7 @@ public interface MongoOperations {
/**
* Updates the first object that is found in the collection of the entity class that matches the query document with
* the provided update document.
*
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -800,8 +911,10 @@ public interface MongoOperations {
/**
* Updates the first object that is found in the specified collection that matches the query document criteria with
* the provided updated document.
*
* the provided updated document. <br />
* <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of
* domain type information. Use {@link #updateFirst(Query, Update, Class, String)} to get full type specific support.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -813,7 +926,7 @@ public interface MongoOperations {
/**
* Updates the first object that is found in the specified collection that matches the query document criteria with
* the provided updated document.
*
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -826,7 +939,7 @@ public interface MongoOperations {
/**
* Updates all objects that are found in the collection for the entity class that matches the query document criteria
* with the provided updated document.
*
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -837,8 +950,10 @@ public interface MongoOperations {
/**
* Updates all objects that are found in the specified collection that matches the query document criteria with the
* provided updated document.
*
* provided updated document. <br />
* <strong>NOTE:</strong> Any additional support for field mapping, versions, etc. is not available due to the lack of
* domain type information. Use {@link #updateMulti(Query, Update, Class, String)} to get full type specific support.
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -850,7 +965,7 @@ public interface MongoOperations {
/**
* Updates all objects that are found in the collection for the entity class that matches the query document criteria
* with the provided updated document.
*
*
* @param query the query document that specifies the criteria used to select a record to be updated
* @param update the update document that contains the updated object or $ operators to manipulate the existing
* object.
@@ -862,14 +977,14 @@ public interface MongoOperations {
/**
* Remove the given object from the collection by id.
*
*
* @param object
*/
WriteResult remove(Object object);
/**
* Removes the given object from the given collection.
*
*
* @param object
* @param collection must not be {@literal null} or empty.
*/
@@ -878,7 +993,7 @@ public interface MongoOperations {
/**
* Remove all documents that match the provided query document criteria from the the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
*
* @param query
* @param entityClass
*/
@@ -887,7 +1002,7 @@ public interface MongoOperations {
/**
* Remove all documents that match the provided query document criteria from the the collection used to store the
* entityClass. The Class parameter is also used to help convert the Id of the object if it is present in the query.
*
*
* @param query
* @param entityClass
* @param collectionName
@@ -896,18 +1011,22 @@ public interface MongoOperations {
/**
* Remove all documents from the specified collection that match the provided query document criteria. There is no
* conversion/mapping done for any criteria using the id field.
*
* conversion/mapping done for any criteria using the id field. <br />
* <strong>NOTE:</strong> Any additional support for field mapping is not available due to the lack of domain type
* information. Use {@link #remove(Query, Class, String)} to get full type specific support.
*
* @param query the query document that specifies the criteria used to remove a record
* @param collectionName name of the collection where the objects will removed
*/
WriteResult remove(Query query, String collectionName);
/**
* Returns and removes all documents form the specified collection that match the provided query.
*
* @param query
* @param collectionName
* Returns and removes all documents form the specified collection that match the provided query. <br />
* <strong>NOTE:</strong> Any additional support for field mapping is not available due to the lack of domain type
* information. Use {@link #findAllAndRemove(Query, Class, String)} to get full type specific support.
*
* @param query must not be {@literal null}.
* @param collectionName must not be {@literal null}.
* @return
* @since 1.5
*/
@@ -915,7 +1034,7 @@ public interface MongoOperations {
/**
* Returns and removes all documents matching the given query form the collection used to store the entityClass.
*
*
* @param query
* @param entityClass
* @return
@@ -927,7 +1046,7 @@ public interface MongoOperations {
* Returns and removes all documents that match the provided query document criteria from the the collection used to
* store the entityClass. The Class parameter is also used to help convert the Id of the object if it is present in
* the query.
*
*
* @param query
* @param entityClass
* @param collectionName
@@ -938,7 +1057,7 @@ public interface MongoOperations {
/**
* Returns the underlying {@link MongoConverter}.
*
*
* @return
*/
MongoConverter getConverter();

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2014 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,41 +17,48 @@ package org.springframework.data.mongodb.core;
import javax.net.ssl.SSLSocketFactory;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.MongoOptions;
/**
* A factory bean for construction of a {@link MongoOptions} instance.
*
* A factory bean for construction of a {@link MongoOptions} instance. In case used with MongoDB Java driver version 3
* porperties not suppprted by the driver will be ignored.
*
* @author Graeme Rocher
* @author Mark Pollack
* @author Mike Saavedra
* @author Thomas Darimont
* @author Christoph Strobl
* @deprecated since 1.7. Please use {@link MongoClientOptionsFactoryBean} instead.
*/
@SuppressWarnings("deprecation")
public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, InitializingBean {
@Deprecated
public class MongoOptionsFactoryBean extends AbstractFactoryBean<MongoOptions> {
private static final MongoOptions DEFAULT_MONGO_OPTIONS = new MongoOptions();
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.connectionsPerHost;
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS.threadsAllowedToBlockForConnectionMultiplier;
private int maxWaitTime = DEFAULT_MONGO_OPTIONS.maxWaitTime;
private int connectTimeout = DEFAULT_MONGO_OPTIONS.connectTimeout;
private int socketTimeout = DEFAULT_MONGO_OPTIONS.socketTimeout;
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.socketKeepAlive;
private boolean autoConnectRetry = DEFAULT_MONGO_OPTIONS.autoConnectRetry;
private long maxAutoConnectRetryTime = DEFAULT_MONGO_OPTIONS.maxAutoConnectRetryTime;
private int writeNumber = DEFAULT_MONGO_OPTIONS.w;
private int writeTimeout = DEFAULT_MONGO_OPTIONS.wtimeout;
private boolean writeFsync = DEFAULT_MONGO_OPTIONS.fsync;
private boolean slaveOk = DEFAULT_MONGO_OPTIONS.slaveOk;
private int connectionsPerHost = DEFAULT_MONGO_OPTIONS.getConnectionsPerHost();
private int threadsAllowedToBlockForConnectionMultiplier = DEFAULT_MONGO_OPTIONS
.getThreadsAllowedToBlockForConnectionMultiplier();
private int maxWaitTime = DEFAULT_MONGO_OPTIONS.getMaxWaitTime();
private int connectTimeout = DEFAULT_MONGO_OPTIONS.getConnectTimeout();
private int socketTimeout = DEFAULT_MONGO_OPTIONS.getSocketTimeout();
private boolean socketKeepAlive = DEFAULT_MONGO_OPTIONS.isSocketKeepAlive();
private int writeNumber = DEFAULT_MONGO_OPTIONS.getW();
private int writeTimeout = DEFAULT_MONGO_OPTIONS.getWtimeout();
private boolean writeFsync = DEFAULT_MONGO_OPTIONS.isFsync();
private boolean autoConnectRetry = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getAutoConnectRetry(DEFAULT_MONGO_OPTIONS) : false;
private long maxAutoConnectRetryTime = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getMaxAutoConnectRetryTime(DEFAULT_MONGO_OPTIONS) : -1;
private boolean slaveOk = !MongoClientVersion.isMongo3Driver() ? ReflectiveMongoOptionsInvoker
.getSlaveOk(DEFAULT_MONGO_OPTIONS) : false;
private boolean ssl;
private SSLSocketFactory sslSocketFactory;
private MongoOptions options;
/**
* Configures the maximum number of connections allowed per host until we will block.
*
@@ -144,7 +151,10 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
/**
* Configures whether or not the system retries automatically on a failed connect. This defaults to {@literal false}.
*
* @deprecated since 1.7.
*/
@Deprecated
public void setAutoConnectRetry(boolean autoConnectRetry) {
this.autoConnectRetry = autoConnectRetry;
}
@@ -154,7 +164,9 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
* defaults to {@literal 0}, which means to use the default {@literal 15s} if {@link #autoConnectRetry} is on.
*
* @param maxAutoConnectRetryTime the maxAutoConnectRetryTime to set
* @deprecated since 1.7
*/
@Deprecated
public void setMaxAutoConnectRetryTime(long maxAutoConnectRetryTime) {
this.maxAutoConnectRetryTime = maxAutoConnectRetryTime;
}
@@ -163,7 +175,9 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
* Specifies if the driver is allowed to read from secondaries or slaves. Defaults to {@literal false}.
*
* @param slaveOk true if the driver should read from secondaries or slaves.
* @deprecated since 1.7
*/
@Deprecated
public void setSlaveOk(boolean slaveOk) {
this.slaveOk = slaveOk;
}
@@ -194,40 +208,41 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
this.sslSocketFactory = sslSocketFactory;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
public void afterPropertiesSet() {
@Override
protected MongoOptions createInstance() throws Exception {
if (MongoClientVersion.isMongo3Driver()) {
throw new IllegalArgumentException(
String
.format("Usage of 'mongo-options' is no longer supported for MongoDB Java driver version 3 and above. Please use 'mongo-client-options' and refer to chapter 'MongoDB 3.0 Support' for details."));
}
MongoOptions options = new MongoOptions();
options.connectionsPerHost = connectionsPerHost;
options.threadsAllowedToBlockForConnectionMultiplier = threadsAllowedToBlockForConnectionMultiplier;
options.maxWaitTime = maxWaitTime;
options.connectTimeout = connectTimeout;
options.socketTimeout = socketTimeout;
options.socketKeepAlive = socketKeepAlive;
options.autoConnectRetry = autoConnectRetry;
options.maxAutoConnectRetryTime = maxAutoConnectRetryTime;
options.slaveOk = slaveOk;
options.w = writeNumber;
options.wtimeout = writeTimeout;
options.fsync = writeFsync;
options.setConnectionsPerHost(connectionsPerHost);
options.setThreadsAllowedToBlockForConnectionMultiplier(threadsAllowedToBlockForConnectionMultiplier);
options.setMaxWaitTime(maxWaitTime);
options.setConnectTimeout(connectTimeout);
options.setSocketTimeout(socketTimeout);
options.setSocketKeepAlive(socketKeepAlive);
options.setW(writeNumber);
options.setWtimeout(writeTimeout);
options.setFsync(writeFsync);
if (ssl) {
options.setSocketFactory(sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory.getDefault());
}
this.options = options;
}
ReflectiveMongoOptionsInvoker.setAutoConnectRetry(options, autoConnectRetry);
ReflectiveMongoOptionsInvoker.setMaxAutoConnectRetryTime(options, maxAutoConnectRetryTime);
ReflectiveMongoOptionsInvoker.setSlaveOk(options, slaveOk);
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
public MongoOptions getObject() {
return this.options;
return options;
}
/*
@@ -237,12 +252,4 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
public Class<?> getObjectType() {
return MongoOptions.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
*/
public boolean isSingleton() {
return true;
}
}

View File

@@ -0,0 +1,109 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
/**
* {@link ReflectiveDBCollectionInvoker} provides reflective access to {@link DBCollection} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class ReflectiveDBCollectionInvoker {
private static final Method GEN_INDEX_NAME_METHOD;
private static final Method RESET_INDEX_CHACHE_METHOD;
static {
GEN_INDEX_NAME_METHOD = findMethod(DBCollection.class, "genIndexName", DBObject.class);
RESET_INDEX_CHACHE_METHOD = findMethod(DBCollection.class, "resetIndexCache");
}
private ReflectiveDBCollectionInvoker() {}
/**
* Convenience method to generate an index name from the set of fields it is over. Will fall back to a MongoDB Java
* driver version 2 compatible way of generating index name in case of {@link MongoClientVersion#isMongo3Driver()}.
*
* @param keys the names of the fields used in this index
* @return
*/
public static String generateIndexName(DBObject keys) {
if (isMongo3Driver()) {
return genIndexName(keys);
}
return (String) invokeMethod(GEN_INDEX_NAME_METHOD, null, keys);
}
/**
* In case of MongoDB Java driver version 2 all indices that have not yet been applied to this collection will be
* cleared. Since this method is not available for the MongoDB Java driver version 3 the operation will throw
* {@link UnsupportedOperationException}.
*
* @param dbCollection
* @throws UnsupportedOperationException
*/
public static void resetIndexCache(DBCollection dbCollection) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException("The mongo java driver 3 does no loger support resetIndexCache!");
}
invokeMethod(RESET_INDEX_CHACHE_METHOD, dbCollection);
}
/**
* Borrowed from MongoDB Java driver version 2. See <a
* href="http://github.com/mongodb/mongo-java-driver/blob/r2.13.0/src/main/com/mongodb/DBCollection.java#L754"
* >http://github.com/mongodb/mongo-java-driver/blob/r2.13.0/src/main/com/mongodb/DBCollection.java#L754</a>
*
* @param keys
* @return
*/
private static String genIndexName(DBObject keys) {
StringBuilder name = new StringBuilder();
for (String s : keys.keySet()) {
if (name.length() > 0) {
name.append('_');
}
name.append(s).append('_');
Object val = keys.get(s);
if (val instanceof Number || val instanceof String) {
name.append(val.toString().replace(' ', '_'));
}
}
return name.toString();
}
}

View File

@@ -0,0 +1,134 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.DB;
import com.mongodb.Mongo;
/**
* {@link ReflectiveDbInvoker} provides reflective access to {@link DB} API that is not consistently available for
* various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
final class ReflectiveDbInvoker {
private static final Method DB_IS_AUTHENTICATED_METHOD;
private static final Method DB_AUTHENTICATE_METHOD;
private static final Method DB_REQUEST_DONE_METHOD;
private static final Method DB_ADD_USER_METHOD;
private static final Method DB_REQUEST_START_METHOD;
static {
DB_IS_AUTHENTICATED_METHOD = findMethod(DB.class, "isAuthenticated");
DB_AUTHENTICATE_METHOD = findMethod(DB.class, "authenticate", String.class, char[].class);
DB_REQUEST_DONE_METHOD = findMethod(DB.class, "requestDone");
DB_ADD_USER_METHOD = findMethod(DB.class, "addUser", String.class, char[].class);
DB_REQUEST_START_METHOD = findMethod(DB.class, "requestStart");
}
private ReflectiveDbInvoker() {}
/**
* Authenticate against database using provided credentials in case of a MongoDB Java driver version 2.
*
* @param mongo must not be {@literal null}.
* @param db must not be {@literal null}.
* @param credentials must not be {@literal null}.
* @param authenticationDatabaseName
*/
public static void authenticate(Mongo mongo, DB db, UserCredentials credentials, String authenticationDatabaseName) {
String databaseName = db.getName();
DB authDb = databaseName.equals(authenticationDatabaseName) ? db : mongo.getDB(authenticationDatabaseName);
synchronized (authDb) {
Boolean isAuthenticated = (Boolean) invokeMethod(DB_IS_AUTHENTICATED_METHOD, authDb);
if (!isAuthenticated) {
String username = credentials.getUsername();
String password = credentials.hasPassword() ? credentials.getPassword() : null;
Boolean authenticated = (Boolean) invokeMethod(DB_AUTHENTICATE_METHOD, authDb, username,
password == null ? null : password.toCharArray());
if (!authenticated) {
throw new CannotGetMongoDbConnectionException("Failed to authenticate to database [" + databaseName + "], "
+ credentials.toString(), databaseName, credentials);
}
}
}
}
/**
* Starts a new 'consistent request' in case of MongoDB Java driver version 2. Will do nothing for MongoDB Java driver
* version 3 since the operation is no longer available.
*
* @param db
*/
public static void requestStart(DB db) {
if (isMongo3Driver()) {
return;
}
invokeMethod(DB_REQUEST_START_METHOD, db);
}
/**
* Ends the current 'consistent request'. a new 'consistent request' in case of MongoDB Java driver version 2. Will do
* nothing for MongoDB Java driver version 3 since the operation is no longer available
*
* @param db
*/
public static void requestDone(DB db) {
if (MongoClientVersion.isMongo3Driver()) {
return;
}
invokeMethod(DB_REQUEST_DONE_METHOD, db);
}
/**
* @param db
* @param username
* @param password
* @throws UnsupportedOperationException
*/
public static void addUser(DB db, String username, char[] password) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Please use DB.command(…) to call either the createUser or updateUser command!");
}
invokeMethod(DB_ADD_USER_METHOD, db, username, password);
}
}

View File

@@ -0,0 +1,62 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.util.Assert;
import com.mongodb.MapReduceCommand;
/**
* {@link ReflectiveMapReduceInvoker} provides reflective access to {@link MapReduceCommand} API that is not
* consistently available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
final class ReflectiveMapReduceInvoker {
private static final Method ADD_EXTRA_OPTION_METHOD;
static {
ADD_EXTRA_OPTION_METHOD = findMethod(MapReduceCommand.class, "addExtraOption", String.class, Object.class);
}
private ReflectiveMapReduceInvoker() {}
/**
* Sets the extra option for MongoDB Java driver version 2. Will do nothing for MongoDB Java driver version 2.
*
* @param cmd can be {@literal null} for MongoDB Java driver version 2.
* @param key
* @param value
*/
public static void addExtraOption(MapReduceCommand cmd, String key, Object value) {
if (isMongo3Driver()) {
return;
}
Assert.notNull(cmd, "MapReduceCommand must not be null!");
invokeMethod(ADD_EXTRA_OPTION_METHOD, cmd, key, value);
}
}

View File

@@ -0,0 +1,158 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.beans.DirectFieldAccessor;
import org.springframework.util.ReflectionUtils;
import com.mongodb.MongoOptions;
/**
* {@link ReflectiveMongoOptionsInvoker} provides reflective access to {@link MongoOptions} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
@SuppressWarnings("deprecation")
class ReflectiveMongoOptionsInvoker {
private static final Method GET_AUTO_CONNECT_RETRY_METHOD;
private static final Method SET_AUTO_CONNECT_RETRY_METHOD;
private static final Method GET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD;
private static final Method SET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD;
static {
SET_AUTO_CONNECT_RETRY_METHOD = ReflectionUtils
.findMethod(MongoOptions.class, "setAutoConnectRetry", boolean.class);
GET_AUTO_CONNECT_RETRY_METHOD = ReflectionUtils.findMethod(MongoOptions.class, "isAutoConnectRetry");
SET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD = ReflectionUtils.findMethod(MongoOptions.class,
"setMaxAutoConnectRetryTime", long.class);
GET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD = ReflectionUtils.findMethod(MongoOptions.class,
"getMaxAutoConnectRetryTime");
}
private ReflectiveMongoOptionsInvoker() {}
/**
* Sets the retry connection flag for MongoDB Java driver version 2. Will do nothing for MongoDB Java driver version 3
* since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @param autoConnectRetry
*/
public static void setAutoConnectRetry(MongoOptions options, boolean autoConnectRetry) {
if (isMongo3Driver()) {
return;
}
invokeMethod(SET_AUTO_CONNECT_RETRY_METHOD, options, autoConnectRetry);
}
/**
* Sets the maxAutoConnectRetryTime attribute for MongoDB Java driver version 2. Will do nothing for MongoDB Java
* driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @param maxAutoConnectRetryTime
*/
public static void setMaxAutoConnectRetryTime(MongoOptions options, long maxAutoConnectRetryTime) {
if (isMongo3Driver()) {
return;
}
invokeMethod(SET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD, options, maxAutoConnectRetryTime);
}
/**
* Sets the slaveOk attribute for MongoDB Java driver version 2. Will do nothing for MongoDB Java driver version 3
* since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @param slaveOk
*/
public static void setSlaveOk(MongoOptions options, boolean slaveOk) {
if (isMongo3Driver()) {
return;
}
new DirectFieldAccessor(options).setPropertyValue("slaveOk", slaveOk);
}
/**
* Gets the slaveOk attribute for MongoDB Java driver version 2. Throws {@link UnsupportedOperationException} for
* MongoDB Java driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @return
* @throws UnsupportedOperationException
*/
public static boolean getSlaveOk(MongoOptions options) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Cannot get value for autoConnectRetry which has been removed in MongoDB Java driver version 3.");
}
return ((Boolean) new DirectFieldAccessor(options).getPropertyValue("slaveOk")).booleanValue();
}
/**
* Gets the autoConnectRetry attribute for MongoDB Java driver version 2. Throws {@link UnsupportedOperationException}
* for MongoDB Java driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @return
* @throws UnsupportedOperationException
*/
public static boolean getAutoConnectRetry(MongoOptions options) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Cannot get value for autoConnectRetry which has been removed in MongoDB Java driver version 3.");
}
return ((Boolean) invokeMethod(GET_AUTO_CONNECT_RETRY_METHOD, options)).booleanValue();
}
/**
* Gets the maxAutoConnectRetryTime attribute for MongoDB Java driver version 2. Throws
* {@link UnsupportedOperationException} for MongoDB Java driver version 3 since the method has been removed.
*
* @param options can be {@literal null} for MongoDB Java driver version 3.
* @return
* @throws UnsupportedOperationException
*/
public static long getMaxAutoConnectRetryTime(MongoOptions options) {
if (isMongo3Driver()) {
throw new UnsupportedOperationException(
"Cannot get value for maxAutoConnectRetryTime which has been removed in MongoDB Java driver version 3.");
}
return ((Long) invokeMethod(GET_MAX_AUTO_CONNECT_RETRY_TIME_METHOD, options)).longValue();
}
}

View File

@@ -0,0 +1,48 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import org.springframework.beans.DirectFieldAccessor;
import com.mongodb.WriteConcern;
/**
* {@link ReflectiveWriteConcernInvoker} provides reflective access to {@link WriteConcern} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
class ReflectiveWriteConcernInvoker {
private static final WriteConcern NONE_OR_UNACKNOWLEDGED;
static {
NONE_OR_UNACKNOWLEDGED = isMongo3Driver() ? WriteConcern.UNACKNOWLEDGED : (WriteConcern) new DirectFieldAccessor(
new WriteConcern()).getPropertyValue("NONE");
}
/**
* @return {@link WriteConcern#NONE} for MongoDB Java driver version 2, otherwise {@link WriteConcern#UNACKNOWLEDGED}.
*/
public static WriteConcern noneOrUnacknowledged() {
return NONE_OR_UNACKNOWLEDGED;
}
}

View File

@@ -0,0 +1,67 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import com.mongodb.MongoException;
import com.mongodb.WriteResult;
/**
* {@link ReflectiveWriteResultInvoker} provides reflective access to {@link WriteResult} API that is not consistently
* available for various driver versions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
final class ReflectiveWriteResultInvoker {
private static final Method GET_ERROR_METHOD;
private static final Method WAS_ACKNOWLEDGED_METHOD;
private ReflectiveWriteResultInvoker() {}
static {
GET_ERROR_METHOD = findMethod(WriteResult.class, "getError");
WAS_ACKNOWLEDGED_METHOD = findMethod(WriteResult.class, "wasAcknowledged");
}
/**
* @param writeResult can be {@literal null} for MongoDB Java driver version 3.
* @return null in case of MongoDB Java driver version 3 since errors are thrown as {@link MongoException}.
*/
public static String getError(WriteResult writeResult) {
if (isMongo3Driver()) {
return null;
}
return (String) invokeMethod(GET_ERROR_METHOD, writeResult);
}
/**
* @param writeResult
* @return return in case of MongoDB Java driver version 2.
*/
public static boolean wasAcknowledged(WriteResult writeResult) {
return isMongo3Driver() ? ((Boolean) invokeMethod(WAS_ACKNOWLEDGED_METHOD, writeResult)).booleanValue() : true;
}
}

View File

@@ -0,0 +1,84 @@
/*
* Copyright 2014-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.Set;
import org.springframework.data.mongodb.core.script.ExecutableMongoScript;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import com.mongodb.DB;
/**
* Script operations on {@link com.mongodb.DB} level. Allows interaction with server side JavaScript functions.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
public interface ScriptOperations {
/**
* Store given {@link ExecutableMongoScript} generating a syntheitcal name so that it can be called by it
* subsequently.
*
* @param script must not be {@literal null}.
* @return {@link NamedMongoScript} with name under which the {@code JavaScript} function can be called.
*/
NamedMongoScript register(ExecutableMongoScript script);
/**
* Registers the given {@link NamedMongoScript} in the database.
*
* @param script the {@link NamedMongoScript} to be registered.
* @return
*/
NamedMongoScript register(NamedMongoScript script);
/**
* Executes the {@literal script} by either calling it via its {@literal name} or directly sending it.
*
* @param script must not be {@literal null}.
* @param args arguments to pass on for script execution.
* @return the script evaluation result.
* @throws org.springframework.dao.DataAccessException
*/
Object execute(ExecutableMongoScript script, Object... args);
/**
* Call the {@literal JavaScript} by its name.
*
* @param scriptName must not be {@literal null} or empty.
* @param args
* @return
*/
Object call(String scriptName, Object... args);
/**
* Checks {@link DB} for existence of {@link ServerSideJavaScript} with given name.
*
* @param scriptName must not be {@literal null} or empty.
* @return false if no {@link ServerSideJavaScript} with given name exists.
*/
boolean exists(String scriptName);
/**
* Returns names of {@literal JavaScript} functions that can be called.
*
* @return empty {@link Set} if no scripts found.
*/
Set<String> getScriptNames();
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,6 +19,7 @@ import java.net.UnknownHostException;
import org.springframework.beans.factory.DisposableBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.MongoDbFactory;
@@ -27,6 +28,8 @@ import org.springframework.util.StringUtils;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientURI;
import com.mongodb.MongoException;
import com.mongodb.MongoURI;
import com.mongodb.WriteConcern;
@@ -37,6 +40,7 @@ import com.mongodb.WriteConcern;
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
@@ -54,7 +58,9 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
*
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName database name, not be {@literal null} or empty.
* @deprecated since 1.7. Please use {@link #SimpleMongoDbFactory(MongoClient, String)}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName) {
this(mongo, databaseName, null);
}
@@ -65,7 +71,9 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @param mongo Mongo instance, must not be {@literal null}.
* @param databaseName Database name, must not be {@literal null} or empty.
* @param credentials username and password.
* @deprecated since 1.7. The credentials used should be provided by {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials) {
this(mongo, databaseName, credentials, false, null);
}
@@ -77,7 +85,9 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @param databaseName Database name, must not be {@literal null} or empty.
* @param credentials username and password.
* @param authenticationDatabaseName the database name to use for authentication
* @deprecated since 1.7. The credentials used should be provided by {@link MongoClient#getCredentialsList()}.
*/
@Deprecated
public SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
String authenticationDatabaseName) {
this(mongo, databaseName, credentials, false, authenticationDatabaseName);
@@ -90,16 +100,44 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @throws MongoException
* @throws UnknownHostException
* @see MongoURI
* @deprecated since 1.7. Please use {@link #SimpleMongoDbFactory(MongoClientURI)} instead.
*/
@SuppressWarnings("deprecation")
@Deprecated
public SimpleMongoDbFactory(MongoURI uri) throws MongoException, UnknownHostException {
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())),
true, uri.getDatabase());
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())), true,
uri.getDatabase());
}
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClientURI}.
*
* @param uri must not be {@literal null}.
* @throws UnknownHostException
* @since 1.7
*/
public SimpleMongoDbFactory(MongoClientURI uri) throws UnknownHostException {
this(new MongoClient(uri), uri.getDatabase(), true);
}
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClient}.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null}.
* @since 1.7
*/
public SimpleMongoDbFactory(MongoClient mongoClient, String databaseName) {
this(mongoClient, databaseName, false);
}
private SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
boolean mongoInstanceCreated, String authenticationDatabaseName) {
if (mongo instanceof MongoClient && (credentials != null && !UserCredentials.NO_CREDENTIALS.equals(credentials))) {
throw new InvalidDataAccessApiUsageException(
"Usage of 'UserCredentials' with 'MongoClient' is no longer supported. Please use 'MongoCredential' for 'MongoClient' or just 'Mongo'.");
}
Assert.notNull(mongo, "Mongo must not be null");
Assert.hasText(databaseName, "Database name must not be empty");
Assert.isTrue(databaseName.matches("[\\w-]+"),
@@ -117,6 +155,25 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
"Authentication database name must only contain letters, numbers, underscores and dashes!");
}
/**
* @param client
* @param databaseName
* @param mongoInstanceCreated
* @since 1.7
*/
private SimpleMongoDbFactory(MongoClient client, String databaseName, boolean mongoInstanceCreated) {
Assert.notNull(client, "MongoClient must not be null!");
Assert.hasText(databaseName, "Database name must not be empty!");
this.mongo = client;
this.databaseName = databaseName;
this.mongoInstanceCreated = mongoInstanceCreated;
this.exceptionTranslator = new MongoExceptionTranslator();
this.credentials = UserCredentials.NO_CREDENTIALS;
this.authenticationDatabaseName = databaseName;
}
/**
* Configures the {@link WriteConcern} to be used on the {@link DB} instance being created.
*
@@ -138,6 +195,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb(java.lang.String)
*/
@SuppressWarnings("deprecation")
public DB getDb(String dbName) throws DataAccessException {
Assert.hasText(dbName, "Database name must not be empty.");

View File

@@ -0,0 +1,153 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.List;
import org.springframework.util.ObjectUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* @author Christoph Strobl
* @since 1.10
*/
abstract class AbstractAggregationExpression implements AggregationExpression {
private final Object value;
protected AbstractAggregationExpression(Object value) {
this.value = value;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
return toDbObject(this.value, context);
}
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
Object valueToUse;
if (value instanceof List) {
List<Object> arguments = (List<Object>) value;
List<Object> args = new ArrayList<Object>(arguments.size());
for (Object val : arguments) {
args.add(unpack(val, context));
}
valueToUse = args;
} else if (value instanceof java.util.Map) {
DBObject dbo = new BasicDBObject();
for (java.util.Map.Entry<String, Object> entry : ((java.util.Map<String, Object>) value).entrySet()) {
dbo.put(entry.getKey(), unpack(entry.getValue(), context));
}
valueToUse = dbo;
} else {
valueToUse = unpack(value, context);
}
return new BasicDBObject(getMongoMethod(), valueToUse);
}
protected static List<Field> asFields(String... fieldRefs) {
if (ObjectUtils.isEmpty(fieldRefs)) {
return Collections.emptyList();
}
return Fields.fields(fieldRefs).asList();
}
@SuppressWarnings("unchecked")
private Object unpack(Object value, AggregationOperationContext context) {
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDbObject(context);
}
if (value instanceof Field) {
return context.getReference((Field) value).toString();
}
if (value instanceof List) {
List<Object> sourceList = (List<Object>) value;
List<Object> mappedList = new ArrayList<Object>(sourceList.size());
for (Object item : sourceList) {
mappedList.add(unpack(item, context));
}
return mappedList;
}
return value;
}
protected List<Object> append(Object value) {
if (this.value instanceof List) {
List<Object> clone = new ArrayList<Object>((List) this.value);
if (value instanceof List) {
for (Object val : (List) value) {
clone.add(val);
}
} else {
clone.add(value);
}
return clone;
}
return Arrays.asList(this.value, value);
}
@SuppressWarnings("unchecked")
protected java.util.Map<String, Object> append(String key, Object value) {
if (!(this.value instanceof java.util.Map)) {
throw new IllegalArgumentException("o_O");
}
java.util.Map<String, Object> clone = new LinkedHashMap<String, Object>((java.util.Map<String, Object>) this.value);
clone.put(key, value);
return clone;
}
protected List<Object> values() {
if (value instanceof List) {
return new ArrayList<Object>((List) value);
}
if (value instanceof java.util.Map) {
return new ArrayList<Object>(((java.util.Map) value).values());
}
return new ArrayList<Object>(Collections.singletonList(value));
}
protected abstract String getMongoMethod();
}

View File

@@ -0,0 +1,648 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.Collections;
import java.util.List;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* Gateway to {@literal accumulator} aggregation operations.
*
* @author Christoph Strobl
* @since 1.10
* @soundtrack Rage Against The Machine - Killing In The Name
*/
public class AccumulatorOperators {
/**
* Take the numeric value referenced by given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static AccumulatorOperatorFactory valueOf(String fieldReference) {
return new AccumulatorOperatorFactory(fieldReference);
}
/**
* Take the numeric value referenced resulting from given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static AccumulatorOperatorFactory valueOf(AggregationExpression expression) {
return new AccumulatorOperatorFactory(expression);
}
/**
* @author Christoph Strobl
*/
public static class AccumulatorOperatorFactory {
private final String fieldReference;
private final AggregationExpression expression;
/**
* Creates new {@link AccumulatorOperatorFactory} for given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
*/
public AccumulatorOperatorFactory(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
this.fieldReference = fieldReference;
this.expression = null;
}
/**
* Creates new {@link AccumulatorOperatorFactory} for given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
*/
public AccumulatorOperatorFactory(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
this.fieldReference = null;
this.expression = expression;
}
/**
* Creates new {@link AggregationExpression} that takes the associated numeric value expression and calculates and
* returns the sum.
*
* @return
*/
public Sum sum() {
return usesFieldRef() ? Sum.sumOf(fieldReference) : Sum.sumOf(expression);
}
/**
* Creates new {@link AggregationExpression} that takes the associated numeric value expression and returns the
* average value.
*
* @return
*/
public Avg avg() {
return usesFieldRef() ? Avg.avgOf(fieldReference) : Avg.avgOf(expression);
}
/**
* Creates new {@link AggregationExpression} that takes the associated numeric value expression and returns the
* maximum value.
*
* @return
*/
public Max max() {
return usesFieldRef() ? Max.maxOf(fieldReference) : Max.maxOf(expression);
}
/**
* Creates new {@link AggregationExpression} that takes the associated numeric value expression and returns the
* minimum value.
*
* @return
*/
public Min min() {
return usesFieldRef() ? Min.minOf(fieldReference) : Min.minOf(expression);
}
/**
* Creates new {@link AggregationExpression} that takes the associated numeric value expression and calculates the
* population standard deviation of the input values.
*
* @return
*/
public StdDevPop stdDevPop() {
return usesFieldRef() ? StdDevPop.stdDevPopOf(fieldReference) : StdDevPop.stdDevPopOf(expression);
}
/**
* Creates new {@link AggregationExpression} that takes the associated numeric value expression and calculates the
* sample standard deviation of the input values.
*
* @return
*/
public StdDevSamp stdDevSamp() {
return usesFieldRef() ? StdDevSamp.stdDevSampOf(fieldReference) : StdDevSamp.stdDevSampOf(expression);
}
private boolean usesFieldRef() {
return fieldReference != null;
}
}
/**
* {@link AggregationExpression} for {@code $sum}.
*
* @author Christoph Strobl
*/
public static class Sum extends AbstractAggregationExpression {
private Sum(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$sum";
}
/**
* Creates new {@link Sum}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Sum sumOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Sum(asFields(fieldReference));
}
/**
* Creates new {@link Sum}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Sum sumOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Sum(Collections.singletonList(expression));
}
/**
* Creates new {@link Sum} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Sum and(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Sum(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Sum} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param expression must not be {@literal null}.
* @return
*/
public Sum and(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Sum(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDbObject(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDbObject(((List<Object>) value).iterator().next(), context);
}
}
return super.toDbObject(value, context);
}
}
/**
* {@link AggregationExpression} for {@code $avg}.
*
* @author Christoph Strobl
*/
public static class Avg extends AbstractAggregationExpression {
private Avg(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$avg";
}
/**
* Creates new {@link Avg}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Avg avgOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Avg(asFields(fieldReference));
}
/**
* Creates new {@link Avg}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Avg avgOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Avg(Collections.singletonList(expression));
}
/**
* Creates new {@link Avg} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Avg and(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Avg(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Avg} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param expression must not be {@literal null}.
* @return
*/
public Avg and(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Avg(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDbObject(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDbObject(((List<Object>) value).iterator().next(), context);
}
}
return super.toDbObject(value, context);
}
}
/**
* {@link AggregationExpression} for {@code $max}.
*
* @author Christoph Strobl
*/
public static class Max extends AbstractAggregationExpression {
private Max(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$max";
}
/**
* Creates new {@link Max}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Max maxOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Max(asFields(fieldReference));
}
/**
* Creates new {@link Max}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Max maxOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Max(Collections.singletonList(expression));
}
/**
* Creates new {@link Max} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Max and(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Max(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Max} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param expression must not be {@literal null}.
* @return
*/
public Max and(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Max(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDbObject(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDbObject(((List<Object>) value).iterator().next(), context);
}
}
return super.toDbObject(value, context);
}
}
/**
* {@link AggregationExpression} for {@code $min}.
*
* @author Christoph Strobl
*/
public static class Min extends AbstractAggregationExpression {
private Min(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$min";
}
/**
* Creates new {@link Min}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Min minOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Min(asFields(fieldReference));
}
/**
* Creates new {@link Min}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Min minOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Min(Collections.singletonList(expression));
}
/**
* Creates new {@link Min} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Min and(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Min(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Min} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param expression must not be {@literal null}.
* @return
*/
public Min and(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Min(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDbObject(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDbObject(((List<Object>) value).iterator().next(), context);
}
}
return super.toDbObject(value, context);
}
}
/**
* {@link AggregationExpression} for {@code $stdDevPop}.
*
* @author Christoph Strobl
*/
public static class StdDevPop extends AbstractAggregationExpression {
private StdDevPop(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$stdDevPop";
}
/**
* Creates new {@link StdDevPop}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static StdDevPop stdDevPopOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new StdDevPop(asFields(fieldReference));
}
/**
* Creates new {@link StdDevPop} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public static StdDevPop stdDevPopOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new StdDevPop(Collections.singletonList(expression));
}
/**
* Creates new {@link StdDevPop} with all previously added arguments appending the given one. <br/>
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public StdDevPop and(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new StdDevPop(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link StdDevSamp} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param expression must not be {@literal null}.
* @return
*/
public StdDevPop and(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new StdDevPop(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDbObject(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDbObject(((List<Object>) value).iterator().next(), context);
}
}
return super.toDbObject(value, context);
}
}
/**
* {@link AggregationExpression} for {@code $stdDevSamp}.
*
* @author Christoph Strobl
*/
public static class StdDevSamp extends AbstractAggregationExpression {
private StdDevSamp(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$stdDevSamp";
}
/**
* Creates new {@link StdDevSamp}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static StdDevSamp stdDevSampOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new StdDevSamp(asFields(fieldReference));
}
/**
* Creates new {@link StdDevSamp}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static StdDevSamp stdDevSampOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new StdDevSamp(Collections.singletonList(expression));
}
/**
* Creates new {@link StdDevSamp} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public StdDevSamp and(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new StdDevSamp(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link StdDevSamp} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param expression must not be {@literal null}.
* @return
*/
public StdDevSamp and(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new StdDevSamp(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDbObject(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDbObject(((List<Object>) value).iterator().next(), context);
}
}
return super.toDbObject(value, context);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -23,10 +23,19 @@ import java.util.List;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.aggregation.CountOperation.CountOperationBuilder;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.DirectFieldReference;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
import org.springframework.data.mongodb.core.aggregation.FacetOperation.FacetOperationBuilder;
import org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation;
import org.springframework.data.mongodb.core.aggregation.GraphLookupOperation.StartWithBuilder;
import org.springframework.data.mongodb.core.aggregation.ReplaceRootOperation.ReplaceRootDocumentOperationBuilder;
import org.springframework.data.mongodb.core.aggregation.ReplaceRootOperation.ReplaceRootOperationBuilder;
import org.springframework.data.mongodb.core.aggregation.Fields.*;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.util.Assert;
@@ -36,21 +45,39 @@ import com.mongodb.DBObject;
/**
* An {@code Aggregation} is a representation of a list of aggregation steps to be performed by the MongoDB Aggregation
* Framework.
*
*
* @author Tobias Trelle
* @author Thomas Darimont
* @author Oliver Gierke
* @author Mark Paluch
* @author Alessio Fachechi
* @author Christoph Strobl
* @author Nikolay Bogdanov
* @since 1.3
*/
public class Aggregation {
public static final AggregationOperationContext DEFAULT_CONTEXT = new NoOpAggregationOperationContext();
/**
* References the root document, i.e. the top-level document, currently being processed in the aggregation pipeline
* stage.
*/
public static final String ROOT = SystemVariable.ROOT.toString();
private final List<AggregationOperation> operations;
/**
* References the start of the field path being processed in the aggregation pipeline stage. Unless documented
* otherwise, all stages start with CURRENT the same as ROOT.
*/
public static final String CURRENT = SystemVariable.CURRENT.toString();
public static final AggregationOperationContext DEFAULT_CONTEXT = AggregationOperationRenderer.DEFAULT_CONTEXT;
public static final AggregationOptions DEFAULT_OPTIONS = newAggregationOptions().build();
protected final List<AggregationOperation> operations;
private final AggregationOptions options;
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
*
* @param operations must not be {@literal null} or empty.
*/
public static Aggregation newAggregation(List<? extends AggregationOperation> operations) {
@@ -59,16 +86,30 @@ public class Aggregation {
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
*
* @param operations must not be {@literal null} or empty.
*/
public static Aggregation newAggregation(AggregationOperation... operations) {
return new Aggregation(operations);
}
/**
* Returns a copy of this {@link Aggregation} with the given {@link AggregationOptions} set. Note that options are
* supported in MongoDB version 2.6+.
*
* @param options must not be {@literal null}.
* @return
* @since 1.6
*/
public Aggregation withOptions(AggregationOptions options) {
Assert.notNull(options, "AggregationOptions must not be null.");
return new Aggregation(this.operations, options);
}
/**
* Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s.
*
*
* @param type must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
*/
@@ -78,7 +119,7 @@ public class Aggregation {
/**
* Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s.
*
*
* @param type must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
*/
@@ -88,20 +129,63 @@ public class Aggregation {
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
*
* @param aggregationOperations must not be {@literal null} or empty.
*/
protected Aggregation(AggregationOperation... aggregationOperations) {
this(asAggregationList(aggregationOperations));
}
/**
* @param aggregationOperations must not be {@literal null} or empty.
* @return
*/
protected static List<AggregationOperation> asAggregationList(AggregationOperation... aggregationOperations) {
Assert.notEmpty(aggregationOperations, "AggregationOperations must not be null or empty!");
return Arrays.asList(aggregationOperations);
}
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
* @param aggregationOperations must not be {@literal null} or empty.
*/
protected Aggregation(List<AggregationOperation> aggregationOperations) {
this(aggregationOperations, DEFAULT_OPTIONS);
}
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
* @param aggregationOperations must not be {@literal null} or empty.
* @param options must not be {@literal null} or empty.
*/
protected Aggregation(List<AggregationOperation> aggregationOperations, AggregationOptions options) {
Assert.notNull(aggregationOperations, "AggregationOperations must not be null!");
Assert.isTrue(aggregationOperations.length > 0, "At least one AggregationOperation has to be provided");
Assert.isTrue(!aggregationOperations.isEmpty(), "At least one AggregationOperation has to be provided");
Assert.notNull(options, "AggregationOptions must not be null!");
this.operations = Arrays.asList(aggregationOperations);
// check $out is the last operation if it exists
for (AggregationOperation aggregationOperation : aggregationOperations) {
if (aggregationOperation instanceof OutOperation && !isLast(aggregationOperation, aggregationOperations)) {
throw new IllegalArgumentException("The $out operator must be the last stage in the pipeline.");
}
}
this.operations = aggregationOperations;
this.options = options;
}
private boolean isLast(AggregationOperation aggregationOperation, List<AggregationOperation> aggregationOperations) {
return aggregationOperations.indexOf(aggregationOperation) == aggregationOperations.size() - 1;
}
/**
* A pointer to the previous {@link AggregationOperation}.
*
*
* @return
*/
public static String previousOperation() {
@@ -110,7 +194,7 @@ public class Aggregation {
/**
* Creates a new {@link ProjectionOperation} including the given fields.
*
*
* @param fields must not be {@literal null}.
* @return
*/
@@ -119,8 +203,8 @@ public class Aggregation {
}
/**
* Creates a new {@link ProjectionOperation} includeing the given {@link Fields}.
*
* Creates a new {@link ProjectionOperation} including the given {@link Fields}.
*
* @param fields must not be {@literal null}.
* @return
*/
@@ -130,17 +214,95 @@ public class Aggregation {
/**
* Factory method to create a new {@link UnwindOperation} for the field with the given name.
*
* @param fieldName must not be {@literal null} or empty.
*
* @param field must not be {@literal null} or empty.
* @return
*/
public static UnwindOperation unwind(String field) {
return new UnwindOperation(field(field));
}
/**
* Factory method to create a new {@link ReplaceRootOperation} for the field with the given name.
*
* @param fieldName must not be {@literal null} or empty.
* @return
* @since 1.10
*/
public static ReplaceRootOperation replaceRoot(String fieldName) {
return ReplaceRootOperation.builder().withValueOf(fieldName);
}
/**
* Factory method to create a new {@link ReplaceRootOperation} for the field with the given
* {@link AggregationExpression}.
*
* @param aggregationExpression must not be {@literal null}.
* @return
* @since 1.10
*/
public static ReplaceRootOperation replaceRoot(AggregationExpression aggregationExpression) {
return ReplaceRootOperation.builder().withValueOf(aggregationExpression);
}
/**
* Factory method to create a new {@link ReplaceRootDocumentOperationBuilder} to configure a
* {@link ReplaceRootOperation}.
*
* @return the {@literal ReplaceRootDocumentOperationBuilder}.
* @since 1.10
*/
public static ReplaceRootOperationBuilder replaceRoot() {
return ReplaceRootOperation.builder();
}
/**
* Factory method to create a new {@link UnwindOperation} for the field with the given name and
* {@code preserveNullAndEmptyArrays}. Note that extended unwind is supported in MongoDB version 3.2+.
*
* @param field must not be {@literal null} or empty.
* @param preserveNullAndEmptyArrays {@literal true} to output the document if path is {@literal null}, missing or
* array is empty.
* @return new {@link UnwindOperation}
* @since 1.10
*/
public static UnwindOperation unwind(String field, boolean preserveNullAndEmptyArrays) {
return new UnwindOperation(field(field), preserveNullAndEmptyArrays);
}
/**
* Factory method to create a new {@link UnwindOperation} for the field with the given name including the name of a
* new field to hold the array index of the element as {@code arrayIndex}. Note that extended unwind is supported in
* MongoDB version 3.2+.
*
* @param field must not be {@literal null} or empty.
* @param arrayIndex must not be {@literal null} or empty.
* @return new {@link UnwindOperation}
* @since 1.10
*/
public static UnwindOperation unwind(String field, String arrayIndex) {
return new UnwindOperation(field(field), field(arrayIndex), false);
}
/**
* Factory method to create a new {@link UnwindOperation} for the field with the given nameincluding the name of a new
* field to hold the array index of the element as {@code arrayIndex} using {@code preserveNullAndEmptyArrays}. Note
* that extended unwind is supported in MongoDB version 3.2+.
*
* @param field must not be {@literal null} or empty.
* @param arrayIndex must not be {@literal null} or empty.
* @param preserveNullAndEmptyArrays {@literal true} to output the document if path is {@literal null}, missing or
* array is empty.
* @return new {@link UnwindOperation}
* @since 1.10
*/
public static UnwindOperation unwind(String field, String arrayIndex, boolean preserveNullAndEmptyArrays) {
return new UnwindOperation(field(field), field(arrayIndex), preserveNullAndEmptyArrays);
}
/**
* Creates a new {@link GroupOperation} for the given fields.
*
*
* @param fields must not be {@literal null}.
* @return
*/
@@ -150,7 +312,7 @@ public class Aggregation {
/**
* Creates a new {@link GroupOperation} for the given {@link Fields}.
*
*
* @param fields must not be {@literal null}.
* @return
*/
@@ -158,9 +320,21 @@ public class Aggregation {
return new GroupOperation(fields);
}
/**
* Creates a new {@link GraphLookupOperation.GraphLookupOperationFromBuilder} to construct a
* {@link GraphLookupOperation} given {@literal fromCollection}.
*
* @param fromCollection must not be {@literal null} or empty.
* @return
* @since 1.10
*/
public static StartWithBuilder graphLookup(String fromCollection) {
return GraphLookupOperation.builder().from(fromCollection);
}
/**
* Factory method to create a new {@link SortOperation} for the given {@link Sort}.
*
*
* @param sort must not be {@literal null}.
* @return
*/
@@ -170,7 +344,7 @@ public class Aggregation {
/**
* Factory method to create a new {@link SortOperation} for the given sort {@link Direction} and {@code fields}.
*
*
* @param direction must not be {@literal null}.
* @param fields must not be {@literal null}.
* @return
@@ -181,17 +355,28 @@ public class Aggregation {
/**
* Creates a new {@link SkipOperation} skipping the given number of elements.
*
*
* @param elementsToSkip must not be less than zero.
* @return
* @deprecated prepare to get this one removed in favor of {@link #skip(long)}.
*/
public static SkipOperation skip(int elementsToSkip) {
return new SkipOperation(elementsToSkip);
}
/**
* Creates a new {@link SkipOperation} skipping the given number of elements.
*
* @param elementsToSkip must not be less than zero.
* @return
*/
public static SkipOperation skip(long elementsToSkip) {
return new SkipOperation(elementsToSkip);
}
/**
* Creates a new {@link LimitOperation} limiting the result to the given number of elements.
*
*
* @param maxElements must not be less than zero.
* @return
*/
@@ -201,7 +386,7 @@ public class Aggregation {
/**
* Creates a new {@link MatchOperation} using the given {@link Criteria}.
*
*
* @param criteria must not be {@literal null}.
* @return
*/
@@ -209,12 +394,142 @@ public class Aggregation {
return new MatchOperation(criteria);
}
/**
* Creates a new {@link MatchOperation} using the given {@link CriteriaDefinition}.
*
* @param criteria must not be {@literal null}.
* @return
* @since 1.10
*/
public static MatchOperation match(CriteriaDefinition criteria) {
return new MatchOperation(criteria);
}
/**
* Creates a new {@link OutOperation} using the given collection name. This operation must be the last operation in
* the pipeline.
*
* @param outCollectionName collection name to export aggregation results. The {@link OutOperation} creates a new
* collection in the current database if one does not already exist. The collection is not visible until the
* aggregation completes. If the aggregation fails, MongoDB does not create the collection. Must not be
* {@literal null}.
* @return
*/
public static OutOperation out(String outCollectionName) {
return new OutOperation(outCollectionName);
}
/**
* Creates a new {@link BucketOperation} given {@literal groupByField}.
*
* @param groupByField must not be {@literal null} or empty.
* @return
* @since 1.10
*/
public static BucketOperation bucket(String groupByField) {
return new BucketOperation(field(groupByField));
}
/**
* Creates a new {@link BucketOperation} given {@link AggregationExpression group-by expression}.
*
* @param groupByExpression must not be {@literal null}.
* @return
* @since 1.10
*/
public static BucketOperation bucket(AggregationExpression groupByExpression) {
return new BucketOperation(groupByExpression);
}
/**
* Creates a new {@link BucketAutoOperation} given {@literal groupByField}.
*
* @param groupByField must not be {@literal null} or empty.
* @param buckets number of buckets, must be a positive integer.
* @return
* @since 1.10
*/
public static BucketAutoOperation bucketAuto(String groupByField, int buckets) {
return new BucketAutoOperation(field(groupByField), buckets);
}
/**
* Creates a new {@link BucketAutoOperation} given {@link AggregationExpression group-by expression}.
*
* @param groupByExpression must not be {@literal null}.
* @param buckets number of buckets, must be a positive integer.
* @return
* @since 1.10
*/
public static BucketAutoOperation bucketAuto(AggregationExpression groupByExpression, int buckets) {
return new BucketAutoOperation(groupByExpression, buckets);
}
/**
* Creates a new {@link FacetOperation}.
*
* @return
* @since 1.10
*/
public static FacetOperation facet() {
return FacetOperation.EMPTY;
}
/**
* Creates a new {@link FacetOperationBuilder} given {@link Aggregation}.
*
* @param aggregationOperations the sub-pipeline, must not be {@literal null}.
* @return
* @since 1.10
*/
public static FacetOperationBuilder facet(AggregationOperation... aggregationOperations) {
return facet().and(aggregationOperations);
}
/**
* Creates a new {@link LookupOperation}.
*
* @param from must not be {@literal null}.
* @param localField must not be {@literal null}.
* @param foreignField must not be {@literal null}.
* @param as must not be {@literal null}.
* @return never {@literal null}.
* @since 1.9
*/
public static LookupOperation lookup(String from, String localField, String foreignField, String as) {
return lookup(field(from), field(localField), field(foreignField), field(as));
}
/**
* Creates a new {@link LookupOperation} for the given {@link Fields}.
*
* @param from must not be {@literal null}.
* @param localField must not be {@literal null}.
* @param foreignField must not be {@literal null}.
* @param as must not be {@literal null}.
* @return never {@literal null}.
* @since 1.9
*/
public static LookupOperation lookup(Field from, Field localField, Field foreignField, Field as) {
return new LookupOperation(from, localField, foreignField, as);
}
/**
* Creates a new {@link CountOperationBuilder}.
*
* @return never {@literal null}.
* @since 1.10
*/
public static CountOperationBuilder count() {
return new CountOperationBuilder();
}
/**
* Creates a new {@link Fields} instance for the given field names.
*
* @see Fields#fields(String...)
*
* @param fields must not be {@literal null}.
* @return
* @see Fields#fields(String...)
*/
public static Fields fields(String... fields) {
return Fields.fields(fields);
@@ -222,7 +537,7 @@ public class Aggregation {
/**
* Creates a new {@link Fields} instance from the given field name and target reference.
*
*
* @param name must not be {@literal null} or empty.
* @param target must not be {@literal null} or empty.
* @return
@@ -231,30 +546,44 @@ public class Aggregation {
return Fields.from(field(name, target));
}
/**
* Creates a new {@link GeoNearOperation} instance from the given {@link NearQuery} and the{@code distanceField}. The
* {@code distanceField} defines output field that contains the calculated distance.
*
* @param query must not be {@literal null}.
* @param distanceField must not be {@literal null} or empty.
* @return
* @since 1.7
*/
public static GeoNearOperation geoNear(NearQuery query, String distanceField) {
return new GeoNearOperation(query, distanceField);
}
/**
* Returns a new {@link AggregationOptions.Builder}.
*
* @return
* @since 1.6
*/
public static AggregationOptions.Builder newAggregationOptions() {
return new AggregationOptions.Builder();
}
/**
* Converts this {@link Aggregation} specification to a {@link DBObject}.
*
*
* @param inputCollectionName the name of the input collection
* @return the {@code DBObject} representing this aggregation
*/
public DBObject toDbObject(String inputCollectionName, AggregationOperationContext rootContext) {
AggregationOperationContext context = rootContext;
List<DBObject> operationDocuments = new ArrayList<DBObject>(operations.size());
for (AggregationOperation operation : operations) {
operationDocuments.add(operation.toDBObject(context));
if (operation instanceof FieldsExposingAggregationOperation) {
FieldsExposingAggregationOperation exposedFieldsOperation = (FieldsExposingAggregationOperation) operation;
context = new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields());
}
}
List<DBObject> operationDocuments = AggregationOperationRenderer.toDBObject(operations, rootContext);
DBObject command = new BasicDBObject("aggregate", inputCollectionName);
command.put("pipeline", operationDocuments);
command = options.applyAndReturnPotentiallyChangedCommand(command);
return command;
}
@@ -264,42 +593,53 @@ public class Aggregation {
*/
@Override
public String toString() {
return SerializationUtils
.serializeToJsonSafely(toDbObject("__collection__", new NoOpAggregationOperationContext()));
return SerializationUtils.serializeToJsonSafely(toDbObject("__collection__", DEFAULT_CONTEXT));
}
/**
* Simple {@link AggregationOperationContext} that just returns {@link FieldReference}s as is.
*
* @author Oliver Gierke
* Describes the system variables available in MongoDB aggregation framework pipeline expressions.
*
* @author Thomas Darimont
* @see <a href="https://docs.mongodb.org/manual/reference/aggregation-variables">Aggregation Variables</a>
*/
private static class NoOpAggregationOperationContext implements AggregationOperationContext {
enum SystemVariable {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(com.mongodb.DBObject)
ROOT, CURRENT;
private static final String PREFIX = "$$";
/**
* Return {@literal true} if the given {@code fieldRef} denotes a well-known system variable, {@literal false}
* otherwise.
*
* @param fieldRef may be {@literal null}.
* @return
*/
@Override
public DBObject getMappedObject(DBObject dbObject) {
return dbObject;
public static boolean isReferingToSystemVariable(String fieldRef) {
if (fieldRef == null || !fieldRef.startsWith(PREFIX) || fieldRef.length() <= 2) {
return false;
}
int indexOfFirstDot = fieldRef.indexOf('.');
String candidate = fieldRef.substring(2, indexOfFirstDot == -1 ? fieldRef.length() : indexOfFirstDot);
for (SystemVariable value : values()) {
if (value.name().equals(candidate)) {
return true;
}
}
return false;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.ExposedFields.AvailableField)
* @see java.lang.Enum#toString()
*/
@Override
public FieldReference getReference(Field field) {
return new FieldReference(new ExposedField(field, true));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
*/
@Override
public FieldReference getReference(String name) {
return new FieldReference(new ExposedField(new AggregationField(name), true));
public String toString() {
return PREFIX.concat(name());
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,25 +13,26 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
package org.springframework.data.mongodb.core.aggregation;
import com.mongodb.DBObject;
/**
* Value object to create custom {@link Metric}s on the fly.
*
* @deprecated As of release 1.5, replaced by {@link org.springframework.data.geo.Metric}. This class is scheduled to be
* removed in the next major release.
* @author Oliver Gierke
* An {@link AggregationExpression} can be used with field expressions in aggregation pipeline stages like
* {@code project} and {@code group}.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
*/
@Deprecated
public class CustomMetric extends org.springframework.data.geo.CustomMetric implements Metric {
public interface AggregationExpression {
/**
* Creates a custom {@link Metric} using the given multiplier.
*
* @param multiplier
* Turns the {@link AggregationExpression} into a {@link DBObject} within the given
* {@link AggregationOperationContext}.
*
* @param context
* @return
*/
public CustomMetric(double multiplier) {
super(multiplier);
}
DBObject toDbObject(AggregationOperationContext context);
}

View File

@@ -0,0 +1,109 @@
/*
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* An enum of supported {@link AggregationExpression}s in aggregation pipeline stages.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
* @since 1.7
* @deprecated since 1.10. Please use {@link ArithmeticOperators} and {@link ComparisonOperators} instead.
*/
@Deprecated
public enum AggregationFunctionExpressions {
SIZE, CMP, EQ, GT, GTE, LT, LTE, NE, SUBTRACT, ADD, MULTIPLY;
/**
* Returns an {@link AggregationExpression} build from the current {@link Enum} name and the given parameters.
*
* @param parameters must not be {@literal null}
* @return
*/
public AggregationExpression of(Object... parameters) {
Assert.notNull(parameters, "Parameters must not be null!");
return new FunctionExpression(name().toLowerCase(), parameters);
}
/**
* An {@link AggregationExpression} representing a function call.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.7
*/
static class FunctionExpression implements AggregationExpression {
private final String name;
private final List<Object> values;
/**
* Creates a new {@link FunctionExpression} for the given name and values.
*
* @param name must not be {@literal null} or empty.
* @param values must not be {@literal null}.
*/
public FunctionExpression(String name, Object[] values) {
Assert.hasText(name, "Name must not be null!");
Assert.notNull(values, "Values must not be null!");
this.name = name;
this.values = Arrays.asList(values);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Expression#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
List<Object> args = new ArrayList<Object>(values.size());
for (Object value : values) {
args.add(unpack(value, context));
}
return new BasicDBObject("$" + name, args);
}
private static Object unpack(Object value, AggregationOperationContext context) {
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDbObject(context);
}
if (value instanceof Field) {
return context.getReference((Field) value).toString();
}
return value;
}
}
}

View File

@@ -0,0 +1,109 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.List;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.DirectFieldReference;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
import org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation;
import com.mongodb.DBObject;
/**
* Rendering support for {@link AggregationOperation} into a {@link List} of {@link com.mongodb.DBObject}.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 1.10
*/
class AggregationOperationRenderer {
static final AggregationOperationContext DEFAULT_CONTEXT = new NoOpAggregationOperationContext();
/**
* Render a {@link List} of {@link AggregationOperation} given {@link AggregationOperationContext} into their
* {@link DBObject} representation.
*
* @param operations must not be {@literal null}.
* @param context must not be {@literal null}.
* @return the {@link List} of {@link DBObject}.
*/
static List<DBObject> toDBObject(List<AggregationOperation> operations, AggregationOperationContext rootContext) {
List<DBObject> operationDocuments = new ArrayList<DBObject>(operations.size());
AggregationOperationContext contextToUse = rootContext;
for (AggregationOperation operation : operations) {
operationDocuments.add(operation.toDBObject(contextToUse));
if (operation instanceof FieldsExposingAggregationOperation) {
FieldsExposingAggregationOperation exposedFieldsOperation = (FieldsExposingAggregationOperation) operation;
ExposedFields fields = exposedFieldsOperation.getFields();
if (operation instanceof InheritsFieldsAggregationOperation) {
contextToUse = new InheritingExposedFieldsAggregationOperationContext(fields, contextToUse);
} else {
contextToUse = fields.exposesNoFields() ? DEFAULT_CONTEXT
: new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields(), contextToUse);
}
}
}
return operationDocuments;
}
/**
* Simple {@link AggregationOperationContext} that just returns {@link FieldReference}s as is.
*
* @author Oliver Gierke
*/
private static class NoOpAggregationOperationContext implements AggregationOperationContext {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(com.mongodb.DBObject)
*/
@Override
public DBObject getMappedObject(DBObject dbObject) {
return dbObject;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.ExposedFields.AvailableField)
*/
@Override
public FieldReference getReference(Field field) {
return new DirectFieldReference(new ExposedField(field, true));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
*/
@Override
public FieldReference getReference(String name) {
return new DirectFieldReference(new ExposedField(new AggregationField(name), true));
}
}
}

View File

@@ -0,0 +1,189 @@
/*
* Copyright 2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Holds a set of configurable aggregation options that can be used within an aggregation pipeline. A list of support
* aggregation options can be found in the MongoDB reference documentation
* https://docs.mongodb.org/manual/reference/command/aggregate/#aggregate
*
* @author Thomas Darimont
* @author Oliver Gierke
* @see Aggregation#withOptions(AggregationOptions)
* @see TypedAggregation#withOptions(AggregationOptions)
* @since 1.6
*/
public class AggregationOptions {
private static final String CURSOR = "cursor";
private static final String EXPLAIN = "explain";
private static final String ALLOW_DISK_USE = "allowDiskUse";
private final boolean allowDiskUse;
private final boolean explain;
private final DBObject cursor;
/**
* Creates a new {@link AggregationOptions}.
*
* @param allowDiskUse whether to off-load intensive sort-operations to disk.
* @param explain whether to get the execution plan for the aggregation instead of the actual results.
* @param cursor can be {@literal null}, used to pass additional options to the aggregation.
*/
public AggregationOptions(boolean allowDiskUse, boolean explain, DBObject cursor) {
this.allowDiskUse = allowDiskUse;
this.explain = explain;
this.cursor = cursor;
}
/**
* Enables writing to temporary files. When set to true, aggregation stages can write data to the _tmp subdirectory in
* the dbPath directory.
*
* @return
*/
public boolean isAllowDiskUse() {
return allowDiskUse;
}
/**
* Specifies to return the information on the processing of the pipeline.
*
* @return
*/
public boolean isExplain() {
return explain;
}
/**
* Specify a document that contains options that control the creation of the cursor object.
*
* @return
*/
public DBObject getCursor() {
return cursor;
}
/**
* Returns a new potentially adjusted copy for the given {@code aggregationCommandObject} with the configuration
* applied.
*
* @param command the aggregation command.
* @return
*/
DBObject applyAndReturnPotentiallyChangedCommand(DBObject command) {
DBObject result = new BasicDBObject(command.toMap());
if (allowDiskUse && !result.containsField(ALLOW_DISK_USE)) {
result.put(ALLOW_DISK_USE, allowDiskUse);
}
if (explain && !result.containsField(EXPLAIN)) {
result.put(EXPLAIN, explain);
}
if (cursor != null && !result.containsField(CURSOR)) {
result.put("cursor", cursor);
}
return result;
}
/**
* Returns a {@link DBObject} representation of this {@link AggregationOptions}.
*
* @return
*/
public DBObject toDbObject() {
DBObject dbo = new BasicDBObject();
dbo.put(ALLOW_DISK_USE, allowDiskUse);
dbo.put(EXPLAIN, explain);
dbo.put(CURSOR, cursor);
return dbo;
}
/* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return toDbObject().toString();
}
/**
* A Builder for {@link AggregationOptions}.
*
* @author Thomas Darimont
*/
public static class Builder {
private boolean allowDiskUse;
private boolean explain;
private DBObject cursor;
/**
* Defines whether to off-load intensive sort-operations to disk.
*
* @param allowDiskUse
* @return
*/
public Builder allowDiskUse(boolean allowDiskUse) {
this.allowDiskUse = allowDiskUse;
return this;
}
/**
* Defines whether to get the execution plan for the aggregation instead of the actual results.
*
* @param explain
* @return
*/
public Builder explain(boolean explain) {
this.explain = explain;
return this;
}
/**
* Additional options to the aggregation.
*
* @param cursor
* @return
*/
public Builder cursor(DBObject cursor) {
this.cursor = cursor;
return this;
}
/**
* Returns a new {@link AggregationOptions} instance with the given configuration.
*
* @return
*/
public AggregationOptions build() {
return new AggregationOptions(allowDiskUse, explain, cursor);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,6 +28,8 @@ import com.mongodb.DBObject;
*
* @author Tobias Trelle
* @author Oliver Gierke
* @author Thomas Darimont
* @author Mark Paluch
* @param <T> The class in which the results are mapped onto.
* @since 1.3
*/
@@ -45,8 +47,8 @@ public class AggregationResults<T> implements Iterable<T> {
*/
public AggregationResults(List<T> mappedResults, DBObject rawResults) {
Assert.notNull(mappedResults);
Assert.notNull(rawResults);
Assert.notNull(mappedResults, "List of mapped results must not be null!");
Assert.notNull(rawResults, "Raw results must not be null!");
this.mappedResults = Collections.unmodifiableList(mappedResults);
this.rawResults = rawResults;
@@ -90,6 +92,16 @@ public class AggregationResults<T> implements Iterable<T> {
return serverUsed;
}
/**
* Returns the raw result that was returned by the server.
*
* @return
* @since 1.6
*/
public DBObject getRawResults() {
return rawResults;
}
private String parseServerUsed() {
Object object = rawResults.get("serverUsed");

View File

@@ -0,0 +1,74 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* An {@link AggregationExpression} that renders a MongoDB Aggregation Framework expression from the AST of a
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/expressions.html">SpEL
* expression</a>. <br />
* <br />
* <strong>Samples:</strong> <br />
* <code>
* <pre>
* // { $and: [ { $gt: [ "$qty", 100 ] }, { $lt: [ "$qty", 250 ] } ] }
* expressionOf("qty > 100 && qty < 250);
*
* // { $cond : { if : { $gte : [ "$a", 42 ]}, then : "answer", else : "no-answer" } }
* expressionOf("cond(a >= 42, 'answer', 'no-answer')");
* </pre>
* </code>
*
* @author Christoph Strobl
* @see SpelExpressionTransformer
* @since 1.10
*/
public class AggregationSpELExpression implements AggregationExpression {
private static final SpelExpressionTransformer TRANSFORMER = new SpelExpressionTransformer();
private final String rawExpression;
private final Object[] parameters;
private AggregationSpELExpression(String rawExpression, Object[] parameters) {
this.rawExpression = rawExpression;
this.parameters = parameters;
}
/**
* Creates new {@link AggregationSpELExpression} for the given {@literal expressionString} and {@literal parameters}.
*
* @param expressionString must not be {@literal null}.
* @param parameters can be empty.
* @return
*/
public static AggregationSpELExpression expressionOf(String expressionString, Object... parameters) {
Assert.notNull(expressionString, "ExpressionString must not be null!");
return new AggregationSpELExpression(expressionString, parameters);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
return (DBObject) TRANSFORMER.transform(rawExpression, context, parameters);
}
}

View File

@@ -0,0 +1,353 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import org.springframework.util.Assert;
/**
* Gateway to {@literal boolean expressions} that evaluate their argument expressions as booleans and return a boolean
* as the result.
*
* @author Christoph Strobl
* @since 1.10
*/
public class BooleanOperators {
/**
* Take the array referenced by given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static BooleanOperatorFactory valueOf(String fieldReference) {
return new BooleanOperatorFactory(fieldReference);
}
/**
* Take the value resulting of the given {@link AggregationExpression}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static BooleanOperatorFactory valueOf(AggregationExpression fieldReference) {
return new BooleanOperatorFactory(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that evaluates the boolean value of the referenced field and returns the
* opposite boolean value.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Not not(String fieldReference) {
return Not.not(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that evaluates the boolean value of {@link AggregationExpression} result
* and returns the opposite boolean value.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Not not(AggregationExpression expression) {
return Not.not(expression);
}
/**
* @author Christoph Strobl
*/
public static class BooleanOperatorFactory {
private final String fieldReference;
private final AggregationExpression expression;
/**
* Creates new {@link BooleanOperatorFactory} for given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
*/
public BooleanOperatorFactory(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
this.fieldReference = fieldReference;
this.expression = null;
}
/**
* Creates new {@link BooleanOperatorFactory} for given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
*/
public BooleanOperatorFactory(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
this.fieldReference = null;
this.expression = expression;
}
/**
* Creates new {@link AggregationExpression} that evaluates one or more expressions and returns {@literal true} if
* all of the expressions are {@literal true}.
*
* @param expression must not be {@literal null}.
* @return
*/
public And and(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return createAnd().andExpression(expression);
}
/**
* Creates new {@link AggregationExpression} that evaluates one or more expressions and returns {@literal true} if
* all of the expressions are {@literal true}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public And and(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return createAnd().andField(fieldReference);
}
private And createAnd() {
return usesFieldRef() ? And.and(Fields.field(fieldReference)) : And.and(expression);
}
/**
* Creates new {@link AggregationExpression} that evaluates one or more expressions and returns {@literal true} if
* any of the expressions are {@literal true}.
*
* @param expression must not be {@literal null}.
* @return
*/
public Or or(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return createOr().orExpression(expression);
}
/**
* Creates new {@link AggregationExpression} that evaluates one or more expressions and returns {@literal true} if
* any of the expressions are {@literal true}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Or or(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return createOr().orField(fieldReference);
}
private Or createOr() {
return usesFieldRef() ? Or.or(Fields.field(fieldReference)) : Or.or(expression);
}
/**
* Creates new {@link AggregationExpression} that evaluates a boolean and returns the opposite boolean value.
*
* @return
*/
public Not not() {
return usesFieldRef() ? Not.not(fieldReference) : Not.not(expression);
}
private boolean usesFieldRef() {
return this.fieldReference != null;
}
}
/**
* {@link AggregationExpression} for {@code $and}.
*
* @author Christoph Strobl
*/
public static class And extends AbstractAggregationExpression {
private And(List<?> values) {
super(values);
}
@Override
protected String getMongoMethod() {
return "$and";
}
/**
* Creates new {@link And} that evaluates one or more expressions and returns {@literal true} if all of the
* expressions are {@literal true}.
*
* @param expressions
* @return
*/
public static And and(Object... expressions) {
return new And(Arrays.asList(expressions));
}
/**
* Creates new {@link And} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public And andExpression(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new And(append(expression));
}
/**
* Creates new {@link And} with all previously added arguments appending the given one.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public And andField(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new And(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link And} with all previously added arguments appending the given one.
*
* @param value must not be {@literal null}.
* @return
*/
public And andValue(Object value) {
Assert.notNull(value, "Value must not be null!");
return new And(append(value));
}
}
/**
* {@link AggregationExpression} for {@code $or}.
*
* @author Christoph Strobl
*/
public static class Or extends AbstractAggregationExpression {
private Or(List<?> values) {
super(values);
}
@Override
protected String getMongoMethod() {
return "$or";
}
/**
* Creates new {@link Or} that evaluates one or more expressions and returns {@literal true} if any of the
* expressions are {@literal true}.
*
* @param expressions must not be {@literal null}.
* @return
*/
public static Or or(Object... expressions) {
Assert.notNull(expressions, "Expressions must not be null!");
return new Or(Arrays.asList(expressions));
}
/**
* Creates new {@link Or} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public Or orExpression(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Or(append(expression));
}
/**
* Creates new {@link Or} with all previously added arguments appending the given one.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Or orField(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Or(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Or} with all previously added arguments appending the given one.
*
* @param value must not be {@literal null}.
* @return
*/
public Or orValue(Object value) {
Assert.notNull(value, "Value must not be null!");
return new Or(append(value));
}
}
/**
* {@link AggregationExpression} for {@code $not}.
*
* @author Christoph Strobl
*/
public static class Not extends AbstractAggregationExpression {
private Not(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$not";
}
/**
* Creates new {@link Not} that evaluates the boolean value of the referenced field and returns the opposite boolean
* value.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Not not(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Not(asFields(fieldReference));
}
/**
* Creates new {@link Not} that evaluates the resulting boolean value of the given {@link AggregationExpression} and
* returns the opposite boolean value.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Not not(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Not(Collections.singletonList(expression));
}
}
}

View File

@@ -0,0 +1,275 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.aggregation.BucketAutoOperation.BucketAutoOperationOutputBuilder;
import org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OutputBuilder;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $bucketAuto}-operation. <br />
* Bucket stage is typically used with {@link Aggregation} and {@code $facet}. Categorizes incoming documents into a
* specific number of groups, called buckets, based on a specified expression. Bucket boundaries are automatically
* determined in an attempt to evenly distribute the documents into the specified number of buckets. <br />
* We recommend to use the static factory method {@link Aggregation#bucketAuto(String, int)} instead of creating
* instances of this class directly.
*
* @see <a href=
* "https://docs.mongodb.org/manual/reference/aggregation/bucketAuto/">https://docs.mongodb.org/manual/reference/aggregation/bucketAuto/</a>
* @see BucketOperationSupport
* @author Mark Paluch
* @author Christoph Strobl
* @since 1.10
*/
public class BucketAutoOperation extends BucketOperationSupport<BucketAutoOperation, BucketAutoOperationOutputBuilder>
implements FieldsExposingAggregationOperation {
private final int buckets;
private final String granularity;
/**
* Creates a new {@link BucketAutoOperation} given a {@link Field group-by field}.
*
* @param groupByField must not be {@literal null}.
* @param buckets number of buckets, must be a positive integer.
*/
public BucketAutoOperation(Field groupByField, int buckets) {
super(groupByField);
Assert.isTrue(buckets > 0, "Number of buckets must be greater 0!");
this.buckets = buckets;
this.granularity = null;
}
/**
* Creates a new {@link BucketAutoOperation} given a {@link AggregationExpression group-by expression}.
*
* @param groupByExpression must not be {@literal null}.
* @param buckets number of buckets, must be a positive integer.
*/
public BucketAutoOperation(AggregationExpression groupByExpression, int buckets) {
super(groupByExpression);
Assert.isTrue(buckets > 0, "Number of buckets must be greater 0!");
this.buckets = buckets;
this.granularity = null;
}
private BucketAutoOperation(BucketAutoOperation bucketOperation, Outputs outputs) {
super(bucketOperation, outputs);
this.buckets = bucketOperation.buckets;
this.granularity = bucketOperation.granularity;
}
private BucketAutoOperation(BucketAutoOperation bucketOperation, int buckets, String granularity) {
super(bucketOperation);
this.buckets = buckets;
this.granularity = granularity;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
DBObject options = new BasicDBObject();
options.put("buckets", buckets);
options.putAll(super.toDBObject(context));
if (granularity != null) {
options.put("granularity", granularity);
}
return new BasicDBObject("$bucketAuto", options);
}
/**
* Configures a number of bucket {@literal buckets} and return a new {@link BucketAutoOperation}.
*
* @param buckets must be a positive number.
* @return
*/
public BucketAutoOperation withBuckets(int buckets) {
Assert.isTrue(buckets > 0, "Number of buckets must be greater 0!");
return new BucketAutoOperation(this, buckets, granularity);
}
/**
* Configures {@link Granularity granularity} that specifies the preferred number series to use to ensure that the
* calculated boundary edges end on preferred round numbers or their powers of 10 and return a new
* {@link BucketAutoOperation}. <br />
* Use either predefined {@link Granularities} or provide a own one.
*
* @param granularity must not be {@literal null}.
* @return
*/
public BucketAutoOperation withGranularity(Granularity granularity) {
Assert.notNull(granularity, "Granularity must not be null!");
return new BucketAutoOperation(this, buckets, granularity.getMongoRepresentation());
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#newBucketOperation(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.Outputs)
*/
@Override
protected BucketAutoOperation newBucketOperation(Outputs outputs) {
return new BucketAutoOperation(this, outputs);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutputExpression(java.lang.String, java.lang.Object[])
*/
@Override
public ExpressionBucketAutoOperationBuilder andOutputExpression(String expression, Object... params) {
return new ExpressionBucketAutoOperationBuilder(expression, this, params);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutput(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public BucketAutoOperationOutputBuilder andOutput(AggregationExpression expression) {
return new BucketAutoOperationOutputBuilder(expression, this);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutput(java.lang.String)
*/
@Override
public BucketAutoOperationOutputBuilder andOutput(String fieldName) {
return new BucketAutoOperationOutputBuilder(Fields.field(fieldName), this);
}
/**
* {@link OutputBuilder} implementation for {@link BucketAutoOperation}.
*/
public static class BucketAutoOperationOutputBuilder
extends OutputBuilder<BucketAutoOperationOutputBuilder, BucketAutoOperation> {
/**
* Creates a new {@link BucketAutoOperationOutputBuilder} fot the given value and {@link BucketAutoOperation}.
*
* @param value must not be {@literal null}.
* @param operation must not be {@literal null}.
*/
protected BucketAutoOperationOutputBuilder(Object value, BucketAutoOperation operation) {
super(value, operation);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OutputBuilder#apply(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OperationOutput)
*/
@Override
protected BucketAutoOperationOutputBuilder apply(OperationOutput operationOutput) {
return new BucketAutoOperationOutputBuilder(operationOutput, this.operation);
}
}
/**
* {@link ExpressionBucketOperationBuilderSupport} implementation for {@link BucketAutoOperation} using SpEL
* expression based {@link Output}.
*
* @author Mark Paluch
*/
public static class ExpressionBucketAutoOperationBuilder
extends ExpressionBucketOperationBuilderSupport<BucketAutoOperationOutputBuilder, BucketAutoOperation> {
/**
* Creates a new {@link ExpressionBucketAutoOperationBuilder} for the given value, {@link BucketAutoOperation} and
* parameters.
*
* @param expression must not be {@literal null}.
* @param operation must not be {@literal null}.
* @param parameters
*/
protected ExpressionBucketAutoOperationBuilder(String expression, BucketAutoOperation operation,
Object[] parameters) {
super(expression, operation, parameters);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OutputBuilder#apply(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OperationOutput)
*/
@Override
protected BucketAutoOperationOutputBuilder apply(OperationOutput operationOutput) {
return new BucketAutoOperationOutputBuilder(operationOutput, this.operation);
}
}
/**
* @author Mark Paluch
*/
public interface Granularity {
/**
* @return a String that represents a MongoDB granularity to be used with {@link BucketAutoOperation}. Never
* {@literal null}.
*/
String getMongoRepresentation();
}
/**
* Supported MongoDB granularities.
*
* @see <a
* href="https://docs.mongodb.com/manual/reference/operator/aggregation/bucketAuto/#granularity>https://docs.mongodb.com/manual/reference/operator/aggregation/bucketAuto/#granularity</a>
* @author Mark Paluch
*/
public enum Granularities implements Granularity {
R5, R10, R20, R40, R80, //
SERIES_1_2_5("1-2-5"), //
E6, E12, E24, E48, E96, E192, //
POWERSOF2;
private final String granularity;
Granularities() {
this.granularity = name();
}
Granularities(String granularity) {
this.granularity = granularity;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.GranularitytoMongoGranularity()
*/
@Override
public String getMongoRepresentation() {
return granularity;
}
}
}

View File

@@ -0,0 +1,227 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import org.springframework.data.mongodb.core.aggregation.BucketOperation.BucketOperationOutputBuilder;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $bucket}-operation. <br />
*
* Bucket stage is typically used with {@link Aggregation} and {@code $facet}. Categorizes incoming documents into
* groups, called buckets, based on a specified expression and bucket boundaries. <br />
*
* We recommend to use the static factory method {@link Aggregation#bucket(String)} instead of creating instances of
* this class directly.
*
* @see <a href="https://docs.mongodb.org/manual/reference/aggregation/bucket/">https://docs.mongodb.org/manual/reference/aggregation/bucket/</a>
* @see BucketOperationSupport
* @author Mark Paluch
* @since 1.10
*/
public class BucketOperation extends BucketOperationSupport<BucketOperation, BucketOperationOutputBuilder>
implements FieldsExposingAggregationOperation {
private final List<Object> boundaries;
private final Object defaultBucket;
/**
* Creates a new {@link BucketOperation} given a {@link Field group-by field}.
*
* @param groupByField must not be {@literal null}.
*/
public BucketOperation(Field groupByField) {
super(groupByField);
this.boundaries = Collections.emptyList();
this.defaultBucket = null;
}
/**
* Creates a new {@link BucketOperation} given a {@link AggregationExpression group-by expression}.
*
* @param groupByExpression must not be {@literal null}.
*/
public BucketOperation(AggregationExpression groupByExpression) {
super(groupByExpression);
this.boundaries = Collections.emptyList();
this.defaultBucket = null;
}
private BucketOperation(BucketOperation bucketOperation, Outputs outputs) {
super(bucketOperation, outputs);
this.boundaries = bucketOperation.boundaries;
this.defaultBucket = bucketOperation.defaultBucket;
}
private BucketOperation(BucketOperation bucketOperation, List<Object> boundaries, Object defaultBucket) {
super(bucketOperation);
this.boundaries = new ArrayList<Object>(boundaries);
this.defaultBucket = defaultBucket;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
DBObject options = new BasicDBObject();
options.put("boundaries", context.getMappedObject(new BasicDBObject("$set", boundaries)).get("$set"));
if (defaultBucket != null) {
options.put("default", context.getMappedObject(new BasicDBObject("$set", defaultBucket)).get("$set"));
}
options.putAll(super.toDBObject(context));
return new BasicDBObject("$bucket", options);
}
/**
* Configures a default bucket {@literal literal} and return a new {@link BucketOperation}.
*
* @param literal must not be {@literal null}.
* @return
*/
public BucketOperation withDefaultBucket(Object literal) {
Assert.notNull(literal, "Default bucket literal must not be null!");
return new BucketOperation(this, boundaries, literal);
}
/**
* Configures {@literal boundaries} and return a new {@link BucketOperation}. Existing {@literal boundaries} are
* preserved and the new {@literal boundaries} are appended.
*
* @param boundaries must not be {@literal null}.
* @return
*/
public BucketOperation withBoundaries(Object... boundaries) {
Assert.notNull(boundaries, "Boundaries must not be null!");
Assert.noNullElements(boundaries, "Boundaries must not contain null values!");
List<Object> newBoundaries = new ArrayList<Object>(this.boundaries.size() + boundaries.length);
newBoundaries.addAll(this.boundaries);
newBoundaries.addAll(Arrays.asList(boundaries));
return new BucketOperation(this, newBoundaries, defaultBucket);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#newBucketOperation(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.Outputs)
*/
@Override
protected BucketOperation newBucketOperation(Outputs outputs) {
return new BucketOperation(this, outputs);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutputExpression(java.lang.String, java.lang.Object[])
*/
@Override
public ExpressionBucketOperationBuilder andOutputExpression(String expression, Object... params) {
return new ExpressionBucketOperationBuilder(expression, this, params);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutput(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public BucketOperationOutputBuilder andOutput(AggregationExpression expression) {
return new BucketOperationOutputBuilder(expression, this);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutput(java.lang.String)
*/
@Override
public BucketOperationOutputBuilder andOutput(String fieldName) {
return new BucketOperationOutputBuilder(Fields.field(fieldName), this);
}
/**
* {@link OutputBuilder} implementation for {@link BucketOperation}.
*/
public static class BucketOperationOutputBuilder
extends BucketOperationSupport.OutputBuilder<BucketOperationOutputBuilder, BucketOperation> {
/**
* Creates a new {@link BucketOperationOutputBuilder} fot the given value and {@link BucketOperation}.
*
* @param value must not be {@literal null}.
* @param operation must not be {@literal null}.
*/
protected BucketOperationOutputBuilder(Object value, BucketOperation operation) {
super(value, operation);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OutputBuilder#apply(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OperationOutput)
*/
@Override
protected BucketOperationOutputBuilder apply(OperationOutput operationOutput) {
return new BucketOperationOutputBuilder(operationOutput, this.operation);
}
}
/**
* {@link ExpressionBucketOperationBuilderSupport} implementation for {@link BucketOperation} using SpEL expression
* based {@link Output}.
*
* @author Mark Paluch
*/
public static class ExpressionBucketOperationBuilder
extends ExpressionBucketOperationBuilderSupport<BucketOperationOutputBuilder, BucketOperation> {
/**
* Creates a new {@link ExpressionBucketOperationBuilderSupport} for the given value, {@link BucketOperation}
* and parameters.
*
* @param expression must not be {@literal null}.
* @param operation must not be {@literal null}.
* @param parameters
*/
protected ExpressionBucketOperationBuilder(String expression, BucketOperation operation, Object[] parameters) {
super(expression, operation, parameters);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OutputBuilder#apply(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OperationOutput)
*/
@Override
protected BucketOperationOutputBuilder apply(OperationOutput operationOutput) {
return new BucketOperationOutputBuilder(operationOutput, this.operation);
}
}
}

View File

@@ -0,0 +1,680 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.List;
import org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OutputBuilder;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder;
import org.springframework.expression.spel.ast.Projection;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Base class for bucket operations that support output expressions the aggregation framework. <br />
* Bucket stages collect documents into buckets and can contribute output fields. <br />
* Implementing classes are required to provide an {@link OutputBuilder}.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 1.10
*/
public abstract class BucketOperationSupport<T extends BucketOperationSupport<T, B>, B extends OutputBuilder<B, T>>
implements FieldsExposingAggregationOperation {
private final Field groupByField;
private final AggregationExpression groupByExpression;
private final Outputs outputs;
/**
* Creates a new {@link BucketOperationSupport} given a {@link Field group-by field}.
*
* @param groupByField must not be {@literal null}.
*/
protected BucketOperationSupport(Field groupByField) {
Assert.notNull(groupByField, "Group by field must not be null!");
this.groupByField = groupByField;
this.groupByExpression = null;
this.outputs = Outputs.EMPTY;
}
/**
* Creates a new {@link BucketOperationSupport} given a {@link AggregationExpression group-by expression}.
*
* @param groupByExpression must not be {@literal null}.
*/
protected BucketOperationSupport(AggregationExpression groupByExpression) {
Assert.notNull(groupByExpression, "Group by AggregationExpression must not be null!");
this.groupByExpression = groupByExpression;
this.groupByField = null;
this.outputs = Outputs.EMPTY;
}
/**
* Creates a copy of {@link BucketOperationSupport}.
*
* @param operationSupport must not be {@literal null}.
*/
protected BucketOperationSupport(BucketOperationSupport<?, ?> operationSupport) {
this(operationSupport, operationSupport.outputs);
}
/**
* Creates a copy of {@link BucketOperationSupport} and applies the new {@link Outputs}.
*
* @param operationSupport must not be {@literal null}.
* @param outputs must not be {@literal null}.
*/
protected BucketOperationSupport(BucketOperationSupport<?, ?> operationSupport, Outputs outputs) {
Assert.notNull(operationSupport, "BucketOperationSupport must not be null!");
Assert.notNull(outputs, "Outputs must not be null!");
this.groupByField = operationSupport.groupByField;
this.groupByExpression = operationSupport.groupByExpression;
this.outputs = outputs;
}
/**
* Creates a new {@link ExpressionBucketOperationBuilderSupport} given a SpEL {@literal expression} and optional
* {@literal params} to add an output field to the resulting bucket documents.
*
* @param expression the SpEL expression, must not be {@literal null} or empty.
* @param params must not be {@literal null}
* @return
*/
public abstract ExpressionBucketOperationBuilderSupport<B, T> andOutputExpression(String expression,
Object... params);
/**
* Creates a new {@link BucketOperationSupport} given an {@link AggregationExpression} to add an output field to the
* resulting bucket documents.
*
* @param expression the SpEL expression, must not be {@literal null} or empty.
* @return
*/
public abstract B andOutput(AggregationExpression expression);
/**
* Creates a new {@link BucketOperationSupport} given {@literal fieldName} to add an output field to the resulting
* bucket documents. {@link BucketOperationSupport} exposes accumulation operations that can be applied to
* {@literal fieldName}.
*
* @param fieldName must not be {@literal null} or empty.
* @return
*/
public abstract B andOutput(String fieldName);
/**
* Creates a new {@link BucketOperationSupport} given to add a count field to the resulting bucket documents.
*
* @return
*/
public B andOutputCount() {
return andOutput(new AggregationExpression() {
@Override
public DBObject toDbObject(AggregationOperationContext context) {
return new BasicDBObject("$sum", 1);
}
});
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
DBObject dbObject = new BasicDBObject();
dbObject.put("groupBy", groupByExpression == null ? context.getReference(groupByField).toString()
: groupByExpression.toDbObject(context));
if (!outputs.isEmpty()) {
dbObject.put("output", outputs.toDbObject(context));
}
return dbObject;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
*/
@Override
public ExposedFields getFields() {
return outputs.asExposedFields();
}
/**
* Implementation hook to create a new bucket operation.
*
* @param outputs the outputs
* @return the new bucket operation.
*/
protected abstract T newBucketOperation(Outputs outputs);
protected T andOutput(Output output) {
return newBucketOperation(outputs.and(output));
}
/**
* Builder for SpEL expression-based {@link Output}.
*
* @author Mark Paluch
*/
public abstract static class ExpressionBucketOperationBuilderSupport<B extends OutputBuilder<B, T>, T extends BucketOperationSupport<T, B>>
extends OutputBuilder<B, T> {
/**
* Creates a new {@link ExpressionBucketOperationBuilderSupport} for the given value, {@link BucketOperationSupport}
* and parameters.
*
* @param expression must not be {@literal null}.
* @param operation must not be {@literal null}.
* @param parameters
*/
protected ExpressionBucketOperationBuilderSupport(String expression, T operation, Object[] parameters) {
super(new SpelExpressionOutput(expression, parameters), operation);
}
}
/**
* Base class for {@link Output} builders that result in a {@link BucketOperationSupport} providing the built
* {@link Output}.
*
* @author Mark Paluch
*/
public abstract static class OutputBuilder<B extends OutputBuilder<B, T>, T extends BucketOperationSupport<T, B>> {
protected final Object value;
protected final T operation;
/**
* Creates a new {@link OutputBuilder} for the given value and {@link BucketOperationSupport}.
*
* @param value must not be {@literal null}.
* @param operation must not be {@literal null}.
*/
protected OutputBuilder(Object value, T operation) {
Assert.notNull(value, "Value must not be null or empty!");
Assert.notNull(operation, "ProjectionOperation must not be null!");
this.value = value;
this.operation = operation;
}
/**
* Generates a builder for a {@code $sum}-expression. <br />
* Count expressions are emulated via {@code $sum: 1}.
*
* @return
*/
public B count() {
return sum(1);
}
/**
* Generates a builder for a {@code $sum}-expression for the current value.
*
* @return
*/
public B sum() {
return apply(Accumulators.SUM);
}
/**
* Generates a builder for a {@code $sum}-expression for the given {@literal value}.
*
* @param value
* @return
*/
public B sum(Number value) {
return apply(new OperationOutput(Accumulators.SUM.getMongoOperator(), Collections.singleton(value)));
}
/**
* Generates a builder for an {@code $last}-expression for the current value..
*
* @return
*/
public B last() {
return apply(Accumulators.LAST);
}
/**
* Generates a builder for a {@code $first}-expression the current value.
*
* @return
*/
public B first() {
return apply(Accumulators.FIRST);
}
/**
* Generates a builder for an {@code $avg}-expression for the current value.
*
* @param reference
* @return
*/
public B avg() {
return apply(Accumulators.AVG);
}
/**
* Generates a builder for an {@code $min}-expression for the current value.
*
* @return
*/
public B min() {
return apply(Accumulators.MIN);
}
/**
* Generates a builder for an {@code $max}-expression for the current value.
*
* @return
*/
public B max() {
return apply(Accumulators.MAX);
}
/**
* Generates a builder for an {@code $push}-expression for the current value.
*
* @return
*/
public B push() {
return apply(Accumulators.PUSH);
}
/**
* Generates a builder for an {@code $addToSet}-expression for the current value.
*
* @return
*/
public B addToSet() {
return apply(Accumulators.ADDTOSET);
}
/**
* Apply an operator to the current value.
*
* @param operation the operation name, must not be {@literal null} or empty.
* @param values must not be {@literal null}.
* @return
*/
public B apply(String operation, Object... values) {
Assert.hasText(operation, "Operation must not be empty or null!");
Assert.notNull(value, "Values must not be null!");
List<Object> objects = new ArrayList<Object>(values.length + 1);
objects.add(value);
objects.addAll(Arrays.asList(values));
return apply(new OperationOutput(operation, objects));
}
/**
* Apply an {@link OperationOutput} to this output.
*
* @param operationOutput must not be {@literal null}.
* @return
*/
protected abstract B apply(OperationOutput operationOutput);
private B apply(Accumulators operation) {
return this.apply(operation.getMongoOperator());
}
/**
* Returns the finally to be applied {@link BucketOperation} with the given alias.
*
* @param alias will never be {@literal null} or empty.
* @return
*/
public T as(String alias) {
if (value instanceof OperationOutput) {
return this.operation.andOutput(((OperationOutput) this.value).withAlias(alias));
}
if (value instanceof Field) {
throw new IllegalStateException("Cannot add a field as top-level output. Use accumulator expressions.");
}
return this.operation
.andOutput(new AggregationExpressionOutput(Fields.field(alias), (AggregationExpression) value));
}
}
private enum Accumulators {
SUM("$sum"), AVG("$avg"), FIRST("$first"), LAST("$last"), MAX("$max"), MIN("$min"), PUSH("$push"), ADDTOSET(
"$addToSet");
private String mongoOperator;
Accumulators(String mongoOperator) {
this.mongoOperator = mongoOperator;
}
public String getMongoOperator() {
return mongoOperator;
}
}
/**
* Encapsulates {@link Output}s.
*
* @author Mark Paluch
*/
protected static class Outputs implements AggregationExpression {
protected static final Outputs EMPTY = new Outputs();
private List<Output> outputs;
/**
* Creates a new, empty {@link Outputs}.
*/
private Outputs() {
this.outputs = new ArrayList<Output>();
}
/**
* Creates new {@link Outputs} containing all given {@link Output}s.
*
* @param current
* @param output
*/
private Outputs(Collection<Output> current, Output output) {
this.outputs = new ArrayList<Output>(current.size() + 1);
this.outputs.addAll(current);
this.outputs.add(output);
}
/**
* @return the {@link ExposedFields} derived from {@link Output}.
*/
protected ExposedFields asExposedFields() {
// The count field is included by default when the output is not specified.
if (isEmpty()) {
return ExposedFields.from(new ExposedField("count", true));
}
ExposedFields fields = ExposedFields.from();
for (Output output : outputs) {
fields = fields.and(output.getExposedField());
}
return fields;
}
/**
* Create a new {@link Outputs} that contains the new {@link Output}.
*
* @param output must not be {@literal null}.
* @return the new {@link Outputs} that contains the new {@link Output}
*/
protected Outputs and(Output output) {
Assert.notNull(output, "BucketOutput must not be null!");
return new Outputs(this.outputs, output);
}
/**
* @return {@literal true} if {@link Outputs} contains no {@link Output}.
*/
protected boolean isEmpty() {
return outputs.isEmpty();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
DBObject dbObject = new BasicDBObject();
for (Output output : outputs) {
dbObject.put(output.getExposedField().getName(), output.toDbObject(context));
}
return dbObject;
}
}
/**
* Encapsulates an output field in a bucket aggregation stage. <br />
* Output fields can be either top-level fields that define a valid field name or nested output fields using
* operators.
*
* @author Mark Paluch
*/
protected abstract static class Output implements AggregationExpression {
private final ExposedField field;
/**
* Creates new {@link Projection} for the given {@link Field}.
*
* @param field must not be {@literal null}.
*/
protected Output(Field field) {
Assert.notNull(field, "Field must not be null!");
this.field = new ExposedField(field, true);
}
/**
* Returns the field exposed by the {@link Output}.
*
* @return will never be {@literal null}.
*/
protected ExposedField getExposedField() {
return field;
}
}
/**
* Output field that uses a Mongo operation (expression object) to generate an output field value. <br />
* {@link OperationOutput} is used either with a regular field name or an operation keyword (e.g.
* {@literal $sum, $count}).
*
* @author Mark Paluch
*/
protected static class OperationOutput extends Output {
private final String operation;
private final List<Object> values;
/**
* Creates a new {@link Output} for the given field.
*
* @param operation the actual operation key, must not be {@literal null} or empty.
* @param values the values to pass into the operation, must not be {@literal null}.
*/
public OperationOutput(String operation, Collection<? extends Object> values) {
super(Fields.field(operation));
Assert.hasText(operation, "Operation must not be null or empty!");
Assert.notNull(values, "Values must not be null!");
this.operation = operation;
this.values = new ArrayList<Object>(values);
}
private OperationOutput(Field field, OperationOutput operationOutput) {
super(field);
this.operation = operationOutput.operation;
this.values = operationOutput.values;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
List<Object> operationArguments = getOperationArguments(context);
return new BasicDBObject(operation,
operationArguments.size() == 1 ? operationArguments.get(0) : operationArguments);
}
protected List<Object> getOperationArguments(AggregationOperationContext context) {
List<Object> result = new ArrayList<Object>(values != null ? values.size() : 1);
for (Object element : values) {
if (element instanceof Field) {
result.add(context.getReference((Field) element).toString());
} else if (element instanceof Fields) {
for (Field field : (Fields) element) {
result.add(context.getReference(field).toString());
}
} else if (element instanceof AggregationExpression) {
result.add(((AggregationExpression) element).toDbObject(context));
} else {
result.add(element);
}
}
return result;
}
/**
* Returns the field that holds the {@link ProjectionOperationBuilder.OperationProjection}.
*
* @return
*/
protected Field getField() {
return getExposedField();
}
/**
* Creates a new instance of this {@link OperationOutput} with the given alias.
*
* @param alias the alias to set
* @return
*/
public OperationOutput withAlias(String alias) {
final Field aliasedField = Fields.field(alias);
return new OperationOutput(aliasedField, this) {
@Override
protected Field getField() {
return aliasedField;
}
@Override
protected List<Object> getOperationArguments(AggregationOperationContext context) {
// We have to make sure that we use the arguments from the "previous" OperationOutput that we replace
// with this new instance.
return OperationOutput.this.getOperationArguments(context);
}
};
}
}
/**
* A {@link Output} based on a SpEL expression.
*/
private static class SpelExpressionOutput extends Output {
private static final SpelExpressionTransformer TRANSFORMER = new SpelExpressionTransformer();
private final String expression;
private final Object[] params;
/**
* Creates a new {@link SpelExpressionOutput} for the given field, SpEL expression and parameters.
*
* @param expression must not be {@literal null} or empty.
* @param parameters must not be {@literal null}.
*/
public SpelExpressionOutput(String expression, Object[] parameters) {
super(Fields.field(expression));
Assert.hasText(expression, "Expression must not be null!");
Assert.notNull(parameters, "Parameters must not be null!");
this.expression = expression;
this.params = parameters.clone();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.Output#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
return (DBObject) TRANSFORMER.transform(expression, context, params);
}
}
/**
* @author Mark Paluch
*/
private static class AggregationExpressionOutput extends Output {
private final AggregationExpression expression;
/**
* Creates a new {@link AggregationExpressionOutput}.
*
* @param field
* @param expression
*/
protected AggregationExpressionOutput(Field field, AggregationExpression expression) {
super(field);
this.expression = expression;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.Output#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
return expression.toDbObject(context);
}
}
}

View File

@@ -0,0 +1,879 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.Collections;
import java.util.List;
import org.springframework.util.Assert;
/**
* Gateway to {@literal comparison expressions}.
*
* @author Christoph Strobl
* @since 1.10
*/
public class ComparisonOperators {
/**
* Take the field referenced by given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static ComparisonOperatorFactory valueOf(String fieldReference) {
return new ComparisonOperatorFactory(fieldReference);
}
/**
* Take the value resulting from the given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static ComparisonOperatorFactory valueOf(AggregationExpression expression) {
return new ComparisonOperatorFactory(expression);
}
public static class ComparisonOperatorFactory {
private final String fieldReference;
private final AggregationExpression expression;
/**
* Creates new {@link ComparisonOperatorFactory} for given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
*/
public ComparisonOperatorFactory(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
this.fieldReference = fieldReference;
this.expression = null;
}
/**
* Creates new {@link ComparisonOperatorFactory} for given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
*/
public ComparisonOperatorFactory(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
this.fieldReference = null;
this.expression = expression;
}
/**
* Creates new {@link AggregationExpression} that compares two values.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Cmp compareTo(String fieldReference) {
return createCmp().compareTo(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that compares two values.
*
* @param expression must not be {@literal null}.
* @return
*/
public Cmp compareTo(AggregationExpression expression) {
return createCmp().compareTo(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values.
*
* @param value must not be {@literal null}.
* @return
*/
public Cmp compareToValue(Object value) {
return createCmp().compareToValue(value);
}
private Cmp createCmp() {
return usesFieldRef() ? Cmp.valueOf(fieldReference) : Cmp.valueOf(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is equal to the value of the referenced field.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Eq equalTo(String fieldReference) {
return createEq().equalTo(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is equal to the expression result.
*
* @param expression must not be {@literal null}.
* @return
*/
public Eq equalTo(AggregationExpression expression) {
return createEq().equalTo(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is equal to the given value.
*
* @param value must not be {@literal null}.
* @return
*/
public Eq equalToValue(Object value) {
return createEq().equalToValue(value);
}
private Eq createEq() {
return usesFieldRef() ? Eq.valueOf(fieldReference) : Eq.valueOf(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is greater than the value of the referenced field.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Gt greaterThan(String fieldReference) {
return createGt().greaterThan(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is greater than the expression result.
*
* @param expression must not be {@literal null}.
* @return
*/
public Gt greaterThan(AggregationExpression expression) {
return createGt().greaterThan(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is greater than the given value.
*
* @param value must not be {@literal null}.
* @return
*/
public Gt greaterThanValue(Object value) {
return createGt().greaterThanValue(value);
}
private Gt createGt() {
return usesFieldRef() ? Gt.valueOf(fieldReference) : Gt.valueOf(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is greater than or equivalent to the value of the referenced field.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Gte greaterThanEqualTo(String fieldReference) {
return createGte().greaterThanEqualTo(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is greater than or equivalent to the expression result.
*
* @param expression must not be {@literal null}.
* @return
*/
public Gte greaterThanEqualTo(AggregationExpression expression) {
return createGte().greaterThanEqualTo(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is greater than or equivalent to the given value.
*
* @param value must not be {@literal null}.
* @return
*/
public Gte greaterThanEqualToValue(Object value) {
return createGte().greaterThanEqualToValue(value);
}
private Gte createGte() {
return usesFieldRef() ? Gte.valueOf(fieldReference) : Gte.valueOf(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is less than the value of the referenced field.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Lt lessThan(String fieldReference) {
return createLt().lessThan(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is less than the expression result.
*
* @param expression must not be {@literal null}.
* @return
*/
public Lt lessThan(AggregationExpression expression) {
return createLt().lessThan(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is less than to the given value.
*
* @param value must not be {@literal null}.
* @return
*/
public Lt lessThanValue(Object value) {
return createLt().lessThanValue(value);
}
private Lt createLt() {
return usesFieldRef() ? Lt.valueOf(fieldReference) : Lt.valueOf(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is less than or equivalent to the value of the referenced field.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Lte lessThanEqualTo(String fieldReference) {
return createLte().lessThanEqualTo(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is less than or equivalent to the expression result.
*
* @param expression must not be {@literal null}.
* @return
*/
public Lte lessThanEqualTo(AggregationExpression expression) {
return createLte().lessThanEqualTo(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is less than or equivalent to the given value.
*
* @param value
* @return
*/
public Lte lessThanEqualToValue(Object value) {
return createLte().lessThanEqualToValue(value);
}
private Lte createLte() {
return usesFieldRef() ? Lte.valueOf(fieldReference) : Lte.valueOf(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the values
* are not equivalent.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Ne notEqualTo(String fieldReference) {
return createNe().notEqualTo(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the values
* are not equivalent.
*
* @param expression must not be {@literal null}.
* @return
*/
public Ne notEqualTo(AggregationExpression expression) {
return createNe().notEqualTo(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the values
* are not equivalent.
*
* @param value must not be {@literal null}.
* @return
*/
public Ne notEqualToValue(Object value) {
return createNe().notEqualToValue(value);
}
private Ne createNe() {
return usesFieldRef() ? Ne.valueOf(fieldReference) : Ne.valueOf(expression);
}
private boolean usesFieldRef() {
return fieldReference != null;
}
}
/**
* {@link AggregationExpression} for {@code $cmp}.
*
* @author Christoph Strobl
*/
public static class Cmp extends AbstractAggregationExpression {
private Cmp(List<?> value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$cmp";
}
/**
* Creates new {@link Cmp}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Cmp valueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Cmp(asFields(fieldReference));
}
/**
* Creates new {@link Cmp}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Cmp valueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Cmp(Collections.singletonList(expression));
}
/**
* Creates new {@link Cmp} with all previously added arguments appending the given one.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Cmp compareTo(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Cmp(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Cmp} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public Cmp compareTo(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Cmp(append(expression));
}
/**
* Creates new {@link Cmp} with all previously added arguments appending the given one.
*
* @param value must not be {@literal null}.
* @return
*/
public Cmp compareToValue(Object value) {
Assert.notNull(value, "Value must not be null!");
return new Cmp(append(value));
}
}
/**
* {@link AggregationExpression} for {@code $eq}.
*
* @author Christoph Strobl
*/
public static class Eq extends AbstractAggregationExpression {
private Eq(List<?> value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$eq";
}
/**
* Creates new {@link Eq}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Eq valueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Eq(asFields(fieldReference));
}
/**
* Creates new {@link Eq}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Eq valueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Eq(Collections.singletonList(expression));
}
/**
* Creates new {@link Eq} with all previously added arguments appending the given one.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Eq equalTo(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Eq(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Eq} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public Eq equalTo(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Eq(append(expression));
}
/**
* Creates new {@link Eq} with all previously added arguments appending the given one.
*
* @param value must not be {@literal null}.
* @return
*/
public Eq equalToValue(Object value) {
Assert.notNull(value, "Value must not be null!");
return new Eq(append(value));
}
}
/**
* {@link AggregationExpression} for {@code $gt}.
*
* @author Christoph Strobl
*/
public static class Gt extends AbstractAggregationExpression {
private Gt(List<?> value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$gt";
}
/**
* Creates new {@link Gt}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Gt valueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Gt(asFields(fieldReference));
}
/**
* Creates new {@link Gt}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Gt valueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Gt(Collections.singletonList(expression));
}
/**
* Creates new {@link Gt} with all previously added arguments appending the given one.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Gt greaterThan(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Gt(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Gt} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public Gt greaterThan(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Gt(append(expression));
}
/**
* Creates new {@link Gt} with all previously added arguments appending the given one.
*
* @param value must not be {@literal null}.
* @return
*/
public Gt greaterThanValue(Object value) {
Assert.notNull(value, "Value must not be null!");
return new Gt(append(value));
}
}
/**
* {@link AggregationExpression} for {@code $lt}.
*
* @author Christoph Strobl
*/
public static class Lt extends AbstractAggregationExpression {
private Lt(List<?> value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$lt";
}
/**
* Creates new {@link Lt}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Lt valueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Lt(asFields(fieldReference));
}
/**
* Creates new {@link Lt}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Lt valueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Lt(Collections.singletonList(expression));
}
/**
* Creates new {@link Lt} with all previously added arguments appending the given one.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Lt lessThan(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Lt(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Lt} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public Lt lessThan(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Lt(append(expression));
}
/**
* Creates new {@link Lt} with all previously added arguments appending the given one.
*
* @param value must not be {@literal null}.
* @return
*/
public Lt lessThanValue(Object value) {
Assert.notNull(value, "Value must not be null!");
return new Lt(append(value));
}
}
/**
* {@link AggregationExpression} for {@code $gte}.
*
* @author Christoph Strobl
*/
public static class Gte extends AbstractAggregationExpression {
private Gte(List<?> value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$gte";
}
/**
* Creates new {@link Gte}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Gte valueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Gte(asFields(fieldReference));
}
/**
* Creates new {@link Gte}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Gte valueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Gte(Collections.singletonList(expression));
}
/**
* Creates new {@link Gte} with all previously added arguments appending the given one.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Gte greaterThanEqualTo(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Gte(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Gte} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public Gte greaterThanEqualTo(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Gte(append(expression));
}
/**
* Creates new {@link Gte} with all previously added arguments appending the given one.
*
* @param value must not be {@literal null}.
* @return
*/
public Gte greaterThanEqualToValue(Object value) {
Assert.notNull(value, "Value must not be null!");
return new Gte(append(value));
}
}
/**
* {@link AggregationExpression} for {@code $lte}.
*
* @author Christoph Strobl
*/
public static class Lte extends AbstractAggregationExpression {
private Lte(List<?> value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$lte";
}
/**
* Creates new {@link Lte}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Lte valueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Lte(asFields(fieldReference));
}
/**
* Creates new {@link Lte}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Lte valueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Lte(Collections.singletonList(expression));
}
/**
* Creates new {@link Lte} with all previously added arguments appending the given one.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Lte lessThanEqualTo(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Lte(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Lte} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public Lte lessThanEqualTo(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Lte(append(expression));
}
/**
* Creates new {@link Lte} with all previously added arguments appending the given one.
*
* @param value must not be {@literal null}.
* @return
*/
public Lte lessThanEqualToValue(Object value) {
Assert.notNull(value, "Value must not be null!");
return new Lte(append(value));
}
}
/**
* {@link AggregationExpression} for {@code $ne}.
*
* @author Christoph Strobl
*/
public static class Ne extends AbstractAggregationExpression {
private Ne(List<?> value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$ne";
}
/**
* Creates new {@link Ne}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Ne valueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Ne(asFields(fieldReference));
}
/**
* Creates new {@link Ne}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Ne valueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Ne(Collections.singletonList(expression));
}
/**
* Creates new {@link Ne} with all previously added arguments appending the given one.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Ne notEqualTo(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Ne(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Ne} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public Ne notEqualTo(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Ne(append(expression));
}
/**
* Creates new {@link Eq} with all previously added arguments appending the given one.
*
* @param value must not be {@literal null}.
* @return
*/
public Ne notEqualToValue(Object value) {
Assert.notNull(value, "Value must not be null!");
return new Ne(append(value));
}
}
}

View File

@@ -0,0 +1,978 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.OtherwiseBuilder;
import org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.ThenBuilder;
import org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Switch.CaseOperator;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Gateway to {@literal conditional expressions} that evaluate their argument expressions as booleans to a value.
*
* @author Mark Paluch
* @since 1.10
*/
public class ConditionalOperators {
/**
* Take the field referenced by given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static ConditionalOperatorFactory when(String fieldReference) {
return new ConditionalOperatorFactory(fieldReference);
}
/**
* Take the value resulting from the given {@literal expression}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static ConditionalOperatorFactory when(AggregationExpression expression) {
return new ConditionalOperatorFactory(expression);
}
/**
* Take the value resulting from the given {@literal criteriaDefinition}.
*
* @param criteriaDefinition must not be {@literal null}.
* @return
*/
public static ConditionalOperatorFactory when(CriteriaDefinition criteriaDefinition) {
return new ConditionalOperatorFactory(criteriaDefinition);
}
/**
* Creates new {@link AggregationExpression} that evaluates an expression and returns the value of the expression if
* the expression evaluates to a non-null value. If the expression evaluates to a {@literal null} value, including
* instances of undefined values or missing fields, returns the value of the replacement expression.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static IfNull.ThenBuilder ifNull(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return IfNull.ifNull(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that evaluates an expression and returns the value of the expression if
* the expression evaluates to a non-null value. If the expression evaluates to a {@literal null} value, including
* instances of undefined values or missing fields, returns the value of the replacement expression.
*
* @param expression must not be {@literal null}.
* @return
*/
public static IfNull.ThenBuilder ifNull(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return IfNull.ifNull(expression);
}
/**
* Creates new {@link AggregationExpression} that evaluates a series of {@link CaseOperator} expressions. When it
* finds an expression which evaluates to {@literal true}, {@code $switch} executes a specified expression and breaks
* out of the control flow.
*
* @param conditions must not be {@literal null}.
* @return
*/
public static Switch switchCases(CaseOperator... conditions) {
return Switch.switchCases(conditions);
}
/**
* Creates new {@link AggregationExpression} that evaluates a series of {@link CaseOperator} expressions. When it
* finds an expression which evaluates to {@literal true}, {@code $switch} executes a specified expression and breaks
* out of the control flow.
*
* @param conditions must not be {@literal null}.
* @return
*/
public static Switch switchCases(List<CaseOperator> conditions) {
return Switch.switchCases(conditions);
}
public static class ConditionalOperatorFactory {
private final String fieldReference;
private final AggregationExpression expression;
private final CriteriaDefinition criteriaDefinition;
/**
* Creates new {@link ConditionalOperatorFactory} for given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
*/
public ConditionalOperatorFactory(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
this.fieldReference = fieldReference;
this.expression = null;
this.criteriaDefinition = null;
}
/**
* Creates new {@link ConditionalOperatorFactory} for given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
*/
public ConditionalOperatorFactory(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
this.fieldReference = null;
this.expression = expression;
this.criteriaDefinition = null;
}
/**
* Creates new {@link ConditionalOperatorFactory} for given {@link CriteriaDefinition}.
*
* @param criteriaDefinition must not be {@literal null}.
*/
public ConditionalOperatorFactory(CriteriaDefinition criteriaDefinition) {
Assert.notNull(criteriaDefinition, "CriteriaDefinition must not be null!");
this.fieldReference = null;
this.expression = null;
this.criteriaDefinition = criteriaDefinition;
}
/**
* Creates new {@link AggregationExpression} that evaluates a boolean expression to return one of the two specified
* return expressions.
*
* @param value must not be {@literal null}.
* @return
*/
public OtherwiseBuilder then(Object value) {
Assert.notNull(value, "Value must not be null!");
return createThenBuilder().then(value);
}
/**
* Creates new {@link AggregationExpression} that evaluates a boolean expression to return one of the two specified
* return expressions.
*
* @param expression must not be {@literal null}.
* @return
*/
public OtherwiseBuilder thenValueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return createThenBuilder().then(expression);
}
/**
* Creates new {@link AggregationExpression} that evaluates a boolean expression to return one of the two specified
* return expressions.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public OtherwiseBuilder thenValueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return createThenBuilder().then(fieldReference);
}
private ThenBuilder createThenBuilder() {
if (usesFieldRef()) {
return Cond.newBuilder().when(fieldReference);
}
return usesCriteriaDefinition() ? Cond.newBuilder().when(criteriaDefinition) : Cond.newBuilder().when(expression);
}
private boolean usesFieldRef() {
return this.fieldReference != null;
}
private boolean usesCriteriaDefinition() {
return this.criteriaDefinition != null;
}
}
/**
* Encapsulates the aggregation framework {@code $ifNull} operator. Replacement values can be either {@link Field
* field references}, {@link AggregationExpression expressions}, values of simple MongoDB types or values that can be
* converted to a simple MongoDB type.
*
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/ifNull/">https://docs.mongodb.com/manual/reference/operator/aggregation/ifNull/</a>
* @author Mark Paluch
*/
public static class IfNull implements AggregationExpression {
private final Object condition;
private final Object value;
private IfNull(Object condition, Object value) {
this.condition = condition;
this.value = value;
}
/**
* Creates new {@link IfNull}.
*
* @param fieldReference the field to check for a {@literal null} value, field reference must not be {@literal null}
* .
* @return
*/
public static ThenBuilder ifNull(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new IfNullOperatorBuilder().ifNull(fieldReference);
}
/**
* Creates new {@link IfNull}.
*
* @param expression the expression to check for a {@literal null} value, field reference must not be
* {@literal null}.
* @return
*/
public static ThenBuilder ifNull(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new IfNullOperatorBuilder().ifNull(expression);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
List<Object> list = new ArrayList<Object>();
if (condition instanceof Field) {
list.add(context.getReference((Field) condition).toString());
} else if (condition instanceof AggregationExpression) {
list.add(((AggregationExpression) condition).toDbObject(context));
} else {
list.add(condition);
}
list.add(resolve(value, context));
return new BasicDBObject("$ifNull", list);
}
private Object resolve(Object value, AggregationOperationContext context) {
if (value instanceof Field) {
return context.getReference((Field) value).toString();
} else if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDbObject(context);
} else if (value instanceof DBObject) {
return value;
}
return context.getMappedObject(new BasicDBObject("$set", value)).get("$set");
}
/**
* @author Mark Paluch
*/
public interface IfNullBuilder {
/**
* @param fieldReference the field to check for a {@literal null} value, field reference must not be
* {@literal null}.
* @return the {@link ThenBuilder}
*/
ThenBuilder ifNull(String fieldReference);
/**
* @param expression the expression to check for a {@literal null} value, field name must not be {@literal null}
* or empty.
* @return the {@link ThenBuilder}
*/
ThenBuilder ifNull(AggregationExpression expression);
}
/**
* @author Mark Paluch
*/
public interface ThenBuilder {
/**
* @param value the value to be used if the {@code $ifNull} condition evaluates {@literal true}. Can be a
* {@link DBObject}, a value that is supported by MongoDB or a value that can be converted to a MongoDB
* representation but must not be {@literal null}.
* @return
*/
IfNull then(Object value);
/**
* @param fieldReference the field holding the replacement value, must not be {@literal null}.
* @return
*/
IfNull thenValueOf(String fieldReference);
/**
* @param expression the expression yielding to the replacement value, must not be {@literal null}.
* @return
*/
IfNull thenValueOf(AggregationExpression expression);
}
/**
* Builder for fluent {@link IfNull} creation.
*
* @author Mark Paluch
*/
static final class IfNullOperatorBuilder implements IfNullBuilder, ThenBuilder {
private Object condition;
private IfNullOperatorBuilder() {}
/**
* Creates a new builder for {@link IfNull}.
*
* @return never {@literal null}.
*/
public static IfNullOperatorBuilder newBuilder() {
return new IfNullOperatorBuilder();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull.IfNullBuilder#ifNull(java.lang.String)
*/
public ThenBuilder ifNull(String fieldReference) {
Assert.hasText(fieldReference, "FieldReference name must not be null or empty!");
this.condition = Fields.field(fieldReference);
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull.IfNullBuilder#ifNull(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public ThenBuilder ifNull(AggregationExpression expression) {
Assert.notNull(expression, "AggregationExpression name must not be null or empty!");
this.condition = expression;
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull.ThenBuilder#then(java.lang.Object)
*/
public IfNull then(Object value) {
return new IfNull(condition, value);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull.ThenBuilder#thenValueOf(java.lang.String)
*/
public IfNull thenValueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new IfNull(condition, Fields.field(fieldReference));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull.ThenBuilder#thenValueOf(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
public IfNull thenValueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new IfNull(condition, expression);
}
}
}
/**
* {@link AggregationExpression} for {@code $switch}.
*
* @author Christoph Strobl
*/
public static class Switch extends AbstractAggregationExpression {
private Switch(java.util.Map<String, Object> values) {
super(values);
}
@Override
protected String getMongoMethod() {
return "$switch";
}
/**
* Creates new {@link Switch}.
*
* @param conditions must not be {@literal null}.
*/
public static Switch switchCases(CaseOperator... conditions) {
Assert.notNull(conditions, "Conditions must not be null!");
return switchCases(Arrays.asList(conditions));
}
/**
* Creates new {@link Switch}.
*
* @param conditions must not be {@literal null}.
*/
public static Switch switchCases(List<CaseOperator> conditions) {
Assert.notNull(conditions, "Conditions must not be null!");
return new Switch(Collections.<String, Object> singletonMap("branches", new ArrayList<CaseOperator>(conditions)));
}
public Switch defaultTo(Object value) {
return new Switch(append("default", value));
}
/**
* Encapsulates the aggregation framework case document inside a {@code $switch}-operation.
*/
public static class CaseOperator implements AggregationExpression {
private final AggregationExpression when;
private final Object then;
private CaseOperator(AggregationExpression when, Object then) {
this.when = when;
this.then = then;
}
public static ThenBuilder when(final AggregationExpression condition) {
Assert.notNull(condition, "Condition must not be null!");
return new ThenBuilder() {
@Override
public CaseOperator then(Object value) {
Assert.notNull(value, "Value must not be null!");
return new CaseOperator(condition, value);
}
};
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
DBObject dbo = new BasicDBObject("case", when.toDbObject(context));
if (then instanceof AggregationExpression) {
dbo.put("then", ((AggregationExpression) then).toDbObject(context));
} else if (then instanceof Field) {
dbo.put("then", context.getReference((Field) then).toString());
} else {
dbo.put("then", then);
}
return dbo;
}
/**
* @author Christoph Strobl
*/
public interface ThenBuilder {
/**
* Set the then {@literal value}.
*
* @param value must not be {@literal null}.
* @return
*/
CaseOperator then(Object value);
}
}
}
/**
* Encapsulates the aggregation framework {@code $cond} operator. A {@link Cond} allows nested conditions
* {@code if-then[if-then-else]-else} using {@link Field}, {@link CriteriaDefinition}, {@link AggregationExpression}
* or a {@link DBObject custom} condition. Replacement values can be either {@link Field field references},
* {@link AggregationExpression expressions}, values of simple MongoDB types or values that can be converted to a
* simple MongoDB type.
*
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/cond/">https://docs.mongodb.com/manual/reference/operator/aggregation/cond/</a>
* @author Mark Paluch
* @author Christoph Strobl
*/
public static class Cond implements AggregationExpression {
private final Object condition;
private final Object thenValue;
private final Object otherwiseValue;
/**
* Creates a new {@link Cond} for a given {@link Field} and {@code then}/{@code otherwise} values.
*
* @param condition must not be {@literal null}.
* @param thenValue must not be {@literal null}.
* @param otherwiseValue must not be {@literal null}.
*/
private Cond(Field condition, Object thenValue, Object otherwiseValue) {
this((Object) condition, thenValue, otherwiseValue);
}
/**
* Creates a new {@link Cond} for a given {@link CriteriaDefinition} and {@code then}/{@code otherwise} values.
*
* @param condition must not be {@literal null}.
* @param thenValue must not be {@literal null}.
* @param otherwiseValue must not be {@literal null}.
*/
private Cond(CriteriaDefinition condition, Object thenValue, Object otherwiseValue) {
this((Object) condition, thenValue, otherwiseValue);
}
private Cond(Object condition, Object thenValue, Object otherwiseValue) {
Assert.notNull(condition, "Condition must not be null!");
Assert.notNull(thenValue, "Then value must not be null!");
Assert.notNull(otherwiseValue, "Otherwise value must not be null!");
assertNotBuilder(condition, "Condition");
assertNotBuilder(thenValue, "Then value");
assertNotBuilder(otherwiseValue, "Otherwise value");
this.condition = condition;
this.thenValue = thenValue;
this.otherwiseValue = otherwiseValue;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
BasicDBObject condObject = new BasicDBObject();
condObject.append("if", resolveCriteria(context, condition));
condObject.append("then", resolveValue(context, thenValue));
condObject.append("else", resolveValue(context, otherwiseValue));
return new BasicDBObject("$cond", condObject);
}
private Object resolveValue(AggregationOperationContext context, Object value) {
if (value instanceof DBObject || value instanceof Field) {
return resolve(context, value);
}
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDbObject(context);
}
return context.getMappedObject(new BasicDBObject("$set", value)).get("$set");
}
private Object resolveCriteria(AggregationOperationContext context, Object value) {
if (value instanceof DBObject || value instanceof Field) {
return resolve(context, value);
}
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDbObject(context);
}
if (value instanceof CriteriaDefinition) {
DBObject mappedObject = context.getMappedObject(((CriteriaDefinition) value).getCriteriaObject());
List<Object> clauses = new ArrayList<Object>();
clauses.addAll(getClauses(context, mappedObject));
return clauses.size() == 1 ? clauses.get(0) : clauses;
}
throw new InvalidDataAccessApiUsageException(
String.format("Invalid value in condition. Supported: DBObject, Field references, Criteria, got: %s", value));
}
private List<Object> getClauses(AggregationOperationContext context, DBObject mappedObject) {
List<Object> clauses = new ArrayList<Object>();
for (String key : mappedObject.keySet()) {
Object predicate = mappedObject.get(key);
clauses.addAll(getClauses(context, key, predicate));
}
return clauses;
}
private List<Object> getClauses(AggregationOperationContext context, String key, Object predicate) {
List<Object> clauses = new ArrayList<Object>();
if (predicate instanceof List) {
List<Object> args = new ArrayList<Object>();
for (Object clause : (List<?>) predicate) {
if (clause instanceof DBObject) {
args.addAll(getClauses(context, (DBObject) clause));
}
}
clauses.add(new BasicDBObject(key, args));
} else if (predicate instanceof DBObject) {
DBObject nested = (DBObject) predicate;
for (String s : nested.keySet()) {
if (!isKeyword(s)) {
continue;
}
List<Object> args = new ArrayList<Object>();
args.add("$" + key);
args.add(nested.get(s));
clauses.add(new BasicDBObject(s, args));
}
} else if (!isKeyword(key)) {
List<Object> args = new ArrayList<Object>();
args.add("$" + key);
args.add(predicate);
clauses.add(new BasicDBObject("$eq", args));
}
return clauses;
}
/**
* Returns whether the given {@link String} is a MongoDB keyword.
*
* @param candidate
* @return
*/
private boolean isKeyword(String candidate) {
return candidate.startsWith("$");
}
private Object resolve(AggregationOperationContext context, Object value) {
if (value instanceof DBObject) {
return context.getMappedObject((DBObject) value);
}
return context.getReference((Field) value).toString();
}
private void assertNotBuilder(Object toCheck, String name) {
Assert.isTrue(!ClassUtils.isAssignableValue(ConditionalExpressionBuilder.class, toCheck),
String.format("%s must not be of type %s", name, ConditionalExpressionBuilder.class.getSimpleName()));
}
/**
* Get a builder that allows fluent creation of {@link Cond}.
*
* @return never {@literal null}.
*/
public static WhenBuilder newBuilder() {
return ConditionalExpressionBuilder.newBuilder();
}
/**
* Start creating new {@link Cond} by providing the boolean expression used in {@code if}.
*
* @param booleanExpression must not be {@literal null}.
* @return never {@literal null}.
*/
public static ThenBuilder when(DBObject booleanExpression) {
return ConditionalExpressionBuilder.newBuilder().when(booleanExpression);
}
/**
* Start creating new {@link Cond} by providing the {@link AggregationExpression} used in {@code if}.
*
* @param expression expression that yields in a boolean result, must not be {@literal null}.
* @return never {@literal null}.
*/
public static ThenBuilder when(AggregationExpression expression) {
return ConditionalExpressionBuilder.newBuilder().when(expression);
}
/**
* Start creating new {@link Cond} by providing the field reference used in {@code if}.
*
* @param booleanField name of a field holding a boolean value, must not be {@literal null}.
* @return never {@literal null}.
*/
public static ThenBuilder when(String booleanField) {
return ConditionalExpressionBuilder.newBuilder().when(booleanField);
}
/**
* Start creating new {@link Cond} by providing the {@link CriteriaDefinition} used in {@code if}.
*
* @param criteria criteria to evaluate, must not be {@literal null}.
* @return the {@link ThenBuilder}
*/
public static ThenBuilder when(CriteriaDefinition criteria) {
return ConditionalExpressionBuilder.newBuilder().when(criteria);
}
/**
* @author Mark Paluch
*/
public interface WhenBuilder {
/**
* @param booleanExpression expression that yields in a boolean result, must not be {@literal null}.
* @return the {@link ThenBuilder}
*/
ThenBuilder when(DBObject booleanExpression);
/**
* @param expression expression that yields in a boolean result, must not be {@literal null}.
* @return the {@link ThenBuilder}
*/
ThenBuilder when(AggregationExpression expression);
/**
* @param booleanField name of a field holding a boolean value, must not be {@literal null}.
* @return the {@link ThenBuilder}
*/
ThenBuilder when(String booleanField);
/**
* @param criteria criteria to evaluate, must not be {@literal null}.
* @return the {@link ThenBuilder}
*/
ThenBuilder when(CriteriaDefinition criteria);
}
/**
* @author Mark Paluch
*/
public interface ThenBuilder {
/**
* @param value the value to be used if the condition evaluates {@literal true}. Can be a {@link DBObject}, a
* value that is supported by MongoDB or a value that can be converted to a MongoDB representation but
* must not be {@literal null}.
* @return the {@link OtherwiseBuilder}
*/
OtherwiseBuilder then(Object value);
/**
* @param fieldReference must not be {@literal null}.
* @return the {@link OtherwiseBuilder}
*/
OtherwiseBuilder thenValueOf(String fieldReference);
/**
* @param expression must not be {@literal null}.
* @return the {@link OtherwiseBuilder}
*/
OtherwiseBuilder thenValueOf(AggregationExpression expression);
}
/**
* @author Mark Paluch
*/
public interface OtherwiseBuilder {
/**
* @param value the value to be used if the condition evaluates {@literal false}. Can be a {@link DBObject}, a
* value that is supported by MongoDB or a value that can be converted to a MongoDB representation but
* must not be {@literal null}.
* @return the {@link Cond}
*/
Cond otherwise(Object value);
/**
* @param fieldReference must not be {@literal null}.
* @return the {@link Cond}
*/
Cond otherwiseValueOf(String fieldReference);
/**
* @param expression must not be {@literal null}.
* @return the {@link Cond}
*/
Cond otherwiseValueOf(AggregationExpression expression);
}
/**
* Builder for fluent {@link Cond} creation.
*
* @author Mark Paluch
*/
static class ConditionalExpressionBuilder implements WhenBuilder, ThenBuilder, OtherwiseBuilder {
private Object condition;
private Object thenValue;
private ConditionalExpressionBuilder() {}
/**
* Creates a new builder for {@link Cond}.
*
* @return never {@literal null}.
*/
public static ConditionalExpressionBuilder newBuilder() {
return new ConditionalExpressionBuilder();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.WhenBuilder#when(com.mongodb.DBObject)
*/
@Override
public ConditionalExpressionBuilder when(DBObject booleanExpression) {
Assert.notNull(booleanExpression, "'Boolean expression' must not be null!");
this.condition = booleanExpression;
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.WhenBuilder#when(org.springframework.data.mongodb.core.query.CriteriaDefinition)
*/
@Override
public ThenBuilder when(CriteriaDefinition criteria) {
Assert.notNull(criteria, "Criteria must not be null!");
this.condition = criteria;
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.WhenBuilder#when(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public ThenBuilder when(AggregationExpression expression) {
Assert.notNull(expression, "AggregationExpression field must not be null!");
this.condition = expression;
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.WhenBuilder#when(java.lang.String)
*/
@Override
public ThenBuilder when(String booleanField) {
Assert.hasText(booleanField, "Boolean field name must not be null or empty!");
this.condition = Fields.field(booleanField);
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.ThenBuilder#then(java.lang.Object)
*/
@Override
public OtherwiseBuilder then(Object thenValue) {
Assert.notNull(thenValue, "Then-value must not be null!");
this.thenValue = thenValue;
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.ThenBuilder#thenValueOf(java.lang.String)
*/
@Override
public OtherwiseBuilder thenValueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
this.thenValue = Fields.field(fieldReference);
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.ThenBuilder#thenValueOf(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public OtherwiseBuilder thenValueOf(AggregationExpression expression) {
Assert.notNull(expression, "AggregationExpression must not be null!");
this.thenValue = expression;
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.OtherwiseBuilder#otherwise(java.lang.Object)
*/
@Override
public Cond otherwise(Object otherwiseValue) {
Assert.notNull(otherwiseValue, "Value must not be null!");
return new Cond(condition, thenValue, otherwiseValue);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.OtherwiseBuilder#otherwiseValueOf(java.lang.String)
*/
@Override
public Cond otherwiseValueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Cond(condition, thenValue, Fields.field(fieldReference));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.OtherwiseBuilder#otherwiseValueOf(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public Cond otherwiseValueOf(AggregationExpression expression) {
Assert.notNull(expression, "AggregationExpression must not be null!");
return new Cond(condition, thenValue, expression);
}
}
}
}

View File

@@ -0,0 +1,82 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $count}-operation. <br />
* We recommend to use the static factory method {@link Aggregation#count()} instead of creating instances of this class
* directly.
*
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/count/#pipe._S_count">https://docs.mongodb.com/manual/reference/operator/aggregation/count/</a>
* @author Mark Paluch
* @since 1.10
*/
public class CountOperation implements FieldsExposingAggregationOperation {
private final String fieldName;
/**
* Creates a new {@link CountOperation} given the {@link fieldName} field name.
*
* @param asFieldName must not be {@literal null} or empty.
*/
public CountOperation(String fieldName) {
Assert.hasText(fieldName, "Field name must not be null or empty!");
this.fieldName = fieldName;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$count", fieldName);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
*/
@Override
public ExposedFields getFields() {
return ExposedFields.from(new ExposedField(fieldName, true));
}
/**
* Builder for {@link CountOperation}.
*
* @author Mark Paluch
*/
public static class CountOperationBuilder {
/**
* Returns the finally to be applied {@link CountOperation} with the given alias.
*
* @param fieldName must not be {@literal null} or empty.
* @return
*/
public CountOperation as(String fieldName) {
return new CountOperation(fieldName);
}
}
}

View File

@@ -0,0 +1,67 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.util.Assert;
/**
* Gateway to {@literal data type} expressions.
*
* @author Christoph Strobl
* @since 1.10
* @soundtrack Clawfinger - Catch Me
*/
public class DataTypeOperators {
/**
* Return the BSON data type of the given {@literal field}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Type typeOf(String fieldReference) {
return Type.typeOf(fieldReference);
}
/**
* {@link AggregationExpression} for {@code $type}.
*
* @author Christoph Strobl
*/
public static class Type extends AbstractAggregationExpression {
private Type(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$type";
}
/**
* Creates new {@link Type}.
*
* @param field must not be {@literal null}.
* @return
*/
public static Type typeOf(String field) {
Assert.notNull(field, "Field must not be null!");
return new Type(Fields.field(field));
}
}
}

View File

@@ -0,0 +1,837 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.LinkedHashMap;
import org.springframework.data.mongodb.core.aggregation.ArithmeticOperators.ArithmeticOperatorFactory;
import org.springframework.util.Assert;
/**
* Gateway to {@literal Date} aggregation operations.
*
* @author Christoph Strobl
* @since 1.10
*/
public class DateOperators {
/**
* Take the date referenced by given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static DateOperatorFactory dateOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new DateOperatorFactory(fieldReference);
}
/**
* Take the date resulting from the given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static DateOperatorFactory dateOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new DateOperatorFactory(expression);
}
/**
* @author Christoph Strobl
*/
public static class DateOperatorFactory {
private final String fieldReference;
private final AggregationExpression expression;
/**
* Creates new {@link ArithmeticOperatorFactory} for given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
*/
public DateOperatorFactory(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
this.fieldReference = fieldReference;
this.expression = null;
}
/**
* Creates new {@link ArithmeticOperatorFactory} for given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
*/
public DateOperatorFactory(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
this.fieldReference = null;
this.expression = expression;
}
/**
* Creates new {@link AggregationExpression} that returns the day of the year for a date as a number between 1 and
* 366.
*
* @return
*/
public DayOfYear dayOfYear() {
return usesFieldRef() ? DayOfYear.dayOfYear(fieldReference) : DayOfYear.dayOfYear(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the day of the month for a date as a number between 1 and
* 31.
*
* @return
*/
public DayOfMonth dayOfMonth() {
return usesFieldRef() ? DayOfMonth.dayOfMonth(fieldReference) : DayOfMonth.dayOfMonth(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the day of the week for a date as a number between 1
* (Sunday) and 7 (Saturday).
*
* @return
*/
public DayOfWeek dayOfWeek() {
return usesFieldRef() ? DayOfWeek.dayOfWeek(fieldReference) : DayOfWeek.dayOfWeek(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the year portion of a date.
*
* @return
*/
public Year year() {
return usesFieldRef() ? Year.yearOf(fieldReference) : Year.yearOf(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the month of a date as a number between 1 and 12.
*
* @return
*/
public Month month() {
return usesFieldRef() ? Month.monthOf(fieldReference) : Month.monthOf(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the week of the year for a date as a number between 0 and
* 53.
*
* @return
*/
public Week week() {
return usesFieldRef() ? Week.weekOf(fieldReference) : Week.weekOf(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the hour portion of a date as a number between 0 and 23.
*
* @return
*/
public Hour hour() {
return usesFieldRef() ? Hour.hourOf(fieldReference) : Hour.hourOf(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the minute portion of a date as a number between 0 and 59.
*
* @return
*/
public Minute minute() {
return usesFieldRef() ? Minute.minuteOf(fieldReference) : Minute.minuteOf(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the second portion of a date as a number between 0 and 59,
* but can be 60 to account for leap seconds.
*
* @return
*/
public Second second() {
return usesFieldRef() ? Second.secondOf(fieldReference) : Second.secondOf(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the millisecond portion of a date as an integer between 0
* and 999.
*
* @return
*/
public Millisecond millisecond() {
return usesFieldRef() ? Millisecond.millisecondOf(fieldReference) : Millisecond.millisecondOf(expression);
}
/**
* Creates new {@link AggregationExpression} that converts a date object to a string according to a user-specified
* {@literal format}.
*
* @param format must not be {@literal null}.
* @return
*/
public DateToString toString(String format) {
return (usesFieldRef() ? DateToString.dateOf(fieldReference) : DateToString.dateOf(expression)).toString(format);
}
/**
* Creates new {@link AggregationExpression} that returns the weekday number in ISO 8601 format, ranging from 1 (for
* Monday) to 7 (for Sunday).
*
* @return
*/
public IsoDayOfWeek isoDayOfWeek() {
return usesFieldRef() ? IsoDayOfWeek.isoDayOfWeek(fieldReference) : IsoDayOfWeek.isoDayOfWeek(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the week number in ISO 8601 format, ranging from 1 to 53.
*
* @return
*/
public IsoWeek isoWeek() {
return usesFieldRef() ? IsoWeek.isoWeekOf(fieldReference) : IsoWeek.isoWeekOf(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the year number in ISO 8601 format.
*
* @return
*/
public IsoWeekYear isoWeekYear() {
return usesFieldRef() ? IsoWeekYear.isoWeekYearOf(fieldReference) : IsoWeekYear.isoWeekYearOf(expression);
}
private boolean usesFieldRef() {
return fieldReference != null;
}
}
/**
* {@link AggregationExpression} for {@code $dayOfYear}.
*
* @author Christoph Strobl
*/
public static class DayOfYear extends AbstractAggregationExpression {
private DayOfYear(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$dayOfYear";
}
/**
* Creates new {@link DayOfYear}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static DayOfYear dayOfYear(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new DayOfYear(Fields.field(fieldReference));
}
/**
* Creates new {@link DayOfYear}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static DayOfYear dayOfYear(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new DayOfYear(expression);
}
}
/**
* {@link AggregationExpression} for {@code $dayOfMonth}.
*
* @author Christoph Strobl
*/
public static class DayOfMonth extends AbstractAggregationExpression {
private DayOfMonth(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$dayOfMonth";
}
/**
* Creates new {@link DayOfMonth}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static DayOfMonth dayOfMonth(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new DayOfMonth(Fields.field(fieldReference));
}
/**
* Creates new {@link DayOfMonth}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static DayOfMonth dayOfMonth(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new DayOfMonth(expression);
}
}
/**
* {@link AggregationExpression} for {@code $dayOfWeek}.
*
* @author Christoph Strobl
*/
public static class DayOfWeek extends AbstractAggregationExpression {
private DayOfWeek(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$dayOfWeek";
}
/**
* Creates new {@link DayOfWeek}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static DayOfWeek dayOfWeek(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new DayOfWeek(Fields.field(fieldReference));
}
/**
* Creates new {@link DayOfWeek}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static DayOfWeek dayOfWeek(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new DayOfWeek(expression);
}
}
/**
* {@link AggregationExpression} for {@code $year}.
*
* @author Christoph Strobl
*/
public static class Year extends AbstractAggregationExpression {
private Year(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$year";
}
/**
* Creates new {@link Year}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Year yearOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Year(Fields.field(fieldReference));
}
/**
* Creates new {@link Year}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Year yearOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Year(expression);
}
}
/**
* {@link AggregationExpression} for {@code $month}.
*
* @author Christoph Strobl
*/
public static class Month extends AbstractAggregationExpression {
private Month(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$month";
}
/**
* Creates new {@link Month}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Month monthOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Month(Fields.field(fieldReference));
}
/**
* Creates new {@link Month}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Month monthOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Month(expression);
}
}
/**
* {@link AggregationExpression} for {@code $week}.
*
* @author Christoph Strobl
*/
public static class Week extends AbstractAggregationExpression {
private Week(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$week";
}
/**
* Creates new {@link Week}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Week weekOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Week(Fields.field(fieldReference));
}
/**
* Creates new {@link Week}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Week weekOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Week(expression);
}
}
/**
* {@link AggregationExpression} for {@code $hour}.
*
* @author Christoph Strobl
*/
public static class Hour extends AbstractAggregationExpression {
private Hour(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$hour";
}
/**
* Creates new {@link Hour}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Hour hourOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Hour(Fields.field(fieldReference));
}
/**
* Creates new {@link Hour}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Hour hourOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Hour(expression);
}
}
/**
* {@link AggregationExpression} for {@code $minute}.
*
* @author Christoph Strobl
*/
public static class Minute extends AbstractAggregationExpression {
private Minute(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$minute";
}
/**
* Creates new {@link Minute}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Minute minuteOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Minute(Fields.field(fieldReference));
}
/**
* Creates new {@link Minute}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Minute minuteOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Minute(expression);
}
}
/**
* {@link AggregationExpression} for {@code $second}.
*
* @author Christoph Strobl
*/
public static class Second extends AbstractAggregationExpression {
private Second(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$second";
}
/**
* Creates new {@link Second}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Second secondOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Second(Fields.field(fieldReference));
}
/**
* Creates new {@link Second}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Second secondOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Second(expression);
}
}
/**
* {@link AggregationExpression} for {@code $millisecond}.
*
* @author Christoph Strobl
*/
public static class Millisecond extends AbstractAggregationExpression {
private Millisecond(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$millisecond";
}
/**
* Creates new {@link Millisecond}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Millisecond millisecondOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Millisecond(Fields.field(fieldReference));
}
/**
* Creates new {@link Millisecond}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Millisecond millisecondOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Millisecond(expression);
}
}
/**
* {@link AggregationExpression} for {@code $dateToString}.
*
* @author Christoph Strobl
*/
public static class DateToString extends AbstractAggregationExpression {
private DateToString(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$dateToString";
}
/**
* Creates new {@link FormatBuilder} allowing to define the date format to apply.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static FormatBuilder dateOf(final String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new FormatBuilder() {
@Override
public DateToString toString(String format) {
Assert.notNull(format, "Format must not be null!");
return new DateToString(argumentMap(Fields.field(fieldReference), format));
}
};
}
/**
* Creates new {@link FormatBuilder} allowing to define the date format to apply.
*
* @param expression must not be {@literal null}.
* @return
*/
public static FormatBuilder dateOf(final AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new FormatBuilder() {
@Override
public DateToString toString(String format) {
Assert.notNull(format, "Format must not be null!");
return new DateToString(argumentMap(expression, format));
}
};
}
private static java.util.Map<String, Object> argumentMap(Object date, String format) {
java.util.Map<String, Object> args = new LinkedHashMap<String, Object>(2);
args.put("format", format);
args.put("date", date);
return args;
}
public interface FormatBuilder {
/**
* Creates new {@link DateToString} with all previously added arguments appending the given one.
*
* @param format must not be {@literal null}.
* @return
*/
DateToString toString(String format);
}
}
/**
* {@link AggregationExpression} for {@code $isoDayOfWeek}.
*
* @author Christoph Strobl
*/
public static class IsoDayOfWeek extends AbstractAggregationExpression {
private IsoDayOfWeek(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$isoDayOfWeek";
}
/**
* Creates new {@link IsoDayOfWeek}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static IsoDayOfWeek isoDayOfWeek(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new IsoDayOfWeek(Fields.field(fieldReference));
}
/**
* Creates new {@link IsoDayOfWeek}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static IsoDayOfWeek isoDayOfWeek(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new IsoDayOfWeek(expression);
}
}
/**
* {@link AggregationExpression} for {@code $isoWeek}.
*
* @author Christoph Strobl
*/
public static class IsoWeek extends AbstractAggregationExpression {
private IsoWeek(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$isoWeek";
}
/**
* Creates new {@link IsoWeek}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static IsoWeek isoWeekOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new IsoWeek(Fields.field(fieldReference));
}
/**
* Creates new {@link IsoWeek}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static IsoWeek isoWeekOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new IsoWeek(expression);
}
}
/**
* {@link AggregationExpression} for {@code $isoWeekYear}.
*
* @author Christoph Strobl
*/
public static class IsoWeekYear extends AbstractAggregationExpression {
private IsoWeekYear(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$isoWeekYear";
}
/**
* Creates new {@link IsoWeekYear}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static IsoWeekYear isoWeekYearOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new IsoWeekYear(Fields.field(fieldReference));
}
/**
* Creates new {@link Millisecond}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static IsoWeekYear isoWeekYearOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new IsoWeekYear(expression);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,14 +22,17 @@ import java.util.Iterator;
import java.util.List;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
import org.springframework.util.Assert;
import org.springframework.util.CompositeIterator;
import org.springframework.util.ObjectUtils;
/**
* Value object to capture the fields exposed by an {@link AggregationOperation}.
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Mark Paluch
* @since 1.3
*/
public final class ExposedFields implements Iterable<ExposedField> {
@@ -88,7 +91,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
}
/**
* Creates a new {@link ExposedFields} instance for the given fields in either sythetic or non-synthetic way.
* Creates a new {@link ExposedFields} instance for the given fields in either synthetic or non-synthetic way.
*
* @param fields must not be {@literal null}.
* @param synthetic
@@ -103,11 +106,11 @@ public final class ExposedFields implements Iterable<ExposedField> {
result.add(new ExposedField(field, synthetic));
}
return ExposedFields.from(result);
return from(result);
}
/**
* Creates a new {@link ExposedFields} with the given orignals and synthetics.
* Creates a new {@link ExposedFields} with the given originals and synthetics.
*
* @param originals must not be {@literal null}.
* @param synthetic must not be {@literal null}.
@@ -203,8 +206,13 @@ public final class ExposedFields implements Iterable<ExposedField> {
public Iterator<ExposedField> iterator() {
CompositeIterator<ExposedField> iterator = new CompositeIterator<ExposedField>();
iterator.add(syntheticFields.iterator());
iterator.add(originalFields.iterator());
if (!syntheticFields.isEmpty()) {
iterator.add(syntheticFields.iterator());
}
if (!originalFields.isEmpty()) {
iterator.add(originalFields.iterator());
}
return iterator;
}
@@ -268,14 +276,21 @@ public final class ExposedFields implements Iterable<ExposedField> {
return field.isAliased();
}
/**
* @return the synthetic
*/
public boolean isSynthetic() {
return synthetic;
}
/**
* Returns whether the field can be referred to using the given name.
*
* @param input
* @param name
* @return
*/
public boolean canBeReferredToBy(String input) {
return getTarget().equals(input);
public boolean canBeReferredToBy(String name) {
return getName().equals(name) || getTarget().equals(name);
}
/*
@@ -323,12 +338,36 @@ public final class ExposedFields implements Iterable<ExposedField> {
}
}
/**
* A reference to an {@link ExposedField}.
*
* @author Christoph Strobl
* @since 1.10
*/
interface FieldReference {
/**
* Returns the raw, unqualified reference, i.e. the field reference without a {@literal $} prefix.
*
* @return
*/
String getRaw();
/**
* Returns the reference value for the given field reference. Will return 1 for a synthetic, unaliased field or the
* raw rendering of the reference otherwise.
*
* @return
*/
Object getReferenceValue();
}
/**
* A reference to an {@link ExposedField}.
*
* @author Oliver Gierke
*/
static class FieldReference {
static class DirectFieldReference implements FieldReference {
private final ExposedField field;
@@ -337,16 +376,16 @@ public final class ExposedFields implements Iterable<ExposedField> {
*
* @param field must not be {@literal null}.
*/
public FieldReference(ExposedField field) {
public DirectFieldReference(ExposedField field) {
Assert.notNull(field, "ExposedField must not be null!");
this.field = field;
}
/**
* Returns the raw, unqualified reference, i.e. the field reference without a {@literal $} prefix.
*
* @return
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference#getRaw()
*/
public String getRaw() {
@@ -354,11 +393,9 @@ public final class ExposedFields implements Iterable<ExposedField> {
return field.synthetic ? target : String.format("%s.%s", Fields.UNDERSCORE_ID, target);
}
/**
* Returns the referenve value for the given field reference. Will return 1 for a synthetic, unaliased field or the
* raw rendering of the reference otherwise.
*
* @return
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference#getReferenceValue()
*/
public Object getReferenceValue() {
return field.synthetic && !field.isAliased() ? 1 : toString();
@@ -370,6 +407,11 @@ public final class ExposedFields implements Iterable<ExposedField> {
*/
@Override
public String toString() {
if(getRaw().startsWith("$")) {
return getRaw();
}
return String.format("$%s", getRaw());
}
@@ -384,11 +426,11 @@ public final class ExposedFields implements Iterable<ExposedField> {
return true;
}
if (!(obj instanceof FieldReference)) {
if (!(obj instanceof DirectFieldReference)) {
return false;
}
FieldReference that = (FieldReference) obj;
DirectFieldReference that = (DirectFieldReference) obj;
return this.field.equals(that.field);
}
@@ -402,4 +444,78 @@ public final class ExposedFields implements Iterable<ExposedField> {
return field.hashCode();
}
}
/**
* A {@link FieldReference} to a {@link Field} used within a nested {@link AggregationExpression}.
*
* @author Christoph Strobl
* @since 1.10
*/
static class ExpressionFieldReference implements FieldReference {
private FieldReference delegate;
/**
* Creates a new {@link FieldReference} for the given {@link ExposedField}.
*
* @param field must not be {@literal null}.
*/
public ExpressionFieldReference(FieldReference field) {
delegate = field;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference#getRaw()
*/
@Override
public String getRaw() {
return delegate.getRaw();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference#getReferenceValue()
*/
@Override
public Object getReferenceValue() {
return delegate.getReferenceValue();
}
@Override
public String toString() {
String fieldRef = delegate.toString();
if (fieldRef.startsWith("$$")) {
return fieldRef;
}
if (fieldRef.startsWith("$")) {
return "$" + fieldRef;
}
return fieldRef;
}
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof ExpressionFieldReference)) {
return false;
}
ExpressionFieldReference that = (ExpressionFieldReference) obj;
return ObjectUtils.nullSafeEquals(this.delegate, that.delegate);
}
@Override
public int hashCode() {
return delegate.hashCode();
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.DirectFieldReference;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.util.Assert;
@@ -24,24 +25,32 @@ import com.mongodb.DBObject;
/**
* {@link AggregationOperationContext} that combines the available field references from a given
* {@code AggregationOperationContext} and an {@link FieldsExposingAggregationOperation}.
*
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Mark Paluch
* @since 1.4
*/
class ExposedFieldsAggregationOperationContext implements AggregationOperationContext {
private final ExposedFields exposedFields;
private final AggregationOperationContext rootContext;
/**
* Creates a new {@link ExposedFieldsAggregationOperationContext} from the given {@link ExposedFields}.
*
* Creates a new {@link ExposedFieldsAggregationOperationContext} from the given {@link ExposedFields}. Uses the given
* {@link AggregationOperationContext} to perform a mapping to mongo types if necessary.
*
* @param exposedFields must not be {@literal null}.
* @param rootContext must not be {@literal null}.
*/
public ExposedFieldsAggregationOperationContext(ExposedFields exposedFields) {
public ExposedFieldsAggregationOperationContext(ExposedFields exposedFields,
AggregationOperationContext rootContext) {
Assert.notNull(exposedFields, "ExposedFields must not be null!");
Assert.notNull(rootContext, "RootContext must not be null!");
this.exposedFields = exposedFields;
this.rootContext = rootContext;
}
/*
@@ -50,7 +59,7 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
*/
@Override
public DBObject getMappedObject(DBObject dbObject) {
return dbObject;
return rootContext.getMappedObject(dbObject);
}
/*
@@ -59,7 +68,7 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
*/
@Override
public FieldReference getReference(Field field) {
return getReference(field.getTarget());
return getReference(field, field.getTarget());
}
/*
@@ -68,13 +77,59 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
*/
@Override
public FieldReference getReference(String name) {
return getReference(null, name);
}
ExposedField field = exposedFields.getField(name);
/**
* Returns a {@link FieldReference} to the given {@link Field} with the given {@code name}.
*
* @param field may be {@literal null}
* @param name must not be {@literal null}
* @return
*/
private FieldReference getReference(Field field, String name) {
if (field != null) {
return new FieldReference(field);
Assert.notNull(name, "Name must not be null!");
FieldReference exposedField = resolveExposedField(field, name);
if (exposedField != null) {
return exposedField;
}
throw new IllegalArgumentException(String.format("Invalid reference '%s'!", name));
}
/**
* Resolves a {@link field}/{@link name} for a {@link FieldReference} if possible.
*
* @param field may be {@literal null}
* @param name must not be {@literal null}
* @return the resolved reference or {@literal null}
*/
protected FieldReference resolveExposedField(Field field, String name) {
ExposedField exposedField = exposedFields.getField(name);
if (exposedField != null) {
if (field != null) {
// we return a FieldReference to the given field directly to make sure that we reference the proper alias here.
return new DirectFieldReference(new ExposedField(field, exposedField.isSynthetic()));
}
return new DirectFieldReference(exposedField);
}
if (name.contains(".")) {
// for nested field references we only check that the root field exists.
ExposedField rootField = exposedFields.getField(name.split("\\.")[0]);
if (rootField != null) {
// We have to synthetic to true, in order to render the field-name as is.
return new DirectFieldReference(new ExposedField(name, true));
}
}
return null;
}
}

View File

@@ -0,0 +1,228 @@
/*
* Copyright 2016-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.Output;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $facet}-operation. <br />
* Facet of {@link AggregationOperation}s to be used in an {@link Aggregation}. Processes multiple
* {@link AggregationOperation} pipelines within a single stage on the same set of input documents. Each sub-pipeline
* has its own field in the output document where its results are stored as an array of documents.
* {@link FacetOperation} enables various aggregations on the same set of input documents, without needing to retrieve
* the input documents multiple times. <br />
* As of MongoDB 3.4, {@link FacetOperation} cannot be used with nested pipelines containing {@link GeoNearOperation},
* {@link OutOperation} and {@link FacetOperation}. <br />
* We recommend to use the static factory method {@link Aggregation#facet()} instead of creating instances of this class
* directly.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 1.10
* @see <a href="https://docs.mongodb.org/manual/reference/aggregation/facet/">MongoDB Aggregation Framework: $facet</a>
*/
public class FacetOperation implements FieldsExposingAggregationOperation {
/**
* Empty (initial) {@link FacetOperation}.
*/
public static final FacetOperation EMPTY = new FacetOperation();
private final Facets facets;
/**
* Creates a new {@link FacetOperation}.
*/
public FacetOperation() {
this(Facets.EMPTY);
}
private FacetOperation(Facets facets) {
this.facets = facets;
}
/**
* Creates a new {@link FacetOperationBuilder} to append a new facet using {@literal operations}. <br />
* {@link FacetOperationBuilder} takes a pipeline of {@link AggregationOperation} to categorize documents into a
* single facet.
*
* @param operations must not be {@literal null} or empty.
* @return
*/
public FacetOperationBuilder and(AggregationOperation... operations) {
Assert.notNull(operations, "AggregationOperations must not be null!");
Assert.notEmpty(operations, "AggregationOperations must not be empty!");
return new FacetOperationBuilder(facets, Arrays.asList(operations));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$facet", facets.toDBObject(context));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
*/
@Override
public ExposedFields getFields() {
return facets.asExposedFields();
}
/**
* Builder for {@link FacetOperation} by adding existing and the new pipeline of {@link AggregationOperation} to the
* new {@link FacetOperation}.
*
* @author Mark Paluch
*/
public static class FacetOperationBuilder {
private final Facets current;
private final List<AggregationOperation> operations;
private FacetOperationBuilder(Facets current, List<AggregationOperation> operations) {
this.current = current;
this.operations = operations;
}
/**
* Creates a new {@link FacetOperation} that contains the configured pipeline of {@link AggregationOperation}
* exposed as {@literal fieldName} in the resulting facet document.
*
* @param fieldName must not be {@literal null} or empty.
* @return
*/
public FacetOperation as(String fieldName) {
Assert.hasText(fieldName, "FieldName must not be null or empty!");
return new FacetOperation(current.and(fieldName, operations));
}
}
/**
* Encapsulates multiple {@link Facet}s
*
* @author Mark Paluch
*/
private static class Facets {
private static final Facets EMPTY = new Facets(Collections.<Facet> emptyList());
private List<Facet> facets;
/**
* Creates a new {@link Facets} given {@link List} of {@link Facet}.
*
* @param facets
*/
private Facets(List<Facet> facets) {
this.facets = facets;
}
/**
* @return the {@link ExposedFields} derived from {@link Output}.
*/
ExposedFields asExposedFields() {
ExposedFields fields = ExposedFields.from();
for (Facet facet : facets) {
fields = fields.and(facet.getExposedField());
}
return fields;
}
DBObject toDBObject(AggregationOperationContext context) {
DBObject dbObject = new BasicDBObject(facets.size());
for (Facet facet : facets) {
dbObject.put(facet.getExposedField().getName(), facet.toDBObjects(context));
}
return dbObject;
}
/**
* Adds a facet to this {@link Facets}.
*
* @param fieldName must not be {@literal null}.
* @param operations must not be {@literal null}.
* @return the new {@link Facets}.
*/
Facets and(String fieldName, List<AggregationOperation> operations) {
Assert.hasText(fieldName, "FieldName must not be null or empty!");
Assert.notNull(operations, "AggregationOperations must not be null!");
List<Facet> facets = new ArrayList<Facet>(this.facets.size() + 1);
facets.addAll(this.facets);
facets.add(new Facet(new ExposedField(fieldName, true), operations));
return new Facets(facets);
}
}
/**
* A single facet with a {@link ExposedField} and its {@link AggregationOperation} pipeline.
*
* @author Mark Paluch
*/
private static class Facet {
private final ExposedField exposedField;
private final List<AggregationOperation> operations;
/**
* Creates a new {@link Facet} given {@link ExposedField} and {@link AggregationOperation} pipeline.
*
* @param exposedField must not be {@literal null}.
* @param operations must not be {@literal null}.
*/
Facet(ExposedField exposedField, List<AggregationOperation> operations) {
Assert.notNull(exposedField, "ExposedField must not be null!");
Assert.notNull(operations, "AggregationOperations must not be null!");
this.exposedField = exposedField;
this.operations = operations;
}
ExposedField getExposedField() {
return exposedField;
}
List<DBObject> toDBObjects(AggregationOperationContext context) {
return AggregationOperationRenderer.toDBObject(operations, context);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.HashMap;
import java.util.Iterator;
import java.util.List;
@@ -30,6 +31,7 @@ import org.springframework.util.StringUtils;
* Value object to capture a list of {@link Field} instances.
*
* @author Oliver Gierke
* @author Thomas Darimont
* @since 1.3
*/
public final class Fields implements Iterable<Field> {
@@ -83,6 +85,15 @@ public final class Fields implements Iterable<Field> {
return new AggregationField(name);
}
/**
* Creates a {@link Field} with the given {@code name} and {@code target}.
* <p>
* The {@code target} is the name of the backing document field that will be aliased with {@code name}.
*
* @param name
* @param target must not be {@literal null} or empty
* @return
*/
public static Field field(String name, String target) {
Assert.hasText(target, "Target must not be null or empty!");
return new AggregationField(name, target);
@@ -175,6 +186,14 @@ public final class Fields implements Iterable<Field> {
return fields.iterator();
}
/**
* @return
* @since 1.10
*/
public List<Field> asList() {
return Collections.unmodifiableList(fields);
}
/**
* Value object to encapsulate a field in an aggregation operation.
*
@@ -182,21 +201,32 @@ public final class Fields implements Iterable<Field> {
*/
static class AggregationField implements Field {
private final String raw;
private final String name;
private final String target;
/**
* Creates an aggregation fieldwith the given name. As no target is set explicitly, the name will be used as target
* as well.
* Creates an aggregation field with the given {@code name}.
*
* @param key
* @see AggregationField#AggregationField(String, String).
* @param name must not be {@literal null} or empty
*/
public AggregationField(String key) {
this(key, null);
public AggregationField(String name) {
this(name, null);
}
/**
* Creates an aggregation field with the given {@code name} and {@code target}.
* <p>
* The {@code name} serves as an alias for the actual backing document field denoted by {@code target}. If no target
* is set explicitly, the name will be used as target.
*
* @param name must not be {@literal null} or empty
* @param target
*/
public AggregationField(String name, String target) {
raw = name;
String nameToSet = cleanUp(name);
String targetToSet = cleanUp(target);
@@ -217,6 +247,10 @@ public final class Fields implements Iterable<Field> {
return source;
}
if (Aggregation.SystemVariable.isReferingToSystemVariable(source)) {
return source;
}
int dollarIndex = source.lastIndexOf('$');
return dollarIndex == -1 ? source : source.substring(dollarIndex + 1);
}
@@ -234,6 +268,11 @@ public final class Fields implements Iterable<Field> {
* @see org.springframework.data.mongodb.core.aggregation.Field#getAlias()
*/
public String getTarget() {
if (isLocalVar()) {
return this.getRaw();
}
return StringUtils.hasText(this.target) ? this.target : this.name;
}
@@ -246,6 +285,22 @@ public final class Fields implements Iterable<Field> {
return !getName().equals(getTarget());
}
/**
* @return {@literal true} in case the field name starts with {@code $$}.
* @since 1.10
*/
public boolean isLocalVar() {
return raw.startsWith("$$") && !raw.startsWith("$$$");
}
/**
* @return
* @since 1.10
*/
public String getRaw() {
return raw;
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,17 +16,28 @@
package org.springframework.data.mongodb.core.aggregation;
/**
* {@link AggregationOperation} that exposes new {@link ExposedFields} that can be used for later aggregation pipeline
* {@code AggregationOperation}s.
*
* {@link AggregationOperation} that exposes {@link ExposedFields} that can be used for later aggregation pipeline
* {@code AggregationOperation}s. A {@link FieldsExposingAggregationOperation} implementing the
* {@link InheritsFieldsAggregationOperation} will expose fields from its parent operations. Not implementing
* {@link InheritsFieldsAggregationOperation} will replace existing exposed fields.
*
* @author Thomas Darimont
* @author Mark Paluch
*/
public interface FieldsExposingAggregationOperation extends AggregationOperation {
/**
* Returns the fields exposed by the {@link AggregationOperation}.
*
*
* @return will never be {@literal null}.
*/
ExposedFields getFields();
/**
* Marker interface for {@link AggregationOperation} that inherits fields from previous operations.
*/
static interface InheritsFieldsAggregationOperation extends FieldsExposingAggregationOperation {
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,17 +22,33 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Represents a {@code geoNear} aggregation operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#geoNear(NearQuery, String)} instead of creating
* instances of this class directly.
*
* @author Thomas Darimont
* @since 1.3
*/
public class GeoNearOperation implements AggregationOperation {
private final NearQuery nearQuery;
private final String distanceField;
public GeoNearOperation(NearQuery nearQuery) {
/**
* Creates a new {@link GeoNearOperation} from the given {@link NearQuery} and the given distance field. The
* {@code distanceField} defines output field that contains the calculated distance.
*
* @param query must not be {@literal null}.
* @param distanceField must not be {@literal null}.
*/
public GeoNearOperation(NearQuery nearQuery, String distanceField) {
Assert.notNull(nearQuery, "NearQuery must not be null.");
Assert.hasLength(distanceField, "Distance field must not be null or empty.");
Assert.notNull(nearQuery);
this.nearQuery = nearQuery;
this.distanceField = distanceField;
}
/*
@@ -41,6 +57,10 @@ public class GeoNearOperation implements AggregationOperation {
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$geoNear", context.getMappedObject(nearQuery.toDBObject()));
BasicDBObject command = (BasicDBObject) context.getMappedObject(nearQuery.toDBObject());
command.put("distanceField", distanceField);
return new BasicDBObject("$geoNear", command);
}
}

View File

@@ -0,0 +1,411 @@
/*
* Copyright 2016-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $graphLookup}-operation. <br />
* Performs a recursive search on a collection, with options for restricting the search by recursion depth and query
* filter. <br />
* We recommend to use the static factory method {@link Aggregation#graphLookup(String)} instead of creating instances
* of this class directly.
*
* @see <a href=
* "https://docs.mongodb.org/manual/reference/aggregation/graphLookup/">https://docs.mongodb.org/manual/reference/aggregation/graphLookup/</a>
* @author Mark Paluch
* @author Christoph Strobl
* @since 1.10
*/
public class GraphLookupOperation implements InheritsFieldsAggregationOperation {
private static final Set<Class<?>> ALLOWED_START_TYPES = new HashSet<Class<?>>(
Arrays.<Class<?>> asList(AggregationExpression.class, String.class, Field.class, DBObject.class));
private final String from;
private final List<Object> startWith;
private final Field connectFrom;
private final Field connectTo;
private final Field as;
private final Long maxDepth;
private final Field depthField;
private final CriteriaDefinition restrictSearchWithMatch;
private GraphLookupOperation(String from, List<Object> startWith, Field connectFrom, Field connectTo, Field as,
Long maxDepth, Field depthField, CriteriaDefinition restrictSearchWithMatch) {
this.from = from;
this.startWith = startWith;
this.connectFrom = connectFrom;
this.connectTo = connectTo;
this.as = as;
this.maxDepth = maxDepth;
this.depthField = depthField;
this.restrictSearchWithMatch = restrictSearchWithMatch;
}
/**
* Creates a new {@link FromBuilder} to build {@link GraphLookupOperation}.
*
* @return a new {@link FromBuilder}.
*/
public static FromBuilder builder() {
return new GraphLookupOperationFromBuilder();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
DBObject graphLookup = new BasicDBObject();
graphLookup.put("from", from);
List<Object> mappedStartWith = new ArrayList<Object>(startWith.size());
for (Object startWithElement : startWith) {
if (startWithElement instanceof AggregationExpression) {
mappedStartWith.add(((AggregationExpression) startWithElement).toDbObject(context));
} else if (startWithElement instanceof Field) {
mappedStartWith.add(context.getReference((Field) startWithElement).toString());
} else {
mappedStartWith.add(startWithElement);
}
}
graphLookup.put("startWith", mappedStartWith.size() == 1 ? mappedStartWith.iterator().next() : mappedStartWith);
graphLookup.put("connectFromField", connectFrom.getName());
graphLookup.put("connectToField", connectTo.getName());
graphLookup.put("as", as.getName());
if (maxDepth != null) {
graphLookup.put("maxDepth", maxDepth);
}
if (depthField != null) {
graphLookup.put("depthField", depthField.getName());
}
if (restrictSearchWithMatch != null) {
graphLookup.put("restrictSearchWithMatch", context.getMappedObject(restrictSearchWithMatch.getCriteriaObject()));
}
return new BasicDBObject("$graphLookup", graphLookup);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
*/
@Override
public ExposedFields getFields() {
return ExposedFields.from(new ExposedField(as, true));
}
/**
* @author Mark Paluch
*/
public interface FromBuilder {
/**
* Set the {@literal collectionName} to apply the {@code $graphLookup} to.
*
* @param collectionName must not be {@literal null} or empty.
* @return
*/
StartWithBuilder from(String collectionName);
}
/**
* @author Mark Paluch
* @author Christoph Strobl
*/
public interface StartWithBuilder {
/**
* Set the startWith {@literal fieldReferences} to apply the {@code $graphLookup} to.
*
* @param fieldReferences must not be {@literal null}.
* @return
*/
ConnectFromBuilder startWith(String... fieldReferences);
/**
* Set the startWith {@literal expressions} to apply the {@code $graphLookup} to.
*
* @param expressions must not be {@literal null}.
* @return
*/
ConnectFromBuilder startWith(AggregationExpression... expressions);
/**
* Set the startWith as either {@literal fieldReferences}, {@link Fields}, {@link DBObject} or
* {@link AggregationExpression} to apply the {@code $graphLookup} to.
*
* @param expressions must not be {@literal null}.
* @return
* @throws IllegalArgumentException
*/
ConnectFromBuilder startWith(Object... expressions);
}
/**
* @author Mark Paluch
*/
public interface ConnectFromBuilder {
/**
* Set the connectFrom {@literal fieldName} to apply the {@code $graphLookup} to.
*
* @param fieldName must not be {@literal null} or empty.
* @return
*/
ConnectToBuilder connectFrom(String fieldName);
}
/**
* @author Mark Paluch
*/
public interface ConnectToBuilder {
/**
* Set the connectTo {@literal fieldName} to apply the {@code $graphLookup} to.
*
* @param fieldName must not be {@literal null} or empty.
* @return
*/
GraphLookupOperationBuilder connectTo(String fieldName);
}
/**
* Builder to build the initial {@link GraphLookupOperationBuilder} that configures the initial mandatory set of
* {@link GraphLookupOperation} properties.
*
* @author Mark Paluch
*/
static final class GraphLookupOperationFromBuilder
implements FromBuilder, StartWithBuilder, ConnectFromBuilder, ConnectToBuilder {
private String from;
private List<? extends Object> startWith;
private String connectFrom;
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.GraphLookupOperation.FromBuilder#from(java.lang.String)
*/
@Override
public StartWithBuilder from(String collectionName) {
Assert.hasText(collectionName, "CollectionName must not be null or empty!");
this.from = collectionName;
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.GraphLookupOperation.StartWithBuilder#startWith(java.lang.String[])
*/
@Override
public ConnectFromBuilder startWith(String... fieldReferences) {
Assert.notNull(fieldReferences, "FieldReferences must not be null!");
Assert.noNullElements(fieldReferences, "FieldReferences must not contain null elements!");
List<Object> fields = new ArrayList<Object>(fieldReferences.length);
for (String fieldReference : fieldReferences) {
fields.add(Fields.field(fieldReference));
}
this.startWith = fields;
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.GraphLookupOperation.StartWithBuilder#startWith(org.springframework.data.mongodb.core.aggregation.AggregationExpression[])
*/
@Override
public ConnectFromBuilder startWith(AggregationExpression... expressions) {
Assert.notNull(expressions, "AggregationExpressions must not be null!");
Assert.noNullElements(expressions, "AggregationExpressions must not contain null elements!");
this.startWith = Arrays.asList(expressions);
return this;
}
@Override
public ConnectFromBuilder startWith(Object... expressions) {
Assert.notNull(expressions, "Expressions must not be null!");
Assert.noNullElements(expressions, "Expressions must not contain null elements!");
this.startWith = verifyAndPotentiallyTransformStartsWithTypes(expressions);
return this;
}
private List<Object> verifyAndPotentiallyTransformStartsWithTypes(Object... expressions) {
List<Object> expressionsToUse = new ArrayList<Object>(expressions.length);
for (Object expression : expressions) {
assertStartWithType(expression);
if (expression instanceof String) {
expressionsToUse.add(Fields.field((String) expression));
} else {
expressionsToUse.add(expression);
}
}
return expressionsToUse;
}
private void assertStartWithType(Object expression) {
for (Class<?> type : ALLOWED_START_TYPES) {
if (ClassUtils.isAssignable(type, expression.getClass())) {
return;
}
}
throw new IllegalArgumentException(
String.format("Expression must be any of %s but was %s", ALLOWED_START_TYPES, expression.getClass()));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.GraphLookupOperation.ConnectFromBuilder#connectFrom(java.lang.String)
*/
@Override
public ConnectToBuilder connectFrom(String fieldName) {
Assert.hasText(fieldName, "ConnectFrom must not be null or empty!");
this.connectFrom = fieldName;
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.GraphLookupOperation.ConnectToBuilder#connectTo(java.lang.String)
*/
@Override
public GraphLookupOperationBuilder connectTo(String fieldName) {
Assert.hasText(fieldName, "ConnectTo must not be null or empty!");
return new GraphLookupOperationBuilder(from, startWith, connectFrom, fieldName);
}
}
/**
* @author Mark Paluch
*/
public static final class GraphLookupOperationBuilder {
private final String from;
private final List<Object> startWith;
private final Field connectFrom;
private final Field connectTo;
private Long maxDepth;
private Field depthField;
private CriteriaDefinition restrictSearchWithMatch;
protected GraphLookupOperationBuilder(String from, List<? extends Object> startWith, String connectFrom,
String connectTo) {
this.from = from;
this.startWith = new ArrayList<Object>(startWith);
this.connectFrom = Fields.field(connectFrom);
this.connectTo = Fields.field(connectTo);
}
/**
* Optionally limit the number of recursions.
*
* @param numberOfRecursions must be greater or equal to zero.
* @return
*/
public GraphLookupOperationBuilder maxDepth(long numberOfRecursions) {
Assert.isTrue(numberOfRecursions >= 0, "Max depth must be >= 0!");
this.maxDepth = numberOfRecursions;
return this;
}
/**
* Optionally add a depth field {@literal fieldName} to each traversed document in the search path.
*
* @param fieldName must not be {@literal null} or empty.
* @return
*/
public GraphLookupOperationBuilder depthField(String fieldName) {
Assert.hasText(fieldName, "Depth field name must not be null or empty!");
this.depthField = Fields.field(fieldName);
return this;
}
/**
* Optionally add a query specifying conditions to the recursive search.
*
* @param criteriaDefinition must not be {@literal null}.
* @return
*/
public GraphLookupOperationBuilder restrict(CriteriaDefinition criteriaDefinition) {
Assert.notNull(criteriaDefinition, "CriteriaDefinition must not be null!");
this.restrictSearchWithMatch = criteriaDefinition;
return this;
}
/**
* Set the name of the array field added to each output document and return the final {@link GraphLookupOperation}.
* Contains the documents traversed in the {@literal $graphLookup} stage to reach the document.
*
* @param fieldName must not be {@literal null} or empty.
* @return the final {@link GraphLookupOperation}.
*/
public GraphLookupOperation as(String fieldName) {
Assert.hasText(fieldName, "As field name must not be null or empty!");
return new GraphLookupOperation(from, startWith, connectFrom, connectTo, Fields.field(fieldName), maxDepth,
depthField, restrictSearchWithMatch);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -31,12 +31,17 @@ import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $group}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#group(Fields)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/group/#stage._S_group
* @author Sebastian Herold
* @author Thomas Darimont
* @author Oliver Gierke
* @author Gustavo de Geus
* @author Christoph Strobl
* @since 1.3
* @see <a href="https://docs.mongodb.org/manual/reference/aggregation/group/">MongoDB Aggregation Framework: $group</a>
*/
public class GroupOperation implements FieldsExposingAggregationOperation {
@@ -190,6 +195,16 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return newBuilder(GroupOps.LAST, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $last}-expression for the given {@link AggregationExpression}.
*
* @param expr
* @return
*/
public GroupOperationBuilder last(AggregationExpression expr) {
return newBuilder(GroupOps.LAST, null, expr);
}
/**
* Generates an {@link GroupOperationBuilder} for a {@code $first}-expression for the given field-reference.
*
@@ -200,6 +215,16 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return newBuilder(GroupOps.FIRST, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for a {@code $first}-expression for the given {@link AggregationExpression}.
*
* @param expr
* @return
*/
public GroupOperationBuilder first(AggregationExpression expr) {
return newBuilder(GroupOps.FIRST, null, expr);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $avg}-expression for the given field-reference.
*
@@ -210,6 +235,16 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return newBuilder(GroupOps.AVG, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $avg}-expression for the given {@link AggregationExpression}.
*
* @param expr
* @return
*/
public GroupOperationBuilder avg(AggregationExpression expr) {
return newBuilder(GroupOps.AVG, null, expr);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $push}-expression for the given field-reference.
*
@@ -244,6 +279,16 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return newBuilder(GroupOps.MIN, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $min}-expression that for the given {@link AggregationExpression}.
*
* @param expr
* @return
*/
public GroupOperationBuilder min(AggregationExpression expr) {
return newBuilder(GroupOps.MIN, null, expr);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $max}-expression that for the given field-reference.
*
@@ -254,6 +299,61 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return newBuilder(GroupOps.MAX, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $max}-expression that for the given {@link AggregationExpression}.
*
* @param expr
* @return
*/
public GroupOperationBuilder max(AggregationExpression expr) {
return newBuilder(GroupOps.MAX, null, expr);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $stdDevSamp}-expression that for the given
* field-reference.
*
* @param reference must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public GroupOperationBuilder stdDevSamp(String reference) {
return newBuilder(GroupOps.STD_DEV_SAMP, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $stdDevSamp}-expression that for the given {@link AggregationExpression}.
*
* @param expr must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public GroupOperationBuilder stdDevSamp(AggregationExpression expr) {
return newBuilder(GroupOps.STD_DEV_SAMP, null, expr);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $stdDevPop}-expression that for the given field-reference.
*
* @param reference must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public GroupOperationBuilder stdDevPop(String reference) {
return newBuilder(GroupOps.STD_DEV_POP, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $stdDevPop}-expression that for the given {@link AggregationExpression}.
*
* @param expr must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public GroupOperationBuilder stdDevPop(AggregationExpression expr) {
return newBuilder(GroupOps.STD_DEV_POP, null, expr);
}
private GroupOperationBuilder newBuilder(Keyword keyword, String reference, Object value) {
return new GroupOperationBuilder(this, new Operation(keyword, null, reference, value));
}
@@ -318,21 +418,18 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
private static enum GroupOps implements Keyword {
SUM, LAST, FIRST, PUSH, AVG, MIN, MAX, ADD_TO_SET, COUNT;
SUM("$sum"), LAST("$last"), FIRST("$first"), PUSH("$push"), AVG("$avg"), MIN("$min"), MAX("$max"), ADD_TO_SET("$addToSet"), STD_DEV_POP("$stdDevPop"), STD_DEV_SAMP("$stdDevSamp");
private String mongoOperator;
GroupOps(String mongoOperator) {
this.mongoOperator = mongoOperator;
}
@Override
public String toString() {
String[] parts = name().split("_");
StringBuilder builder = new StringBuilder();
for (String part : parts) {
String lowerCase = part.toLowerCase(Locale.US);
builder.append(builder.length() == 0 ? lowerCase : StringUtils.capitalize(lowerCase));
}
return "$" + builder.toString();
return mongoOperator;
}
}
@@ -364,7 +461,21 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
}
public Object getValue(AggregationOperationContext context) {
return reference == null ? value : context.getReference(reference).toString();
if (reference == null) {
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDbObject(context);
}
return value;
}
if (Aggregation.SystemVariable.isReferingToSystemVariable(reference)) {
return reference;
}
return context.getReference(reference).toString();
}
@Override

View File

@@ -0,0 +1,64 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
/**
* {@link ExposedFieldsAggregationOperationContext} that inherits fields from its parent
* {@link AggregationOperationContext}.
*
* @author Mark Paluch
* @since 1.9
*/
class InheritingExposedFieldsAggregationOperationContext extends ExposedFieldsAggregationOperationContext {
private final AggregationOperationContext previousContext;
/**
* Creates a new {@link ExposedFieldsAggregationOperationContext} from the given {@link ExposedFields}. Uses the given
* {@link AggregationOperationContext} to perform a mapping to mongo types if necessary.
*
* @param exposedFields must not be {@literal null}.
* @param previousContext must not be {@literal null}.
*/
public InheritingExposedFieldsAggregationOperationContext(ExposedFields exposedFields,
AggregationOperationContext previousContext) {
super(exposedFields, previousContext);
this.previousContext = previousContext;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ExposedFieldsAggregationOperationContext#resolveExposedField(org.springframework.data.mongodb.core.aggregation.Field, java.lang.String)
*/
@Override
protected FieldReference resolveExposedField(Field field, String name) {
FieldReference fieldReference = super.resolveExposedField(field, name);
if (fieldReference != null) {
return fieldReference;
}
if (field != null) {
return previousContext.getReference(field);
}
return previousContext.getReference(name);
}
}

Some files were not shown because too many files have changed in this diff Show More