Compare commits

..

312 Commits

Author SHA1 Message Date
Oliver Gierke
d5fec0989c DATAMONGO-1574 - Release version 1.10 GA (Ingalls). 2017-01-26 10:27:10 +01:00
Oliver Gierke
52b52dba93 DATAMONGO-1574 - Prepare 1.10 GA (Ingalls). 2017-01-26 10:26:35 +01:00
Oliver Gierke
d046f30862 DATAMONGO-1574 - Updated changelog. 2017-01-26 10:26:33 +01:00
Christoph Strobl
7413a031c1 DATAMONGO-1517 - Polishing.
Remove ReflectiveSimpleTypes in favor of MongoSimpleTypes.
Add add integration test.
2017-01-25 16:57:20 +01:00
Mark Paluch
8e3d7f96c4 DATAMONGO-1517 - Add support for Decimal128 BSON type.
Support Decimal128 as Mongo simple type if present. Decimal128 is stored as NumberDecimal.

class Person {

  String id;
  Decimal128 decimal128;

  Person(String id, Decimal128 decimal128) {
    this.id = id;
    this.decimal128 = decimal128;
  }
}

mongoTemplate.save(new Person("foo", new Decimal128(new BigDecimal("123.456"))));

is represented as:

{ "_id" : "foo", "decimal128" : NumberDecimal("123.456") }
2017-01-25 16:57:20 +01:00
Mark Paluch
d2d162dee6 DATAMONGO-1596 - Fix typo in JavaDoc.
Use correct @RelatedDocument annotation in MongoDB cross store reference documentation.
2017-01-25 16:53:26 +01:00
Mark Paluch
b9aa410ac1 DATAMONGO-1575 - Polishing.
Extend year range in license headers. Use MongoDB JSON serializer for String escaping. Move unquoting/quote checking to inner QuotedString utility class. Reformat code.
2017-01-25 11:48:14 +01:00
Christoph Strobl
6f278ce838 DATAMONGO-1575 - Escape Strings correctly.
Use regex groups and parameter index values for replacement in string based queries.
2017-01-25 11:48:14 +01:00
Christoph Strobl
3ed72a922a DATAMONGO-1594 - Update "what’s new" section in reference documentation. 2017-01-23 08:16:02 +01:00
Oliver Gierke
98b3c1678f DATAMONGO-1592 - Adapt AuditingEventListenerUnitTests to changes in core auditing.
The core auditing implementation now skips the invocation of auditing in case the candidate aggregate doesn't need any auditing in the first place. We needed to adapt the sample class we use to actually carry the necessary auditing annotations.

Related ticket: DATACMNS-957.
2017-01-20 16:33:59 +01:00
Oliver Gierke
45818f26b3 DATAMONGO-1590 - Polishing.
Removed some compiler warnings. Hide newly introduced class in package scope and made use of Lombok annotations to avoid boilerplate code.

Original pull request: #436.
2017-01-18 19:41:11 +01:00
Christoph Strobl
822e29524d DATAMONGO-1590 - EntityInformation selected now correctly considers Persistable.
We now wrap the MappingMongoEntityInformation into one that delegates the methods implemented by Persistable to the actual entity in case it implements said interface.

Original pull request: #436.
2017-01-18 19:39:53 +01:00
Mark Paluch
3476c639c2 DATAMONGO-1588 - Polishing.
Remove unused fields. Fix typo in method name. Reformat inner class to align formatting.

Original pull request: #435.
2017-01-16 09:12:33 +01:00
Christoph Strobl
35fa922daa DATAMONGO-1588 - Fix derived finder not accepting subclass of parameter type.
We now allow using sub types as arguments for derived queries. This makes it possible to use eg. a GeoJsonPoint for querying while the declared property type in the domain object remains a regular (legacy) Point.

Original pull request: #435.
2017-01-16 09:12:33 +01:00
Mark Paluch
8f7ea3a2ed DATAMONGO-1589 - Update project documentation with the CLA tool integration. 2017-01-13 11:46:51 +01:00
Mark Paluch
7ab7212edc DATAMONGO-1587 - Polishing.
Convert @see http://… links to valid format using @see <a href=…>…</a>
2017-01-12 17:08:07 +01:00
Mark Paluch
27651ef0be DATAMONGO-1587 - Migrate ticket references in test code to Spring Framework style. 2017-01-12 16:17:39 +01:00
Mark Paluch
174f7f0886 DATAMONGO-1586 - Consider field name in TypeBasedAggregationOperationContext.getReferenceFor(…).
We now consider the provided field name (alias) in mapped fields with which it is exposed. The field name applies to the exposed field after property path resolution in TypeBasedAggregationOperationContext. Previously, the field reference used the property name which caused fields to be considered non-aliased, so aggregation projection operations dropped the alias and exposed the field with its leaf property name.

Original Pull Request: #434
2017-01-12 09:51:13 +01:00
Christoph Strobl
437bf89533 DATAMONGO-1585 - Polishing.
Update documentation for better readability in html and pdf format.

Original Pull Request: #433
2017-01-12 09:42:20 +01:00
Mark Paluch
c3e5fca73d DATAMONGO-1585 - Expose synthetic fields in $project aggregation stage.
Field projections now expose their fields as synthetic simple fields. Projection aggregation stage redefines the available field set available for later aggregation stages entirely so projected fields are considered synthetic. A simple synthetic field has no target field which causes later aggregation stages to not pick up the underlying target but the exposed field name when rendering aggregation operations to Mongo documents.

The change is motivated by a bug where previously an aggregation consisting of projection of an aliased field and sort caused the sort projection stage to render with the original field name instead of the aliased field. The sort did not apply any sorting since projection redefines the available field set entirely and the original field is no longer accessible.

Original Pull Request: #433
2017-01-12 09:41:26 +01:00
Christoph Strobl
8340c02d9a DATAMONGO-1576 - Update lifecycle event documentation.
Add note on lifecycle event handling for property types.
2017-01-11 13:10:48 +01:00
Mark Paluch
3c6db34870 DATAMONGO-1578 - Polishing.
Add ticket references to test methods. Extend license years in copyright header.

Original pull request: #398.
2017-01-02 11:39:24 +01:00
Martin Macko
9f3319928b DATAMONGO-1578 - Add missing @Test annotation to ProjectionOperationUnitTests.
Original pull request: #398.
2017-01-02 11:38:49 +01:00
Mark Paluch
97a03d824a DATAMONGO-1508 - Improve reference documentation.
Replace Spring Data Document with Spring Data MongoDB. Extend copyright year range. Replace static Spring version leftover with variable. Fix typos.
2017-01-02 11:18:53 +01:00
Lukasz Kryger
8149273df9 DATAMONGO-1577 - Fix wording repetition in MongoRepository JavaDoc.
Original pull request: #407.
2017-01-02 11:18:53 +01:00
Ken Dombeck
e838662536 DATAMONGO-1577 - Fix Reference and JavaDoc spelling issues.
Replaced invalid class name MongoMappingConverter with actual class name of MappingMongoConverter. Fix typos.

Original pull request: #425.
2017-01-02 11:18:49 +01:00
Mark Paluch
e784e58a0a DATAMONGO-1508 - Polishing.
Highlight attribute name. Replace tabs with spaces.

Original pull request: #399.
2017-01-02 10:43:49 +01:00
John Lilley
fda72d6eb2 DATAMONGO-1508 - Document authentication-dbname attribute in db-factory.
Original pull request: #399.
2017-01-02 10:43:39 +01:00
Oliver Gierke
a96752da80 DATAMONGO-1522 - Updated changelog. 2016-12-21 19:35:31 +01:00
Oliver Gierke
320a28740a DATAMONGO-1469 - After release cleanups. 2016-12-21 16:33:13 +01:00
Oliver Gierke
e55e748cfd DATAMONGO-1469 - Prepare next development iteration. 2016-12-21 16:33:10 +01:00
Oliver Gierke
737f7b4f30 DATAMONGO-1469 - Release version 1.10 RC1 (Ingalls). 2016-12-21 16:16:51 +01:00
Oliver Gierke
408c5d8684 DATAMONGO-1469 - Prepare 1.10 RC1 (Ingalls). 2016-12-21 16:15:47 +01:00
Oliver Gierke
982adf317e DATAMONGO-1469 - Updated changelog. 2016-12-21 16:15:39 +01:00
Oliver Gierke
7914e8a630 DATAMONGO-1467 - Polishing.
Original pull request: #431.
2016-12-19 19:44:47 +01:00
Christoph Strobl
dc4a30a7f8 DATAMONGO-1467 - Add support for MongoDB 3.2 partialFilterExpression for index creation.
We now support partial filter expression on indexes via Index.partial(…). This allows to create partial indexes that only index the documents in a collection that meet a specified filter expression. 

new Index().named("idx").on("k3y", ASC).partial(filter(where("age").gte(10)))

The filter expression can be set via a plain DBObject or a CriteriaDefinition and is mapped against the associated domain type.

Original pull request: #431.
2016-12-19 19:44:47 +01:00
Oliver Gierke
29e405b800 DATAMONGO-1565 - Polishing.
Formatting in ExpressionEvaluatingParameterBinder and StringBasedMongoQueryUnitTests. Turned Placeholder into value object.
2016-12-19 14:37:26 +01:00
Mark Paluch
1a105333aa DATAMONGO-1565 - Polishing.
Consider quoted/unquoted parameter use with the same parameter reference. Extend date range in license headers.
2016-12-19 14:37:26 +01:00
Christoph Strobl
14e326dc09 DATAMONGO-1565 - Ignore placeholder pattern in replacement values for annotated queries.
We now make sure to quote single and double ticks in the replacement values before actually appending them to the query. We also replace single ticks around parameters in the actual raw annotated query by double quotes to make sure they are treated as a single string parameter.
2016-12-19 14:36:44 +01:00
Mark Paluch
f026ab419d DATAMONGO-1564 - Polishing.
Fix JavaDoc references and minor a import formatting issue.

Original pull request: #429.
2016-12-16 14:11:18 +01:00
Christoph Strobl
75139042e0 DATAMONGO-1564 - Split up AggregationExpressions.
Refactored to multiple smaller Aggregation Operator classes reflecting the grouping (array operators, string operators,…) predefined by MongoDB.

Original pull request: #429.
2016-12-16 14:11:18 +01:00
Mark Paluch
6236384c1d DATAMONGO-1567 - Use newer Java 8 on Travis CI. 2016-12-16 10:47:02 +01:00
Mark Paluch
6993054d6a DATAMONGO-1533 - Polishing.
Enhance JavaDoc. Minor reformatting.

Original pull request: #428.
2016-12-16 09:25:00 +01:00
Christoph Strobl
35bfb92ace DATAMONGO-1533 - Add AggregationExpression derived from SpEL AST.
We added an AggregationExpression that renders a MongoDB Aggregation Framework expression from the AST of a SpEL expression. This allows usage with various stages (eg. $project, $group) throughout the aggregation support.

  // { $and: [ { $gt: [ "$qty", 100 ] }, { $lt: [ "$qty", 250 ] } ] }
  expressionOf("qty > 100 && qty < 250);

  // { $cond : { if : { $gte : [ "$a", 42 ]}, then : "answer", else : "no-answer" } }
  expressionOf("cond(a >= 42, 'answer', 'no-answer')");

Original pull request: #428.
2016-12-16 09:02:15 +01:00
Oliver Gierke
89a02bb822 DATAMONGO-1566 - Adapt API in MongoRepositoryFactoryBean.
Related tickets: DATACMNS-891.
2016-12-15 14:57:26 +01:00
Christoph Strobl
c9c5fe62ca DATAMONGO-1552 - Polishing.
Updated doc, removed whitespaces, minor method wording changes.

Original Pull Request: #426
2016-12-14 12:05:26 +01:00
Mark Paluch
d250f88c38 DATAMONGO-1552 - Update Documentation.
Original Pull Request: #426
2016-12-14 11:09:46 +01:00
Mark Paluch
bcb63b2732 DATAMONGO-1552 - Add $facet aggregation stage.
Original Pull Request: #426
2016-12-14 11:09:01 +01:00
Mark Paluch
2c4377c9a6 DATAMONGO-1552 - Add $bucketAuto aggregation stage.
Original Pull Request: #426
2016-12-14 11:08:26 +01:00
Mark Paluch
e992d813fb DATAMONGO-1552 - Add $bucket aggregation stage.
Original Pull Request: #426
2016-12-14 11:07:40 +01:00
Mark Paluch
aa1e91c761 DATAMONGO-442 - Polishing.
Reformat code according to Spring Data style. Add test for authenticated use. Add JavaDoc to newly introduced methods. Allow configuration of an authentication database. Update reference documentation.

Original pull request: #419.
2016-12-13 16:50:37 +01:00
Ricardo Espirito Santo
9737464f9a DATAMONGO-442 - Support username/password authentication with MongoLog4jAppender.
Added optional username and password for authentication support on Log4jAppender.

Original pull request: #419.
2016-12-13 15:56:56 +01:00
Christoph Strobl
204a0515c4 DATAMONGO-1551 - Polishing.
Add startWith overload allowing to mix expressions, removed white spaces, updated doc.

Original Pull Request: #424
2016-12-13 15:15:55 +01:00
Mark Paluch
c6dae3c444 DATAMONGO-1551 - Add $graphLookup aggregation stage.
We now support the $graphLookup aggregation pipeline stage via Aggregation to perform recursive lookup adding the lookup result as array to documents entering $graphLookup.

TypedAggregation<Employee> agg = Aggregation.newAggregation(Employee.class,
		graphLookup("employee")
			.startWith("reportsTo")
			.connectFrom("reportsTo")
			.connectTo("name")
			.depthField("depth")
			.maxDepth(5)
			.as("reportingHierarchy"));

Original Pull Request: #424
2016-12-13 14:23:39 +01:00
Christoph Strobl
c1c7daf0ed DATAMONGO-1550 - Polishing $replaceRoot (aggregation stage).
Original Pull Request: #422.
2016-12-13 08:22:20 +01:00
Mark Paluch
14678ce7a9 DATAMONGO-1550 - Add $replaceRoot aggregation stage.
We now support the $replaceRoot stage in aggregation pipelines. $replaceRoot can reference either a field, an aggregation expression or it can be used to compose a replacement document.

newAggregation(
	replaceRoot().withDocument()
		.andValue("value").as("field")
		.and(MULTIPLY.of(field("total"), field("discounted")))
);

newAggregation(
	replaceRoot("item")));

Original Pull Request: #422
2016-12-13 08:22:20 +01:00
Christoph Strobl
4e56d9c575 DATAMONGO-1549 - Polishing $count (aggregation stage).
Original Pull Request: #422
2016-12-13 08:22:10 +01:00
Mark Paluch
cab35759db DATAMONGO-1549 - Add $count aggregation stage.
We now support the $count stage in aggregation pipelines.

newAggregation(
	match(where("hotelCode").is("0360")),
	count().as("documents"));

Original Pull Request: #422
2016-12-13 07:58:58 +01:00
Christoph Strobl
7b49b120e3 DATAMONGO-1558 - Upgrade MongoDB server version to, and add build profile for MongoDB 3.4.
Added MongoDB 3.4 profile to pom.xml and upgraded to MongoDB 3.4 on travis-ci.
2016-12-13 07:58:37 +01:00
Mark Paluch
dc57b66adf DATAMONGO-1548 - Polishing.
Enhance JavaDoc. Minor formatting. Fix typos.

Original pull request: #423.
2016-12-12 12:05:32 +01:00
Christoph Strobl
0449719a16 DATAMONGO-1548 - Add support for MongoDB 3.4 aggregation operators.
We now support the following MongoDB 3.4 aggregation operators:

$indexOfBytes, $indexOfCP, $split, $strLenBytes, $strLenCP, $substrCP, $indexOfArray, $range, $reverseArray, $reduce, $zip, $in, $isoDayOfWeek, $isoWeek, $isoWeekYear, $switch and $type.

Original pull request: #423.
2016-12-12 12:05:09 +01:00
Mark Paluch
3d8b6868c7 DATAMONGO-1538 - Polishing.
Use InheritingExposedFieldsAggregationOperationContext instead of anonymous context class for condition mapping. Drop aggregation input collections before tests. Minor reformatting.

Original pull request: #417.
2016-12-07 09:35:08 +01:00
Christoph Strobl
68db0d4cb0 DATAMONGO-1538 - Add support for $let to aggregation.
We now support $let in aggregation $project stage.

ExpressionVariable total = newExpressionVariable("total").forExpression(ADD.of(field("price"), field("tax")));
ExpressionVariable discounted = newExpressionVariable("discounted").forExpression(Cond.when("applyDiscount").then(0.9D).otherwise(1.0D));

newAggregation(Sales.class,
	project()
		.and(define(total, discounted)
			.andApply(MULTIPLY.of(field("total"), field("discounted"))))
		.as("finalTotal"));

Original pull request: #417.
2016-12-07 09:33:58 +01:00
Christoph Strobl
c9dfeea0c7 DATAMONGO-1542 - Polishing.
Added some static entry points for better readability.

Original Pull Request: #421
2016-12-06 10:13:32 +01:00
Mark Paluch
1a11877ae9 DATAMONGO-1542 - Refactor CondOperator and IfNullOperator to children of AggregationExpressions.
Renamed CondOperator to Cond and IfNullOperator to IfNull. Both conditional operations are now available from ConditionalOperators.when and ConditionalOperators.ifNull and accept AggregationExpressions for conditions and values.

Original Pull Request: #421
2016-12-06 09:19:28 +01:00
Christoph Strobl
ea4782c421 DATAMONGO-1520 - Add overload for aggregation $match accepting CriteriaDefinition.
We now also accept CriteriaDefinition next to Criteria for Aggregation.match. The existing match(Criteria) method remains to preserve binary compatibility.
2016-12-06 09:05:55 +01:00
Mark Paluch
a0ac3510a0 DATAMONGO-1540 - Polishing.
Reduce Map aggregation expression builder entrypoint. Fix JavaDoc.

Original pull request: #420.
2016-12-05 16:47:20 +01:00
Christoph Strobl
192399413d DATAMONGO-1540 - Add support for $map (aggregation).
We now support $map operator in aggregation.

Original pull request: #420.
2016-12-05 16:47:17 +01:00
Oliver Gierke
a0be890437 DATAMONGO-1547 - Register MongoRepositoryFactory in spring.factories.
This is required for the switch in support for multi-store detection.

Related ticket: DATACMNS-952.
2016-12-05 14:27:24 +01:00
Oliver Gierke
438dbc4b33 DATAMONGO-1546 - Register GeoJsonConfiguration via spring.factories.
Related tickets: DATACMNS-952.
2016-12-05 14:22:09 +01:00
Oliver Gierke
407affb458 DATAMONGO-1141 - Polishing.
Aligned assertion messages for consistency. Fixed imports in  UpdateMapperUnitTests.

Original pull request: #405.
2016-12-02 18:13:22 +01:00
Mark Paluch
c6a4e7166c DATAMONGO-1141 - Polishing.
Add property to field name mapping for Sort orders by moving Sort mapping to UpdateMapper. Fix typo. Add JavaDoc. Reformat code. Remove trailing whitespaces.

Original pull request: #405.
2016-12-02 18:13:18 +01:00
Pavel Vodrážka
7f39c42eb7 DATAMONGO-1141 - Add support for $push $sort in Update.
Sorting update modifier added. Supports sorting arrays by document fields and element values.

Original pull request: #405.
2016-12-02 18:13:02 +01:00
Oliver Gierke
40da4701de DATAMONGO-1454 - Polishing.
Formatting in test case.

Original pull request: #381.
2016-12-02 16:33:13 +01:00
Mark Paluch
3ae6aebebb DATAMONGO-1454 - Add support for exists projection in repository query methods.
We now support exists projections for query methods in query methods for derived and string queries.

public PersonRepository extends Repository<Person, String> {

  boolean existsByFirstname(String firstname);

  @ExistsQuery(value = "{ 'lastname' : ?0 }")
  boolean someExistQuery(String lastname);

  @Query(value = "{ 'lastname' : ?0 }", exists = true)
  boolean anotherExistQuery(String lastname);
}

Original pull request: #381.
2016-12-02 16:32:58 +01:00
Mark Paluch
bbfa0f7b83 DATAMONGO-1536 - Polishing.
Add JavaDoc. Change visibility of AbstractAggregationExpression.getMongoMethod() to protected.

Original pull request: #418.
2016-12-02 15:32:16 +01:00
Christoph Strobl
63d6234446 DATAMONGO-1536 - Add aggregation operators for array, arithmetic, date and set operations.
We now support the following aggregation framework operators:

- setEquals, setIntersection, setUnion, setDifference, setIsSubset, anyElementTrue, allElementsTrue
- stdDevPop, stdDevSamp
- abs, ceil, exp, floor, ln, log, log10, pow, sqrt, trunc
- arrayElementAt, concatArrays, isArray
- literal
- dayOfYear, dayOfMonth, dayOfWeek, year, month, week, hour, minute, second, millisecond, dateToString

Original pull request: #418.
2016-12-02 15:30:01 +01:00
Oliver Gierke
9ff86feb4f DATAMONGO-1539 - Polishing.
Renamed @Count and @Delete to @CountQuery and @DeleteQuery. Minor polishing in test cases and test repository methods. JavaDoc, formatting.

Original pull request: #416.
2016-12-02 11:55:14 +01:00
Fırat KÜÇÜK
8c838e8350 DATAMONGO-1539 - Introduce @CountQuery and @DeleteQuery.
Introducing dedicated annotations for manually defined count and delete queries to avoid misconfiguration and generally simplifying the declaration.

Original pull request: 416.
2016-12-02 11:55:07 +01:00
Oliver Gierke
a79930145d DATAMONGO-1525 - Improved creation of empty collections, esp. EnumSet.
We now use more type information to create a better empty collection in the first place. The previous algorithm always used an empty HashSet plus a subsequent conversion using the raw collection type. Especially the latter caused problems for EnumSets as the conversion into one requires the presence of component type information.

We now use Spring's collection factory and more available type information to create a proper collection in the first place and only rely on a subsequent conversion for arrays.
2016-12-01 20:10:41 +01:00
Christoph Strobl
9059a77712 DATAMONGO-1534 - Fix bulk operations missing to write type info.
We now correctly convert entities into their MongoDB representation including type information via _class property.

Original pull request: #415.
2016-11-28 09:06:13 +01:00
Christoph Strobl
a741400e9b DATAMONGO-1530 - Polishing.
Add missing transformations for ConstructorReference, OperatorNot, OpNE, OpEQ, OpGT, OpGE, OpLT, OpLE, OperatorPower, OpOr and OpAnd. This allows usage of logical operators &, || and ! as part of the expression, while ConstructorReference allows instantiating eg. arrays via an expression `new int[]{4,5,6}`. This can be useful eg. comparing arrays using $setEquals.

More complex aggregation operators like $filter can be created by defining the variable references as string inside the expression like filter(a, 'num', '$$num' > 10).
Commands like $let requires usage of InlineMap to pass in required arguments like eg. let({low:1, high:'$$low'}, gt('$$low', '$$high')).

Original Pull Request: #410
2016-11-25 15:45:14 +01:00
Sebastien Gerard
b786b8220a DATAMONGO-1530 - Add support for missing MongoDB 3.2 aggregation pipeline operators.
Original Pull Request: #410
2016-11-25 15:43:43 +01:00
Mark Paluch
710770e88d DATAMONGO-784 - Polishing.
Add JavaDoc for compareValue.

Original pull request: #414.
2016-11-24 13:56:27 +01:00
Christoph Strobl
e631e2d7c5 DATAMONGO-784 - Add support for comparison aggregation operators to group & project.
We now directly support comparison aggregation operators ($cmp, $eq, $gt, $gte, $lt, $lte and $ne) on both group and project stages.

Original pull request: #414.
2016-11-24 13:56:27 +01:00
Mark Paluch
3dc1e9355a DATAMONGO-1491 - Polishing.
Remove variable before returning value. Add generics for list creation.

Original pull request: #412.
2016-11-24 12:48:15 +01:00
Christoph Strobl
2985b4ca3d DATAMONGO-1491 - Add support for $filter (aggregation).
We new support $filter in aggregation pipeline.

Aggregation.newAggregation(Sales.class,
	Aggregation.project()
		.and(filter("items").as("item").by(GTE.of(field("item.price"), 100)))
		.as("items"))

Original pull request: #412.
2016-11-24 12:47:07 +01:00
Christoph Strobl
578441ee9f DATAMONGO-1327 - Polishing.
Just added overloads for stdDevSamp and stdDevPop taking AggregationExpression and updated the doc.
Also replaced String operation based MongoDB operation building by using operators directly.

Original Pull Request: #360
2016-11-23 15:39:01 +01:00
gustavodegeus
36838ffe31 DATAMONGO-1327 - Added support for $stdDevSamp and $stdDevPop to aggregation $group stage.
Original Pull Request: #360
CLA: 171720160409030719 (Gustavo de Geus)
2016-11-23 15:38:35 +01:00
Oliver Gierke
5bd0e21173 DATAMONGO-1527 - Updated changelog. 2016-11-23 13:52:43 +01:00
Oliver Gierke
255d32513c DATAMONGO-1502 - Updated changelog. 2016-11-03 18:56:45 +01:00
Oliver Gierke
6a9823fd24 DATAMONGO-1521 - Added Aggregation.skip(…) overload to support longs.
Deprecated the one taking an int.
2016-11-03 15:00:20 +01:00
Christoph Strobl
2ae75a4ff9 DATAMONGO-1500 - Fix JSON serialization error in derived queries with field spec.
We now make sure not to eagerly attempt to convert given query parameters into a mongo specific format by calling toString() the query object, but rather delegate this to another step later in the chain.

Original pull request: #404.
2016-11-03 09:36:42 +01:00
Christoph Strobl
9c20da3e8f DATAMONGO-1504 - Assert compatibility with MongoDB 3.4.
We now make sure to comply to the API requirements of mongo-java-driver 3.4 (in current beta1) by using empty DBObjects instead of null, ignoring non appropriate replication settings and cleaning up tests after execution.

Original pull request: #394.
2016-11-03 09:32:56 +01:00
Oliver Gierke
f782338581 DATAMONGO-1513 - Fixed identifier population for event listener generated, non-ObjectId on batch inserts.
The methods in MongoTemplate inserting a batch of documents previously only returned database generated identifiers, more especially ObjectId ones. This caused non-ObjectId identifiers potentially generated by other parties — i.e. an event listener reacting to a BeforeSaveEvent — not being considered for source object identifier population.

This commit adds a workaround augmenting the list of database generated identifiers with the ones actually present in the documents to be inserted. A follow-up ticket DATAMONGO-1519 was created to track the removal of the workaround in favor of a proper fix unfortunately requiring a change in public API (so a 2.0 candidate only).

Related tickets: DATAMONGO-1519.
2016-11-02 09:52:47 +01:00
Oliver Gierke
cb90bfc6a6 DATAMONGO-1514 - Polishing.
Extended license years in copyright header.

Original pull request: #401.
2016-10-27 14:32:37 +02:00
Martin Macko
189d4dd1b7 DATAMONGO-1514 - SpringDataMongodbQuery needs to be public.
SpringDataMongodbQuery is exposed publicly in QuerydslRepositorySupport, that's we've got to make it public to make sure class to the exposed methods from outside the package actually compile.

Original pull request: #401.
2016-10-27 14:31:46 +02:00
Christoph Strobl
10208001f8 DATAMONGO-1480 - Polishing.
Opened up Meta attributes to now allowing usage of more than one cursor option via dedicated enum.

new Query().noCursorTimeout();

and

interface PersonRepository extends CrudRepository<Person, String> {

    @Meta(flags = {CursorOptions.NO_TIMEOUT})
    Iterable<Person> findBy();
}

Original Pull Request: #390
2016-10-24 18:44:35 +02:00
Mark Paluch
98dca5a65e DATAMONGO-1480 - Add support for noCursorTimeout in Query.
We now allow setting noCursorTimeout for queries using `Query` and `@Meta`.

Query query = new Query().noCursorTimeout();

and

interface PersonRepository extends CrudRepository<Person, String> {

    @Meta(noCursorTimeout = true)
    Iterable<Person> findBy();

    @Meta(noCursorTimeout = true)
    Stream<Person> streamBy();
}

Original Pull Request: #390
2016-10-24 18:43:40 +02:00
Christoph Strobl
b6bc0ea316 DATAMONGO-1490 - Polishing.
Restore 1.8.xsd and create new 1.10 one reflecting the changes made.

Original Pull Request: #389
2016-10-24 13:26:51 +02:00
Mark Paluch
b67c551e19 DATAMONGO-1490 - Replace space indentation by tabs.
Original Pull Request: #389
2016-10-24 13:26:51 +02:00
Mark Paluch
e0bc1e0f20 DATAMONGO-1490 - Change the XML data type of boolean flags to String.
We now accept String data types for boolean flags in XML configurations. Boolean data types in the XSD don't allows use of property placeholders even if the resolved value could be converted to boolean. Affected fields by this change are:

* `<mongo:repositories create-query-indexes=… />`
* `<mongo:options ssl=…/>`
* `<mongo:client-options ssl=… />`

Original Pull Request: #389
2016-10-24 13:26:40 +02:00
Oliver Gierke
fcb436dd30 DATAMONGO-1495 - Updated changelog. 2016-09-29 14:42:09 +02:00
Oliver Gierke
0e57dd473c DATAMONGO-1499 - Updated changelog. 2016-09-29 11:42:09 +02:00
Oliver Gierke
8d36e42b1b DATAMONGO-1498 - Removed defaulting of MongoMappingContext for repositories and auditing.
Previously we created a default bean definition for MongoMappingContext if none was present in the application context. That lookup for an existing one unfortunately comes too early, especially with Spring Boot in place. This then caused the MappingContext not being aware of the custom conversions and simply types registered by Boot.

We now removed the defaulting relying on a MappingMongoConverter being present in the Application context (which usually is the case for the usage with AbstractMongoConfiguration or the XML <mongo:mapping-converter /> alternative. We use that bean to lookup the MappingContext.
2016-09-28 16:22:20 +02:00
Oliver Gierke
b66bfae105 DATAMONGO-1497 - MappingMongoConverter now consistently uses DbObjectAccessor.
We now use DbObjectAccessor also for preliminary inspections of the source DBObject (e.g. whether a value is present at all). Previously we operated on the DBObject directly which caused issues with properties mapped to nested fields as the keys weren't exploded correctly and thus the check always failed.
2016-09-22 17:55:50 +02:00
Oliver Gierke
eb3d55e0bd DATAMONGO-1494 - Updated changelog. 2016-09-21 08:04:20 +02:00
Oliver Gierke
84dbfdfd5e DATAMONGO-1450 - Updated changelog. 2016-09-21 08:04:14 +02:00
Mark Paluch
1813b1aea0 DATAMONGO-1493 - Fix minor typo in reference documentation.
Related pull request: #391.
2016-09-19 17:03:29 +02:00
Jordan Jennings
69241737b7 DATAMONGO-1493 - Fix minor typo in docs.
Original pull request: #391.
2016-09-19 16:21:12 +02:00
Christoph Strobl
f053bed447 DATAMONGO-1492 - Make o.s.d.m.core.aggregation.AggregationExpression public.
By turning `AggregationExpression` public we allow adding custom expressions without workarounds. It is now possible to create eg. `ProjectionOperation` like:

ProjectionOperation agg = Aggregation.project()
      .and(new AggregationExpression() {

        @Override
        public DBObject toDbObject(AggregationOperationContext context) {

          DBObject filterExpression = new BasicDBObject();
          filterExpression.put("input", "$x");
          filterExpression.put("as", "y");
          filterExpression.put("cond", new BasicDBObject("$eq", Arrays.<Object> asList("$$y.z", 2)));

          return new BasicDBObject("$filter", filterExpression);
        }
      }).as("profile");

Original pull request: #392.
2016-09-19 16:02:44 +02:00
Christoph Strobl
1a1cd9ef14 DATAMONGO-1485 - Consider potential custom conversion for enums in Querydsl paths.
We now take potential registered converters for enums into account when serializing path expressions via SpringDataMongodbSerializer.

Original pull request: #388.
2016-09-19 07:00:21 +02:00
Oliver Gierke
7effc0e10f DATAMONGO-1468 - Polishing.
Slightly touched test case.

Original pull request: #387
2016-09-08 10:29:23 +02:00
Christoph Strobl
395bb1faa4 DATAMONGO-1486 - Fix ClassCastException when mapping non-String Map key for updates.
We now make sure to convert Map keys into Strings when mapping update values for Map properties.

Original pull request: #387.
2016-09-08 10:29:22 +02:00
Christoph Strobl
eb1392cc1a DATAMONGO-861 - Polishing.
Favor usage of List over BasicDBList.
Rename ProjectionOperation.transform to applyCondition.
Add missing author and since tags, remove trailing white spaces and fix reference documentation headline clash.

Original Pull Request: #385
2016-08-30 14:40:24 +02:00
Mark Paluch
ace01e4e6d DATAMONGO-861 - Add support for $cond and $ifNull operators in aggregation operations.
We now support $cond and $ifNull operators for projection and grouping operations. ConditionalOperator and IfNullOperators are AggregationExpressions that can be applied to transform or generate values during aggregation.

TypedAggregation<InventoryItem> agg = newAggregation(InventoryItem.class,
  project().and("discount")
    .transform(ConditionalOperator.newBuilder().when(Criteria.where("qty").gte(250))
      .then(30)
      .otherwise(20))
    .and(ifNull("description", "Unspecified")).as("description")
);

corresponds to

{ "$project": { "discount": { "$cond": { "if": { "$gte": [ "$qty", 250 ] },
        "then": 30, "else": 20 } },
    "description": { "$ifNull": [ "$description", "Unspecified"] }
  }
}

Original Pull Request: #385
2016-08-30 14:39:43 +02:00
Mark Paluch
116dda63c2 DATAMONGO-1406 - Propagate PersistentEntity when mapping query criteria for nested keywords.
We now propagate the PersistentEntity when mapping nested keywords so that the criteria mapping chain for nested keywords and properties has now access to the PersistentEntity and can use configured field names.

Previously the plain property names have been used as field names and potential customizations via @Field have been ignored.

Original Pull Request: #384
2016-08-25 13:09:04 +02:00
Mark Paluch
4649872394 DATAMONGO-1465 - Polishing.
Replace boolean flag in convertAndJoinScriptArgs with literal. Joined args are rendered to JavaScript and require always string quotation.

Original pull request: #383.
2016-08-23 14:59:43 +02:00
Christoph Strobl
ecc6f3fc4e DATAMONGO-1465 - Fix String quotation in DefaultScriptOperations.execute().
This change prevents Strings from being quoted prior to sending them as args of a script.

Original pull request: #383.
2016-08-23 14:57:10 +02:00
Mark Paluch
512f68a611 DATAMONGO-1476 - Polishing.
Extend year range in license header. Use the specified collection name in doRemove.

Original pull request: #382.
2016-08-23 10:06:53 +02:00
Niko Schmuck
fbcd4ba367 DATAMONGO-1476 - Consistently use specified collection name in MongoTemplate.stream().
We now use the specified argument name instead of DBCollection.getName() when invoking ReadDbObjectCallback.

Original pull request: #382.
2016-08-23 10:01:59 +02:00
Oliver Gierke
760d7d6a32 DATAMONGO-1471 - Converter only applies identifier values if actually available.
Setting the value for the identifier property is an explicit step in MappingMongoConverter and always executed if the type to be created has an identifier property. If the source document doesn't contain an _id field (e.g. because it has been excluded explicitly) that previously caused null to be set on the identifier. This caused an exception if the identifier property is a primitive type.

We now explicitly check whether the field backing the identifier property is actually present in the source document and only explicitly set the value if so.
2016-08-17 16:54:10 +02:00
Oliver Gierke
29a6688e8c DATAMONGO-1470 - AbstractMongoConfiguration now supports multiple base packages for @Document scanning.
Introduced AbstractMongoConfiguration.getMappingBasePackages() to return multiple ones over the previously existing ….getMappingBasePackage(). The former is now used by the code triggering the scanning using what the latter returns by default.
2016-07-28 13:10:48 +02:00
Oliver Gierke
b6e7683202 DATAMONGO-1409 - After release cleanups. 2016-07-27 14:32:37 +02:00
Oliver Gierke
286a977575 DATAMONGO-1409 - Prepare next development iteration. 2016-07-27 14:32:35 +02:00
Oliver Gierke
d7f70e219b DATAMONGO-1409 - Release version 1.10 M1 (Ingalls). 2016-07-27 13:52:12 +02:00
Oliver Gierke
728cc390f6 DATAMONGO-1409 - Prepare 1.10 M1 (Ingalls). 2016-07-27 13:51:38 +02:00
Oliver Gierke
ddcc3914ff DATAMONGO-1409 - Updated changelog. 2016-07-27 13:51:32 +02:00
Oliver Gierke
9a385599af DATAMONGO-1394 - Polishing.
Some internal refactorings to avoid deeply nested if-clauses.

Original pull request: #373.
2016-07-27 10:54:21 +02:00
Christoph Strobl
c14c42fb0c DATAMONGO-1394 - Support identifier references on Querydsl expressions for DBRefs.
We now allow direct usage path.eq(…) on id properties of db referenced objects. This allows to write the query as person.coworker.id.eq(coworker.getId()) instead of person.coworker.eq(coworker). This helps building the query using just the plain id not having to actually create new object wrapping it.

Original pull request: #373.
2016-07-27 10:54:21 +02:00
Oliver Gierke
5a8e4f3dae DATAMONGO-1194 - Polishing.
Some missing JavaDoc and slight code polish.

Original pull request: #377.
2016-07-26 15:30:31 +02:00
Christoph Strobl
5d50155d81 DATAMONGO-1194 - Improve DBRef resolution for maps.
We bulk load maps of referenced objects as long as they are stored in the same collection. This reduces database roundtrips and network traffic.

Original pull request: #377.
2016-07-26 15:30:31 +02:00
Christoph Strobl
babab54ffd DATAMONGO-1194 - Improve DBRef resolution for collections.
We now bulk load collections of referenced objects as long as they are stored in the same collection. This reduces database roundtrips and network traffic.

Original pull request: #377.
2016-07-26 15:30:30 +02:00
Mark Paluch
1ba137b98a DATAMONGO-1464 - Polishing.
Added JavaDoc. Simplified if-check in MongoQueryExecution.isListOfGeoResult(…).

Original pull request: #379.
2016-07-26 14:51:55 +02:00
Mark Paluch
353b836a77 DATAMONGO-1464 - Optimize query execution for pagination queries.
We execute paged queries now in an optimized way. The data is obtained for each paged execution but the count query is deferred. We determine the total from the pageable and the results in which we don't hit the page size bounds (i.e. results are less than a full page without offset or results are greater 0 and less than a full page with offset). In all other cases we issue an additional count query.

Original pull request: #379.
2016-07-26 14:51:29 +02:00
Oliver Gierke
325bcd11b9 DATAMONGO-1431 - Added MongoOperations.stream(…) with explicit collection. 2016-07-20 15:59:52 +02:00
Mark Paluch
9db2dde19b DATAMONGO-1463 - Upgrade to mongo-java-driver 2.14.3.
Upgrade mongo-java-driver 2.14.3 and upgrade the mongo33 profile to use 3.3.0 (release).
2016-07-20 10:33:23 +02:00
Mark Paluch
318ba53e2f DATAMONGO-1462 - Integrate version badge from spring.io.
Add version badge from spring.io and replace fixed version numbers with a placeholder.
2016-07-20 08:43:37 +02:00
Mark Paluch
3db30bd4a6 DATAMONGO-1460 - User placeholder property for JSR-303 API. 2016-07-15 12:48:37 +02:00
Oliver Gierke
dd1fbfeb66 DATAMONGO-1459 - Polishing.
Added missing @Overrides to MongoRepository interface and polished non-JavaDoc references.
2016-07-14 15:52:32 +02:00
Oliver Gierke
3fa17272bb DATAMONGO-1459 - Added support for any-match mode in Query-by-example.
MongoExampleMapper now $or-concatenates the predicates derived from the example in case the ExampleMatcher expresses any-match binding to be desired.

Moved integration tests for Query-by-example to the appropriate package and polished the code a little.

Related ticket: DATACMNS-879.
2016-07-14 15:52:32 +02:00
Christoph Strobl
9361fc3c71 DATAMONGO-1455, DATAMONGO-1456 - Polishing.
Use static imports for org.junit.Assert and org.hamcrest.core. Fix spelling.

Original pull request: #375.
2016-07-12 15:25:49 +02:00
Christoph Strobl
ac55f5e77f DATAMONGO-1455, DATAMONGO-1456 - Add support for $caseSensitive and $diacriticSensitive to $text.
We added methods to set values for $caseSensitive and $diacriticSensitive when using TextCriteria. Both operators are optional and will not be used until explicitly set.

    // { "$text" : { "$search" : "coffee" , "$caseSensitive" : true } }
    TextCriteria.forDefaultLanguage().matching("coffee").caseSensitive(true);

    // { "$text" : { "$search" : "coffee" , "$diacriticSensitive" : true } }
    TextCriteria.forDefaultLanguage().matching("coffee").diacriticSensitive(true);

Original pull request: #375.
2016-07-12 15:25:49 +02:00
Christoph Strobl
116baf9a92 DATAMONGO-832 - Polishing
Moved newly introduced types into order. Added missing @since tag and additional test.
Updated reference documentation for update operators and added $slice operator to "what’s new" section.

Original Pull Request: #374
2016-07-11 10:02:22 +02:00
Mark Paluch
026dce2612 DATAMONGO-832 - Add support for $slice in Update.push.
We now support $slice in Update operations via the PushOperatorBuilder.

    new Update().push("key").slice(5).each(Arrays.asList("one", "two", "three"));

Original Pull Request: #374
2016-07-11 09:49:45 +02:00
Mark Paluch
eae32be568 DATAMONGO-1457 - Polishing.
Add missing Javadoc to size operator. Mention Array Aggregation Operators in reference docs. Fix typos in reference docs.

Original pull request: #372.
2016-07-08 10:34:08 +02:00
Christoph Strobl
9d51ea4c01 DATAMONGO-1457 - Add support for $slice in aggregation.
We now support $slice in aggregation projections via the ProjectionOperationBuilder.

    Aggregation.project().and("field").slice(10, 20)

Original pull request: #372.
2016-07-08 10:08:29 +02:00
Mark Paluch
f4a5482005 DATAMONGO-1418 - Polishing.
Added ticket references. Simplified code.

 Original pull request: #361.
2016-06-24 15:32:58 +02:00
Nikolai Bogdanov
0db36aff8f DATAMONGO-1418 - Add support for $out operator in aggregations.
We now support the $out operator via Aggregation.out(…) to store aggregation results in a collection. Using the $out operator returns an empty list in AggregationResults.

Original pull request: #361.
CLA: 172720160413124705 (Nikolai Bogdanov)
2016-06-24 15:28:45 +02:00
Christoph Strobl
ba8ece334a DATAMONGO-1453 - Fix GeoJson conversion when coordinates are Integers.
We now use Number instead of Double for reading "coordinates" from GeoJSON representations.

Original pull request: #369.
2016-06-24 14:04:58 +02:00
Oliver Gierke
a5394074c5 DATAMONGO-1410 - Updated changelog. 2016-06-15 14:31:49 +02:00
Kevin Dosey
fe5bb515b7 DATAMONGO-1449 - Switched to foreach loop in collection handling of MappingMongoConverter.
This should result in minor to moderate performance improvement for iteration on Collections/Arrays during DBObject to object mapping.

Original pull request: #368.
2016-06-11 18:29:32 +02:00
Mark Paluch
c84bfbccf4 DATAMONGO-1437 - Polishing.
Renamed test. Added JavaDoc. Simplify throws declaration.

Original pull request: #367.
2016-06-02 08:55:09 +02:00
Christoph Strobl
d147f80a39 DATAMONGO-1437 - Preserve non translatable Exception cause when lazily resolving DBRef.
We now preserve the cause of Exceptions that cannot be translated into DataAccessExceptions when an error occurs during lazily loading DBRefs.

Original pull request: #367.
2016-06-02 08:52:53 +02:00
Christoph Strobl
7b8dadeb74 DATAMONGO-1271 - Polishing.
Removed non Java 6 language features, reworked and added a few tests.

Original Pull Request: #322
2016-05-27 07:53:53 +02:00
Jordi Llach Fernandez
d1251c42ca DATAMONGO-1271 - Provide lifecycle events for DBRefs.
We now publish livecycle events when loading DBRefs.

Original Pull Request: #322
CLA: 121620150519031801 (Jordi Llach Fernandez)
2016-05-27 07:53:45 +02:00
Oliver Gierke
4140dd573f DATAMONGO-1423 - Polishing.
Orignal pull request: #365.
2016-05-25 17:26:31 +02:00
Christoph Strobl
0e60630393 DATAMONGO-1423 - Map keys now get registered conversions applied for Updates.
We now pipe map keys through the potentially registered conversions when mapping Updates.

Orignal pull request: #365.
2016-05-25 17:26:30 +02:00
Oliver Gierke
9bc35512fd DATAMONGO-1416 - Polishing.
Just use instanceOf(…) from Hamcrest's Matchers class instead of dedicated class.

Original pull request: #362.
2016-05-24 15:57:21 +02:00
Christoph Strobl
b626c2f82b DATAMONGO-1416 - Get rid of the warnings for Atomic… type conversions.
We now use explicit converters instead of a ConverterFactory. This reduces noise in log when registering converters.

Original pull request: #362.
2016-05-24 15:57:21 +02:00
Mark Paluch
c3e894ee8d DATAMONGO-1424 - Polishing.
Remove EndingWith from NotLike. Remove superfluous white-spaces. Split combined highlighted keywords to individual highlighting.

Original pull request: #364.
2016-05-09 11:06:44 +02:00
Christoph Strobl
2f713bede5 DATAMONGO-1424 - Add support for NOT_LIKE.
We now support `notLike` and `isNotLike` in query derivation.

Original pull request: #364.
2016-05-09 11:05:44 +02:00
Mark Paluch
e03520d2fb DATAMONGO-1425 - Polishing.
Add NotContaining to documentation. Add integration test for Containing/NotContaining on collection properties.

Original pull request: #363.
2016-05-09 10:42:19 +02:00
Christoph Strobl
3829d58dc2 DATAMONGO-1425 - Fix query derivation for notContaining on String properties.
We now correctly build up the criteria for derived queries using notContaining keyword on String properties.

Original pull request: #363.
2016-05-09 10:42:15 +02:00
Mark Paluch
7b87fa9509 DATAMONGO-1412 - Fix backticks and code element highlighting.
Fixed broken highlighting using backticks followed by chars/single quotes. Convert single quote emphasis of id to backtick code fences. Add missing spaces between words and backticks.

Original Pull Request: #359
2016-05-02 13:20:29 +02:00
Mark Paluch
b2b9f3406a DATAMONGO-1412 - Document mapping rules for Java types to MongoDB representation.
Original Pull Request: #359
Related pull request: #353
Related ticket: DATAMONGO-1404
2016-05-02 13:20:22 +02:00
Mark Paluch
d610761019 DATAMONGO-1404 - Polishing.
Add author and since tags. Update license headers. Reformat code. Replace FQCN with import and simple class name. Remove final keyword in test methods. Add tests for numeric values. Update documentation.

Original pull request: #353.
2016-05-02 13:20:10 +02:00
Alexey Plotnik
8983bd26ce DATAMONGO-1404 - Add support for $min and $max update operators.
Original pull request: #353.
CLA: 169820160330091912 (Alexey Plotnik)
2016-05-02 13:19:31 +02:00
Christoph Strobl
5485f2fcd4 DATAMONGO-1391 - Polishing.
Removed white spaces, updated Javadoc and return early when using non 3.2 $unwind options.

Original Pull Request: #355
2016-05-02 12:56:38 +02:00
Mark Paluch
f8681fec66 DATAMONGO-1391 - Support Mongo 3.2 syntax for $unwind in aggregation.
We now support both, the simple {$unwind: path} and the MongoDB 3.2 {$unwind: {…}} syntax.

Original Pull Request: #355
2016-05-02 12:52:39 +02:00
Mark Paluch
0dd904894d DATAMONGO-1399 - Polishing.
Update since version to 1.10. Remove trailing whitespaces.

Original pull request: #352.
2016-05-02 09:44:37 +02:00
Christoph Strobl
7d70a8677e DATAMONGO-1399 - Allow adding hole to GeoJSON Polygon.
We now allow creation of GeoJsonPolygon having an outer and multiple inner rings.

Original pull request: #352.
2016-05-02 09:40:44 +02:00
Mark Paluch
13a52b5ac9 DATAMONGO-1403 - Add maxExecutionTimeMs alias for @Meta(maxExcecutionTime).
We added maxExecutionTimeMs as an alias for maxExcecutionTime which has been deprecated due to spelling issues.

Original pull request: #356.
2016-04-26 12:43:37 +02:00
Mark Paluch
f5cfcda673 DATAMONGO-1411 - Enable build on TravisCI.
We now start MongoDB server via apt-get instead of relying on the TravisCI managed 2.4.2 installation.
Doing this we altered tests to just check on the port and not the host part of the URIs.

Additionally we upgraded build profiles, removed promoted snapshot-versions, renamed mongo32-next to mongo32  and added mongo33-next build profile.

Original pull request: #358
2016-04-26 10:40:29 +02:00
Raja Dilip Kolli
cf44a7105f DATAMONGO-1420 - Update version numbers in Github readme.
Bumped to latest versions available

Original pull request: #354
2016-04-15 08:30:32 +02:00
Oliver Gierke
0228255d2b DATAMONGO-1356 - AuditingEventListener now has an explicit order.
AuditingEventListener now has a fixed ordering of 100. This allows other listeners to be registered to be executed before or after it.
2016-04-14 22:27:46 +02:00
Oliver Gierke
50e37355d4 DATAMONGO-1419 - Removed deprecated methods of AbstractMongoEventListener. 2016-04-14 22:27:42 +02:00
Oliver Gierke
a15dababfa DATAMONGO-1408 - Updated changelog. 2016-04-06 23:14:25 +02:00
Oliver Gierke
9942451017 DATAMONGO-1405 - After release cleanups. 2016-04-06 16:37:03 +02:00
Oliver Gierke
e144c29316 DATAMONGO-1405 - Prepare next development iteration. 2016-04-06 16:36:59 +02:00
Oliver Gierke
64d4880983 DATAMONGO-1405 - Release version 1.9 GA (Hopper). 2016-04-06 16:35:59 +02:00
Oliver Gierke
47c348e03a DATAMONGO-1405 - Prepare 1.9 GA (Hopper). 2016-04-06 16:34:45 +02:00
Oliver Gierke
dea86535c1 DATAMONGO-1405 - Updated changelog. 2016-04-06 16:34:39 +02:00
Artur Konczak
eee6b62589 DATAMONGO-1407 - updated jira link to point to correct project on jira.
Original pull request: #357.
2016-04-05 14:13:28 +02:00
Mark Paluch
771ca8d84c DATAMONGO-1407 - Add pull request template. 2016-04-05 09:50:18 +02:00
Christoph Strobl
8f5b334951 DATAMONGO-1398 - Mention QBE and add links.
Original Pull Request: #349
2016-03-31 21:00:27 +02:00
Mark Paluch
0dc6169282 DATAMONGO-1398, DATAMONGO-1395 - Update Lifecycle Events examples in Reference Documentation.
Replace deprecated methods by the supported API.

Original Pull Request: #349
2016-03-31 20:59:53 +02:00
Mark Paluch
abe78f0428 DATAMONGO-1398 - Updated what's new section and general improvements.
Update Spring Framework documentation links to point always to the Spring Framework version specified in the pom, where possible. Mention $lookup in aggregation.

Original Pull Request: #349
2016-03-31 20:59:16 +02:00
Christoph Strobl
9930ec2d19 DATAMONGO-1401 - Fix error when updating entity with both GeoJsonPoint and Version property.
We now ignore property reference exceptions when resolving field values that have already been mapped. Eg. in case of an already mapped update extracted from an actual domain type instance.

Original pull request: #351.
2016-03-31 09:15:29 +02:00
Oliver Gierke
83d7f4477e DATAMONGO-1392 - After release cleanups. 2016-03-18 11:16:07 +01:00
Oliver Gierke
18c3704c2e DATAMONGO-1392 - Prepare next development iteration. 2016-03-18 11:15:51 +01:00
Oliver Gierke
bef581caa5 DATAMONGO-1392 - Release version 1.9 RC1 (Hopper). 2016-03-18 11:15:00 +01:00
Oliver Gierke
2f0abe0604 DATAMONGO-1392 - Prepare 1.9 RC1 (Hopper). 2016-03-18 11:06:57 +01:00
Oliver Gierke
4235b44c47 DATAMONGO-1392 - Updated changelog. 2016-03-18 11:06:52 +01:00
Oliver Gierke
f318185ad0 DATAMONGO-1400 - Adapt to rename of Tuple to Pair in Spring Data Commons.
Related tickets: DATACMNS-818.
2016-03-18 09:59:06 +01:00
Oliver Gierke
43b496287c DATAMONGO-1245 - Final tweaks to Query by Example documentation.
Tweaked section anchor to match conventions. Use level offsets to accommodate changes in Spring Data Commons.
2016-03-18 09:29:46 +01:00
Oliver Gierke
9d0c8ecdc3 DATAMONGO-1245 - Polishing.
Adapt to API changes in Spring Data Commons.

Related tickets: DATACMNS-810.
Original pull request: #341.
2016-03-17 18:39:18 +01:00
Mark Paluch
5a78d99af0 DATAMONGO-1245 - Initial documentation for Query by Example.
Adopt changes from query by example API refactoring.

Related tickets: DATACMNS-810.
Original pull request: #341.
2016-03-17 18:39:18 +01:00
Christoph Strobl
693f5ddf6e DATAMONGO-1245 - Add support for Query By Example.
An explorative approach to QBE trying find possibilities and limitations. We now support querying documents by providing a sample of the given object holding compare values. For the sake of partial matching we flatten out nested structures so we can create different queries for matching like:

{ _id : 1, nested : { value : "conflux" } }
{ _id : 1, nested.value : { "conflux" } }

This is useful when you want so search using a only partially filled nested document. String matching can be configured to wrap strings with $regex which creates { firstname : { $regex : "^foo", $options: "i" } } when using StringMatchMode.STARTING along with the ignoreCaseOption. DBRefs and geo structures such as Point or GeoJsonPoint is converted to their according structure.

Related tickets: DATACMNS-810.
Original pull request: #341.
2016-03-17 18:39:18 +01:00
Christoph Strobl
ece655f67d DATAMONGO-1387 - Polishing.
Added a few more tests and append values if present on Query.

Original Pull Request: #345
2016-03-17 13:10:51 +01:00
John Willemin
119692c979 DATAMONGO-1387 - Fix BasicQuery getFieldsObject() inconsistency.
We changed BasicQuery to consider its parent getFieldsObject() when not given an explicit fields DBObject.

Original Pull Request: #345
CLA: 165520160303021604 (John Willemin)
2016-03-17 13:09:58 +01:00
Oliver Gierke
6068f3243a DATAMONGO-1397 - Polishing.
Switched to Slf4J-native placeholder replacement in debug logging for MongoTemplate.

Original pull request: #348.
2016-03-16 17:28:04 +01:00
Mark Paluch
a7cda2e793 DATAMONGO-1397 - Log command, entity and collection name in MongoTemplate.geoNear(…).
Original pull request: #348.
2016-03-16 17:28:03 +01:00
Oliver Gierke
2687cb85f0 DATAMONGO-1373 - Polishing.
Added method, field and annotation target to @Field annotation explicitly. Fixed copyright date ranges where needed.

Tweaked formatting in test cases.

Original pull request: #347.
Related ticket: DATACMNS-825.
2016-03-15 15:35:10 +01:00
Mark Paluch
b2ce1700d2 DATAMONGO-1373 - Allow usage of @AliasFor with mapping and indexing annotations.
We now support @AliasFor to build composed annotations with: @Document, @Id, @Field, @Indexed, @CompoundIndex, @GeoSpatialIndexed, @TextIndexed, @Query, and @Meta. Added missing license header to @Field.

Original pull request: #347.
Related tickets: DATACMNS-825.
2016-03-15 15:20:45 +01:00
Christoph Strobl
0b634f8340 DATAMONGO-1373 - Allow usage of @AliasFor for composed @Document annotation.
We now resolve aliased attribute values when reading @Document on entity types. This allows creation of composed annotations like:

@Retention(RetentionPolicy.RUNTIME)
@Target({ ElementType.TYPE })
@Document
static @interface ComposedDocumentAnnotation {

  @AliasFor(annotation = Document.class, attribute = "collection")
  String name() default "custom-collection-name";
}

Original pull request: #347.
Related issue: DATACMNS-825.
2016-03-15 15:20:22 +01:00
Mark Paluch
9a078b743f DATAMONGO-1326 - Support field inheritance for $lookup aggregation operator.
We now distinguish between aggregation operations that replace fields in the aggregation pipeline and those which inherit fields from previous operations. InheritsFieldsAggregationOperation is a nested interface of FieldsExposingAggregationOperation is a marker to lookup fields along the aggregation context chain. Added unit and integration tests. Mention lookup operator in docs.

Original pull request: #344.
2016-03-08 09:00:55 +01:00
Christoph Strobl
65b6576cfc DATAMONGO-1326 - Add Builder, update javadoc and remove additional interface.
Updated javadoc and formatting. Added tests and removed marker interface.

Original Pull Request: #344
2016-03-08 08:59:54 +01:00
Alessio Fachechi
78e99e6df2 DATAMONGO-1326 - Add support for $lookup to aggregation.
Original Pull Request: #344
CLA: 164120160221125037 (Alessio Fachechi)
2016-03-07 14:20:14 +01:00
Oliver Gierke
bb0a42733d DATAMONGO-1389 - Fixed test case to verify type predicting bean registration.
Related ticket: DATACMNS-821.
2016-03-01 15:02:18 +01:00
Oliver Gierke
a2ae08e263 DATAMONGO-1381 - Updated changelog. 2016-02-23 14:27:23 +01:00
Oliver Gierke
eaa9d6c7e6 DATAMONGO-1366 - After release cleanups. 2016-02-12 15:43:57 +01:00
Oliver Gierke
8900695153 DATAMONGO-1366 - Prepare next development iteration. 2016-02-12 15:43:39 +01:00
Oliver Gierke
bfe548d573 DATAMONGO-1366 - Release version 1.9 M1 (Hopper). 2016-02-12 15:42:47 +01:00
Oliver Gierke
7ab4002771 DATAMONGO-1366 - Prepare 1.9 M1 (Hopper). 2016-02-12 15:36:19 +01:00
Oliver Gierke
6eace856aa DATAMONGO-1366 - Updated changelog. 2016-02-12 15:36:11 +01:00
Oliver Gierke
f10e5a19c5 DATAMONGO-1345 - Finalized application of projections in query methods.
Refactored the query execution out of AbstractMongoQuery into MongoQueryExecution. Made sure the streaming execution lazily applies the projections, too.

Added a DtoInstantiatingConverter to be able to copy data from created entities into DTOs as we cannot hand the DTO type into the MongoTemplate execution in the first place as it's going to be used for the query mapping currently.
2016-02-12 14:14:54 +01:00
Uxío Fuentefría
90a4a63776 DATAMONGO-1378 - Update reference documentation: Change Query.sort() to Query.with(Sort sort).
sort() is not a method of Query, to sort a query you have to use with().

Original pull request: #320.
CLA: 162620160211060822 (Uxío Fuentefría)
2016-02-11 20:22:36 +01:00
Oliver Gierke
0f14e35ba3 DATAMONGO-1288 - Polishing.
Some JavaDoc here and there. Moved converter factory registration into MongoConverters.getConvertersToRegister() for consistency with others.

Original pull request: #331.
2016-02-11 14:08:31 +01:00
Christoph Strobl
ad0c4207d6 DATAMONGO-1288 - Add conversion for AtomicInteger & AtomicLong.
We now convert AtomicInteger and AtomicLong to the required Number target type by calling get() followed by the actual conversion. This allows to directly use these types e.g. as part of an Update: new Update().set("intValue", new AtomicInteger(10));

Original pull request: #331.
2016-02-11 14:08:19 +01:00
Mark Paluch
97da43645a DATAMONGO-1380 - Polishing.
Add credits, use message formatting instead string concatenation.

Original pull request: #317.
2016-02-11 12:02:09 +01:00
Alex Vengrovsk
42b7c42617 DATAMONGO-1380 - Improve logging in MongoChangeSetPersister.
Add checking for debug enabling in the getPersistentId method

Original pull request: #317.
2016-02-11 11:53:15 +01:00
Timo Kockert
bd81e25e6b DATAMONGO-1270 - Update documentation to reflect deprecation of MongoFactoryBean.
Original pull request: #315.
2016-02-10 15:57:15 +01:00
Thomas Dudouet
debe6aa649 DATAMONGO-1377 - Update JavaDoc: Use @EnableMongoRepositories instead of @EnableJpaRepositories.
The JavaDoc description references the EnableJpaRepositories annotation instead of the EnableMongoRepositories annotation.

Original pull request: #340.
2016-02-10 15:13:24 +01:00
Oliver Gierke
6f433902f0 DATAMONGO-1376 - Moved away from SimpleTypeInformationMapper.INSTANCE.
Related tickets: DATACMNS-815.
2016-02-09 14:31:05 +01:00
Martin Macko
ba902e7f8e DATAMONGO-1375 - Fix typo in MongoOperations JavaDoc.
Original pull request: #343.
2016-02-09 11:29:30 +01:00
Oliver Gierke
7e8ec21684 DATAMONGO-1372 - Polishing.
Tiny formattings, collapsed if-clause into ternary operation.

Original pull request: #342.
2016-02-04 15:19:51 +01:00
Christoph Strobl
b7131b7efc DATAMONGO-1372 - Add and register Converters for java.util.Currency.
We now support conversion from currency into ISO 4217 String and back.

Original pull request: #342.
2016-02-04 15:19:48 +01:00
Oliver Gierke
ace99c3464 DATAMONGO-1371 - Added code of conduct.
Moved to Asciidoctor for the CONTRIBUTING file.
2016-02-02 09:42:48 +01:00
Oliver Gierke
83fc5bc113 DATAMONGO-1366 - Declare Artifactory Maven plugin to be able to distribute build artifacts. 2016-01-28 14:51:55 +01:00
Oliver Gierke
160de0adf6 DATAMONGO-1361 - Guard command result statistics evaluation against changes in MongoDB 3.2.
MongoDB 3.2 RC1 decided to remove fields from statistics JSON documents returned in case no result was found for a geo near query. The avgDistance field is unfortunately missing as of that version.

Introduced a value object to encapsulate the mitigation behavior and make client code unaware of that.
2016-01-21 12:45:10 +01:00
Oliver Gierke
b4753f3a83 DATAMONGO-1360 - Query instances contained in a Near Query now get mapped during geoNear(…) execution.
A Query instance which might be part of a NearQuery definition is now passed through the QueryMapper to make sure complex types contained in it or even in more general types that have custom conversions registered are mapped correctly before the near command is actually executed.
2016-01-20 13:10:50 +01:00
Oliver Gierke
bce6e2c78c DATAMONGO-1163 - Polishing.
Fixed indentation changes in IndexingIntegrationTests. Separated test cases from each other.

Original pull request: #325.
2015-12-27 12:05:19 +01:00
Jordi Llach
b5ea0eccd2 DATAMONGO-1163 - Allow usage of @Indexed as meta-annotation.
@Indexed can now be used as meta-annotation so that user annotations can be annotated with it and the index creation facilities still pick up the configuration information.

Original pull request: #325.
2015-12-27 12:05:17 +01:00
Oliver Gierke
87865b9761 DATAMONGO-1355 - Updated changelog. 2015-12-18 10:55:56 +01:00
Christoph Strobl
13fa4703c0 DATAMONGO-1334 - Map-reduce operations now honor MapReduceOptions.limit.
We now also consider the limit set via MapReduceOptions when executing mapReduce operations via MongoTemplate.mapReduce(…).

MapReduceOptions.limit(…) supersedes a potential limit set via the Query itself. This change also allows to define a limit even when no explicit Query is used.

Original pull request: #338.
2015-12-16 11:57:44 +01:00
Christoph Strobl
5a21e00322 DATAMONGO-1317 - Assert compatibility with mongo-java-driver 3.2.
We now do a defensive check against the actual WObject of WriteConcern to avoid the IllegalStateException raised by the new java-driver in case _w is null or not an Integer. This allows us to run against recent 2.13, 2.14, 3.0, 3.1 and the latest 3.2.0.

Original pull request: #337.
2015-12-16 11:49:01 +01:00
Oliver Gierke
3feed2bc5a DATAMONGO-1289 - Polishing.
Some additional JavaDoc and comment removal.

Original pull request: #333.
2015-12-16 11:38:31 +01:00
Christoph Strobl
501b9501e0 DATAMONGO-1289 - MappingMongoEntityInformation no uses fallback identifier type derived from repository declaration.
We now use RepositoryMetdata.getIdType() to provide a fallback identifier type in case the entity information does not hold an id property which is perfectly valid for MongoDB.

Original pull request: #333.
2015-12-16 11:37:51 +01:00
Oliver Gierke
727271e68c DATAMONGO-1345 - Added support for projections on repository methods.
Related tickets: DATACMNS-89.
2015-12-14 19:56:42 +01:00
Christoph Strobl
63a619dddf DATAMONGO-1349 - Upgrade to mongo-java-driver 2.14.0. 2015-12-11 10:38:36 +01:00
Oliver Gierke
113566a6ab DATAMONGO-1346 - Update.pullAll(…) now registers multiple invocations correctly.
Previously calling the method multiple times overrode the result of previous calls. We now use addMultiFieldOperation(…) to make sure already existing values are kept.
2015-12-10 15:38:40 +01:00
Oliver Gierke
7862841b48 DATAMONGO-934 - Polishing.
Polished JavaDoc and implementation as well as tests. Extracted Tuple to Spring Data Commons. Moved exception translation into MongoExceptionTranslator.

Changed implementation of DefaultBulkOperations to consider the WriteConcernResolver of the underlying MongoTemplate to avoid exposing the WriteConcern on execution.

Original pull request: #327.
Related tickets: DATACMNS-790.
2015-11-26 17:56:39 +01:00
Tobias Trelle
fe6cbaa03d DATAMONGO-934 - Added support for bulk operations.
Introduced BulkOperations that can be obtained via MongoOperations, register operations to be eventually executed in a bulk.

Original pull request: #327.
2015-11-26 17:56:35 +01:00
Oliver Gierke
9ef1fc7304 DATAMONGO-1337 - Another round of polishes on SonarQuber complaints. 2015-11-26 12:27:22 +01:00
Oliver Gierke
cf3a9d3ced DATAMONGO-1337 - Reverted making some of the loggers static.
The logger instance in AbstractMonitor is supposed to pick up the type of the actual implementation class and thus cannot be static.

Related pull request: #336.
2015-11-26 12:00:40 +01:00
Christian Ivan
1d1c80db7b DATAMONGO-1337 - General code quality improvements.
A round of code polish regarding the PMD and Squid rules referred to in the ticket.

Original pull request: #336.
2015-11-26 11:53:06 +01:00
Oliver Gierke
eeb37e9104 DATAMONGO-1342 - Fixed potential NullPointerException in MongoQueryCreator.
MongoQueryCreator.nextAsArray(…) now returns a single element object array in case null is handed to the method. It previously failed with a NullPointerException.
2015-11-25 17:23:15 +01:00
Oliver Gierke
18bf0daee7 DATAMONGO-1335 - DBObjectAccessor now writes all nested fields correctly.
Previously, DBObjectAccessor has always reset the in-between values when traversing nested properties. This caused previously written values to be erased if subsequent values are written. We now reuse an already existing BasicDBObject if present.
2015-11-25 16:06:52 +01:00
Oliver Gierke
1e9189aee7 DATAMONGO-1341 - Moved MongoDbErrorCodes into utility package.
This resolves a package cycle introduced by MongoPersistentEntityIndexCreator referring to error codes now.

Updated Sonargraph architecture description along the way.
2015-11-25 15:36:22 +01:00
Oliver Gierke
95f6dfafdd DATAMONGO-1287 - Optimizations in reading associations as constructor arguments.
As per discussion on the ticket we now omit looking up the value for an association being used as constructor argument as the simple check whether the currently handled property is a constructor argument is sufficient to potentially skip handling the value.

Related pull requests: #335, #322.
2015-11-23 11:13:07 +01:00
Christoph Strobl
bedaae8a90 DATAMONGO-1287 - Fix double fetching for lazy DbRefs used in entity constructor.
We now check properties for their usage as constructor arguments, that might already have been resolved, before setting the actual value. This prevents turning already eagerly fetched DBRefs back into LazyLoadingProxies.

Original pull request: #335.
Related pull request: #322.
2015-11-20 13:39:00 +01:00
Oliver Gierke
7bfa3fe7fd DATAMONGO-1290 - Polishing.
Removed a level of indentation from ExpressionEvaluationParameterBinder.replacePlaceholders(…). Polished JavaDoc.

Original pull request: #332.
2015-11-20 13:20:11 +01:00
Christoph Strobl
143b0b73b9 DATAMONGO-1290 - Move parameter binding for String based queries.
Moved parameter binding for string based queries into separate class.

Original pull request: #332.
2015-11-20 13:20:09 +01:00
Christoph Strobl
cbfc46270e DATAMONGO-1290 - Convert byte[] parameter in @Query to $binary representation.
We now convert non quoted binary parameters to the $binary format. This allows using them along with the @Query annotation.

Original pull request: #332.
2015-11-20 13:06:22 +01:00
Christoph Strobl
b31efb46ec DATAMONGO-1204 - ObjectPath now uses raw id values to track resolved objects.
We now use the native id within ObjectPath for checking if a DBref has already been resolved. This is required as MongoDB Java driver 3 generation changed ObjectId.equals(…) which now performs a type check.

Original pull request: #334.
Related pull request: #288.
2015-11-20 12:47:52 +01:00
Oliver Gierke
ef3477098f DATAMONGO-1324 - Register ObjectId converters unconditionally to make sure they really get used.
The presence of ObjectToObjectConverter in a DefaultConversionService causes the guard trying to register converters for ObjectIds in AbstractMongoConverter to not trigger the registration. This in turn caused ObjectId conversions to be executed via reflection instead of the straight forward method calls and thus a drop in performance for such operations.

We no unconditionally register the converters to make sure they really get applied.

Related tickets: SPR-13703.
2015-11-19 12:02:41 +01:00
Oliver Gierke
9dce117555 DATAMONGO-1238 - Upgraded to Querydsl 4. 2015-11-17 13:42:38 +01:00
Oliver Gierke
e66e1e0502 DATAMONGO-1316 - Updated changelog. 2015-11-16 08:31:45 +01:00
Christoph Strobl
19e1e9daeb DATAMONGO-1297 - Allow @Indexed annotation on DBRef.
We now also treat references as source of a potential index. This enforces index creation for Objects like:

@Document
class WithDbRef {

  @Indexed
  @DBRef
  ReferencedObject reference;
}

Combining @TextIndexed or @GeoSpatialIndexed with a DBRef will lead to a MappingException.

Original pull request: #329.
2015-11-13 17:54:42 +01:00
Christoph Strobl
ec8a948f3f DATAMONGO-1302 - Allow ConverterFactory to be registered in CustomConversions.
We now allow registration of ConverterFactory within CustomConversions by inspecting the generic type arguments for determining the conversion source and target types.

Original pull request: #330.
2015-11-10 14:37:02 +01:00
Ilho Ahn
38fc7641a0 DATAMONGO-1314 - Fix typo in Exception message.
Original Pull Request: #265
2015-11-09 20:37:26 +01:00
Christoph Strobl
ddc3925659 DATAMONGO-1291 - Made @Document usable as meta-annotation.
We now use Spring's AnnotationUtils.findAnnotation(…) for @Document lookup which enables the full power of Spring 4.2's composable annotations.

Original pull request: #326.
2015-11-06 14:34:43 +01:00
Christoph Strobl
f8416edf8f DATAMONGO-1293 - Polishing.
Move configuration parsing error into method actually responsible for reading uri/client-uri attributes.

Original Pull Request: #328
2015-10-29 12:47:16 +01:00
Viktor Khoroshko
4f94f37ce8 DATAMONGO-1293 - Allowed id attribute in addition to client-uri attribute in MongoDbFactoryParser.
We now allow write-concern and id to be configured along with the uri or client-uri attribute of <mongo:db-factory.

Original Pull Request: #328
CLA: 140120150929074128 (Viktor Khoroshko)
2015-10-29 12:47:08 +01:00
Oliver Gierke
528de58418 DATAMONGO-1276 - Fixed potential NullPointerExceptions in MongoTemplate.
Triggering data access exception translation could lead to NullPointerException in cases where PersistenceExceptionTranslator returned null because the original exception couldn't be translated and the result was directly used from a throw clause.

This is now fixed by consistently the potentiallyConvertRuntimeException(…) method, which was made static to be able to refer to it from nested static classes.

Refactored Scanner usage to actually close the Scanner instance to prevent a resource leak.
2015-10-21 15:04:12 +02:00
Oliver Gierke
e6ea34aed8 DATAMONGO-1304 - Updated changelog. 2015-10-14 13:46:21 +02:00
Oliver Gierke
f171938b00 DATAMONGO-1303 - Added build profiles for MongoDB Java driver 3.1 and 3.2 snapshots.
Added new build profiles mongod31 and mongo32-next to build the project against the latest MongoDB 3.1 driver as well as upcoming snapshots of the 3.2 generation.
2015-10-12 15:41:30 +02:00
Oliver Gierke
7b27368d2d DATAMONGO-1282 - After release cleanups. 2015-09-01 12:11:02 +02:00
Spring Buildmaster
f754df51bc DATAMONGO-1282 - Prepare next development iteration. 2015-09-01 02:12:29 -07:00
Spring Buildmaster
77dce53c7a DATAMONGO-1282 - Release version 1.8.0.RELEASE (Gosling GA). 2015-09-01 02:12:26 -07:00
Oliver Gierke
73f268e7c4 DATAMONGO-1282 - Prepare 1.8.0.RELEASE (Gosling GA). 2015-09-01 09:44:21 +02:00
Oliver Gierke
075d7d8131 DATAMONGO-1282 - Updated changelog. 2015-09-01 09:44:11 +02:00
Christoph Strobl
206337044a DATAMONGO-1280 - Updated "What’s new" section in reference documentation.
Original pull request: #319.
2015-08-31 12:55:30 +02:00
Christoph Strobl
55b44ff7aa DATAMONGO-1275 - Fixed broken links in reference documentation.
Original pull request: #318.
2015-08-22 13:16:49 +02:00
Christoph Strobl
ae48639ae9 DATAMONGO-1275 - Added documentation for optimistic locking.
Original pull request: #318.
2015-08-22 13:16:45 +02:00
Oliver Gierke
6b5e78f810 DATAMONGO-1256 - Polishing.
Minor Javadoc polishing.

Original pull request: #316.
2015-08-07 14:04:49 +02:00
Christoph Strobl
3e485e0a88 DATAMONGO-1256 - MongoMappingEvents now expose the collection name they're issued for.
We now directly expose the collection name via MongoMappingEvent.getCollectionName(). Therefore we added new constructors to all the events, deprecating the previous ones. 

Several overloads have been added to MongoEventListener, deprecating previous API. We’ll call the deprecated from the new ones until their removal.

Original pull request: #316.
2015-08-07 14:04:47 +02:00
Oliver Gierke
335c78f908 DATAMONGO-1269 - Polishing.
Original pull request: #314.
2015-08-06 11:00:36 +02:00
Christoph Strobl
b103e4eaf6 DATAMONGO-1269 - Retain position parameter in property path.
We now retain position parameters in paths used in queries when mapping the field name. This allows to map "list.1.name" to the name property of the first element in the list.

The change also fixes a glitch in mapping java.util.Map like structures having numeric keys.

Original pull request: #314.
2015-08-06 11:00:36 +02:00
Oliver Gierke
c4a6c63d23 DATAMONGO-1268 - After release cleanups. 2015-08-04 14:09:20 +02:00
Spring Buildmaster
4a4f10f97b DATAMONGO-1268 - Prepare next development iteration. 2015-08-04 04:37:14 -07:00
Spring Buildmaster
a5712daab7 DATAMONGO-1268 - Release version 1.8.0.RC1 (Gosling RC1). 2015-08-04 04:37:12 -07:00
Oliver Gierke
28cb1ef106 DATAMONGO-1268 - Prepare 1.8.0.RC1 (Gosling RC1). 2015-08-04 13:08:49 +02:00
Oliver Gierke
0d99a3e527 DATAMONGO-1268 - Updated changelog. 2015-08-04 13:08:49 +02:00
Christoph Strobl
9da43263ce DATAMONGO-1263 - Index resolver considers generic type argument of collection elements.
We now consider the potential generic type argument of collection elements. 
Prior to this change an index within List<GenericWrapper<ConcreteWithIndex>> would not have been resolved.

Original pull request: #312.
2015-08-04 08:48:57 +02:00
Oliver Gierke
784e199068 DATAMONGO-1266 - Fixed domain type lookup for methods returning primitves.
If a repository query method returned a primitive, that primitive was exposed as domain type which e.g. caused deleteBy…(…) methods to fail that returned a void.

We now shortcut the MongoEntityMetadata lookup in MongoQueryMethod to use the repository's domain type if a primitive or wrapper is returned.
2015-08-03 11:53:10 +02:00
Oliver Gierke
1ffee802c0 DATAMONGO-1261 - Updated changelog. 2015-07-28 16:42:58 +02:00
Christoph Strobl
6f0ac7f0c2 DATAMONGO-1254 - Grouping after projection in aggregation now uses correct aliased field name.
We now push the aliased field name down the aggregation pipeline for projections including operations. This allows to reference them in a later stage. Prior to this change the field reference was potentially resolved to the target field of the operation which did not result in an error but lead to false results.

Original pull request: #311.
2015-07-27 14:15:33 +02:00
Christoph Strobl
941d4d8985 DATAMONGO-1260 - Prevent accidental authentication misconfiguration on SimpleMongoDbFactory.
We now reject configuration using MongoClient along with UserCredentials in SimpleMongoDbFactory. This move favors the native authentication mechanism provided via MongoCredential.

<mongo:mongo-client id="mongo-client-with-credentials" credentials="jon:warg@snow?uri.authMechanism=PLAIN" />

Original pull request: #309.
2015-07-27 14:08:42 +02:00
Oliver Gierke
44c76d8ffb DATAMONGO-1257 - We now hint to credential quoting from the XSD.
The namespace XSD now mentions the capability of quoting more complex credentials in case they validly contain a comma.
2015-07-27 13:47:11 +02:00
Oliver Gierke
df9a9f5fb6 DATAMONGO-1257 - Polishing.
Made internal helper methods in MongoCredentialPropertyEditor static where possible.

Original pull request: #310.
2015-07-24 18:40:57 +02:00
Christoph Strobl
bebd0fa0e6 DATAMONGO-1257 - <mongo:mongo-client /> element now supports usernames with a comma.
We now allow grouping credentials by enclosing them in single quotes like this:

credentials='CN=myName,OU=myOrgUnit,O=myOrg,L=myLocality,ST=myState,C=myCountry?uri.authMechanism=MONGODB-X509'

We also changed the required argument checks to be more authentication mechanism specific which means the pattern is now username[:password@database][?options].

Original pull request: #310.
2015-07-24 18:40:56 +02:00
Oliver Gierke
594e90789d DATAMONGO-1244 - Polishing.
Minor reformattings and extracted a method to improve digestability.

Original pull request: #306.
2015-07-08 10:18:35 +02:00
Thomas Darimont
f2ab42cb80 DATAMONGO-1244 - Improved handling of expression parameters in StringBasedMongoQuery.
Replaced regex based parsing of dynamic expression based parameters with custom parsing to make sure we also support complex nested expression objects.
Previously we only supported simple named or positional expressions. Since MongoDBs JSON based query language uses deeply nested objects to express queries, we needed to improve the handling here.

Manual parsing is tedious and more verbose than regex based parsing but it gives us more control over the whole parsing process.

We also dynamically adjust  the quoting so that we only output quoted parameters if necessary.

This enables to express complex filtering queries the use Spring Security constructors like:
```
@Query("{id: ?#{ hasRole('ROLE_ADMIN') ? {$exists:true} : principal.id}}")
List<User> findAllForCurrentUserById();
```

Original pull request: #306.
2015-07-08 10:18:27 +02:00
Christoph Strobl
3224fa8ce7 DATAMONGO-1251 - Fixed potential NullPointerException in UpdateMapper.
We now explicitly handle the possibility of the source object a type hint needs to be calculated for being null.
2015-07-07 09:57:46 +02:00
Oliver Gierke
ce156c1344 DATAMONGO-1250 - Fixed inline code formatting in reference docs. 2015-07-04 19:07:09 +02:00
Oliver Gierke
434e553022 DATAMONGO-1250 - Fixed accidental duplicate invocation of value conversion in UpdateMapper.
UpdateMapper.getMappedObjectForField(…) invokes the very same method of the super class but handed in an already mapped value so that value conversion was invoked twice.

This was especially problematic in cases a dedicated converter had been registered for an object that is already a Mongo-storable one (e.g. an enum-to-string converter and back) without indicating which of the tow converter is the reading or the writing one. This basically caused the source value converted back and forth during the update mapping creating the impression the value wasn't converted at all.

This is now fixed by removing the superfluous mapping.
2015-07-04 19:00:21 +02:00
Oliver Gierke
de5b5ee4b0 DATAMONGO-1246 - Updated changelog. 2015-07-01 10:00:16 +02:00
Oliver Gierke
60636bf56d DATAMONGO-1247 - Updated changelog. 2015-07-01 07:48:31 +02:00
Oliver Gierke
1ca71f93e9 DATAMONGO-1248 - Updated changelog. 2015-06-30 13:58:37 +02:00
Oliver Gierke
63ff39bed6 DATAMONGO-1236 - Polishing.
Removed the creation of a BasicMongoPersistentEntity in favor of always handing ClassTypeInformation.OBJECT into the converter in case not entity can be found.

This makes sure type information is written for updates on properties of type Object (which essentially leads to no PersistentEntity being available).

Original pull request: #301.
2015-06-30 09:54:53 +02:00
Christoph Strobl
cb0b9604d4 DATAMONGO-1236 - Update now include type hint correctly.
We now use property type information when mapping fields affected by an update in case we do not have proper entity information within the context. This allows more precise type resolution required for determining the need to write type hints for a given property.

Original pull request: #301.
2015-06-30 09:54:53 +02:00
Christoph Strobl
1dbe3b62d7 DATAMONGO-1125 - Improve exception message for index creation errors.
We now use MongoExceptionTranslator to potentially convert exceptions during index creation into Springs DataAccessException hierarchy. In case we encounter an error code indicating DataIntegrityViolation we try to fetch existing index data and append it to the exceptions message.

Original pull request: #302.
2015-06-24 20:28:23 +02:00
Christoph Strobl
5c0707d221 DATAMONGO-1232 - IngoreCase in criteria now escapes query.
We now quote the original criteria before actually wrapping it inside of an regular expression for case insensitive search. This happens not only to case insensitive is, startsWith, endsWith criteria but also to those using like. In that case we quote the part between leading and trailing wildcard if required.

Original pull request: #301.
2015-06-22 12:50:05 +02:00
Christoph Strobl
c4ffc37dd5 DATAMONGO-1166 - ReadPreference is now be used for aggregations.
We now use MongoTemplate.readPreference(…) when executing commands such as geoNear(…) and aggregate(…).

Original pull request: #303.
2015-06-22 08:21:23 +02:00
Christoph Strobl
aaf93b0f6f DATAMONGO-1157 - Throw meaningful exception when @DbRef is used with unsupported types.
We now eagerly check DBRef properties for invalid definitions such as final class or array. In that case we throw a MappingException when verify is called.
2015-06-19 15:54:19 +02:00
Thomas Darimont
23eab1e84f DATAMONGO-1242 - Update MongoDB Java driver to 3.0.2 in mongo3 profile.
Update mongo driver.

Original pull request: #304.
2015-06-19 15:37:47 +02:00
Oliver Gierke
218f32e552 DATAMONGO-1229 - Fixed application of ignore case flag on nested properties.
Previously we tried to apply the ignore case settings found in the PartTree to the root PropertyPath we handle in MongoQueryCreator.create(). This is now changed to work on the leaf property of the PropertyPath.
2015-06-05 06:49:03 +02:00
Eddú Meléndez
62fbe4d08c DATAMONGO-1234 - Fix typos in JavaDoc. 2015-06-05 06:37:22 +02:00
Oliver Gierke
41ffd00619 DATAMONGO-1228 - After release cleanups. 2015-06-02 11:58:11 +02:00
Spring Buildmaster
98b9a604cf DATAMONGO-1228 - Prepare next development iteration. 2015-06-02 01:29:04 -07:00
381 changed files with 34899 additions and 6719 deletions

12
.github/PULL_REQUEST_TEMPLATE.md vendored Normal file
View File

@@ -0,0 +1,12 @@
<!--
Thank you for proposing a pull request. This template will guide you through the essential steps necessary for a pull request.
Make sure that:
-->
- [ ] You have read the [Spring Data contribution guidelines](https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.adoc).
- [ ] There is a ticket in the bug tracker for the project in our [JIRA](https://jira.spring.io/browse/DATAMONGO).
- [ ] You use the code formatters provided [here](https://github.com/spring-projects/spring-data-build/tree/master/etc/ide) and have them applied to your changes. Dont submit any formatting related changes.
- [ ] You submit test cases (unit or integration tests) that back your changes.
- [ ] You added yourself as author in the headers of the classes you touched. Amend the date range in the Apache license header if needed. For new types, add the license header (copy from another file and set the current year only).

View File

@@ -3,13 +3,31 @@ language: java
jdk:
- oraclejdk8
services:
- mongodb
before_script:
- mongod --version
env:
matrix:
- PROFILE=ci
- PROFILE=mongo-next
- PROFILE=mongo3
- PROFILE=mongo3-next
- PROFILE=mongo31
- PROFILE=mongo32
- PROFILE=mongo33
- PROFILE=mongo34
- PROFILE=mongo34-next
# Current MongoDB version is 2.4.2 as of 2016-04, see https://github.com/travis-ci/travis-ci/issues/3694
# apt-get starts a MongoDB instance so it's not started using before_script
addons:
apt:
sources:
- mongodb-3.4-precise
packages:
- mongodb-org-server
- mongodb-org-shell
- oracle-java8-installer
sudo: false

27
CODE_OF_CONDUCT.adoc Normal file
View File

@@ -0,0 +1,27 @@
= Contributor Code of Conduct
As contributors and maintainers of this project, and in the interest of fostering an open and welcoming community, we pledge to respect all people who contribute through reporting issues, posting feature requests, updating documentation, submitting pull requests or patches, and other activities.
We are committed to making participation in this project a harassment-free experience for everyone, regardless of level of experience, gender, gender identity and expression, sexual orientation, disability, personal appearance, body size, race, ethnicity, age, religion, or nationality.
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery
* Personal attacks
* Trolling or insulting/derogatory comments
* Public or private harassment
* Publishing other's private information, such as physical or electronic addresses,
without explicit permission
* Other unethical or unprofessional conduct
Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful.
By adopting this Code of Conduct, project maintainers commit themselves to fairly and consistently applying these principles to every aspect of managing this project. Project maintainers who do not follow or enforce the Code of Conduct may be permanently removed from the project team.
This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community.
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting a project maintainer at spring-code-of-conduct@pivotal.io.
All complaints will be reviewed and investigated and will result in a response that is deemed necessary and appropriate to the circumstances.
Maintainers are obligated to maintain confidentiality with regard to the reporter of an incident.
This Code of Conduct is adapted from the http://contributor-covenant.org[Contributor Covenant], version 1.3.0, available at http://contributor-covenant.org/version/1/3/0/[contributor-covenant.org/version/1/3/0/].

View File

@@ -1 +0,0 @@
You find the contribution guidelines for Spring Data projects [here](https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.md).

3
CONTRIBUTING.adoc Normal file
View File

@@ -0,0 +1,3 @@
= Spring Data contribution guidelines
You find the contribution guidelines for Spring Data projects https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.adoc[here].

View File

@@ -1,3 +1,6 @@
[![Spring Data MongoDB](https://spring.io/badges/spring-data-mongodb/ga.svg)](http://projects.spring.io/spring-data-mongodb#quick-start)
[![Spring Data MongoDB](https://spring.io/badges/spring-data-mongodb/snapshot.svg)](http://projects.spring.io/spring-data-mongodb#quick-start)
# Spring Data MongoDB
The primary goal of the [Spring Data](http://projects.spring.io/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
@@ -26,7 +29,7 @@ Add the Maven dependency:
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.5.0.RELEASE</version>
<version>${version}.RELEASE</version>
</dependency>
```
@@ -36,7 +39,7 @@ If you'd rather like the latest snapshots of the upcoming major version, use our
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.6.0.BUILD-SNAPSHOT</version>
<version>${version}.BUILD-SNAPSHOT</version>
</dependency>
<repository>
@@ -140,8 +143,8 @@ public class MyService {
Here are some ways for you to get involved in the community:
* Get involved with the Spring community on Stackoverflow and help out on the [spring-data-mongodb](http://stackoverflow.com/questions/tagged/spring-data-mongodb) tag by responding to questions and joining the debate.
* Create [JIRA](https://jira.springframework.org/browse/DATADOC) tickets for bugs and new features and comment and vote on the ones that you are interested in.
* Create [JIRA](https://jira.spring.io/browse/DATAMONGO) tickets for bugs and new features and comment and vote on the ones that you are interested in.
* Github is for social coding: if you want to write code, we encourage contributions through pull requests from [forks of this repository](http://help.github.com/forking/). If you want to contribute code this way, please reference a JIRA ticket as well covering the specific issue you are addressing.
* Watch for upcoming articles on Spring by [subscribing](http://spring.io/blog) to spring.io.
Before we accept a non-trivial patch or pull request we will need you to sign the [contributor's agreement](https://support.springsource.com/spring_committer_signup). Signing the contributor's agreement does not grant anyone commit rights to the main repository, but it does mean that we can accept your contributions, and you will get an author credit if we do. Active contributors might be asked to join the core team, and given the ability to merge pull requests.
Before we accept a non-trivial patch or pull request we will need you to [sign the Contributor License Agreement](https://cla.pivotal.io/sign/spring). Signing the contributors agreement does not grant anyone commit rights to the main repository, but it does mean that we can accept your contributions, and you will get an author credit if we do. If you forget to do so, you'll be reminded when you submit a pull request. Active contributors might be asked to join the core team, and given the ability to merge pull requests.

93
pom.xml
View File

@@ -1,11 +1,11 @@
<?xml version="1.0" encoding="UTF-8"?>
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.8.0.M1</version>
<version>1.10.0.RELEASE</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.7.0.M1</version>
<version>1.9.0.RELEASE</version>
</parent>
<modules>
@@ -28,8 +28,8 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.11.0.M1</springdata.commons>
<mongo>2.13.0</mongo>
<springdata.commons>1.13.0.RELEASE</springdata.commons>
<mongo>2.14.3</mongo>
<mongo.osgi>2.13.0</mongo.osgi>
</properties>
@@ -107,7 +107,7 @@
<id>mongo-next</id>
<properties>
<mongo>2.14.0-SNAPSHOT</mongo>
<mongo>2.15.0-SNAPSHOT</mongo>
</properties>
<repositories>
@@ -123,7 +123,7 @@
<id>mongo3</id>
<properties>
<mongo>3.0.0</mongo>
<mongo>3.0.4</mongo>
</properties>
</profile>
@@ -132,7 +132,7 @@
<id>mongo3-next</id>
<properties>
<mongo>3.0.0-SNAPSHOT</mongo>
<mongo>3.0.5-SNAPSHOT</mongo>
</properties>
<repositories>
@@ -143,6 +143,79 @@
</repositories>
</profile>
<profile>
<id>mongo31</id>
<properties>
<mongo>3.1.1</mongo>
</properties>
</profile>
<profile>
<id>mongo32</id>
<properties>
<mongo>3.2.2</mongo>
</properties>
</profile>
<profile>
<id>mongo33</id>
<properties>
<mongo>3.3.0</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>mongo34</id>
<properties>
<mongo>3.4.0</mongo>
</properties>
</profile>
<profile>
<id>mongo34-next</id>
<properties>
<mongo>3.4.1-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>release</id>
<build>
<plugins>
<plugin>
<groupId>org.jfrog.buildinfo</groupId>
<artifactId>artifactory-maven-plugin</artifactId>
<inherited>false</inherited>
</plugin>
</plugins>
</build>
</profile>
</profiles>
<dependencies>
@@ -156,8 +229,8 @@
<repositories>
<repository>
<id>spring-libs-milestone</id>
<url>https://repo.spring.io/libs-milestone</url>
<id>spring-libs-release</id>
<url>https://repo.spring.io/libs-release</url>
</repository>
</repositories>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.8.0.M1</version>
<version>1.10.0.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -48,7 +48,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.8.0.M1</version>
<version>1.10.0.RELEASE</version>
</dependency>
<dependency>
@@ -81,7 +81,7 @@
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>1.0.0.GA</version>
<version>${validation}</version>
<scope>test</scope>
</dependency>
<dependency>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -37,6 +37,8 @@ import com.mongodb.MongoException;
/**
* @author Thomas Risberg
* @author Oliver Gierke
* @author Alex Vengrovsk
* @author Mark Paluch
*/
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
@@ -45,7 +47,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
protected final Logger log = LoggerFactory.getLogger(getClass());
private final Logger log = LoggerFactory.getLogger(getClass());
private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory;
@@ -76,25 +78,25 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for " + dbk);
log.debug("Loading MongoDB data for {}", dbk);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
for (DBObject dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) {
log.debug("Processing key: " + key);
log.debug("Processing key: {}", key);
}
if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) {
throw new DataIntegrityViolationException("Unble to convert property " + key + ": Invalid metadata, "
+ ENTITY_FIELD_CLASS + " not available");
throw new DataIntegrityViolationException(
"Unble to convert property " + key + ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
}
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: " + key);
log.debug("Adding to ChangeSet: {}", key);
}
changeSet.set(key, value);
}
@@ -109,9 +111,9 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
log.debug("getPersistentId called on " + entity);
if (log.isDebugEnabled()) {
log.debug("getPersistentId called on {}", entity);
}
if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
}
@@ -130,7 +132,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
}
if (log.isDebugEnabled()) {
log.debug("Flush: changeset: " + cs.getValues());
log.debug("Flush: changeset: {}", cs.getValues());
}
String collName = getCollectionNameForEntity(entity.getClass());
@@ -152,7 +154,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
});
if (value == null) {
if (log.isDebugEnabled()) {
log.debug("Flush: removing: " + dbQuery);
log.debug("Flush: removing: {}", dbQuery);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
@@ -164,7 +166,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
final DBObject dbDoc = new BasicDBObject();
dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) {
log.debug("Flush: saving: " + dbQuery);
log.debug("Flush: saving: {}", dbQuery);
}
mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.8.0.M1</version>
<version>1.10.0.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -5,7 +5,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.8.0.M1</version>
<version>1.10.0.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,6 +18,8 @@ package org.springframework.data.mongodb.log4j;
import java.net.UnknownHostException;
import java.util.Arrays;
import java.util.Calendar;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import org.apache.log4j.AppenderSkeleton;
@@ -30,13 +32,18 @@ import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoCredential;
import com.mongodb.ServerAddress;
import com.mongodb.WriteConcern;
/**
* Log4j appender writing log entries into a MongoDB instance.
*
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Christoph Strobl
* @author Ricardo Espirito Santo
*/
public class MongoLog4jAppender extends AppenderSkeleton {
@@ -54,17 +61,19 @@ public class MongoLog4jAppender extends AppenderSkeleton {
protected String host = "localhost";
protected int port = 27017;
protected String username;
protected String password;
protected String authenticationDatabase;
protected String database = "logs";
protected String collectionPattern = "%c";
protected PatternLayout collectionLayout = new PatternLayout(collectionPattern);
protected String applicationId = System.getProperty("APPLICATION_ID", null);
protected WriteConcern warnOrHigherWriteConcern = WriteConcern.SAFE;
protected WriteConcern infoOrLowerWriteConcern = WriteConcern.NORMAL;
protected WriteConcern warnOrHigherWriteConcern = WriteConcern.ACKNOWLEDGED;
protected WriteConcern infoOrLowerWriteConcern = WriteConcern.UNACKNOWLEDGED;
protected Mongo mongo;
protected DB db;
public MongoLog4jAppender() {
}
public MongoLog4jAppender() {}
public MongoLog4jAppender(boolean isActive) {
super(isActive);
@@ -86,6 +95,53 @@ public class MongoLog4jAppender extends AppenderSkeleton {
this.port = port;
}
/**
* @return
* @since 1.10
*/
public String getUsername() {
return username;
}
/**
* @param username may be {@literal null} for unauthenticated access.
* @since 1.10
*/
public void setUsername(String username) {
this.username = username;
}
/**
* @return
* @since 1.10
*/
public String getPassword() {
return password;
}
/**
* @param password may be {@literal null} for unauthenticated access.
* @since 1.10
*/
public void setPassword(String password) {
this.password = password;
}
/**
* @return
*/
public String getAuthenticationDatabase() {
return authenticationDatabase;
}
/**
* @param authenticationDatabase may be {@literal null} to use {@link #getDatabase()} as authentication database.
* @since 1.10
*/
public void setAuthenticationDatabase(String authenticationDatabase) {
this.authenticationDatabase = authenticationDatabase;
}
public String getDatabase() {
return database;
}
@@ -111,14 +167,14 @@ public class MongoLog4jAppender extends AppenderSkeleton {
this.applicationId = applicationId;
}
public void setWarnOrHigherWriteConcern(String wc) {
this.warnOrHigherWriteConcern = WriteConcern.valueOf(wc);
}
public String getWarnOrHigherWriteConcern() {
return warnOrHigherWriteConcern.toString();
}
public void setWarnOrHigherWriteConcern(String wc) {
this.warnOrHigherWriteConcern = WriteConcern.valueOf(wc);
}
public String getInfoOrLowerWriteConcern() {
return infoOrLowerWriteConcern.toString();
}
@@ -128,10 +184,26 @@ public class MongoLog4jAppender extends AppenderSkeleton {
}
protected void connectToMongo() throws UnknownHostException {
this.mongo = new Mongo(host, port);
this.mongo = createMongoClient();
this.db = mongo.getDB(database);
}
private MongoClient createMongoClient() throws UnknownHostException {
ServerAddress serverAddress = new ServerAddress(host, port);
if (null == password || null == username) {
return new MongoClient(serverAddress);
}
String authenticationDatabaseToUse = authenticationDatabase == null ? this.database : authenticationDatabase;
MongoCredential mongoCredential = MongoCredential.createCredential(username,
authenticationDatabaseToUse, password.toCharArray());
List<MongoCredential> credentials = Collections.singletonList(mongoCredential);
return new MongoClient(serverAddress, credentials);
}
/*
* (non-Javadoc)
* @see org.apache.log4j.AppenderSkeleton#append(org.apache.log4j.spi.LoggingEvent)
@@ -160,7 +232,7 @@ public class MongoLog4jAppender extends AppenderSkeleton {
// Copy properties into document
Map<Object, Object> props = event.getProperties();
if (null != props && props.size() > 0) {
if (null != props && !props.isEmpty()) {
BasicDBObject propsDbo = new BasicDBObject();
for (Map.Entry<Object, Object> entry : props.entrySet()) {
propsDbo.put(entry.getKey().toString(), entry.getValue().toString());

View File

@@ -0,0 +1,114 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.log4j;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.util.Calendar;
import java.util.Collections;
import org.apache.log4j.LogManager;
import org.apache.log4j.Logger;
import org.apache.log4j.MDC;
import org.apache.log4j.PropertyConfigurator;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.DB;
import com.mongodb.DBCursor;
import com.mongodb.MongoClient;
import com.mongodb.MongoCredential;
import com.mongodb.ServerAddress;
/**
* Integration tests for {@link MongoLog4jAppender} using authentication.
*
* @author Mark Paluch
*/
public class MongoLog4jAppenderAuthenticationIntegrationTests {
private final static String username = "admin";
private final static String password = "test";
private final static String authenticationDatabase = "logs";
MongoClient mongo;
DB db;
String collection;
ServerAddress serverLocation;
Logger log;
@Before
public void setUp() throws Exception {
serverLocation = new ServerAddress("localhost", 27017);
mongo = new MongoClient(serverLocation);
db = mongo.getDB("logs");
BasicDBList roles = new BasicDBList();
roles.add("dbOwner");
db.command(new BasicDBObjectBuilder().add("createUser", username).add("pwd", password).add("roles", roles).get());
mongo.close();
mongo = new MongoClient(serverLocation, Collections
.singletonList(MongoCredential.createCredential(username, authenticationDatabase, password.toCharArray())));
db = mongo.getDB("logs");
Calendar now = Calendar.getInstance();
collection = String.valueOf(now.get(Calendar.YEAR)) + String.format("%1$02d", now.get(Calendar.MONTH) + 1);
LogManager.resetConfiguration();
PropertyConfigurator.configure(getClass().getResource("/log4j-with-authentication.properties"));
log = Logger.getLogger(MongoLog4jAppenderIntegrationTests.class.getName());
}
@After
public void tearDown() {
if (db != null) {
db.getCollection(collection).remove(new BasicDBObject());
db.command(new BasicDBObject("dropUser", username));
}
LogManager.resetConfiguration();
PropertyConfigurator.configure(getClass().getResource("/log4j.properties"));
}
@Test
public void testLogging() {
log.debug("DEBUG message");
log.info("INFO message");
log.warn("WARN message");
log.error("ERROR message");
DBCursor msgs = db.getCollection(collection).find();
assertThat(msgs.count(), is(4));
}
@Test
public void testProperties() {
MDC.put("property", "one");
log.debug("DEBUG message");
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,39 +20,51 @@ import static org.junit.Assert.*;
import java.util.Calendar;
import org.apache.log4j.LogManager;
import org.apache.log4j.Logger;
import org.apache.log4j.MDC;
import org.apache.log4j.PropertyConfigurator;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.DBCursor;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.ServerAddress;
/**
* Integration tests for {@link MongoLog4jAppender}.
*
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MongoLog4jAppenderIntegrationTests {
static final String NAME = MongoLog4jAppenderIntegrationTests.class.getName();
Logger log = Logger.getLogger(NAME);
Mongo mongo;
MongoClient mongo;
DB db;
String collection;
ServerAddress serverLocation;
Logger log;
@Before
public void setUp() throws Exception {
serverLocation = new ServerAddress("localhost", 27017);
mongo = new Mongo("localhost", 27017);
mongo = new MongoClient(serverLocation);
db = mongo.getDB("logs");
Calendar now = Calendar.getInstance();
collection = String.valueOf(now.get(Calendar.YEAR)) + String.format("%1$02d", now.get(Calendar.MONTH) + 1);
db.getCollection(collection).drop();
log = Logger.getLogger(MongoLog4jAppenderIntegrationTests.class.getName());
}
@After
public void tearDown() {
db.getCollection(collection).remove(new BasicDBObject());
}
@Test
@@ -64,13 +76,11 @@ public class MongoLog4jAppenderIntegrationTests {
log.error("ERROR message");
DBCursor msgs = db.getCollection(collection).find();
assertThat(msgs.count(), is(4));
}
@Test
public void testProperties() {
MDC.put("property", "one");
log.debug("DEBUG message");
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,10 +24,7 @@ import org.junit.Test;
*/
public class MongoLog4jAppenderUnitTests {
/**
* @see DATAMONGO-641
*/
@Test
@Test // DATAMONGO-641
public void closesWithoutMongoInstancePresent() {
new MongoLog4jAppender().close();
}

View File

@@ -0,0 +1,16 @@
log4j.rootCategory=INFO, mongo
log4j.appender.mongo=org.springframework.data.mongodb.log4j.MongoLog4jAppender
log4j.appender.mongo.layout=org.apache.log4j.PatternLayout
log4j.appender.mongo.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.appender.mongo.host = localhost
log4j.appender.mongo.port = 27017
log4j.appender.mongo.database = logs
log4j.appender.mongo.username = admin
log4j.appender.mongo.password = test
log4j.appender.mongo.authenticationDatabase = logs
log4j.appender.mongo.collectionPattern = %X{year}%X{month}
log4j.appender.mongo.applicationId = my.application
log4j.appender.mongo.warnOrHigherWriteConcern = FSYNC_SAFE
log4j.category.org.springframework.data.mongodb=DEBUG

View File

@@ -1,13 +1,13 @@
log4j.rootCategory=INFO, stdout
log4j.rootCategory=INFO, mongo
log4j.appender.stdout=org.springframework.data.mongodb.log4j.MongoLog4jAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.appender.stdout.host = localhost
log4j.appender.stdout.port = 27017
log4j.appender.stdout.database = logs
log4j.appender.stdout.collectionPattern = %X{year}%X{month}
log4j.appender.stdout.applicationId = my.application
log4j.appender.stdout.warnOrHigherWriteConcern = FSYNC_SAFE
log4j.appender.mongo=org.springframework.data.mongodb.log4j.MongoLog4jAppender
log4j.appender.mongo.layout=org.apache.log4j.PatternLayout
log4j.appender.mongo.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.appender.mongo.host = localhost
log4j.appender.mongo.port = 27017
log4j.appender.mongo.database = logs
log4j.appender.mongo.collectionPattern = %X{year}%X{month}
log4j.appender.mongo.applicationId = my.application
log4j.appender.mongo.warnOrHigherWriteConcern = FSYNC_SAFE
log4j.category.org.springframework.data.mongodb=DEBUG

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<context version="7.1.10.209">
<context version="7.2.2.230">
<scope type="Project" name="spring-data-mongodb">
<element type="TypeFilterReferenceOverridden" name="Filter">
<element type="IncludeTypePattern" name="org.springframework.data.mongodb.**"/>
@@ -35,6 +35,12 @@
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Implementation" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="CDI">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.cdi.**"/>
</element>
<stereotype name="Unrestricted"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Config" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
</element>
@@ -76,6 +82,11 @@
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="Script">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.script.**"/>
</element>
</element>
<element type="Subsystem" name="Conversion">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.core.convert.**"/>
@@ -83,6 +94,7 @@
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Script" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="SpEL">
<element type="TypeFilter" name="Assignment">
@@ -105,6 +117,11 @@
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="MapReduce">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.mapreduce.**"/>
</element>
</element>
<element type="Subsystem" name="Core">
<element type="TypeFilter" name="Assignment">
<element type="WeakTypePattern" name="**.core.**"/>
@@ -113,8 +130,10 @@
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Conversion" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Index" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|MapReduce" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Script" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="Util">
<element type="TypeFilter" name="Assignment">
@@ -169,7 +188,32 @@
</element>
<element type="Subsystem" name="Querydsl">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="com.mysema.query.**"/>
<element type="IncludeTypePattern" name="com.querydsl.**"/>
</element>
</element>
<element type="Subsystem" name="Slf4j">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="org.slf4j.**"/>
</element>
</element>
<element type="Subsystem" name="Jackson">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="com.fasterxml.jackson.**"/>
</element>
</element>
<element type="Subsystem" name="DOM">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="org.w3c.dom.**"/>
</element>
</element>
<element type="Subsystem" name="AOP Alliance">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="org.aopalliance.**"/>
</element>
</element>
<element type="Subsystem" name="Guava">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="com.google.common.**"/>
</element>
</element>
</architecture>

View File

@@ -11,12 +11,11 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.8.0.M1</version>
<version>1.10.0.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<validation>1.0.0.GA</validation>
<objenesis>1.3</objenesis>
<equalsverifier>1.5</equalsverifier>
</properties>
@@ -59,14 +58,14 @@
</dependency>
<dependency>
<groupId>com.mysema.querydsl</groupId>
<groupId>com.querydsl</groupId>
<artifactId>querydsl-mongodb</artifactId>
<version>${querydsl}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>com.mysema.querydsl</groupId>
<groupId>com.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>${querydsl}</version>
<scope>provided</scope>
@@ -183,7 +182,7 @@
<version>${apt}</version>
<dependencies>
<dependency>
<groupId>com.mysema.querydsl</groupId>
<groupId>com.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>${querydsl}</version>
</dependency>

View File

@@ -0,0 +1,61 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import java.util.List;
import org.springframework.dao.DataAccessException;
import com.mongodb.BulkWriteError;
import com.mongodb.BulkWriteException;
import com.mongodb.BulkWriteResult;
/**
* Is thrown when errors occur during bulk operations.
*
* @author Tobias Trelle
* @author Oliver Gierke
* @since 1.9
*/
public class BulkOperationException extends DataAccessException {
private static final long serialVersionUID = 73929601661154421L;
private final List<BulkWriteError> errors;
private final BulkWriteResult result;
/**
* Creates a new {@link BulkOperationException} with the given message and source {@link BulkWriteException}.
*
* @param message must not be {@literal null}.
* @param source must not be {@literal null}.
*/
public BulkOperationException(String message, BulkWriteException source) {
super(message, source);
this.errors = source.getWriteErrors();
this.result = source.getWriteResult();
}
public List<BulkWriteError> getErrors() {
return errors;
}
public BulkWriteResult getResult() {
return result;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.config;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
@@ -118,17 +119,33 @@ public abstract class AbstractMongoConfiguration {
* Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration
* class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending
* {@link AbstractMongoConfiguration} the base package will be considered {@code com.acme} unless the method is
* overriden to implement alternate behaviour.
* overridden to implement alternate behavior.
*
* @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for
* entities.
* @deprecated use {@link #getMappingBasePackages()} instead.
*/
@Deprecated
protected String getMappingBasePackage() {
Package mappingBasePackage = getClass().getPackage();
return mappingBasePackage == null ? null : mappingBasePackage.getName();
}
/**
* Returns the base packages to scan for MongoDB mapped entities at startup. Will return the package name of the
* configuration class' (the concrete class, not this one here) by default. So if you have a
* {@code com.acme.AppConfig} extending {@link AbstractMongoConfiguration} the base package will be considered
* {@code com.acme} unless the method is overridden to implement alternate behavior.
*
* @return the base packages to scan for mapped {@link Document} classes or an empty collection to not enable scanning
* for entities.
* @since 1.10
*/
protected Collection<String> getMappingBasePackages() {
return Collections.singleton(getMappingBasePackage());
}
/**
* Return {@link UserCredentials} to be used when connecting to the MongoDB instance or {@literal null} if none shall
* be used.
@@ -204,26 +221,52 @@ public abstract class AbstractMongoConfiguration {
}
/**
* Scans the mapping base package for classes annotated with {@link Document}.
* Scans the mapping base package for classes annotated with {@link Document}. By default, it scans for entities in
* all packages returned by {@link #getMappingBasePackages()}.
*
* @see #getMappingBasePackage()
* @see #getMappingBasePackages()
* @return
* @throws ClassNotFoundException
*/
protected Set<Class<?>> getInitialEntitySet() throws ClassNotFoundException {
String basePackage = getMappingBasePackage();
Set<Class<?>> initialEntitySet = new HashSet<Class<?>>();
for (String basePackage : getMappingBasePackages()) {
initialEntitySet.addAll(scanForEntities(basePackage));
}
return initialEntitySet;
}
/**
* Scans the given base package for entities, i.e. MongoDB specific types annotated with {@link Document} and
* {@link Persistent}.
*
* @param basePackage must not be {@literal null}.
* @return
* @throws ClassNotFoundException
* @since 1.10
*/
protected Set<Class<?>> scanForEntities(String basePackage) throws ClassNotFoundException {
if (!StringUtils.hasText(basePackage)) {
return Collections.emptySet();
}
Set<Class<?>> initialEntitySet = new HashSet<Class<?>>();
if (StringUtils.hasText(basePackage)) {
ClassPathScanningCandidateComponentProvider componentProvider = new ClassPathScanningCandidateComponentProvider(
false);
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Document.class));
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Persistent.class));
for (BeanDefinition candidate : componentProvider.findCandidateComponents(basePackage)) {
initialEntitySet.add(ClassUtils.forName(candidate.getBeanClassName(),
AbstractMongoConfiguration.class.getClassLoader()));
initialEntitySet
.add(ClassUtils.forName(candidate.getBeanClassName(), AbstractMongoConfiguration.class.getClassLoader()));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,24 +15,24 @@
*/
package org.springframework.data.mongodb.config;
import static org.springframework.beans.factory.config.BeanDefinition.*;
import static org.springframework.data.mongodb.config.BeanNames.*;
import java.lang.annotation.Annotation;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.RootBeanDefinition;
import org.springframework.context.annotation.ImportBeanDefinitionRegistrar;
import org.springframework.core.type.AnnotationMetadata;
import org.springframework.data.auditing.IsNewAwareAuditingHandler;
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
import org.springframework.data.auditing.config.AuditingConfiguration;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.event.AuditingEventListener;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.Assert;
/**
@@ -71,7 +71,6 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
Assert.notNull(annotationMetadata, "AnnotationMetadata must not be null!");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
defaultDependenciesIfNecessary(registry, annotationMetadata);
super.registerBeanDefinitions(annotationMetadata, registry);
}
@@ -85,7 +84,11 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
Assert.notNull(configuration, "AuditingConfiguration must not be null!");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(IsNewAwareAuditingHandler.class);
builder.addConstructorArgReference(MAPPING_CONTEXT_BEAN_NAME);
BeanDefinitionBuilder definition = BeanDefinitionBuilder.genericBeanDefinition(MongoMappingContextLookup.class);
definition.setAutowireMode(AbstractBeanDefinition.AUTOWIRE_CONSTRUCTOR);
builder.addConstructorArgValue(definition.getBeanDefinition());
return configureDefaultAuditHandlerAttributes(configuration, builder);
}
@@ -102,29 +105,58 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
BeanDefinitionBuilder listenerBeanDefinitionBuilder = BeanDefinitionBuilder
.rootBeanDefinition(AuditingEventListener.class);
listenerBeanDefinitionBuilder.addConstructorArgValue(ParsingUtils.getObjectFactoryBeanDefinition(
getAuditingHandlerBeanName(), registry));
listenerBeanDefinitionBuilder
.addConstructorArgValue(ParsingUtils.getObjectFactoryBeanDefinition(getAuditingHandlerBeanName(), registry));
registerInfrastructureBeanWithId(listenerBeanDefinitionBuilder.getBeanDefinition(),
AuditingEventListener.class.getName(), registry);
}
/**
* Register default bean definitions for a {@link MongoMappingContext} and an {@link IsNewStrategyFactory} in case we
* don't find beans with the assumed names in the registry.
*
* @param registry the {@link BeanDefinitionRegistry} to use to register the components into.
* @param source the source which the registered components shall be registered with
* Simple helper to be able to wire the {@link MappingContext} from a {@link MappingMongoConverter} bean available in
* the application context.
*
* @author Oliver Gierke
*/
private void defaultDependenciesIfNecessary(BeanDefinitionRegistry registry, Object source) {
static class MongoMappingContextLookup
implements FactoryBean<MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty>> {
if (!registry.containsBeanDefinition(MAPPING_CONTEXT_BEAN_NAME)) {
private final MappingMongoConverter converter;
RootBeanDefinition definition = new RootBeanDefinition(MongoMappingContext.class);
definition.setRole(ROLE_INFRASTRUCTURE);
definition.setSource(source);
/**
* Creates a new {@link MongoMappingContextLookup} for the given {@link MappingMongoConverter}.
*
* @param converter must not be {@literal null}.
*/
public MongoMappingContextLookup(MappingMongoConverter converter) {
this.converter = converter;
}
registry.registerBeanDefinition(MAPPING_CONTEXT_BEAN_NAME, definition);
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
@Override
public MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> getObject() throws Exception {
return converter.getMappingContext();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
@Override
public Class<?> getObjectType() {
return MappingContext.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
*/
@Override
public boolean isSingleton() {
return true;
}
}
}

View File

@@ -17,8 +17,11 @@ package org.springframework.data.mongodb.config;
import java.beans.PropertyEditorSupport;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.Properties;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.springframework.util.StringUtils;
@@ -28,10 +31,13 @@ import com.mongodb.MongoCredential;
* Parse a {@link String} to a Collection of {@link MongoCredential}.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
private static final Pattern GROUP_PATTERN = Pattern.compile("(\\\\?')(.*?)\\1");
private static final String AUTH_MECHANISM_KEY = "uri.authMechanism";
private static final String USERNAME_PASSWORD_DELIMINATOR = ":";
private static final String DATABASE_DELIMINATOR = "@";
@@ -51,11 +57,7 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
List<MongoCredential> credentials = new ArrayList<MongoCredential>();
for (String credentialString : text.split(",")) {
if (!text.contains(USERNAME_PASSWORD_DELIMINATOR) || !text.contains(DATABASE_DELIMINATOR)) {
throw new IllegalArgumentException("Credentials need to be in format 'username:password@database'!");
}
for (String credentialString : extractCredentialsString(text)) {
String[] userNameAndPassword = extractUserNameAndPassword(credentialString);
String database = extractDB(credentialString);
@@ -68,43 +70,83 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
String authMechanism = options.getProperty(AUTH_MECHANISM_KEY);
if (MongoCredential.GSSAPI_MECHANISM.equals(authMechanism)) {
verifyUserNamePresent(userNameAndPassword);
credentials.add(MongoCredential.createGSSAPICredential(userNameAndPassword[0]));
} else if (MongoCredential.MONGODB_CR_MECHANISM.equals(authMechanism)) {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.createMongoCRCredential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if (MongoCredential.MONGODB_X509_MECHANISM.equals(authMechanism)) {
verifyUserNamePresent(userNameAndPassword);
credentials.add(MongoCredential.createMongoX509Credential(userNameAndPassword[0]));
} else if (MongoCredential.PLAIN_MECHANISM.equals(authMechanism)) {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.createPlainCredential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else if (MongoCredential.SCRAM_SHA_1_MECHANISM.equals(authMechanism)) {
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(MongoCredential.createScramSha1Credential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
} else {
throw new IllegalArgumentException(String.format(
"Cannot create MongoCredentials for unknown auth mechanism '%s'!", authMechanism));
throw new IllegalArgumentException(
String.format("Cannot create MongoCredentials for unknown auth mechanism '%s'!", authMechanism));
}
}
} else {
credentials.add(MongoCredential.createCredential(userNameAndPassword[0], database,
userNameAndPassword[1].toCharArray()));
verifyUsernameAndPasswordPresent(userNameAndPassword);
verifyDatabasePresent(database);
credentials.add(
MongoCredential.createCredential(userNameAndPassword[0], database, userNameAndPassword[1].toCharArray()));
}
}
setValue(credentials);
}
private List<String> extractCredentialsString(String source) {
Matcher matcher = GROUP_PATTERN.matcher(source);
List<String> list = new ArrayList<String>();
while (matcher.find()) {
String value = StringUtils.trimLeadingCharacter(matcher.group(), '\'');
list.add(StringUtils.trimTrailingCharacter(value, '\''));
}
if (!list.isEmpty()) {
return list;
}
return Arrays.asList(source.split(","));
}
private static String[] extractUserNameAndPassword(String text) {
int dbSeperationIndex = text.lastIndexOf(DATABASE_DELIMINATOR);
String userNameAndPassword = text.substring(0, dbSeperationIndex);
return userNameAndPassword.split(USERNAME_PASSWORD_DELIMINATOR);
int index = text.lastIndexOf(DATABASE_DELIMINATOR);
index = index != -1 ? index : text.lastIndexOf(OPTIONS_DELIMINATOR);
return index == -1 ? new String[] {} : text.substring(0, index).split(USERNAME_PASSWORD_DELIMINATOR);
}
private static String extractDB(String text) {
int dbSeperationIndex = text.lastIndexOf(DATABASE_DELIMINATOR);
if (dbSeperationIndex == -1) {
return "";
}
String tmp = text.substring(dbSeperationIndex + 1);
int optionsSeperationIndex = tmp.lastIndexOf(OPTIONS_DELIMINATOR);
@@ -129,4 +171,28 @@ public class MongoCredentialPropertyEditor extends PropertyEditorSupport {
return properties;
}
private static void verifyUsernameAndPasswordPresent(String[] source) {
verifyUserNamePresent(source);
if (source.length != 2) {
throw new IllegalArgumentException(
"Credentials need to specify username and password like in 'username:password@database'!");
}
}
private static void verifyDatabasePresent(String source) {
if (!StringUtils.hasText(source)) {
throw new IllegalArgumentException("Credentials need to specify database like in 'username:password@database'!");
}
}
private static void verifyUserNamePresent(String[] source) {
if (source.length == 0 || !StringUtils.hasText(source[0])) {
throw new IllegalArgumentException("Credentials need to specify username!");
}
}
}

View File

@@ -18,6 +18,10 @@ package org.springframework.data.mongodb.config;
import static org.springframework.data.config.ParsingUtils.*;
import static org.springframework.data.mongodb.config.MongoParsingUtils.*;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
@@ -44,9 +48,21 @@ import com.mongodb.MongoURI;
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
* @author Viktor Khoroshko
*/
public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
private static final Set<String> MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES;
static {
Set<String> mongoUriAllowedAdditionalAttributes = new HashSet<String>();
mongoUriAllowedAdditionalAttributes.add("id");
mongoUriAllowedAdditionalAttributes.add("write-concern");
MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES = Collections.unmodifiableSet(mongoUriAllowedAdditionalAttributes);
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
@@ -70,13 +86,10 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
BeanDefinitionBuilder dbFactoryBuilder = BeanDefinitionBuilder.genericBeanDefinition(SimpleMongoDbFactory.class);
setPropertyValue(dbFactoryBuilder, element, "write-concern", "writeConcern");
BeanDefinition mongoUri = getMongoUri(element);
BeanDefinition mongoUri = getMongoUri(element, parserContext);
if (mongoUri != null) {
if (element.getAttributes().getLength() >= 2 && !element.hasAttribute("write-concern")) {
parserContext.getReaderContext().error("Configure either Mongo URI or details individually!",
parserContext.extractSource(element));
}
dbFactoryBuilder.addConstructorArgValue(mongoUri);
return getSourceBeanDefinition(dbFactoryBuilder, parserContext, element);
}
@@ -149,12 +162,15 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
/**
* Creates a {@link BeanDefinition} for a {@link MongoURI} or {@link MongoClientURI} depending on configured
* attributes.
* attributes. <br />
* Errors when configured element contains {@literal uri} or {@literal client-uri} along with other attributes except
* {@literal write-concern} and/or {@literal id}.
*
* @param element must not be {@literal null}.
* @param parserContext
* @return {@literal null} in case no client-/uri defined.
*/
private BeanDefinition getMongoUri(Element element) {
private BeanDefinition getMongoUri(Element element, ParserContext parserContext) {
boolean hasClientUri = element.hasAttribute("client-uri");
@@ -162,6 +178,21 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
return null;
}
int allowedAttributesCount = 1;
for (String attribute : MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES) {
if (element.hasAttribute(attribute)) {
allowedAttributesCount++;
}
}
if (element.getAttributes().getLength() > allowedAttributesCount) {
parserContext.getReaderContext().error(
"Configure either " + (hasClientUri ? "Mongo Client URI" : "Mongo URI") + " or details individually!",
parserContext.extractSource(element));
}
Class<?> type = hasClientUri ? MongoClientURI.class : MongoURI.class;
String uri = hasClientUri ? element.getAttribute("client-uri") : element.getAttribute("uri");

View File

@@ -0,0 +1,145 @@
/*
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.List;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.Pair;
import com.mongodb.BulkWriteResult;
/**
* Bulk operations for insert/update/remove actions on a collection. These bulks operation are available since MongoDB
* 2.6 and make use of low level bulk commands on the protocol level. This interface defines a fluent API to add
* multiple single operations or list of similar operations in sequence which can then eventually be executed by calling
* {@link #execute()}.
*
* @author Tobias Trelle
* @author Oliver Gierke
* @since 1.9
*/
public interface BulkOperations {
/**
* Mode for bulk operation.
**/
public enum BulkMode {
/** Perform bulk operations in sequence. The first error will cancel processing. */
ORDERED,
/** Perform bulk operations in parallel. Processing will continue on errors. */
UNORDERED
};
/**
* Add a single insert to the bulk operation.
*
* @param documents the document to insert, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the insert added, will never be {@literal null}.
*/
BulkOperations insert(Object documents);
/**
* Add a list of inserts to the bulk operation.
*
* @param documents List of documents to insert, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the insert added, will never be {@literal null}.
*/
BulkOperations insert(List<? extends Object> documents);
/**
* Add a single update to the bulk operation. For the update request, only the first matching document is updated.
*
* @param query update criteria, must not be {@literal null}.
* @param update {@link Update} operation to perform, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateOne(Query query, Update update);
/**
* Add a list of updates to the bulk operation. For each update request, only the first matching document is updated.
*
* @param updates Update operations to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateOne(List<Pair<Query, Update>> updates);
/**
* Add a single update to the bulk operation. For the update request, all matching documents are updated.
*
* @param query Update criteria.
* @param update Update operation to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateMulti(Query query, Update update);
/**
* Add a list of updates to the bulk operation. For each update request, all matching documents are updated.
*
* @param updates Update operations to perform.
* @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateMulti(List<Pair<Query, Update>> updates);
/**
* Add a single upsert to the bulk operation. An upsert is an update if the set of matching documents is not empty,
* else an insert.
*
* @param query Update criteria.
* @param update Update operation to perform.
* @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations upsert(Query query, Update update);
/**
* Add a list of upserts to the bulk operation. An upsert is an update if the set of matching documents is not empty,
* else an insert.
*
* @param updates Updates/insert operations to perform.
* @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations upsert(List<Pair<Query, Update>> updates);
/**
* Add a single remove operation to the bulk operation.
*
* @param remove the {@link Query} to select the documents to be removed, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the removal added, will never be {@literal null}.
*/
BulkOperations remove(Query remove);
/**
* Add a list of remove operations to the bulk operation.
*
* @param removes the remove operations to perform, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the removal added, will never be {@literal null}.
*/
BulkOperations remove(List<Query> removes);
/**
* Execute all bulk operations using the default write concern.
*
* @return Result of the bulk operation providing counters for inserts/updates etc.
* @throws {@link BulkOperationException} if an error occurred during bulk processing.
*/
BulkWriteResult execute();
}

View File

@@ -25,7 +25,7 @@ import com.mongodb.DBCursor;
interface CursorPreparer {
/**
* Prepare the given cursor (apply limits, skips and so on). Returns th eprepared cursor.
* Prepare the given cursor (apply limits, skips and so on). Returns the prepared cursor.
*
* @param cursor
*/

View File

@@ -0,0 +1,337 @@
/*
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.List;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.Pair;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.BulkWriteException;
import com.mongodb.BulkWriteOperation;
import com.mongodb.BulkWriteRequestBuilder;
import com.mongodb.BulkWriteResult;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.WriteConcern;
/**
* Default implementation for {@link BulkOperations}.
*
* @author Tobias Trelle
* @author Oliver Gierke
* @author Christoph Strobl
* @since 1.9
*/
class DefaultBulkOperations implements BulkOperations {
private final MongoOperations mongoOperations;
private final BulkMode bulkMode;
private final String collectionName;
private final Class<?> entityType;
private PersistenceExceptionTranslator exceptionTranslator;
private WriteConcernResolver writeConcernResolver;
private WriteConcern defaultWriteConcern;
private BulkWriteOperation bulk;
/**
* Creates a new {@link DefaultBulkOperations} for the given {@link MongoOperations}, {@link BulkMode}, collection
* name and {@link WriteConcern}.
*
* @param mongoOperations The underlying {@link MongoOperations}, must not be {@literal null}.
* @param bulkMode must not be {@literal null}.
* @param collectionName Name of the collection to work on, must not be {@literal null} or empty.
* @param entityType the entity type, can be {@literal null}.
*/
DefaultBulkOperations(MongoOperations mongoOperations, BulkMode bulkMode, String collectionName,
Class<?> entityType) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.notNull(bulkMode, "BulkMode must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
this.mongoOperations = mongoOperations;
this.bulkMode = bulkMode;
this.collectionName = collectionName;
this.entityType = entityType;
this.exceptionTranslator = new MongoExceptionTranslator();
this.writeConcernResolver = DefaultWriteConcernResolver.INSTANCE;
this.bulk = initBulkOperation();
}
/**
* Configures the {@link PersistenceExceptionTranslator} to be used. Defaults to {@link MongoExceptionTranslator}.
*
* @param exceptionTranslator can be {@literal null}.
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? new MongoExceptionTranslator() : exceptionTranslator;
}
/**
* Configures the {@link WriteConcernResolver} to be used. Defaults to {@link DefaultWriteConcernResolver}.
*
* @param writeConcernResolver can be {@literal null}.
*/
public void setWriteConcernResolver(WriteConcernResolver writeConcernResolver) {
this.writeConcernResolver = writeConcernResolver == null ? DefaultWriteConcernResolver.INSTANCE
: writeConcernResolver;
}
/**
* Configures the default {@link WriteConcern} to be used. Defaults to {@literal null}.
*
* @param defaultWriteConcern can be {@literal null}.
*/
public void setDefaultWriteConcern(WriteConcern defaultWriteConcern) {
this.defaultWriteConcern = defaultWriteConcern;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#insert(java.lang.Object)
*/
@Override
public BulkOperations insert(Object document) {
Assert.notNull(document, "Document must not be null!");
if (document instanceof DBObject) {
bulk.insert((DBObject) document);
return this;
}
DBObject sink = new BasicDBObject();
mongoOperations.getConverter().write(document, sink);
bulk.insert(sink);
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#insert(java.util.List)
*/
@Override
public BulkOperations insert(List<? extends Object> documents) {
Assert.notNull(documents, "Documents must not be null!");
for (Object document : documents) {
insert(document);
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateOne(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateOne(Query query, Update update) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
return updateOne(Arrays.asList(Pair.of(query, update)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateOne(java.util.List)
*/
@Override
public BulkOperations updateOne(List<Pair<Query, Update>> updates) {
Assert.notNull(updates, "Updates must not be null!");
for (Pair<Query, Update> update : updates) {
update(update.getFirst(), update.getSecond(), false, false);
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateMulti(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateMulti(Query query, Update update) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
return updateMulti(Arrays.asList(Pair.of(query, update)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateMulti(java.util.List)
*/
@Override
public BulkOperations updateMulti(List<Pair<Query, Update>> updates) {
Assert.notNull(updates, "Updates must not be null!");
for (Pair<Query, Update> update : updates) {
update(update.getFirst(), update.getSecond(), false, true);
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#upsert(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
public BulkOperations upsert(Query query, Update update) {
return update(query, update, true, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#upsert(java.util.List)
*/
@Override
public BulkOperations upsert(List<Pair<Query, Update>> updates) {
for (Pair<Query, Update> update : updates) {
upsert(update.getFirst(), update.getSecond());
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#remove(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public BulkOperations remove(Query query) {
Assert.notNull(query, "Query must not be null!");
bulk.find(query.getQueryObject()).remove();
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#remove(java.util.List)
*/
@Override
public BulkOperations remove(List<Query> removes) {
Assert.notNull(removes, "Removals must not be null!");
for (Query query : removes) {
remove(query);
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#executeBulk()
*/
@Override
public BulkWriteResult execute() {
MongoAction action = new MongoAction(defaultWriteConcern, MongoActionOperation.BULK, collectionName, entityType,
null, null);
WriteConcern writeConcern = writeConcernResolver.resolve(action);
try {
return writeConcern == null ? bulk.execute() : bulk.execute(writeConcern);
} catch (BulkWriteException o_O) {
DataAccessException toThrow = exceptionTranslator.translateExceptionIfPossible(o_O);
throw toThrow == null ? o_O : toThrow;
} finally {
this.bulk = initBulkOperation();
}
}
/**
* Performs update and upsert bulk operations.
*
* @param query the {@link Query} to determine documents to update.
* @param update the {@link Update} to perform, must not be {@literal null}.
* @param upsert whether to upsert.
* @param multi whether to issue a multi-update.
* @return the {@link BulkOperations} with the update registered.
*/
private BulkOperations update(Query query, Update update, boolean upsert, boolean multi) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
BulkWriteRequestBuilder builder = bulk.find(query.getQueryObject());
if (upsert) {
if (multi) {
builder.upsert().update(update.getUpdateObject());
} else {
builder.upsert().updateOne(update.getUpdateObject());
}
} else {
if (multi) {
builder.update(update.getUpdateObject());
} else {
builder.updateOne(update.getUpdateObject());
}
}
return this;
}
private final BulkWriteOperation initBulkOperation() {
DBCollection collection = mongoOperations.getCollection(collectionName);
switch (bulkMode) {
case ORDERED:
return collection.initializeOrderedBulkOperation();
case UNORDERED:
return collection.initializeUnorderedBulkOperation();
}
throw new IllegalStateException("BulkMode was null!");
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,17 +15,15 @@
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.domain.Sort.Direction.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.List;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexField;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.util.Assert;
import com.mongodb.DBCollection;
@@ -42,12 +40,12 @@ import com.mongodb.MongoException;
*/
public class DefaultIndexOperations implements IndexOperations {
private static final Double ONE = Double.valueOf(1);
private static final Double MINUS_ONE = Double.valueOf(-1);
private static final Collection<String> TWO_D_IDENTIFIERS = Arrays.asList("2d", "2dsphere");
private static final String PARTIAL_FILTER_EXPRESSION_KEY = "partialFilterExpression";
private final MongoOperations mongoOperations;
private final String collectionName;
private final QueryMapper mapper;
private final Class<?> type;
/**
* Creates a new {@link DefaultIndexOperations}.
@@ -56,12 +54,26 @@ public class DefaultIndexOperations implements IndexOperations {
* @param collectionName must not be {@literal null}.
*/
public DefaultIndexOperations(MongoOperations mongoOperations, String collectionName) {
this(mongoOperations, collectionName, null);
}
/**
* Creates a new {@link DefaultIndexOperations}.
*
* @param mongoOperations must not be {@literal null}.
* @param collectionName must not be {@literal null}.
* @param type Type used for mapping potential partial index filter expression. Can be {@literal null}.
* @since 1.10
*/
public DefaultIndexOperations(MongoOperations mongoOperations, String collectionName, Class<?> type) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.notNull(collectionName, "Collection name can not be null!");
this.mongoOperations = mongoOperations;
this.collectionName = collectionName;
this.mapper = new QueryMapper(mongoOperations.getConverter());
this.type = type;
}
/*
@@ -69,9 +81,20 @@ public class DefaultIndexOperations implements IndexOperations {
* @see org.springframework.data.mongodb.core.IndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/
public void ensureIndex(final IndexDefinition indexDefinition) {
mongoOperations.execute(collectionName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
DBObject indexOptions = indexDefinition.getIndexOptions();
if (indexOptions != null && indexOptions.containsField(PARTIAL_FILTER_EXPRESSION_KEY)) {
Assert.isInstanceOf(DBObject.class, indexOptions.get(PARTIAL_FILTER_EXPRESSION_KEY));
indexOptions.put(PARTIAL_FILTER_EXPRESSION_KEY,
mapper.getMappedObject((DBObject) indexOptions.get(PARTIAL_FILTER_EXPRESSION_KEY),
lookupPersistentEntity(type, collectionName)));
}
if (indexOptions != null) {
collection.createIndex(indexDefinition.getIndexKeys(), indexOptions);
} else {
@@ -79,6 +102,24 @@ public class DefaultIndexOperations implements IndexOperations {
}
return null;
}
private MongoPersistentEntity<?> lookupPersistentEntity(Class<?> entityType, String collection) {
if (entityType != null) {
return mongoOperations.getConverter().getMappingContext().getPersistentEntity(entityType);
}
Collection<? extends MongoPersistentEntity<?>> entities = mongoOperations.getConverter().getMappingContext()
.getPersistentEntities();
for (MongoPersistentEntity<?> entity : entities) {
if (entity.getCollection().equals(collection)) {
return entity;
}
}
return null;
}
});
}
@@ -126,7 +167,9 @@ public class DefaultIndexOperations implements IndexOperations {
public List<IndexInfo> getIndexInfo() {
return mongoOperations.execute(collectionName, new CollectionCallback<List<IndexInfo>>() {
public List<IndexInfo> doInCollection(DBCollection collection) throws MongoException, DataAccessException {
List<DBObject> dbObjectList = collection.getIndexInfo();
return getIndexData(dbObjectList);
}
@@ -136,44 +179,7 @@ public class DefaultIndexOperations implements IndexOperations {
List<IndexInfo> indexInfoList = new ArrayList<IndexInfo>();
for (DBObject ix : dbObjectList) {
DBObject keyDbObject = (DBObject) ix.get("key");
int numberOfElements = keyDbObject.keySet().size();
List<IndexField> indexFields = new ArrayList<IndexField>(numberOfElements);
for (String key : keyDbObject.keySet()) {
Object value = keyDbObject.get(key);
if (TWO_D_IDENTIFIERS.contains(value)) {
indexFields.add(IndexField.geo(key));
} else if ("text".equals(value)) {
DBObject weights = (DBObject) ix.get("weights");
for (String fieldName : weights.keySet()) {
indexFields.add(IndexField.text(fieldName, Float.valueOf(weights.get(fieldName).toString())));
}
} else {
Double keyValue = new Double(value.toString());
if (ONE.equals(keyValue)) {
indexFields.add(IndexField.create(key, ASC));
} else if (MINUS_ONE.equals(keyValue)) {
indexFields.add(IndexField.create(key, DESC));
}
}
}
String name = ix.get("name").toString();
boolean unique = ix.containsField("unique") ? (Boolean) ix.get("unique") : false;
boolean dropDuplicates = ix.containsField("dropDups") ? (Boolean) ix.get("dropDups") : false;
boolean sparse = ix.containsField("sparse") ? (Boolean) ix.get("sparse") : false;
String language = ix.containsField("default_language") ? (String) ix.get("default_language") : "";
indexInfoList.add(new IndexInfo(indexFields, name, unique, dropDuplicates, sparse, language));
indexInfoList.add(IndexInfo.indexInfoOf(ix));
}
return indexInfoList;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2014-2015 the original author or authors.
* Copyright 2014-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -98,7 +98,7 @@ class DefaultScriptOperations implements ScriptOperations {
@Override
public Object doInDB(DB db) throws MongoException, DataAccessException {
return db.eval(script.getCode(), convertScriptArgs(args));
return db.eval(script.getCode(), convertScriptArgs(false, args));
}
});
}
@@ -155,7 +155,7 @@ class DefaultScriptOperations implements ScriptOperations {
return scriptNames;
}
private Object[] convertScriptArgs(Object... args) {
private Object[] convertScriptArgs(boolean quote, Object... args) {
if (ObjectUtils.isEmpty(args)) {
return args;
@@ -164,15 +164,15 @@ class DefaultScriptOperations implements ScriptOperations {
List<Object> convertedValues = new ArrayList<Object>(args.length);
for (Object arg : args) {
convertedValues.add(arg instanceof String ? String.format("'%s'", arg) : this.mongoOperations.getConverter()
.convertToMongoType(arg));
convertedValues.add(arg instanceof String && quote ? String.format("'%s'", arg)
: this.mongoOperations.getConverter().convertToMongoType(arg));
}
return convertedValues.toArray();
}
private String convertAndJoinScriptArgs(Object... args) {
return ObjectUtils.isEmpty(args) ? "" : StringUtils.arrayToCommaDelimitedString(convertScriptArgs(args));
return ObjectUtils.isEmpty(args) ? "" : StringUtils.arrayToCommaDelimitedString(convertScriptArgs(true, args));
}
/**

View File

@@ -0,0 +1,32 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.WriteConcern;
/**
* Default {@link WriteConcernResolver} resolving the {@link WriteConcern} from the given {@link MongoAction}.
*
* @author Oliver Gierke
*/
enum DefaultWriteConcernResolver implements WriteConcernResolver {
INSTANCE;
public WriteConcern resolve(MongoAction action) {
return action.getDefaultWriteConcern();
}
}

View File

@@ -0,0 +1,72 @@
/*
* Copyright 2016-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Value object to mitigate different representations of geo command execution results in MongoDB.
*
* @author Oliver Gierke
* @soundtrack Fruitcake - Jeff Coffin (The Inside of the Outside)
*/
class GeoCommandStatistics {
private static final GeoCommandStatistics NONE = new GeoCommandStatistics(new BasicDBObject());
private final DBObject source;
/**
* Creates a new {@link GeoCommandStatistics} instance with the given source document.
*
* @param source must not be {@literal null}.
*/
private GeoCommandStatistics(DBObject source) {
Assert.notNull(source, "Source document must not be null!");
this.source = source;
}
/**
* Creates a new {@link GeoCommandStatistics} from the given command result extracting the statistics.
*
* @param commandResult must not be {@literal null}.
* @return
*/
public static GeoCommandStatistics from(DBObject commandResult) {
Assert.notNull(commandResult, "Command result must not be null!");
Object stats = commandResult.get("stats");
return stats == null ? NONE : new GeoCommandStatistics((DBObject) stats);
}
/**
* Returns the average distance reported by the command result. Mitigating a removal of the field in case the command
* didn't return any result introduced in MongoDB 3.2 RC1.
*
* @return
* @see <a href="https://jira.mongodb.org/browse/SERVER-21024">MongoDB Jira SERVER-21024</a>
*/
public double getAverageDistance() {
Object averageDistance = source.get("avgDistance");
return averageDistance == null ? Double.NaN : (Double) averageDistance;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,15 +17,14 @@ package org.springframework.data.mongodb.core;
import org.springframework.context.annotation.Bean;
import org.springframework.data.mongodb.core.geo.GeoJsonModule;
import org.springframework.data.web.config.SpringDataWebConfigurationMixin;
import org.springframework.data.web.config.SpringDataJacksonModules;
/**
* Configuration class to expose {@link GeoJsonModule} as a Spring bean.
*
* @author Oliver Gierke
*/
@SpringDataWebConfigurationMixin
public class GeoJsonConfiguration {
public class GeoJsonConfiguration implements SpringDataJacksonModules {
@Bean
public GeoJsonModule geoJsonModule() {

View File

@@ -49,7 +49,7 @@ public class MongoAction {
* @param collectionName the collection name, must not be {@literal null} or empty.
* @param entityType the POJO that is being operated against
* @param document the converted DBObject from the POJO or Spring Update object
* @param query the converted DBOjbect from the Spring Query object
* @param query the converted DBObject from the Spring Query object
*/
public MongoAction(WriteConcern defaultWriteConcern, MongoActionOperation mongoActionOperation,
String collectionName, Class<?> entityType, DBObject document, DBObject query) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -25,5 +25,5 @@ package org.springframework.data.mongodb.core;
*/
public enum MongoActionOperation {
REMOVE, UPDATE, INSERT, INSERT_LIST, SAVE
REMOVE, UPDATE, INSERT, INSERT_LIST, SAVE, BULK;
}

View File

@@ -199,8 +199,8 @@ public abstract class MongoDbUtils {
}
/**
* Check if credentials present. In case we're using a monog-java-driver version 3 or above we do not have the need
* for authentication as the auth data has to be provied within the MongoClient
* Check if credentials present. In case we're using a mongo-java-driver version 3 or above we do not have the need
* for authentication as the auth data has to be provided within the MongoClient
*
* @param credentials
* @return

View File

@@ -25,10 +25,14 @@ import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.dao.DuplicateKeyException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.InvalidDataAccessResourceUsageException;
import org.springframework.dao.PermissionDeniedDataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.BulkOperationException;
import org.springframework.data.mongodb.UncategorizedMongoDbException;
import org.springframework.data.mongodb.util.MongoDbErrorCodes;
import org.springframework.util.ClassUtils;
import com.mongodb.BulkWriteException;
import com.mongodb.MongoException;
/**
@@ -42,12 +46,12 @@ import com.mongodb.MongoException;
*/
public class MongoExceptionTranslator implements PersistenceExceptionTranslator {
private static final Set<String> DULICATE_KEY_EXCEPTIONS = new HashSet<String>(Arrays.asList(
"MongoException.DuplicateKey", "DuplicateKeyException"));
private static final Set<String> DULICATE_KEY_EXCEPTIONS = new HashSet<String>(
Arrays.asList("MongoException.DuplicateKey", "DuplicateKeyException"));
private static final Set<String> RESOURCE_FAILURE_EXCEPTIONS = new HashSet<String>(Arrays.asList(
"MongoException.Network", "MongoSocketException", "MongoException.CursorNotFound",
"MongoCursorNotFoundException", "MongoServerSelectionException", "MongoTimeoutException"));
private static final Set<String> RESOURCE_FAILURE_EXCEPTIONS = new HashSet<String>(
Arrays.asList("MongoException.Network", "MongoSocketException", "MongoException.CursorNotFound",
"MongoCursorNotFoundException", "MongoServerSelectionException", "MongoTimeoutException"));
private static final Set<String> RESOURCE_USAGE_EXCEPTIONS = new HashSet<String>(
Arrays.asList("MongoInternalException"));
@@ -81,17 +85,24 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
return new DataIntegrityViolationException(ex.getMessage(), ex);
}
if (ex instanceof BulkWriteException) {
return new BulkOperationException(ex.getMessage(), (BulkWriteException) ex);
}
// All other MongoExceptions
if (ex instanceof MongoException) {
int code = ((MongoException) ex).getCode();
if (code == 11000 || code == 11001) {
if (MongoDbErrorCodes.isDuplicateKeyCode(code)) {
throw new DuplicateKeyException(ex.getMessage(), ex);
} else if (code == 12000 || code == 13440) {
} else if (MongoDbErrorCodes.isDataAccessResourceFailureCode(code)) {
throw new DataAccessResourceFailureException(ex.getMessage(), ex);
} else if (code == 10003 || code == 12001 || code == 12010 || code == 12011 || code == 12012) {
} else if (MongoDbErrorCodes.isInvalidDataAccessApiUsageCode(code) || code == 10003 || code == 12001
|| code == 12010 || code == 12011 || code == 12012) {
throw new InvalidDataAccessApiUsageException(ex.getMessage(), ex);
} else if (MongoDbErrorCodes.isPermissionDeniedCode(code)) {
throw new PermissionDeniedDataAccessException(ex.getMessage(), ex);
}
return new UncategorizedMongoDbException(ex.getMessage(), ex);
}

View File

@@ -20,6 +20,7 @@ import java.util.List;
import java.util.Set;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.mongodb.core.BulkOperations.BulkMode;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
@@ -174,13 +175,28 @@ public interface MongoOperations {
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed.
*
* @param <T> element return type
* @param query
* @param entityType
* @return
* @param query must not be {@literal null}.
* @param entityType must not be {@literal null}.
* @return will never be {@literal null}.
* @since 1.7
*/
<T> CloseableIterator<T> stream(Query query, Class<T> entityType);
/**
* Executes the given {@link Query} on the entity collection of the specified {@code entityType} and collection backed
* by a Mongo DB {@link Cursor}.
* <p>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed.
*
* @param <T> element return type
* @param query must not be {@literal null}.
* @param entityType must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return will never be {@literal null}.
* @since 1.10
*/
<T> CloseableIterator<T> stream(Query query, Class<T> entityType, String collectionName);
/**
* Create an uncapped collection with a name based on the provided entity class.
*
@@ -190,7 +206,7 @@ public interface MongoOperations {
<T> DBCollection createCollection(Class<T> entityClass);
/**
* Create a collect with a name based on the provided entity class using the options.
* Create a collection with a name based on the provided entity class using the options.
*
* @param entityClass class that determines the collection to create
* @param collectionOptions options to use when creating the collection.
@@ -207,7 +223,7 @@ public interface MongoOperations {
DBCollection createCollection(String collectionName);
/**
* Create a collect with the provided name and options.
* Create a collection with the provided name and options.
*
* @param collectionName name of the collection
* @param collectionOptions options to use when creating the collection.
@@ -292,6 +308,34 @@ public interface MongoOperations {
*/
ScriptOperations scriptOps();
/**
* Returns a new {@link BulkOperations} for the given collection.
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param collectionName the name of the collection to work on, must not be {@literal null} or empty.
* @return {@link BulkOperations} on the named collection
*/
BulkOperations bulkOps(BulkMode mode, String collectionName);
/**
* Returns a new {@link BulkOperations} for the given entity type.
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param entityType the name of the entity class, must not be {@literal null}.
* @return {@link BulkOperations} on the named collection associated of the given entity class.
*/
BulkOperations bulkOps(BulkMode mode, Class<?> entityType);
/**
* Returns a new {@link BulkOperations} for the given entity type and collection name.
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param entityClass the name of the entity class, must not be {@literal null}.
* @param collectionName the name of the collection to work on, must not be {@literal null} or empty.
* @return {@link BulkOperations} on the named collection associated with the given entity class.
*/
BulkOperations bulkOps(BulkMode mode, Class<?> entityType, String collectionName);
/**
* Query for a list of objects of type T from the collection used by the entity class.
* <p/>
@@ -600,8 +644,8 @@ public interface MongoOperations {
<T> T findById(Object id, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
@@ -612,8 +656,8 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, Class<T> entityClass);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
@@ -625,8 +669,8 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
@@ -639,8 +683,8 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
@@ -728,9 +772,9 @@ public interface MongoOperations {
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert"
* >Spring's Type Conversion"</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" >
* Spring's Type Conversion"</a> for more details.
* <p/>
* <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method.
@@ -785,9 +829,9 @@ public interface MongoOperations {
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert"
* >Spring's Type Conversion"</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" >
* Spring's Type Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection
*/
@@ -1002,5 +1046,4 @@ public interface MongoOperations {
* @return
*/
MongoConverter getConverter();
}

View File

@@ -19,6 +19,7 @@ import java.net.UnknownHostException;
import org.springframework.beans.factory.DisposableBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.MongoDbFactory;
@@ -103,8 +104,8 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
*/
@Deprecated
public SimpleMongoDbFactory(MongoURI uri) throws MongoException, UnknownHostException {
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())),
true, uri.getDatabase());
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), parseChars(uri.getPassword())), true,
uri.getDatabase());
}
/**
@@ -132,6 +133,11 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
private SimpleMongoDbFactory(Mongo mongo, String databaseName, UserCredentials credentials,
boolean mongoInstanceCreated, String authenticationDatabaseName) {
if (mongo instanceof MongoClient && (credentials != null && !UserCredentials.NO_CREDENTIALS.equals(credentials))) {
throw new InvalidDataAccessApiUsageException(
"Usage of 'UserCredentials' with 'MongoClient' is no longer supported. Please use 'MongoCredential' for 'MongoClient' or just 'Mongo'.");
}
Assert.notNull(mongo, "Mongo must not be null");
Assert.hasText(databaseName, "Database name must not be empty");
Assert.isTrue(databaseName.matches("[\\w-]+"),

View File

@@ -0,0 +1,153 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.List;
import org.springframework.util.ObjectUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* @author Christoph Strobl
* @since 1.10
*/
abstract class AbstractAggregationExpression implements AggregationExpression {
private final Object value;
protected AbstractAggregationExpression(Object value) {
this.value = value;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
return toDbObject(this.value, context);
}
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
Object valueToUse;
if (value instanceof List) {
List<Object> arguments = (List<Object>) value;
List<Object> args = new ArrayList<Object>(arguments.size());
for (Object val : arguments) {
args.add(unpack(val, context));
}
valueToUse = args;
} else if (value instanceof java.util.Map) {
DBObject dbo = new BasicDBObject();
for (java.util.Map.Entry<String, Object> entry : ((java.util.Map<String, Object>) value).entrySet()) {
dbo.put(entry.getKey(), unpack(entry.getValue(), context));
}
valueToUse = dbo;
} else {
valueToUse = unpack(value, context);
}
return new BasicDBObject(getMongoMethod(), valueToUse);
}
protected static List<Field> asFields(String... fieldRefs) {
if (ObjectUtils.isEmpty(fieldRefs)) {
return Collections.emptyList();
}
return Fields.fields(fieldRefs).asList();
}
@SuppressWarnings("unchecked")
private Object unpack(Object value, AggregationOperationContext context) {
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDbObject(context);
}
if (value instanceof Field) {
return context.getReference((Field) value).toString();
}
if (value instanceof List) {
List<Object> sourceList = (List<Object>) value;
List<Object> mappedList = new ArrayList<Object>(sourceList.size());
for (Object item : sourceList) {
mappedList.add(unpack(item, context));
}
return mappedList;
}
return value;
}
protected List<Object> append(Object value) {
if (this.value instanceof List) {
List<Object> clone = new ArrayList<Object>((List) this.value);
if (value instanceof List) {
for (Object val : (List) value) {
clone.add(val);
}
} else {
clone.add(value);
}
return clone;
}
return Arrays.asList(this.value, value);
}
@SuppressWarnings("unchecked")
protected java.util.Map<String, Object> append(String key, Object value) {
if (!(this.value instanceof java.util.Map)) {
throw new IllegalArgumentException("o_O");
}
java.util.Map<String, Object> clone = new LinkedHashMap<String, Object>((java.util.Map<String, Object>) this.value);
clone.put(key, value);
return clone;
}
protected List<Object> values() {
if (value instanceof List) {
return new ArrayList<Object>((List) value);
}
if (value instanceof java.util.Map) {
return new ArrayList<Object>(((java.util.Map) value).values());
}
return new ArrayList<Object>(Collections.singletonList(value));
}
protected abstract String getMongoMethod();
}

View File

@@ -0,0 +1,648 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.Collections;
import java.util.List;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* Gateway to {@literal accumulator} aggregation operations.
*
* @author Christoph Strobl
* @since 1.10
* @soundtrack Rage Against The Machine - Killing In The Name
*/
public class AccumulatorOperators {
/**
* Take the numeric value referenced by given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static AccumulatorOperatorFactory valueOf(String fieldReference) {
return new AccumulatorOperatorFactory(fieldReference);
}
/**
* Take the numeric value referenced resulting from given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static AccumulatorOperatorFactory valueOf(AggregationExpression expression) {
return new AccumulatorOperatorFactory(expression);
}
/**
* @author Christoph Strobl
*/
public static class AccumulatorOperatorFactory {
private final String fieldReference;
private final AggregationExpression expression;
/**
* Creates new {@link AccumulatorOperatorFactory} for given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
*/
public AccumulatorOperatorFactory(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
this.fieldReference = fieldReference;
this.expression = null;
}
/**
* Creates new {@link AccumulatorOperatorFactory} for given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
*/
public AccumulatorOperatorFactory(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
this.fieldReference = null;
this.expression = expression;
}
/**
* Creates new {@link AggregationExpression} that takes the associated numeric value expression and calculates and
* returns the sum.
*
* @return
*/
public Sum sum() {
return usesFieldRef() ? Sum.sumOf(fieldReference) : Sum.sumOf(expression);
}
/**
* Creates new {@link AggregationExpression} that takes the associated numeric value expression and returns the
* average value.
*
* @return
*/
public Avg avg() {
return usesFieldRef() ? Avg.avgOf(fieldReference) : Avg.avgOf(expression);
}
/**
* Creates new {@link AggregationExpression} that takes the associated numeric value expression and returns the
* maximum value.
*
* @return
*/
public Max max() {
return usesFieldRef() ? Max.maxOf(fieldReference) : Max.maxOf(expression);
}
/**
* Creates new {@link AggregationExpression} that takes the associated numeric value expression and returns the
* minimum value.
*
* @return
*/
public Min min() {
return usesFieldRef() ? Min.minOf(fieldReference) : Min.minOf(expression);
}
/**
* Creates new {@link AggregationExpression} that takes the associated numeric value expression and calculates the
* population standard deviation of the input values.
*
* @return
*/
public StdDevPop stdDevPop() {
return usesFieldRef() ? StdDevPop.stdDevPopOf(fieldReference) : StdDevPop.stdDevPopOf(expression);
}
/**
* Creates new {@link AggregationExpression} that takes the associated numeric value expression and calculates the
* sample standard deviation of the input values.
*
* @return
*/
public StdDevSamp stdDevSamp() {
return usesFieldRef() ? StdDevSamp.stdDevSampOf(fieldReference) : StdDevSamp.stdDevSampOf(expression);
}
private boolean usesFieldRef() {
return fieldReference != null;
}
}
/**
* {@link AggregationExpression} for {@code $sum}.
*
* @author Christoph Strobl
*/
public static class Sum extends AbstractAggregationExpression {
private Sum(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$sum";
}
/**
* Creates new {@link Sum}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Sum sumOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Sum(asFields(fieldReference));
}
/**
* Creates new {@link Sum}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Sum sumOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Sum(Collections.singletonList(expression));
}
/**
* Creates new {@link Sum} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Sum and(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Sum(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Sum} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param expression must not be {@literal null}.
* @return
*/
public Sum and(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Sum(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDbObject(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDbObject(((List<Object>) value).iterator().next(), context);
}
}
return super.toDbObject(value, context);
}
}
/**
* {@link AggregationExpression} for {@code $avg}.
*
* @author Christoph Strobl
*/
public static class Avg extends AbstractAggregationExpression {
private Avg(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$avg";
}
/**
* Creates new {@link Avg}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Avg avgOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Avg(asFields(fieldReference));
}
/**
* Creates new {@link Avg}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Avg avgOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Avg(Collections.singletonList(expression));
}
/**
* Creates new {@link Avg} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Avg and(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Avg(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Avg} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param expression must not be {@literal null}.
* @return
*/
public Avg and(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Avg(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDbObject(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDbObject(((List<Object>) value).iterator().next(), context);
}
}
return super.toDbObject(value, context);
}
}
/**
* {@link AggregationExpression} for {@code $max}.
*
* @author Christoph Strobl
*/
public static class Max extends AbstractAggregationExpression {
private Max(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$max";
}
/**
* Creates new {@link Max}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Max maxOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Max(asFields(fieldReference));
}
/**
* Creates new {@link Max}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Max maxOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Max(Collections.singletonList(expression));
}
/**
* Creates new {@link Max} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Max and(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Max(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Max} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param expression must not be {@literal null}.
* @return
*/
public Max and(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Max(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDbObject(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDbObject(((List<Object>) value).iterator().next(), context);
}
}
return super.toDbObject(value, context);
}
}
/**
* {@link AggregationExpression} for {@code $min}.
*
* @author Christoph Strobl
*/
public static class Min extends AbstractAggregationExpression {
private Min(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$min";
}
/**
* Creates new {@link Min}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Min minOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Min(asFields(fieldReference));
}
/**
* Creates new {@link Min}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Min minOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Min(Collections.singletonList(expression));
}
/**
* Creates new {@link Min} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Min and(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Min(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Min} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param expression must not be {@literal null}.
* @return
*/
public Min and(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Min(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDbObject(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDbObject(((List<Object>) value).iterator().next(), context);
}
}
return super.toDbObject(value, context);
}
}
/**
* {@link AggregationExpression} for {@code $stdDevPop}.
*
* @author Christoph Strobl
*/
public static class StdDevPop extends AbstractAggregationExpression {
private StdDevPop(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$stdDevPop";
}
/**
* Creates new {@link StdDevPop}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static StdDevPop stdDevPopOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new StdDevPop(asFields(fieldReference));
}
/**
* Creates new {@link StdDevPop} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public static StdDevPop stdDevPopOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new StdDevPop(Collections.singletonList(expression));
}
/**
* Creates new {@link StdDevPop} with all previously added arguments appending the given one. <br/>
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public StdDevPop and(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new StdDevPop(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link StdDevSamp} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param expression must not be {@literal null}.
* @return
*/
public StdDevPop and(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new StdDevPop(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDbObject(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDbObject(((List<Object>) value).iterator().next(), context);
}
}
return super.toDbObject(value, context);
}
}
/**
* {@link AggregationExpression} for {@code $stdDevSamp}.
*
* @author Christoph Strobl
*/
public static class StdDevSamp extends AbstractAggregationExpression {
private StdDevSamp(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$stdDevSamp";
}
/**
* Creates new {@link StdDevSamp}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static StdDevSamp stdDevSampOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new StdDevSamp(asFields(fieldReference));
}
/**
* Creates new {@link StdDevSamp}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static StdDevSamp stdDevSampOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new StdDevSamp(Collections.singletonList(expression));
}
/**
* Creates new {@link StdDevSamp} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public StdDevSamp and(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new StdDevSamp(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link StdDevSamp} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param expression must not be {@literal null}.
* @return
*/
public StdDevSamp and(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new StdDevSamp(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDbObject(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDbObject(((List<Object>) value).iterator().next(), context);
}
}
return super.toDbObject(value, context);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -23,10 +23,18 @@ import java.util.List;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.aggregation.CountOperation.CountOperationBuilder;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.DirectFieldReference;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
import org.springframework.data.mongodb.core.aggregation.FacetOperation.FacetOperationBuilder;
import org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation;
import org.springframework.data.mongodb.core.aggregation.GraphLookupOperation.StartWithBuilder;
import org.springframework.data.mongodb.core.aggregation.ReplaceRootOperation.ReplaceRootDocumentOperationBuilder;
import org.springframework.data.mongodb.core.aggregation.ReplaceRootOperation.ReplaceRootOperationBuilder;
import org.springframework.data.mongodb.core.aggregation.Fields.*;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.util.Assert;
@@ -37,10 +45,14 @@ import com.mongodb.DBObject;
/**
* An {@code Aggregation} is a representation of a list of aggregation steps to be performed by the MongoDB Aggregation
* Framework.
*
*
* @author Tobias Trelle
* @author Thomas Darimont
* @author Oliver Gierke
* @author Mark Paluch
* @author Alessio Fachechi
* @author Christoph Strobl
* @author Nikolay Bogdanov
* @since 1.3
*/
public class Aggregation {
@@ -57,7 +69,7 @@ public class Aggregation {
*/
public static final String CURRENT = SystemVariable.CURRENT.toString();
public static final AggregationOperationContext DEFAULT_CONTEXT = new NoOpAggregationOperationContext();
public static final AggregationOperationContext DEFAULT_CONTEXT = AggregationOperationRenderer.DEFAULT_CONTEXT;
public static final AggregationOptions DEFAULT_OPTIONS = newAggregationOptions().build();
protected final List<AggregationOperation> operations;
@@ -65,7 +77,7 @@ public class Aggregation {
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
*
* @param operations must not be {@literal null} or empty.
*/
public static Aggregation newAggregation(List<? extends AggregationOperation> operations) {
@@ -74,7 +86,7 @@ public class Aggregation {
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
*
* @param operations must not be {@literal null} or empty.
*/
public static Aggregation newAggregation(AggregationOperation... operations) {
@@ -84,7 +96,7 @@ public class Aggregation {
/**
* Returns a copy of this {@link Aggregation} with the given {@link AggregationOptions} set. Note that options are
* supported in MongoDB version 2.6+.
*
*
* @param options must not be {@literal null}.
* @return
* @since 1.6
@@ -97,7 +109,7 @@ public class Aggregation {
/**
* Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s.
*
*
* @param type must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
*/
@@ -107,7 +119,7 @@ public class Aggregation {
/**
* Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s.
*
*
* @param type must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
*/
@@ -117,7 +129,7 @@ public class Aggregation {
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
*
* @param aggregationOperations must not be {@literal null} or empty.
*/
protected Aggregation(AggregationOperation... aggregationOperations) {
@@ -137,7 +149,7 @@ public class Aggregation {
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
*
* @param aggregationOperations must not be {@literal null} or empty.
*/
protected Aggregation(List<AggregationOperation> aggregationOperations) {
@@ -146,23 +158,34 @@ public class Aggregation {
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
*
* @param aggregationOperations must not be {@literal null} or empty.
* @param options must not be {@literal null} or empty.
*/
protected Aggregation(List<AggregationOperation> aggregationOperations, AggregationOptions options) {
Assert.notNull(aggregationOperations, "AggregationOperations must not be null!");
Assert.isTrue(aggregationOperations.size() > 0, "At least one AggregationOperation has to be provided");
Assert.isTrue(!aggregationOperations.isEmpty(), "At least one AggregationOperation has to be provided");
Assert.notNull(options, "AggregationOptions must not be null!");
// check $out is the last operation if it exists
for (AggregationOperation aggregationOperation : aggregationOperations) {
if (aggregationOperation instanceof OutOperation && !isLast(aggregationOperation, aggregationOperations)) {
throw new IllegalArgumentException("The $out operator must be the last stage in the pipeline.");
}
}
this.operations = aggregationOperations;
this.options = options;
}
private boolean isLast(AggregationOperation aggregationOperation, List<AggregationOperation> aggregationOperations) {
return aggregationOperations.indexOf(aggregationOperation) == aggregationOperations.size() - 1;
}
/**
* A pointer to the previous {@link AggregationOperation}.
*
*
* @return
*/
public static String previousOperation() {
@@ -171,7 +194,7 @@ public class Aggregation {
/**
* Creates a new {@link ProjectionOperation} including the given fields.
*
*
* @param fields must not be {@literal null}.
* @return
*/
@@ -180,8 +203,8 @@ public class Aggregation {
}
/**
* Creates a new {@link ProjectionOperation} includeing the given {@link Fields}.
*
* Creates a new {@link ProjectionOperation} including the given {@link Fields}.
*
* @param fields must not be {@literal null}.
* @return
*/
@@ -191,17 +214,95 @@ public class Aggregation {
/**
* Factory method to create a new {@link UnwindOperation} for the field with the given name.
*
* @param fieldName must not be {@literal null} or empty.
*
* @param field must not be {@literal null} or empty.
* @return
*/
public static UnwindOperation unwind(String field) {
return new UnwindOperation(field(field));
}
/**
* Factory method to create a new {@link ReplaceRootOperation} for the field with the given name.
*
* @param fieldName must not be {@literal null} or empty.
* @return
* @since 1.10
*/
public static ReplaceRootOperation replaceRoot(String fieldName) {
return ReplaceRootOperation.builder().withValueOf(fieldName);
}
/**
* Factory method to create a new {@link ReplaceRootOperation} for the field with the given
* {@link AggregationExpression}.
*
* @param aggregationExpression must not be {@literal null}.
* @return
* @since 1.10
*/
public static ReplaceRootOperation replaceRoot(AggregationExpression aggregationExpression) {
return ReplaceRootOperation.builder().withValueOf(aggregationExpression);
}
/**
* Factory method to create a new {@link ReplaceRootDocumentOperationBuilder} to configure a
* {@link ReplaceRootOperation}.
*
* @return the {@literal ReplaceRootDocumentOperationBuilder}.
* @since 1.10
*/
public static ReplaceRootOperationBuilder replaceRoot() {
return ReplaceRootOperation.builder();
}
/**
* Factory method to create a new {@link UnwindOperation} for the field with the given name and
* {@code preserveNullAndEmptyArrays}. Note that extended unwind is supported in MongoDB version 3.2+.
*
* @param field must not be {@literal null} or empty.
* @param preserveNullAndEmptyArrays {@literal true} to output the document if path is {@literal null}, missing or
* array is empty.
* @return new {@link UnwindOperation}
* @since 1.10
*/
public static UnwindOperation unwind(String field, boolean preserveNullAndEmptyArrays) {
return new UnwindOperation(field(field), preserveNullAndEmptyArrays);
}
/**
* Factory method to create a new {@link UnwindOperation} for the field with the given name including the name of a
* new field to hold the array index of the element as {@code arrayIndex}. Note that extended unwind is supported in
* MongoDB version 3.2+.
*
* @param field must not be {@literal null} or empty.
* @param arrayIndex must not be {@literal null} or empty.
* @return new {@link UnwindOperation}
* @since 1.10
*/
public static UnwindOperation unwind(String field, String arrayIndex) {
return new UnwindOperation(field(field), field(arrayIndex), false);
}
/**
* Factory method to create a new {@link UnwindOperation} for the field with the given nameincluding the name of a new
* field to hold the array index of the element as {@code arrayIndex} using {@code preserveNullAndEmptyArrays}. Note
* that extended unwind is supported in MongoDB version 3.2+.
*
* @param field must not be {@literal null} or empty.
* @param arrayIndex must not be {@literal null} or empty.
* @param preserveNullAndEmptyArrays {@literal true} to output the document if path is {@literal null}, missing or
* array is empty.
* @return new {@link UnwindOperation}
* @since 1.10
*/
public static UnwindOperation unwind(String field, String arrayIndex, boolean preserveNullAndEmptyArrays) {
return new UnwindOperation(field(field), field(arrayIndex), preserveNullAndEmptyArrays);
}
/**
* Creates a new {@link GroupOperation} for the given fields.
*
*
* @param fields must not be {@literal null}.
* @return
*/
@@ -211,7 +312,7 @@ public class Aggregation {
/**
* Creates a new {@link GroupOperation} for the given {@link Fields}.
*
*
* @param fields must not be {@literal null}.
* @return
*/
@@ -219,9 +320,21 @@ public class Aggregation {
return new GroupOperation(fields);
}
/**
* Creates a new {@link GraphLookupOperation.GraphLookupOperationFromBuilder} to construct a
* {@link GraphLookupOperation} given {@literal fromCollection}.
*
* @param fromCollection must not be {@literal null} or empty.
* @return
* @since 1.10
*/
public static StartWithBuilder graphLookup(String fromCollection) {
return GraphLookupOperation.builder().from(fromCollection);
}
/**
* Factory method to create a new {@link SortOperation} for the given {@link Sort}.
*
*
* @param sort must not be {@literal null}.
* @return
*/
@@ -231,7 +344,7 @@ public class Aggregation {
/**
* Factory method to create a new {@link SortOperation} for the given sort {@link Direction} and {@code fields}.
*
*
* @param direction must not be {@literal null}.
* @param fields must not be {@literal null}.
* @return
@@ -242,17 +355,28 @@ public class Aggregation {
/**
* Creates a new {@link SkipOperation} skipping the given number of elements.
*
*
* @param elementsToSkip must not be less than zero.
* @return
* @deprecated prepare to get this one removed in favor of {@link #skip(long)}.
*/
public static SkipOperation skip(int elementsToSkip) {
return new SkipOperation(elementsToSkip);
}
/**
* Creates a new {@link SkipOperation} skipping the given number of elements.
*
* @param elementsToSkip must not be less than zero.
* @return
*/
public static SkipOperation skip(long elementsToSkip) {
return new SkipOperation(elementsToSkip);
}
/**
* Creates a new {@link LimitOperation} limiting the result to the given number of elements.
*
*
* @param maxElements must not be less than zero.
* @return
*/
@@ -262,7 +386,7 @@ public class Aggregation {
/**
* Creates a new {@link MatchOperation} using the given {@link Criteria}.
*
*
* @param criteria must not be {@literal null}.
* @return
*/
@@ -270,12 +394,142 @@ public class Aggregation {
return new MatchOperation(criteria);
}
/**
* Creates a new {@link MatchOperation} using the given {@link CriteriaDefinition}.
*
* @param criteria must not be {@literal null}.
* @return
* @since 1.10
*/
public static MatchOperation match(CriteriaDefinition criteria) {
return new MatchOperation(criteria);
}
/**
* Creates a new {@link OutOperation} using the given collection name. This operation must be the last operation in
* the pipeline.
*
* @param outCollectionName collection name to export aggregation results. The {@link OutOperation} creates a new
* collection in the current database if one does not already exist. The collection is not visible until the
* aggregation completes. If the aggregation fails, MongoDB does not create the collection. Must not be
* {@literal null}.
* @return
*/
public static OutOperation out(String outCollectionName) {
return new OutOperation(outCollectionName);
}
/**
* Creates a new {@link BucketOperation} given {@literal groupByField}.
*
* @param groupByField must not be {@literal null} or empty.
* @return
* @since 1.10
*/
public static BucketOperation bucket(String groupByField) {
return new BucketOperation(field(groupByField));
}
/**
* Creates a new {@link BucketOperation} given {@link AggregationExpression group-by expression}.
*
* @param groupByExpression must not be {@literal null}.
* @return
* @since 1.10
*/
public static BucketOperation bucket(AggregationExpression groupByExpression) {
return new BucketOperation(groupByExpression);
}
/**
* Creates a new {@link BucketAutoOperation} given {@literal groupByField}.
*
* @param groupByField must not be {@literal null} or empty.
* @param buckets number of buckets, must be a positive integer.
* @return
* @since 1.10
*/
public static BucketAutoOperation bucketAuto(String groupByField, int buckets) {
return new BucketAutoOperation(field(groupByField), buckets);
}
/**
* Creates a new {@link BucketAutoOperation} given {@link AggregationExpression group-by expression}.
*
* @param groupByExpression must not be {@literal null}.
* @param buckets number of buckets, must be a positive integer.
* @return
* @since 1.10
*/
public static BucketAutoOperation bucketAuto(AggregationExpression groupByExpression, int buckets) {
return new BucketAutoOperation(groupByExpression, buckets);
}
/**
* Creates a new {@link FacetOperation}.
*
* @return
* @since 1.10
*/
public static FacetOperation facet() {
return FacetOperation.EMPTY;
}
/**
* Creates a new {@link FacetOperationBuilder} given {@link Aggregation}.
*
* @param aggregationOperations the sub-pipeline, must not be {@literal null}.
* @return
* @since 1.10
*/
public static FacetOperationBuilder facet(AggregationOperation... aggregationOperations) {
return facet().and(aggregationOperations);
}
/**
* Creates a new {@link LookupOperation}.
*
* @param from must not be {@literal null}.
* @param localField must not be {@literal null}.
* @param foreignField must not be {@literal null}.
* @param as must not be {@literal null}.
* @return never {@literal null}.
* @since 1.9
*/
public static LookupOperation lookup(String from, String localField, String foreignField, String as) {
return lookup(field(from), field(localField), field(foreignField), field(as));
}
/**
* Creates a new {@link LookupOperation} for the given {@link Fields}.
*
* @param from must not be {@literal null}.
* @param localField must not be {@literal null}.
* @param foreignField must not be {@literal null}.
* @param as must not be {@literal null}.
* @return never {@literal null}.
* @since 1.9
*/
public static LookupOperation lookup(Field from, Field localField, Field foreignField, Field as) {
return new LookupOperation(from, localField, foreignField, as);
}
/**
* Creates a new {@link CountOperationBuilder}.
*
* @return never {@literal null}.
* @since 1.10
*/
public static CountOperationBuilder count() {
return new CountOperationBuilder();
}
/**
* Creates a new {@link Fields} instance for the given field names.
*
* @see Fields#fields(String...)
*
* @param fields must not be {@literal null}.
* @return
* @see Fields#fields(String...)
*/
public static Fields fields(String... fields) {
return Fields.fields(fields);
@@ -283,7 +537,7 @@ public class Aggregation {
/**
* Creates a new {@link Fields} instance from the given field name and target reference.
*
*
* @param name must not be {@literal null} or empty.
* @param target must not be {@literal null} or empty.
* @return
@@ -295,7 +549,7 @@ public class Aggregation {
/**
* Creates a new {@link GeoNearOperation} instance from the given {@link NearQuery} and the{@code distanceField}. The
* {@code distanceField} defines output field that contains the calculated distance.
*
*
* @param query must not be {@literal null}.
* @param distanceField must not be {@literal null} or empty.
* @return
@@ -307,7 +561,7 @@ public class Aggregation {
/**
* Returns a new {@link AggregationOptions.Builder}.
*
*
* @return
* @since 1.6
*/
@@ -317,24 +571,13 @@ public class Aggregation {
/**
* Converts this {@link Aggregation} specification to a {@link DBObject}.
*
*
* @param inputCollectionName the name of the input collection
* @return the {@code DBObject} representing this aggregation
*/
public DBObject toDbObject(String inputCollectionName, AggregationOperationContext rootContext) {
AggregationOperationContext context = rootContext;
List<DBObject> operationDocuments = new ArrayList<DBObject>(operations.size());
for (AggregationOperation operation : operations) {
operationDocuments.add(operation.toDBObject(context));
if (operation instanceof FieldsExposingAggregationOperation) {
FieldsExposingAggregationOperation exposedFieldsOperation = (FieldsExposingAggregationOperation) operation;
context = new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields(), rootContext);
}
}
List<DBObject> operationDocuments = AggregationOperationRenderer.toDBObject(operations, rootContext);
DBObject command = new BasicDBObject("aggregate", inputCollectionName);
command.put("pipeline", operationDocuments);
@@ -350,50 +593,14 @@ public class Aggregation {
*/
@Override
public String toString() {
return SerializationUtils
.serializeToJsonSafely(toDbObject("__collection__", new NoOpAggregationOperationContext()));
}
/**
* Simple {@link AggregationOperationContext} that just returns {@link FieldReference}s as is.
*
* @author Oliver Gierke
*/
private static class NoOpAggregationOperationContext implements AggregationOperationContext {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(com.mongodb.DBObject)
*/
@Override
public DBObject getMappedObject(DBObject dbObject) {
return dbObject;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.ExposedFields.AvailableField)
*/
@Override
public FieldReference getReference(Field field) {
return new FieldReference(new ExposedField(field, true));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
*/
@Override
public FieldReference getReference(String name) {
return new FieldReference(new ExposedField(new AggregationField(name), true));
}
return SerializationUtils.serializeToJsonSafely(toDbObject("__collection__", DEFAULT_CONTEXT));
}
/**
* Describes the system variables available in MongoDB aggregation framework pipeline expressions.
*
*
* @author Thomas Darimont
* @see http://docs.mongodb.org/manual/reference/aggregation-variables
* @see <a href="https://docs.mongodb.org/manual/reference/aggregation-variables">Aggregation Variables</a>
*/
enum SystemVariable {
@@ -404,7 +611,7 @@ public class Aggregation {
/**
* Return {@literal true} if the given {@code fieldRef} denotes a well-known system variable, {@literal false}
* otherwise.
*
*
* @param fieldRef may be {@literal null}.
* @return
*/
@@ -426,7 +633,7 @@ public class Aggregation {
return false;
}
/*
/*
* (non-Javadoc)
* @see java.lang.Enum#toString()
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,16 +20,17 @@ import com.mongodb.DBObject;
/**
* An {@link AggregationExpression} can be used with field expressions in aggregation pipeline stages like
* {@code project} and {@code group}.
*
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
*/
interface AggregationExpression {
public interface AggregationExpression {
/**
* Turns the {@link AggregationExpression} into a {@link DBObject} within the given
* {@link AggregationOperationContext}.
*
*
* @param context
* @return
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.util.Assert;
@@ -28,11 +29,14 @@ import com.mongodb.DBObject;
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.10
* @author Christoph Strobl
* @since 1.7
* @deprecated since 1.10. Please use {@link ArithmeticOperators} and {@link ComparisonOperators} instead.
*/
@Deprecated
public enum AggregationFunctionExpressions {
SIZE;
SIZE, CMP, EQ, GT, GTE, LT, LTE, NE, SUBTRACT, ADD, MULTIPLY;
/**
* Returns an {@link AggregationExpression} build from the current {@link Enum} name and the given parameters.
@@ -51,12 +55,12 @@ public enum AggregationFunctionExpressions {
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.10
* @since 1.7
*/
static class FunctionExpression implements AggregationExpression {
private final String name;
private final Object[] values;
private final List<Object> values;
/**
* Creates a new {@link FunctionExpression} for the given name and values.
@@ -70,7 +74,7 @@ public enum AggregationFunctionExpressions {
Assert.notNull(values, "Values must not be null!");
this.name = name;
this.values = values;
this.values = Arrays.asList(values);
}
/*
@@ -80,10 +84,10 @@ public enum AggregationFunctionExpressions {
@Override
public DBObject toDbObject(AggregationOperationContext context) {
List<Object> args = new ArrayList<Object>(values.length);
List<Object> args = new ArrayList<Object>(values.size());
for (int i = 0; i < values.length; i++) {
args.add(unpack(values[i], context));
for (Object value : values) {
args.add(unpack(value, context));
}
return new BasicDBObject("$" + name, args);

View File

@@ -0,0 +1,109 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.List;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.DirectFieldReference;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
import org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation;
import com.mongodb.DBObject;
/**
* Rendering support for {@link AggregationOperation} into a {@link List} of {@link com.mongodb.DBObject}.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 1.10
*/
class AggregationOperationRenderer {
static final AggregationOperationContext DEFAULT_CONTEXT = new NoOpAggregationOperationContext();
/**
* Render a {@link List} of {@link AggregationOperation} given {@link AggregationOperationContext} into their
* {@link DBObject} representation.
*
* @param operations must not be {@literal null}.
* @param context must not be {@literal null}.
* @return the {@link List} of {@link DBObject}.
*/
static List<DBObject> toDBObject(List<AggregationOperation> operations, AggregationOperationContext rootContext) {
List<DBObject> operationDocuments = new ArrayList<DBObject>(operations.size());
AggregationOperationContext contextToUse = rootContext;
for (AggregationOperation operation : operations) {
operationDocuments.add(operation.toDBObject(contextToUse));
if (operation instanceof FieldsExposingAggregationOperation) {
FieldsExposingAggregationOperation exposedFieldsOperation = (FieldsExposingAggregationOperation) operation;
ExposedFields fields = exposedFieldsOperation.getFields();
if (operation instanceof InheritsFieldsAggregationOperation) {
contextToUse = new InheritingExposedFieldsAggregationOperationContext(fields, contextToUse);
} else {
contextToUse = fields.exposesNoFields() ? DEFAULT_CONTEXT
: new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields(), contextToUse);
}
}
}
return operationDocuments;
}
/**
* Simple {@link AggregationOperationContext} that just returns {@link FieldReference}s as is.
*
* @author Oliver Gierke
*/
private static class NoOpAggregationOperationContext implements AggregationOperationContext {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(com.mongodb.DBObject)
*/
@Override
public DBObject getMappedObject(DBObject dbObject) {
return dbObject;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.ExposedFields.AvailableField)
*/
@Override
public FieldReference getReference(Field field) {
return new DirectFieldReference(new ExposedField(field, true));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
*/
@Override
public FieldReference getReference(String name) {
return new DirectFieldReference(new ExposedField(new AggregationField(name), true));
}
}
}

View File

@@ -21,7 +21,7 @@ import com.mongodb.DBObject;
/**
* Holds a set of configurable aggregation options that can be used within an aggregation pipeline. A list of support
* aggregation options can be found in the MongoDB reference documentation
* http://docs.mongodb.org/manual/reference/command/aggregate/#aggregate
* https://docs.mongodb.org/manual/reference/command/aggregate/#aggregate
*
* @author Thomas Darimont
* @author Oliver Gierke

View File

@@ -0,0 +1,74 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* An {@link AggregationExpression} that renders a MongoDB Aggregation Framework expression from the AST of a
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/expressions.html">SpEL
* expression</a>. <br />
* <br />
* <strong>Samples:</strong> <br />
* <code>
* <pre>
* // { $and: [ { $gt: [ "$qty", 100 ] }, { $lt: [ "$qty", 250 ] } ] }
* expressionOf("qty > 100 && qty < 250);
*
* // { $cond : { if : { $gte : [ "$a", 42 ]}, then : "answer", else : "no-answer" } }
* expressionOf("cond(a >= 42, 'answer', 'no-answer')");
* </pre>
* </code>
*
* @author Christoph Strobl
* @see SpelExpressionTransformer
* @since 1.10
*/
public class AggregationSpELExpression implements AggregationExpression {
private static final SpelExpressionTransformer TRANSFORMER = new SpelExpressionTransformer();
private final String rawExpression;
private final Object[] parameters;
private AggregationSpELExpression(String rawExpression, Object[] parameters) {
this.rawExpression = rawExpression;
this.parameters = parameters;
}
/**
* Creates new {@link AggregationSpELExpression} for the given {@literal expressionString} and {@literal parameters}.
*
* @param expressionString must not be {@literal null}.
* @param parameters can be empty.
* @return
*/
public static AggregationSpELExpression expressionOf(String expressionString, Object... parameters) {
Assert.notNull(expressionString, "ExpressionString must not be null!");
return new AggregationSpELExpression(expressionString, parameters);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
return (DBObject) TRANSFORMER.transform(rawExpression, context, parameters);
}
}

View File

@@ -0,0 +1,353 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import org.springframework.util.Assert;
/**
* Gateway to {@literal boolean expressions} that evaluate their argument expressions as booleans and return a boolean
* as the result.
*
* @author Christoph Strobl
* @since 1.10
*/
public class BooleanOperators {
/**
* Take the array referenced by given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static BooleanOperatorFactory valueOf(String fieldReference) {
return new BooleanOperatorFactory(fieldReference);
}
/**
* Take the value resulting of the given {@link AggregationExpression}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static BooleanOperatorFactory valueOf(AggregationExpression fieldReference) {
return new BooleanOperatorFactory(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that evaluates the boolean value of the referenced field and returns the
* opposite boolean value.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Not not(String fieldReference) {
return Not.not(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that evaluates the boolean value of {@link AggregationExpression} result
* and returns the opposite boolean value.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Not not(AggregationExpression expression) {
return Not.not(expression);
}
/**
* @author Christoph Strobl
*/
public static class BooleanOperatorFactory {
private final String fieldReference;
private final AggregationExpression expression;
/**
* Creates new {@link BooleanOperatorFactory} for given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
*/
public BooleanOperatorFactory(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
this.fieldReference = fieldReference;
this.expression = null;
}
/**
* Creates new {@link BooleanOperatorFactory} for given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
*/
public BooleanOperatorFactory(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
this.fieldReference = null;
this.expression = expression;
}
/**
* Creates new {@link AggregationExpression} that evaluates one or more expressions and returns {@literal true} if
* all of the expressions are {@literal true}.
*
* @param expression must not be {@literal null}.
* @return
*/
public And and(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return createAnd().andExpression(expression);
}
/**
* Creates new {@link AggregationExpression} that evaluates one or more expressions and returns {@literal true} if
* all of the expressions are {@literal true}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public And and(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return createAnd().andField(fieldReference);
}
private And createAnd() {
return usesFieldRef() ? And.and(Fields.field(fieldReference)) : And.and(expression);
}
/**
* Creates new {@link AggregationExpression} that evaluates one or more expressions and returns {@literal true} if
* any of the expressions are {@literal true}.
*
* @param expression must not be {@literal null}.
* @return
*/
public Or or(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return createOr().orExpression(expression);
}
/**
* Creates new {@link AggregationExpression} that evaluates one or more expressions and returns {@literal true} if
* any of the expressions are {@literal true}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Or or(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return createOr().orField(fieldReference);
}
private Or createOr() {
return usesFieldRef() ? Or.or(Fields.field(fieldReference)) : Or.or(expression);
}
/**
* Creates new {@link AggregationExpression} that evaluates a boolean and returns the opposite boolean value.
*
* @return
*/
public Not not() {
return usesFieldRef() ? Not.not(fieldReference) : Not.not(expression);
}
private boolean usesFieldRef() {
return this.fieldReference != null;
}
}
/**
* {@link AggregationExpression} for {@code $and}.
*
* @author Christoph Strobl
*/
public static class And extends AbstractAggregationExpression {
private And(List<?> values) {
super(values);
}
@Override
protected String getMongoMethod() {
return "$and";
}
/**
* Creates new {@link And} that evaluates one or more expressions and returns {@literal true} if all of the
* expressions are {@literal true}.
*
* @param expressions
* @return
*/
public static And and(Object... expressions) {
return new And(Arrays.asList(expressions));
}
/**
* Creates new {@link And} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public And andExpression(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new And(append(expression));
}
/**
* Creates new {@link And} with all previously added arguments appending the given one.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public And andField(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new And(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link And} with all previously added arguments appending the given one.
*
* @param value must not be {@literal null}.
* @return
*/
public And andValue(Object value) {
Assert.notNull(value, "Value must not be null!");
return new And(append(value));
}
}
/**
* {@link AggregationExpression} for {@code $or}.
*
* @author Christoph Strobl
*/
public static class Or extends AbstractAggregationExpression {
private Or(List<?> values) {
super(values);
}
@Override
protected String getMongoMethod() {
return "$or";
}
/**
* Creates new {@link Or} that evaluates one or more expressions and returns {@literal true} if any of the
* expressions are {@literal true}.
*
* @param expressions must not be {@literal null}.
* @return
*/
public static Or or(Object... expressions) {
Assert.notNull(expressions, "Expressions must not be null!");
return new Or(Arrays.asList(expressions));
}
/**
* Creates new {@link Or} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public Or orExpression(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Or(append(expression));
}
/**
* Creates new {@link Or} with all previously added arguments appending the given one.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Or orField(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Or(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Or} with all previously added arguments appending the given one.
*
* @param value must not be {@literal null}.
* @return
*/
public Or orValue(Object value) {
Assert.notNull(value, "Value must not be null!");
return new Or(append(value));
}
}
/**
* {@link AggregationExpression} for {@code $not}.
*
* @author Christoph Strobl
*/
public static class Not extends AbstractAggregationExpression {
private Not(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$not";
}
/**
* Creates new {@link Not} that evaluates the boolean value of the referenced field and returns the opposite boolean
* value.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Not not(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Not(asFields(fieldReference));
}
/**
* Creates new {@link Not} that evaluates the resulting boolean value of the given {@link AggregationExpression} and
* returns the opposite boolean value.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Not not(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Not(Collections.singletonList(expression));
}
}
}

View File

@@ -0,0 +1,275 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.aggregation.BucketAutoOperation.BucketAutoOperationOutputBuilder;
import org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OutputBuilder;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $bucketAuto}-operation. <br />
* Bucket stage is typically used with {@link Aggregation} and {@code $facet}. Categorizes incoming documents into a
* specific number of groups, called buckets, based on a specified expression. Bucket boundaries are automatically
* determined in an attempt to evenly distribute the documents into the specified number of buckets. <br />
* We recommend to use the static factory method {@link Aggregation#bucketAuto(String, int)} instead of creating
* instances of this class directly.
*
* @see <a href=
* "https://docs.mongodb.org/manual/reference/aggregation/bucketAuto/">https://docs.mongodb.org/manual/reference/aggregation/bucketAuto/</a>
* @see BucketOperationSupport
* @author Mark Paluch
* @author Christoph Strobl
* @since 1.10
*/
public class BucketAutoOperation extends BucketOperationSupport<BucketAutoOperation, BucketAutoOperationOutputBuilder>
implements FieldsExposingAggregationOperation {
private final int buckets;
private final String granularity;
/**
* Creates a new {@link BucketAutoOperation} given a {@link Field group-by field}.
*
* @param groupByField must not be {@literal null}.
* @param buckets number of buckets, must be a positive integer.
*/
public BucketAutoOperation(Field groupByField, int buckets) {
super(groupByField);
Assert.isTrue(buckets > 0, "Number of buckets must be greater 0!");
this.buckets = buckets;
this.granularity = null;
}
/**
* Creates a new {@link BucketAutoOperation} given a {@link AggregationExpression group-by expression}.
*
* @param groupByExpression must not be {@literal null}.
* @param buckets number of buckets, must be a positive integer.
*/
public BucketAutoOperation(AggregationExpression groupByExpression, int buckets) {
super(groupByExpression);
Assert.isTrue(buckets > 0, "Number of buckets must be greater 0!");
this.buckets = buckets;
this.granularity = null;
}
private BucketAutoOperation(BucketAutoOperation bucketOperation, Outputs outputs) {
super(bucketOperation, outputs);
this.buckets = bucketOperation.buckets;
this.granularity = bucketOperation.granularity;
}
private BucketAutoOperation(BucketAutoOperation bucketOperation, int buckets, String granularity) {
super(bucketOperation);
this.buckets = buckets;
this.granularity = granularity;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
DBObject options = new BasicDBObject();
options.put("buckets", buckets);
options.putAll(super.toDBObject(context));
if (granularity != null) {
options.put("granularity", granularity);
}
return new BasicDBObject("$bucketAuto", options);
}
/**
* Configures a number of bucket {@literal buckets} and return a new {@link BucketAutoOperation}.
*
* @param buckets must be a positive number.
* @return
*/
public BucketAutoOperation withBuckets(int buckets) {
Assert.isTrue(buckets > 0, "Number of buckets must be greater 0!");
return new BucketAutoOperation(this, buckets, granularity);
}
/**
* Configures {@link Granularity granularity} that specifies the preferred number series to use to ensure that the
* calculated boundary edges end on preferred round numbers or their powers of 10 and return a new
* {@link BucketAutoOperation}. <br />
* Use either predefined {@link Granularities} or provide a own one.
*
* @param granularity must not be {@literal null}.
* @return
*/
public BucketAutoOperation withGranularity(Granularity granularity) {
Assert.notNull(granularity, "Granularity must not be null!");
return new BucketAutoOperation(this, buckets, granularity.getMongoRepresentation());
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#newBucketOperation(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.Outputs)
*/
@Override
protected BucketAutoOperation newBucketOperation(Outputs outputs) {
return new BucketAutoOperation(this, outputs);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutputExpression(java.lang.String, java.lang.Object[])
*/
@Override
public ExpressionBucketAutoOperationBuilder andOutputExpression(String expression, Object... params) {
return new ExpressionBucketAutoOperationBuilder(expression, this, params);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutput(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public BucketAutoOperationOutputBuilder andOutput(AggregationExpression expression) {
return new BucketAutoOperationOutputBuilder(expression, this);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutput(java.lang.String)
*/
@Override
public BucketAutoOperationOutputBuilder andOutput(String fieldName) {
return new BucketAutoOperationOutputBuilder(Fields.field(fieldName), this);
}
/**
* {@link OutputBuilder} implementation for {@link BucketAutoOperation}.
*/
public static class BucketAutoOperationOutputBuilder
extends OutputBuilder<BucketAutoOperationOutputBuilder, BucketAutoOperation> {
/**
* Creates a new {@link BucketAutoOperationOutputBuilder} fot the given value and {@link BucketAutoOperation}.
*
* @param value must not be {@literal null}.
* @param operation must not be {@literal null}.
*/
protected BucketAutoOperationOutputBuilder(Object value, BucketAutoOperation operation) {
super(value, operation);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OutputBuilder#apply(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OperationOutput)
*/
@Override
protected BucketAutoOperationOutputBuilder apply(OperationOutput operationOutput) {
return new BucketAutoOperationOutputBuilder(operationOutput, this.operation);
}
}
/**
* {@link ExpressionBucketOperationBuilderSupport} implementation for {@link BucketAutoOperation} using SpEL
* expression based {@link Output}.
*
* @author Mark Paluch
*/
public static class ExpressionBucketAutoOperationBuilder
extends ExpressionBucketOperationBuilderSupport<BucketAutoOperationOutputBuilder, BucketAutoOperation> {
/**
* Creates a new {@link ExpressionBucketAutoOperationBuilder} for the given value, {@link BucketAutoOperation} and
* parameters.
*
* @param expression must not be {@literal null}.
* @param operation must not be {@literal null}.
* @param parameters
*/
protected ExpressionBucketAutoOperationBuilder(String expression, BucketAutoOperation operation,
Object[] parameters) {
super(expression, operation, parameters);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OutputBuilder#apply(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OperationOutput)
*/
@Override
protected BucketAutoOperationOutputBuilder apply(OperationOutput operationOutput) {
return new BucketAutoOperationOutputBuilder(operationOutput, this.operation);
}
}
/**
* @author Mark Paluch
*/
public interface Granularity {
/**
* @return a String that represents a MongoDB granularity to be used with {@link BucketAutoOperation}. Never
* {@literal null}.
*/
String getMongoRepresentation();
}
/**
* Supported MongoDB granularities.
*
* @see <a
* href="https://docs.mongodb.com/manual/reference/operator/aggregation/bucketAuto/#granularity>https://docs.mongodb.com/manual/reference/operator/aggregation/bucketAuto/#granularity</a>
* @author Mark Paluch
*/
public enum Granularities implements Granularity {
R5, R10, R20, R40, R80, //
SERIES_1_2_5("1-2-5"), //
E6, E12, E24, E48, E96, E192, //
POWERSOF2;
private final String granularity;
Granularities() {
this.granularity = name();
}
Granularities(String granularity) {
this.granularity = granularity;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.GranularitytoMongoGranularity()
*/
@Override
public String getMongoRepresentation() {
return granularity;
}
}
}

View File

@@ -0,0 +1,227 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import org.springframework.data.mongodb.core.aggregation.BucketOperation.BucketOperationOutputBuilder;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $bucket}-operation. <br />
*
* Bucket stage is typically used with {@link Aggregation} and {@code $facet}. Categorizes incoming documents into
* groups, called buckets, based on a specified expression and bucket boundaries. <br />
*
* We recommend to use the static factory method {@link Aggregation#bucket(String)} instead of creating instances of
* this class directly.
*
* @see <a href="https://docs.mongodb.org/manual/reference/aggregation/bucket/">https://docs.mongodb.org/manual/reference/aggregation/bucket/</a>
* @see BucketOperationSupport
* @author Mark Paluch
* @since 1.10
*/
public class BucketOperation extends BucketOperationSupport<BucketOperation, BucketOperationOutputBuilder>
implements FieldsExposingAggregationOperation {
private final List<Object> boundaries;
private final Object defaultBucket;
/**
* Creates a new {@link BucketOperation} given a {@link Field group-by field}.
*
* @param groupByField must not be {@literal null}.
*/
public BucketOperation(Field groupByField) {
super(groupByField);
this.boundaries = Collections.emptyList();
this.defaultBucket = null;
}
/**
* Creates a new {@link BucketOperation} given a {@link AggregationExpression group-by expression}.
*
* @param groupByExpression must not be {@literal null}.
*/
public BucketOperation(AggregationExpression groupByExpression) {
super(groupByExpression);
this.boundaries = Collections.emptyList();
this.defaultBucket = null;
}
private BucketOperation(BucketOperation bucketOperation, Outputs outputs) {
super(bucketOperation, outputs);
this.boundaries = bucketOperation.boundaries;
this.defaultBucket = bucketOperation.defaultBucket;
}
private BucketOperation(BucketOperation bucketOperation, List<Object> boundaries, Object defaultBucket) {
super(bucketOperation);
this.boundaries = new ArrayList<Object>(boundaries);
this.defaultBucket = defaultBucket;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
DBObject options = new BasicDBObject();
options.put("boundaries", context.getMappedObject(new BasicDBObject("$set", boundaries)).get("$set"));
if (defaultBucket != null) {
options.put("default", context.getMappedObject(new BasicDBObject("$set", defaultBucket)).get("$set"));
}
options.putAll(super.toDBObject(context));
return new BasicDBObject("$bucket", options);
}
/**
* Configures a default bucket {@literal literal} and return a new {@link BucketOperation}.
*
* @param literal must not be {@literal null}.
* @return
*/
public BucketOperation withDefaultBucket(Object literal) {
Assert.notNull(literal, "Default bucket literal must not be null!");
return new BucketOperation(this, boundaries, literal);
}
/**
* Configures {@literal boundaries} and return a new {@link BucketOperation}. Existing {@literal boundaries} are
* preserved and the new {@literal boundaries} are appended.
*
* @param boundaries must not be {@literal null}.
* @return
*/
public BucketOperation withBoundaries(Object... boundaries) {
Assert.notNull(boundaries, "Boundaries must not be null!");
Assert.noNullElements(boundaries, "Boundaries must not contain null values!");
List<Object> newBoundaries = new ArrayList<Object>(this.boundaries.size() + boundaries.length);
newBoundaries.addAll(this.boundaries);
newBoundaries.addAll(Arrays.asList(boundaries));
return new BucketOperation(this, newBoundaries, defaultBucket);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#newBucketOperation(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.Outputs)
*/
@Override
protected BucketOperation newBucketOperation(Outputs outputs) {
return new BucketOperation(this, outputs);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutputExpression(java.lang.String, java.lang.Object[])
*/
@Override
public ExpressionBucketOperationBuilder andOutputExpression(String expression, Object... params) {
return new ExpressionBucketOperationBuilder(expression, this, params);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutput(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public BucketOperationOutputBuilder andOutput(AggregationExpression expression) {
return new BucketOperationOutputBuilder(expression, this);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport#andOutput(java.lang.String)
*/
@Override
public BucketOperationOutputBuilder andOutput(String fieldName) {
return new BucketOperationOutputBuilder(Fields.field(fieldName), this);
}
/**
* {@link OutputBuilder} implementation for {@link BucketOperation}.
*/
public static class BucketOperationOutputBuilder
extends BucketOperationSupport.OutputBuilder<BucketOperationOutputBuilder, BucketOperation> {
/**
* Creates a new {@link BucketOperationOutputBuilder} fot the given value and {@link BucketOperation}.
*
* @param value must not be {@literal null}.
* @param operation must not be {@literal null}.
*/
protected BucketOperationOutputBuilder(Object value, BucketOperation operation) {
super(value, operation);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OutputBuilder#apply(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OperationOutput)
*/
@Override
protected BucketOperationOutputBuilder apply(OperationOutput operationOutput) {
return new BucketOperationOutputBuilder(operationOutput, this.operation);
}
}
/**
* {@link ExpressionBucketOperationBuilderSupport} implementation for {@link BucketOperation} using SpEL expression
* based {@link Output}.
*
* @author Mark Paluch
*/
public static class ExpressionBucketOperationBuilder
extends ExpressionBucketOperationBuilderSupport<BucketOperationOutputBuilder, BucketOperation> {
/**
* Creates a new {@link ExpressionBucketOperationBuilderSupport} for the given value, {@link BucketOperation}
* and parameters.
*
* @param expression must not be {@literal null}.
* @param operation must not be {@literal null}.
* @param parameters
*/
protected ExpressionBucketOperationBuilder(String expression, BucketOperation operation, Object[] parameters) {
super(expression, operation, parameters);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OutputBuilder#apply(org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OperationOutput)
*/
@Override
protected BucketOperationOutputBuilder apply(OperationOutput operationOutput) {
return new BucketOperationOutputBuilder(operationOutput, this.operation);
}
}
}

View File

@@ -0,0 +1,680 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.List;
import org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.OutputBuilder;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder;
import org.springframework.expression.spel.ast.Projection;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Base class for bucket operations that support output expressions the aggregation framework. <br />
* Bucket stages collect documents into buckets and can contribute output fields. <br />
* Implementing classes are required to provide an {@link OutputBuilder}.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 1.10
*/
public abstract class BucketOperationSupport<T extends BucketOperationSupport<T, B>, B extends OutputBuilder<B, T>>
implements FieldsExposingAggregationOperation {
private final Field groupByField;
private final AggregationExpression groupByExpression;
private final Outputs outputs;
/**
* Creates a new {@link BucketOperationSupport} given a {@link Field group-by field}.
*
* @param groupByField must not be {@literal null}.
*/
protected BucketOperationSupport(Field groupByField) {
Assert.notNull(groupByField, "Group by field must not be null!");
this.groupByField = groupByField;
this.groupByExpression = null;
this.outputs = Outputs.EMPTY;
}
/**
* Creates a new {@link BucketOperationSupport} given a {@link AggregationExpression group-by expression}.
*
* @param groupByExpression must not be {@literal null}.
*/
protected BucketOperationSupport(AggregationExpression groupByExpression) {
Assert.notNull(groupByExpression, "Group by AggregationExpression must not be null!");
this.groupByExpression = groupByExpression;
this.groupByField = null;
this.outputs = Outputs.EMPTY;
}
/**
* Creates a copy of {@link BucketOperationSupport}.
*
* @param operationSupport must not be {@literal null}.
*/
protected BucketOperationSupport(BucketOperationSupport<?, ?> operationSupport) {
this(operationSupport, operationSupport.outputs);
}
/**
* Creates a copy of {@link BucketOperationSupport} and applies the new {@link Outputs}.
*
* @param operationSupport must not be {@literal null}.
* @param outputs must not be {@literal null}.
*/
protected BucketOperationSupport(BucketOperationSupport<?, ?> operationSupport, Outputs outputs) {
Assert.notNull(operationSupport, "BucketOperationSupport must not be null!");
Assert.notNull(outputs, "Outputs must not be null!");
this.groupByField = operationSupport.groupByField;
this.groupByExpression = operationSupport.groupByExpression;
this.outputs = outputs;
}
/**
* Creates a new {@link ExpressionBucketOperationBuilderSupport} given a SpEL {@literal expression} and optional
* {@literal params} to add an output field to the resulting bucket documents.
*
* @param expression the SpEL expression, must not be {@literal null} or empty.
* @param params must not be {@literal null}
* @return
*/
public abstract ExpressionBucketOperationBuilderSupport<B, T> andOutputExpression(String expression,
Object... params);
/**
* Creates a new {@link BucketOperationSupport} given an {@link AggregationExpression} to add an output field to the
* resulting bucket documents.
*
* @param expression the SpEL expression, must not be {@literal null} or empty.
* @return
*/
public abstract B andOutput(AggregationExpression expression);
/**
* Creates a new {@link BucketOperationSupport} given {@literal fieldName} to add an output field to the resulting
* bucket documents. {@link BucketOperationSupport} exposes accumulation operations that can be applied to
* {@literal fieldName}.
*
* @param fieldName must not be {@literal null} or empty.
* @return
*/
public abstract B andOutput(String fieldName);
/**
* Creates a new {@link BucketOperationSupport} given to add a count field to the resulting bucket documents.
*
* @return
*/
public B andOutputCount() {
return andOutput(new AggregationExpression() {
@Override
public DBObject toDbObject(AggregationOperationContext context) {
return new BasicDBObject("$sum", 1);
}
});
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
DBObject dbObject = new BasicDBObject();
dbObject.put("groupBy", groupByExpression == null ? context.getReference(groupByField).toString()
: groupByExpression.toDbObject(context));
if (!outputs.isEmpty()) {
dbObject.put("output", outputs.toDbObject(context));
}
return dbObject;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
*/
@Override
public ExposedFields getFields() {
return outputs.asExposedFields();
}
/**
* Implementation hook to create a new bucket operation.
*
* @param outputs the outputs
* @return the new bucket operation.
*/
protected abstract T newBucketOperation(Outputs outputs);
protected T andOutput(Output output) {
return newBucketOperation(outputs.and(output));
}
/**
* Builder for SpEL expression-based {@link Output}.
*
* @author Mark Paluch
*/
public abstract static class ExpressionBucketOperationBuilderSupport<B extends OutputBuilder<B, T>, T extends BucketOperationSupport<T, B>>
extends OutputBuilder<B, T> {
/**
* Creates a new {@link ExpressionBucketOperationBuilderSupport} for the given value, {@link BucketOperationSupport}
* and parameters.
*
* @param expression must not be {@literal null}.
* @param operation must not be {@literal null}.
* @param parameters
*/
protected ExpressionBucketOperationBuilderSupport(String expression, T operation, Object[] parameters) {
super(new SpelExpressionOutput(expression, parameters), operation);
}
}
/**
* Base class for {@link Output} builders that result in a {@link BucketOperationSupport} providing the built
* {@link Output}.
*
* @author Mark Paluch
*/
public abstract static class OutputBuilder<B extends OutputBuilder<B, T>, T extends BucketOperationSupport<T, B>> {
protected final Object value;
protected final T operation;
/**
* Creates a new {@link OutputBuilder} for the given value and {@link BucketOperationSupport}.
*
* @param value must not be {@literal null}.
* @param operation must not be {@literal null}.
*/
protected OutputBuilder(Object value, T operation) {
Assert.notNull(value, "Value must not be null or empty!");
Assert.notNull(operation, "ProjectionOperation must not be null!");
this.value = value;
this.operation = operation;
}
/**
* Generates a builder for a {@code $sum}-expression. <br />
* Count expressions are emulated via {@code $sum: 1}.
*
* @return
*/
public B count() {
return sum(1);
}
/**
* Generates a builder for a {@code $sum}-expression for the current value.
*
* @return
*/
public B sum() {
return apply(Accumulators.SUM);
}
/**
* Generates a builder for a {@code $sum}-expression for the given {@literal value}.
*
* @param value
* @return
*/
public B sum(Number value) {
return apply(new OperationOutput(Accumulators.SUM.getMongoOperator(), Collections.singleton(value)));
}
/**
* Generates a builder for an {@code $last}-expression for the current value..
*
* @return
*/
public B last() {
return apply(Accumulators.LAST);
}
/**
* Generates a builder for a {@code $first}-expression the current value.
*
* @return
*/
public B first() {
return apply(Accumulators.FIRST);
}
/**
* Generates a builder for an {@code $avg}-expression for the current value.
*
* @param reference
* @return
*/
public B avg() {
return apply(Accumulators.AVG);
}
/**
* Generates a builder for an {@code $min}-expression for the current value.
*
* @return
*/
public B min() {
return apply(Accumulators.MIN);
}
/**
* Generates a builder for an {@code $max}-expression for the current value.
*
* @return
*/
public B max() {
return apply(Accumulators.MAX);
}
/**
* Generates a builder for an {@code $push}-expression for the current value.
*
* @return
*/
public B push() {
return apply(Accumulators.PUSH);
}
/**
* Generates a builder for an {@code $addToSet}-expression for the current value.
*
* @return
*/
public B addToSet() {
return apply(Accumulators.ADDTOSET);
}
/**
* Apply an operator to the current value.
*
* @param operation the operation name, must not be {@literal null} or empty.
* @param values must not be {@literal null}.
* @return
*/
public B apply(String operation, Object... values) {
Assert.hasText(operation, "Operation must not be empty or null!");
Assert.notNull(value, "Values must not be null!");
List<Object> objects = new ArrayList<Object>(values.length + 1);
objects.add(value);
objects.addAll(Arrays.asList(values));
return apply(new OperationOutput(operation, objects));
}
/**
* Apply an {@link OperationOutput} to this output.
*
* @param operationOutput must not be {@literal null}.
* @return
*/
protected abstract B apply(OperationOutput operationOutput);
private B apply(Accumulators operation) {
return this.apply(operation.getMongoOperator());
}
/**
* Returns the finally to be applied {@link BucketOperation} with the given alias.
*
* @param alias will never be {@literal null} or empty.
* @return
*/
public T as(String alias) {
if (value instanceof OperationOutput) {
return this.operation.andOutput(((OperationOutput) this.value).withAlias(alias));
}
if (value instanceof Field) {
throw new IllegalStateException("Cannot add a field as top-level output. Use accumulator expressions.");
}
return this.operation
.andOutput(new AggregationExpressionOutput(Fields.field(alias), (AggregationExpression) value));
}
}
private enum Accumulators {
SUM("$sum"), AVG("$avg"), FIRST("$first"), LAST("$last"), MAX("$max"), MIN("$min"), PUSH("$push"), ADDTOSET(
"$addToSet");
private String mongoOperator;
Accumulators(String mongoOperator) {
this.mongoOperator = mongoOperator;
}
public String getMongoOperator() {
return mongoOperator;
}
}
/**
* Encapsulates {@link Output}s.
*
* @author Mark Paluch
*/
protected static class Outputs implements AggregationExpression {
protected static final Outputs EMPTY = new Outputs();
private List<Output> outputs;
/**
* Creates a new, empty {@link Outputs}.
*/
private Outputs() {
this.outputs = new ArrayList<Output>();
}
/**
* Creates new {@link Outputs} containing all given {@link Output}s.
*
* @param current
* @param output
*/
private Outputs(Collection<Output> current, Output output) {
this.outputs = new ArrayList<Output>(current.size() + 1);
this.outputs.addAll(current);
this.outputs.add(output);
}
/**
* @return the {@link ExposedFields} derived from {@link Output}.
*/
protected ExposedFields asExposedFields() {
// The count field is included by default when the output is not specified.
if (isEmpty()) {
return ExposedFields.from(new ExposedField("count", true));
}
ExposedFields fields = ExposedFields.from();
for (Output output : outputs) {
fields = fields.and(output.getExposedField());
}
return fields;
}
/**
* Create a new {@link Outputs} that contains the new {@link Output}.
*
* @param output must not be {@literal null}.
* @return the new {@link Outputs} that contains the new {@link Output}
*/
protected Outputs and(Output output) {
Assert.notNull(output, "BucketOutput must not be null!");
return new Outputs(this.outputs, output);
}
/**
* @return {@literal true} if {@link Outputs} contains no {@link Output}.
*/
protected boolean isEmpty() {
return outputs.isEmpty();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
DBObject dbObject = new BasicDBObject();
for (Output output : outputs) {
dbObject.put(output.getExposedField().getName(), output.toDbObject(context));
}
return dbObject;
}
}
/**
* Encapsulates an output field in a bucket aggregation stage. <br />
* Output fields can be either top-level fields that define a valid field name or nested output fields using
* operators.
*
* @author Mark Paluch
*/
protected abstract static class Output implements AggregationExpression {
private final ExposedField field;
/**
* Creates new {@link Projection} for the given {@link Field}.
*
* @param field must not be {@literal null}.
*/
protected Output(Field field) {
Assert.notNull(field, "Field must not be null!");
this.field = new ExposedField(field, true);
}
/**
* Returns the field exposed by the {@link Output}.
*
* @return will never be {@literal null}.
*/
protected ExposedField getExposedField() {
return field;
}
}
/**
* Output field that uses a Mongo operation (expression object) to generate an output field value. <br />
* {@link OperationOutput} is used either with a regular field name or an operation keyword (e.g.
* {@literal $sum, $count}).
*
* @author Mark Paluch
*/
protected static class OperationOutput extends Output {
private final String operation;
private final List<Object> values;
/**
* Creates a new {@link Output} for the given field.
*
* @param operation the actual operation key, must not be {@literal null} or empty.
* @param values the values to pass into the operation, must not be {@literal null}.
*/
public OperationOutput(String operation, Collection<? extends Object> values) {
super(Fields.field(operation));
Assert.hasText(operation, "Operation must not be null or empty!");
Assert.notNull(values, "Values must not be null!");
this.operation = operation;
this.values = new ArrayList<Object>(values);
}
private OperationOutput(Field field, OperationOutput operationOutput) {
super(field);
this.operation = operationOutput.operation;
this.values = operationOutput.values;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
List<Object> operationArguments = getOperationArguments(context);
return new BasicDBObject(operation,
operationArguments.size() == 1 ? operationArguments.get(0) : operationArguments);
}
protected List<Object> getOperationArguments(AggregationOperationContext context) {
List<Object> result = new ArrayList<Object>(values != null ? values.size() : 1);
for (Object element : values) {
if (element instanceof Field) {
result.add(context.getReference((Field) element).toString());
} else if (element instanceof Fields) {
for (Field field : (Fields) element) {
result.add(context.getReference(field).toString());
}
} else if (element instanceof AggregationExpression) {
result.add(((AggregationExpression) element).toDbObject(context));
} else {
result.add(element);
}
}
return result;
}
/**
* Returns the field that holds the {@link ProjectionOperationBuilder.OperationProjection}.
*
* @return
*/
protected Field getField() {
return getExposedField();
}
/**
* Creates a new instance of this {@link OperationOutput} with the given alias.
*
* @param alias the alias to set
* @return
*/
public OperationOutput withAlias(String alias) {
final Field aliasedField = Fields.field(alias);
return new OperationOutput(aliasedField, this) {
@Override
protected Field getField() {
return aliasedField;
}
@Override
protected List<Object> getOperationArguments(AggregationOperationContext context) {
// We have to make sure that we use the arguments from the "previous" OperationOutput that we replace
// with this new instance.
return OperationOutput.this.getOperationArguments(context);
}
};
}
}
/**
* A {@link Output} based on a SpEL expression.
*/
private static class SpelExpressionOutput extends Output {
private static final SpelExpressionTransformer TRANSFORMER = new SpelExpressionTransformer();
private final String expression;
private final Object[] params;
/**
* Creates a new {@link SpelExpressionOutput} for the given field, SpEL expression and parameters.
*
* @param expression must not be {@literal null} or empty.
* @param parameters must not be {@literal null}.
*/
public SpelExpressionOutput(String expression, Object[] parameters) {
super(Fields.field(expression));
Assert.hasText(expression, "Expression must not be null!");
Assert.notNull(parameters, "Parameters must not be null!");
this.expression = expression;
this.params = parameters.clone();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.Output#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
return (DBObject) TRANSFORMER.transform(expression, context, params);
}
}
/**
* @author Mark Paluch
*/
private static class AggregationExpressionOutput extends Output {
private final AggregationExpression expression;
/**
* Creates a new {@link AggregationExpressionOutput}.
*
* @param field
* @param expression
*/
protected AggregationExpressionOutput(Field field, AggregationExpression expression) {
super(field);
this.expression = expression;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.Output#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
return expression.toDbObject(context);
}
}
}

View File

@@ -0,0 +1,879 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.Collections;
import java.util.List;
import org.springframework.util.Assert;
/**
* Gateway to {@literal comparison expressions}.
*
* @author Christoph Strobl
* @since 1.10
*/
public class ComparisonOperators {
/**
* Take the field referenced by given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static ComparisonOperatorFactory valueOf(String fieldReference) {
return new ComparisonOperatorFactory(fieldReference);
}
/**
* Take the value resulting from the given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static ComparisonOperatorFactory valueOf(AggregationExpression expression) {
return new ComparisonOperatorFactory(expression);
}
public static class ComparisonOperatorFactory {
private final String fieldReference;
private final AggregationExpression expression;
/**
* Creates new {@link ComparisonOperatorFactory} for given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
*/
public ComparisonOperatorFactory(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
this.fieldReference = fieldReference;
this.expression = null;
}
/**
* Creates new {@link ComparisonOperatorFactory} for given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
*/
public ComparisonOperatorFactory(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
this.fieldReference = null;
this.expression = expression;
}
/**
* Creates new {@link AggregationExpression} that compares two values.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Cmp compareTo(String fieldReference) {
return createCmp().compareTo(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that compares two values.
*
* @param expression must not be {@literal null}.
* @return
*/
public Cmp compareTo(AggregationExpression expression) {
return createCmp().compareTo(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values.
*
* @param value must not be {@literal null}.
* @return
*/
public Cmp compareToValue(Object value) {
return createCmp().compareToValue(value);
}
private Cmp createCmp() {
return usesFieldRef() ? Cmp.valueOf(fieldReference) : Cmp.valueOf(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is equal to the value of the referenced field.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Eq equalTo(String fieldReference) {
return createEq().equalTo(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is equal to the expression result.
*
* @param expression must not be {@literal null}.
* @return
*/
public Eq equalTo(AggregationExpression expression) {
return createEq().equalTo(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is equal to the given value.
*
* @param value must not be {@literal null}.
* @return
*/
public Eq equalToValue(Object value) {
return createEq().equalToValue(value);
}
private Eq createEq() {
return usesFieldRef() ? Eq.valueOf(fieldReference) : Eq.valueOf(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is greater than the value of the referenced field.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Gt greaterThan(String fieldReference) {
return createGt().greaterThan(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is greater than the expression result.
*
* @param expression must not be {@literal null}.
* @return
*/
public Gt greaterThan(AggregationExpression expression) {
return createGt().greaterThan(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is greater than the given value.
*
* @param value must not be {@literal null}.
* @return
*/
public Gt greaterThanValue(Object value) {
return createGt().greaterThanValue(value);
}
private Gt createGt() {
return usesFieldRef() ? Gt.valueOf(fieldReference) : Gt.valueOf(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is greater than or equivalent to the value of the referenced field.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Gte greaterThanEqualTo(String fieldReference) {
return createGte().greaterThanEqualTo(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is greater than or equivalent to the expression result.
*
* @param expression must not be {@literal null}.
* @return
*/
public Gte greaterThanEqualTo(AggregationExpression expression) {
return createGte().greaterThanEqualTo(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is greater than or equivalent to the given value.
*
* @param value must not be {@literal null}.
* @return
*/
public Gte greaterThanEqualToValue(Object value) {
return createGte().greaterThanEqualToValue(value);
}
private Gte createGte() {
return usesFieldRef() ? Gte.valueOf(fieldReference) : Gte.valueOf(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is less than the value of the referenced field.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Lt lessThan(String fieldReference) {
return createLt().lessThan(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is less than the expression result.
*
* @param expression must not be {@literal null}.
* @return
*/
public Lt lessThan(AggregationExpression expression) {
return createLt().lessThan(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is less than to the given value.
*
* @param value must not be {@literal null}.
* @return
*/
public Lt lessThanValue(Object value) {
return createLt().lessThanValue(value);
}
private Lt createLt() {
return usesFieldRef() ? Lt.valueOf(fieldReference) : Lt.valueOf(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is less than or equivalent to the value of the referenced field.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Lte lessThanEqualTo(String fieldReference) {
return createLte().lessThanEqualTo(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is less than or equivalent to the expression result.
*
* @param expression must not be {@literal null}.
* @return
*/
public Lte lessThanEqualTo(AggregationExpression expression) {
return createLte().lessThanEqualTo(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the first
* value is less than or equivalent to the given value.
*
* @param value
* @return
*/
public Lte lessThanEqualToValue(Object value) {
return createLte().lessThanEqualToValue(value);
}
private Lte createLte() {
return usesFieldRef() ? Lte.valueOf(fieldReference) : Lte.valueOf(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the values
* are not equivalent.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Ne notEqualTo(String fieldReference) {
return createNe().notEqualTo(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the values
* are not equivalent.
*
* @param expression must not be {@literal null}.
* @return
*/
public Ne notEqualTo(AggregationExpression expression) {
return createNe().notEqualTo(expression);
}
/**
* Creates new {@link AggregationExpression} that compares two values and returns {@literal true} when the values
* are not equivalent.
*
* @param value must not be {@literal null}.
* @return
*/
public Ne notEqualToValue(Object value) {
return createNe().notEqualToValue(value);
}
private Ne createNe() {
return usesFieldRef() ? Ne.valueOf(fieldReference) : Ne.valueOf(expression);
}
private boolean usesFieldRef() {
return fieldReference != null;
}
}
/**
* {@link AggregationExpression} for {@code $cmp}.
*
* @author Christoph Strobl
*/
public static class Cmp extends AbstractAggregationExpression {
private Cmp(List<?> value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$cmp";
}
/**
* Creates new {@link Cmp}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Cmp valueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Cmp(asFields(fieldReference));
}
/**
* Creates new {@link Cmp}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Cmp valueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Cmp(Collections.singletonList(expression));
}
/**
* Creates new {@link Cmp} with all previously added arguments appending the given one.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Cmp compareTo(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Cmp(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Cmp} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public Cmp compareTo(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Cmp(append(expression));
}
/**
* Creates new {@link Cmp} with all previously added arguments appending the given one.
*
* @param value must not be {@literal null}.
* @return
*/
public Cmp compareToValue(Object value) {
Assert.notNull(value, "Value must not be null!");
return new Cmp(append(value));
}
}
/**
* {@link AggregationExpression} for {@code $eq}.
*
* @author Christoph Strobl
*/
public static class Eq extends AbstractAggregationExpression {
private Eq(List<?> value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$eq";
}
/**
* Creates new {@link Eq}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Eq valueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Eq(asFields(fieldReference));
}
/**
* Creates new {@link Eq}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Eq valueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Eq(Collections.singletonList(expression));
}
/**
* Creates new {@link Eq} with all previously added arguments appending the given one.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Eq equalTo(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Eq(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Eq} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public Eq equalTo(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Eq(append(expression));
}
/**
* Creates new {@link Eq} with all previously added arguments appending the given one.
*
* @param value must not be {@literal null}.
* @return
*/
public Eq equalToValue(Object value) {
Assert.notNull(value, "Value must not be null!");
return new Eq(append(value));
}
}
/**
* {@link AggregationExpression} for {@code $gt}.
*
* @author Christoph Strobl
*/
public static class Gt extends AbstractAggregationExpression {
private Gt(List<?> value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$gt";
}
/**
* Creates new {@link Gt}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Gt valueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Gt(asFields(fieldReference));
}
/**
* Creates new {@link Gt}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Gt valueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Gt(Collections.singletonList(expression));
}
/**
* Creates new {@link Gt} with all previously added arguments appending the given one.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Gt greaterThan(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Gt(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Gt} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public Gt greaterThan(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Gt(append(expression));
}
/**
* Creates new {@link Gt} with all previously added arguments appending the given one.
*
* @param value must not be {@literal null}.
* @return
*/
public Gt greaterThanValue(Object value) {
Assert.notNull(value, "Value must not be null!");
return new Gt(append(value));
}
}
/**
* {@link AggregationExpression} for {@code $lt}.
*
* @author Christoph Strobl
*/
public static class Lt extends AbstractAggregationExpression {
private Lt(List<?> value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$lt";
}
/**
* Creates new {@link Lt}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Lt valueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Lt(asFields(fieldReference));
}
/**
* Creates new {@link Lt}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Lt valueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Lt(Collections.singletonList(expression));
}
/**
* Creates new {@link Lt} with all previously added arguments appending the given one.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Lt lessThan(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Lt(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Lt} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public Lt lessThan(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Lt(append(expression));
}
/**
* Creates new {@link Lt} with all previously added arguments appending the given one.
*
* @param value must not be {@literal null}.
* @return
*/
public Lt lessThanValue(Object value) {
Assert.notNull(value, "Value must not be null!");
return new Lt(append(value));
}
}
/**
* {@link AggregationExpression} for {@code $gte}.
*
* @author Christoph Strobl
*/
public static class Gte extends AbstractAggregationExpression {
private Gte(List<?> value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$gte";
}
/**
* Creates new {@link Gte}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Gte valueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Gte(asFields(fieldReference));
}
/**
* Creates new {@link Gte}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Gte valueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Gte(Collections.singletonList(expression));
}
/**
* Creates new {@link Gte} with all previously added arguments appending the given one.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Gte greaterThanEqualTo(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Gte(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Gte} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public Gte greaterThanEqualTo(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Gte(append(expression));
}
/**
* Creates new {@link Gte} with all previously added arguments appending the given one.
*
* @param value must not be {@literal null}.
* @return
*/
public Gte greaterThanEqualToValue(Object value) {
Assert.notNull(value, "Value must not be null!");
return new Gte(append(value));
}
}
/**
* {@link AggregationExpression} for {@code $lte}.
*
* @author Christoph Strobl
*/
public static class Lte extends AbstractAggregationExpression {
private Lte(List<?> value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$lte";
}
/**
* Creates new {@link Lte}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Lte valueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Lte(asFields(fieldReference));
}
/**
* Creates new {@link Lte}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Lte valueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Lte(Collections.singletonList(expression));
}
/**
* Creates new {@link Lte} with all previously added arguments appending the given one.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Lte lessThanEqualTo(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Lte(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Lte} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public Lte lessThanEqualTo(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Lte(append(expression));
}
/**
* Creates new {@link Lte} with all previously added arguments appending the given one.
*
* @param value must not be {@literal null}.
* @return
*/
public Lte lessThanEqualToValue(Object value) {
Assert.notNull(value, "Value must not be null!");
return new Lte(append(value));
}
}
/**
* {@link AggregationExpression} for {@code $ne}.
*
* @author Christoph Strobl
*/
public static class Ne extends AbstractAggregationExpression {
private Ne(List<?> value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$ne";
}
/**
* Creates new {@link Ne}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Ne valueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Ne(asFields(fieldReference));
}
/**
* Creates new {@link Ne}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Ne valueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Ne(Collections.singletonList(expression));
}
/**
* Creates new {@link Ne} with all previously added arguments appending the given one.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Ne notEqualTo(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Ne(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Ne} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public Ne notEqualTo(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Ne(append(expression));
}
/**
* Creates new {@link Eq} with all previously added arguments appending the given one.
*
* @param value must not be {@literal null}.
* @return
*/
public Ne notEqualToValue(Object value) {
Assert.notNull(value, "Value must not be null!");
return new Ne(append(value));
}
}
}

View File

@@ -0,0 +1,978 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.OtherwiseBuilder;
import org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.ThenBuilder;
import org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Switch.CaseOperator;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Gateway to {@literal conditional expressions} that evaluate their argument expressions as booleans to a value.
*
* @author Mark Paluch
* @since 1.10
*/
public class ConditionalOperators {
/**
* Take the field referenced by given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static ConditionalOperatorFactory when(String fieldReference) {
return new ConditionalOperatorFactory(fieldReference);
}
/**
* Take the value resulting from the given {@literal expression}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static ConditionalOperatorFactory when(AggregationExpression expression) {
return new ConditionalOperatorFactory(expression);
}
/**
* Take the value resulting from the given {@literal criteriaDefinition}.
*
* @param criteriaDefinition must not be {@literal null}.
* @return
*/
public static ConditionalOperatorFactory when(CriteriaDefinition criteriaDefinition) {
return new ConditionalOperatorFactory(criteriaDefinition);
}
/**
* Creates new {@link AggregationExpression} that evaluates an expression and returns the value of the expression if
* the expression evaluates to a non-null value. If the expression evaluates to a {@literal null} value, including
* instances of undefined values or missing fields, returns the value of the replacement expression.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static IfNull.ThenBuilder ifNull(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return IfNull.ifNull(fieldReference);
}
/**
* Creates new {@link AggregationExpression} that evaluates an expression and returns the value of the expression if
* the expression evaluates to a non-null value. If the expression evaluates to a {@literal null} value, including
* instances of undefined values or missing fields, returns the value of the replacement expression.
*
* @param expression must not be {@literal null}.
* @return
*/
public static IfNull.ThenBuilder ifNull(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return IfNull.ifNull(expression);
}
/**
* Creates new {@link AggregationExpression} that evaluates a series of {@link CaseOperator} expressions. When it
* finds an expression which evaluates to {@literal true}, {@code $switch} executes a specified expression and breaks
* out of the control flow.
*
* @param conditions must not be {@literal null}.
* @return
*/
public static Switch switchCases(CaseOperator... conditions) {
return Switch.switchCases(conditions);
}
/**
* Creates new {@link AggregationExpression} that evaluates a series of {@link CaseOperator} expressions. When it
* finds an expression which evaluates to {@literal true}, {@code $switch} executes a specified expression and breaks
* out of the control flow.
*
* @param conditions must not be {@literal null}.
* @return
*/
public static Switch switchCases(List<CaseOperator> conditions) {
return Switch.switchCases(conditions);
}
public static class ConditionalOperatorFactory {
private final String fieldReference;
private final AggregationExpression expression;
private final CriteriaDefinition criteriaDefinition;
/**
* Creates new {@link ConditionalOperatorFactory} for given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
*/
public ConditionalOperatorFactory(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
this.fieldReference = fieldReference;
this.expression = null;
this.criteriaDefinition = null;
}
/**
* Creates new {@link ConditionalOperatorFactory} for given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
*/
public ConditionalOperatorFactory(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
this.fieldReference = null;
this.expression = expression;
this.criteriaDefinition = null;
}
/**
* Creates new {@link ConditionalOperatorFactory} for given {@link CriteriaDefinition}.
*
* @param criteriaDefinition must not be {@literal null}.
*/
public ConditionalOperatorFactory(CriteriaDefinition criteriaDefinition) {
Assert.notNull(criteriaDefinition, "CriteriaDefinition must not be null!");
this.fieldReference = null;
this.expression = null;
this.criteriaDefinition = criteriaDefinition;
}
/**
* Creates new {@link AggregationExpression} that evaluates a boolean expression to return one of the two specified
* return expressions.
*
* @param value must not be {@literal null}.
* @return
*/
public OtherwiseBuilder then(Object value) {
Assert.notNull(value, "Value must not be null!");
return createThenBuilder().then(value);
}
/**
* Creates new {@link AggregationExpression} that evaluates a boolean expression to return one of the two specified
* return expressions.
*
* @param expression must not be {@literal null}.
* @return
*/
public OtherwiseBuilder thenValueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return createThenBuilder().then(expression);
}
/**
* Creates new {@link AggregationExpression} that evaluates a boolean expression to return one of the two specified
* return expressions.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public OtherwiseBuilder thenValueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return createThenBuilder().then(fieldReference);
}
private ThenBuilder createThenBuilder() {
if (usesFieldRef()) {
return Cond.newBuilder().when(fieldReference);
}
return usesCriteriaDefinition() ? Cond.newBuilder().when(criteriaDefinition) : Cond.newBuilder().when(expression);
}
private boolean usesFieldRef() {
return this.fieldReference != null;
}
private boolean usesCriteriaDefinition() {
return this.criteriaDefinition != null;
}
}
/**
* Encapsulates the aggregation framework {@code $ifNull} operator. Replacement values can be either {@link Field
* field references}, {@link AggregationExpression expressions}, values of simple MongoDB types or values that can be
* converted to a simple MongoDB type.
*
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/ifNull/">https://docs.mongodb.com/manual/reference/operator/aggregation/ifNull/</a>
* @author Mark Paluch
*/
public static class IfNull implements AggregationExpression {
private final Object condition;
private final Object value;
private IfNull(Object condition, Object value) {
this.condition = condition;
this.value = value;
}
/**
* Creates new {@link IfNull}.
*
* @param fieldReference the field to check for a {@literal null} value, field reference must not be {@literal null}
* .
* @return
*/
public static ThenBuilder ifNull(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new IfNullOperatorBuilder().ifNull(fieldReference);
}
/**
* Creates new {@link IfNull}.
*
* @param expression the expression to check for a {@literal null} value, field reference must not be
* {@literal null}.
* @return
*/
public static ThenBuilder ifNull(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new IfNullOperatorBuilder().ifNull(expression);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
List<Object> list = new ArrayList<Object>();
if (condition instanceof Field) {
list.add(context.getReference((Field) condition).toString());
} else if (condition instanceof AggregationExpression) {
list.add(((AggregationExpression) condition).toDbObject(context));
} else {
list.add(condition);
}
list.add(resolve(value, context));
return new BasicDBObject("$ifNull", list);
}
private Object resolve(Object value, AggregationOperationContext context) {
if (value instanceof Field) {
return context.getReference((Field) value).toString();
} else if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDbObject(context);
} else if (value instanceof DBObject) {
return value;
}
return context.getMappedObject(new BasicDBObject("$set", value)).get("$set");
}
/**
* @author Mark Paluch
*/
public interface IfNullBuilder {
/**
* @param fieldReference the field to check for a {@literal null} value, field reference must not be
* {@literal null}.
* @return the {@link ThenBuilder}
*/
ThenBuilder ifNull(String fieldReference);
/**
* @param expression the expression to check for a {@literal null} value, field name must not be {@literal null}
* or empty.
* @return the {@link ThenBuilder}
*/
ThenBuilder ifNull(AggregationExpression expression);
}
/**
* @author Mark Paluch
*/
public interface ThenBuilder {
/**
* @param value the value to be used if the {@code $ifNull} condition evaluates {@literal true}. Can be a
* {@link DBObject}, a value that is supported by MongoDB or a value that can be converted to a MongoDB
* representation but must not be {@literal null}.
* @return
*/
IfNull then(Object value);
/**
* @param fieldReference the field holding the replacement value, must not be {@literal null}.
* @return
*/
IfNull thenValueOf(String fieldReference);
/**
* @param expression the expression yielding to the replacement value, must not be {@literal null}.
* @return
*/
IfNull thenValueOf(AggregationExpression expression);
}
/**
* Builder for fluent {@link IfNull} creation.
*
* @author Mark Paluch
*/
static final class IfNullOperatorBuilder implements IfNullBuilder, ThenBuilder {
private Object condition;
private IfNullOperatorBuilder() {}
/**
* Creates a new builder for {@link IfNull}.
*
* @return never {@literal null}.
*/
public static IfNullOperatorBuilder newBuilder() {
return new IfNullOperatorBuilder();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull.IfNullBuilder#ifNull(java.lang.String)
*/
public ThenBuilder ifNull(String fieldReference) {
Assert.hasText(fieldReference, "FieldReference name must not be null or empty!");
this.condition = Fields.field(fieldReference);
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull.IfNullBuilder#ifNull(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public ThenBuilder ifNull(AggregationExpression expression) {
Assert.notNull(expression, "AggregationExpression name must not be null or empty!");
this.condition = expression;
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull.ThenBuilder#then(java.lang.Object)
*/
public IfNull then(Object value) {
return new IfNull(condition, value);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull.ThenBuilder#thenValueOf(java.lang.String)
*/
public IfNull thenValueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new IfNull(condition, Fields.field(fieldReference));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull.ThenBuilder#thenValueOf(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
public IfNull thenValueOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new IfNull(condition, expression);
}
}
}
/**
* {@link AggregationExpression} for {@code $switch}.
*
* @author Christoph Strobl
*/
public static class Switch extends AbstractAggregationExpression {
private Switch(java.util.Map<String, Object> values) {
super(values);
}
@Override
protected String getMongoMethod() {
return "$switch";
}
/**
* Creates new {@link Switch}.
*
* @param conditions must not be {@literal null}.
*/
public static Switch switchCases(CaseOperator... conditions) {
Assert.notNull(conditions, "Conditions must not be null!");
return switchCases(Arrays.asList(conditions));
}
/**
* Creates new {@link Switch}.
*
* @param conditions must not be {@literal null}.
*/
public static Switch switchCases(List<CaseOperator> conditions) {
Assert.notNull(conditions, "Conditions must not be null!");
return new Switch(Collections.<String, Object> singletonMap("branches", new ArrayList<CaseOperator>(conditions)));
}
public Switch defaultTo(Object value) {
return new Switch(append("default", value));
}
/**
* Encapsulates the aggregation framework case document inside a {@code $switch}-operation.
*/
public static class CaseOperator implements AggregationExpression {
private final AggregationExpression when;
private final Object then;
private CaseOperator(AggregationExpression when, Object then) {
this.when = when;
this.then = then;
}
public static ThenBuilder when(final AggregationExpression condition) {
Assert.notNull(condition, "Condition must not be null!");
return new ThenBuilder() {
@Override
public CaseOperator then(Object value) {
Assert.notNull(value, "Value must not be null!");
return new CaseOperator(condition, value);
}
};
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
DBObject dbo = new BasicDBObject("case", when.toDbObject(context));
if (then instanceof AggregationExpression) {
dbo.put("then", ((AggregationExpression) then).toDbObject(context));
} else if (then instanceof Field) {
dbo.put("then", context.getReference((Field) then).toString());
} else {
dbo.put("then", then);
}
return dbo;
}
/**
* @author Christoph Strobl
*/
public interface ThenBuilder {
/**
* Set the then {@literal value}.
*
* @param value must not be {@literal null}.
* @return
*/
CaseOperator then(Object value);
}
}
}
/**
* Encapsulates the aggregation framework {@code $cond} operator. A {@link Cond} allows nested conditions
* {@code if-then[if-then-else]-else} using {@link Field}, {@link CriteriaDefinition}, {@link AggregationExpression}
* or a {@link DBObject custom} condition. Replacement values can be either {@link Field field references},
* {@link AggregationExpression expressions}, values of simple MongoDB types or values that can be converted to a
* simple MongoDB type.
*
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/cond/">https://docs.mongodb.com/manual/reference/operator/aggregation/cond/</a>
* @author Mark Paluch
* @author Christoph Strobl
*/
public static class Cond implements AggregationExpression {
private final Object condition;
private final Object thenValue;
private final Object otherwiseValue;
/**
* Creates a new {@link Cond} for a given {@link Field} and {@code then}/{@code otherwise} values.
*
* @param condition must not be {@literal null}.
* @param thenValue must not be {@literal null}.
* @param otherwiseValue must not be {@literal null}.
*/
private Cond(Field condition, Object thenValue, Object otherwiseValue) {
this((Object) condition, thenValue, otherwiseValue);
}
/**
* Creates a new {@link Cond} for a given {@link CriteriaDefinition} and {@code then}/{@code otherwise} values.
*
* @param condition must not be {@literal null}.
* @param thenValue must not be {@literal null}.
* @param otherwiseValue must not be {@literal null}.
*/
private Cond(CriteriaDefinition condition, Object thenValue, Object otherwiseValue) {
this((Object) condition, thenValue, otherwiseValue);
}
private Cond(Object condition, Object thenValue, Object otherwiseValue) {
Assert.notNull(condition, "Condition must not be null!");
Assert.notNull(thenValue, "Then value must not be null!");
Assert.notNull(otherwiseValue, "Otherwise value must not be null!");
assertNotBuilder(condition, "Condition");
assertNotBuilder(thenValue, "Then value");
assertNotBuilder(otherwiseValue, "Otherwise value");
this.condition = condition;
this.thenValue = thenValue;
this.otherwiseValue = otherwiseValue;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
BasicDBObject condObject = new BasicDBObject();
condObject.append("if", resolveCriteria(context, condition));
condObject.append("then", resolveValue(context, thenValue));
condObject.append("else", resolveValue(context, otherwiseValue));
return new BasicDBObject("$cond", condObject);
}
private Object resolveValue(AggregationOperationContext context, Object value) {
if (value instanceof DBObject || value instanceof Field) {
return resolve(context, value);
}
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDbObject(context);
}
return context.getMappedObject(new BasicDBObject("$set", value)).get("$set");
}
private Object resolveCriteria(AggregationOperationContext context, Object value) {
if (value instanceof DBObject || value instanceof Field) {
return resolve(context, value);
}
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDbObject(context);
}
if (value instanceof CriteriaDefinition) {
DBObject mappedObject = context.getMappedObject(((CriteriaDefinition) value).getCriteriaObject());
List<Object> clauses = new ArrayList<Object>();
clauses.addAll(getClauses(context, mappedObject));
return clauses.size() == 1 ? clauses.get(0) : clauses;
}
throw new InvalidDataAccessApiUsageException(
String.format("Invalid value in condition. Supported: DBObject, Field references, Criteria, got: %s", value));
}
private List<Object> getClauses(AggregationOperationContext context, DBObject mappedObject) {
List<Object> clauses = new ArrayList<Object>();
for (String key : mappedObject.keySet()) {
Object predicate = mappedObject.get(key);
clauses.addAll(getClauses(context, key, predicate));
}
return clauses;
}
private List<Object> getClauses(AggregationOperationContext context, String key, Object predicate) {
List<Object> clauses = new ArrayList<Object>();
if (predicate instanceof List) {
List<Object> args = new ArrayList<Object>();
for (Object clause : (List<?>) predicate) {
if (clause instanceof DBObject) {
args.addAll(getClauses(context, (DBObject) clause));
}
}
clauses.add(new BasicDBObject(key, args));
} else if (predicate instanceof DBObject) {
DBObject nested = (DBObject) predicate;
for (String s : nested.keySet()) {
if (!isKeyword(s)) {
continue;
}
List<Object> args = new ArrayList<Object>();
args.add("$" + key);
args.add(nested.get(s));
clauses.add(new BasicDBObject(s, args));
}
} else if (!isKeyword(key)) {
List<Object> args = new ArrayList<Object>();
args.add("$" + key);
args.add(predicate);
clauses.add(new BasicDBObject("$eq", args));
}
return clauses;
}
/**
* Returns whether the given {@link String} is a MongoDB keyword.
*
* @param candidate
* @return
*/
private boolean isKeyword(String candidate) {
return candidate.startsWith("$");
}
private Object resolve(AggregationOperationContext context, Object value) {
if (value instanceof DBObject) {
return context.getMappedObject((DBObject) value);
}
return context.getReference((Field) value).toString();
}
private void assertNotBuilder(Object toCheck, String name) {
Assert.isTrue(!ClassUtils.isAssignableValue(ConditionalExpressionBuilder.class, toCheck),
String.format("%s must not be of type %s", name, ConditionalExpressionBuilder.class.getSimpleName()));
}
/**
* Get a builder that allows fluent creation of {@link Cond}.
*
* @return never {@literal null}.
*/
public static WhenBuilder newBuilder() {
return ConditionalExpressionBuilder.newBuilder();
}
/**
* Start creating new {@link Cond} by providing the boolean expression used in {@code if}.
*
* @param booleanExpression must not be {@literal null}.
* @return never {@literal null}.
*/
public static ThenBuilder when(DBObject booleanExpression) {
return ConditionalExpressionBuilder.newBuilder().when(booleanExpression);
}
/**
* Start creating new {@link Cond} by providing the {@link AggregationExpression} used in {@code if}.
*
* @param expression expression that yields in a boolean result, must not be {@literal null}.
* @return never {@literal null}.
*/
public static ThenBuilder when(AggregationExpression expression) {
return ConditionalExpressionBuilder.newBuilder().when(expression);
}
/**
* Start creating new {@link Cond} by providing the field reference used in {@code if}.
*
* @param booleanField name of a field holding a boolean value, must not be {@literal null}.
* @return never {@literal null}.
*/
public static ThenBuilder when(String booleanField) {
return ConditionalExpressionBuilder.newBuilder().when(booleanField);
}
/**
* Start creating new {@link Cond} by providing the {@link CriteriaDefinition} used in {@code if}.
*
* @param criteria criteria to evaluate, must not be {@literal null}.
* @return the {@link ThenBuilder}
*/
public static ThenBuilder when(CriteriaDefinition criteria) {
return ConditionalExpressionBuilder.newBuilder().when(criteria);
}
/**
* @author Mark Paluch
*/
public interface WhenBuilder {
/**
* @param booleanExpression expression that yields in a boolean result, must not be {@literal null}.
* @return the {@link ThenBuilder}
*/
ThenBuilder when(DBObject booleanExpression);
/**
* @param expression expression that yields in a boolean result, must not be {@literal null}.
* @return the {@link ThenBuilder}
*/
ThenBuilder when(AggregationExpression expression);
/**
* @param booleanField name of a field holding a boolean value, must not be {@literal null}.
* @return the {@link ThenBuilder}
*/
ThenBuilder when(String booleanField);
/**
* @param criteria criteria to evaluate, must not be {@literal null}.
* @return the {@link ThenBuilder}
*/
ThenBuilder when(CriteriaDefinition criteria);
}
/**
* @author Mark Paluch
*/
public interface ThenBuilder {
/**
* @param value the value to be used if the condition evaluates {@literal true}. Can be a {@link DBObject}, a
* value that is supported by MongoDB or a value that can be converted to a MongoDB representation but
* must not be {@literal null}.
* @return the {@link OtherwiseBuilder}
*/
OtherwiseBuilder then(Object value);
/**
* @param fieldReference must not be {@literal null}.
* @return the {@link OtherwiseBuilder}
*/
OtherwiseBuilder thenValueOf(String fieldReference);
/**
* @param expression must not be {@literal null}.
* @return the {@link OtherwiseBuilder}
*/
OtherwiseBuilder thenValueOf(AggregationExpression expression);
}
/**
* @author Mark Paluch
*/
public interface OtherwiseBuilder {
/**
* @param value the value to be used if the condition evaluates {@literal false}. Can be a {@link DBObject}, a
* value that is supported by MongoDB or a value that can be converted to a MongoDB representation but
* must not be {@literal null}.
* @return the {@link Cond}
*/
Cond otherwise(Object value);
/**
* @param fieldReference must not be {@literal null}.
* @return the {@link Cond}
*/
Cond otherwiseValueOf(String fieldReference);
/**
* @param expression must not be {@literal null}.
* @return the {@link Cond}
*/
Cond otherwiseValueOf(AggregationExpression expression);
}
/**
* Builder for fluent {@link Cond} creation.
*
* @author Mark Paluch
*/
static class ConditionalExpressionBuilder implements WhenBuilder, ThenBuilder, OtherwiseBuilder {
private Object condition;
private Object thenValue;
private ConditionalExpressionBuilder() {}
/**
* Creates a new builder for {@link Cond}.
*
* @return never {@literal null}.
*/
public static ConditionalExpressionBuilder newBuilder() {
return new ConditionalExpressionBuilder();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.WhenBuilder#when(com.mongodb.DBObject)
*/
@Override
public ConditionalExpressionBuilder when(DBObject booleanExpression) {
Assert.notNull(booleanExpression, "'Boolean expression' must not be null!");
this.condition = booleanExpression;
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.WhenBuilder#when(org.springframework.data.mongodb.core.query.CriteriaDefinition)
*/
@Override
public ThenBuilder when(CriteriaDefinition criteria) {
Assert.notNull(criteria, "Criteria must not be null!");
this.condition = criteria;
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.WhenBuilder#when(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public ThenBuilder when(AggregationExpression expression) {
Assert.notNull(expression, "AggregationExpression field must not be null!");
this.condition = expression;
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.WhenBuilder#when(java.lang.String)
*/
@Override
public ThenBuilder when(String booleanField) {
Assert.hasText(booleanField, "Boolean field name must not be null or empty!");
this.condition = Fields.field(booleanField);
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.ThenBuilder#then(java.lang.Object)
*/
@Override
public OtherwiseBuilder then(Object thenValue) {
Assert.notNull(thenValue, "Then-value must not be null!");
this.thenValue = thenValue;
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.ThenBuilder#thenValueOf(java.lang.String)
*/
@Override
public OtherwiseBuilder thenValueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
this.thenValue = Fields.field(fieldReference);
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.ThenBuilder#thenValueOf(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public OtherwiseBuilder thenValueOf(AggregationExpression expression) {
Assert.notNull(expression, "AggregationExpression must not be null!");
this.thenValue = expression;
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.OtherwiseBuilder#otherwise(java.lang.Object)
*/
@Override
public Cond otherwise(Object otherwiseValue) {
Assert.notNull(otherwiseValue, "Value must not be null!");
return new Cond(condition, thenValue, otherwiseValue);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.OtherwiseBuilder#otherwiseValueOf(java.lang.String)
*/
@Override
public Cond otherwiseValueOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Cond(condition, thenValue, Fields.field(fieldReference));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond.OtherwiseBuilder#otherwiseValueOf(org.springframework.data.mongodb.core.aggregation.AggregationExpression)
*/
@Override
public Cond otherwiseValueOf(AggregationExpression expression) {
Assert.notNull(expression, "AggregationExpression must not be null!");
return new Cond(condition, thenValue, expression);
}
}
}
}

View File

@@ -0,0 +1,82 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $count}-operation. <br />
* We recommend to use the static factory method {@link Aggregation#count()} instead of creating instances of this class
* directly.
*
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/aggregation/count/#pipe._S_count">https://docs.mongodb.com/manual/reference/operator/aggregation/count/</a>
* @author Mark Paluch
* @since 1.10
*/
public class CountOperation implements FieldsExposingAggregationOperation {
private final String fieldName;
/**
* Creates a new {@link CountOperation} given the {@link fieldName} field name.
*
* @param asFieldName must not be {@literal null} or empty.
*/
public CountOperation(String fieldName) {
Assert.hasText(fieldName, "Field name must not be null or empty!");
this.fieldName = fieldName;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$count", fieldName);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
*/
@Override
public ExposedFields getFields() {
return ExposedFields.from(new ExposedField(fieldName, true));
}
/**
* Builder for {@link CountOperation}.
*
* @author Mark Paluch
*/
public static class CountOperationBuilder {
/**
* Returns the finally to be applied {@link CountOperation} with the given alias.
*
* @param fieldName must not be {@literal null} or empty.
* @return
*/
public CountOperation as(String fieldName) {
return new CountOperation(fieldName);
}
}
}

View File

@@ -0,0 +1,67 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.util.Assert;
/**
* Gateway to {@literal data type} expressions.
*
* @author Christoph Strobl
* @since 1.10
* @soundtrack Clawfinger - Catch Me
*/
public class DataTypeOperators {
/**
* Return the BSON data type of the given {@literal field}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Type typeOf(String fieldReference) {
return Type.typeOf(fieldReference);
}
/**
* {@link AggregationExpression} for {@code $type}.
*
* @author Christoph Strobl
*/
public static class Type extends AbstractAggregationExpression {
private Type(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$type";
}
/**
* Creates new {@link Type}.
*
* @param field must not be {@literal null}.
* @return
*/
public static Type typeOf(String field) {
Assert.notNull(field, "Field must not be null!");
return new Type(Fields.field(field));
}
}
}

View File

@@ -0,0 +1,837 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.LinkedHashMap;
import org.springframework.data.mongodb.core.aggregation.ArithmeticOperators.ArithmeticOperatorFactory;
import org.springframework.util.Assert;
/**
* Gateway to {@literal Date} aggregation operations.
*
* @author Christoph Strobl
* @since 1.10
*/
public class DateOperators {
/**
* Take the date referenced by given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static DateOperatorFactory dateOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new DateOperatorFactory(fieldReference);
}
/**
* Take the date resulting from the given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static DateOperatorFactory dateOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new DateOperatorFactory(expression);
}
/**
* @author Christoph Strobl
*/
public static class DateOperatorFactory {
private final String fieldReference;
private final AggregationExpression expression;
/**
* Creates new {@link ArithmeticOperatorFactory} for given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
*/
public DateOperatorFactory(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
this.fieldReference = fieldReference;
this.expression = null;
}
/**
* Creates new {@link ArithmeticOperatorFactory} for given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
*/
public DateOperatorFactory(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
this.fieldReference = null;
this.expression = expression;
}
/**
* Creates new {@link AggregationExpression} that returns the day of the year for a date as a number between 1 and
* 366.
*
* @return
*/
public DayOfYear dayOfYear() {
return usesFieldRef() ? DayOfYear.dayOfYear(fieldReference) : DayOfYear.dayOfYear(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the day of the month for a date as a number between 1 and
* 31.
*
* @return
*/
public DayOfMonth dayOfMonth() {
return usesFieldRef() ? DayOfMonth.dayOfMonth(fieldReference) : DayOfMonth.dayOfMonth(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the day of the week for a date as a number between 1
* (Sunday) and 7 (Saturday).
*
* @return
*/
public DayOfWeek dayOfWeek() {
return usesFieldRef() ? DayOfWeek.dayOfWeek(fieldReference) : DayOfWeek.dayOfWeek(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the year portion of a date.
*
* @return
*/
public Year year() {
return usesFieldRef() ? Year.yearOf(fieldReference) : Year.yearOf(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the month of a date as a number between 1 and 12.
*
* @return
*/
public Month month() {
return usesFieldRef() ? Month.monthOf(fieldReference) : Month.monthOf(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the week of the year for a date as a number between 0 and
* 53.
*
* @return
*/
public Week week() {
return usesFieldRef() ? Week.weekOf(fieldReference) : Week.weekOf(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the hour portion of a date as a number between 0 and 23.
*
* @return
*/
public Hour hour() {
return usesFieldRef() ? Hour.hourOf(fieldReference) : Hour.hourOf(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the minute portion of a date as a number between 0 and 59.
*
* @return
*/
public Minute minute() {
return usesFieldRef() ? Minute.minuteOf(fieldReference) : Minute.minuteOf(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the second portion of a date as a number between 0 and 59,
* but can be 60 to account for leap seconds.
*
* @return
*/
public Second second() {
return usesFieldRef() ? Second.secondOf(fieldReference) : Second.secondOf(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the millisecond portion of a date as an integer between 0
* and 999.
*
* @return
*/
public Millisecond millisecond() {
return usesFieldRef() ? Millisecond.millisecondOf(fieldReference) : Millisecond.millisecondOf(expression);
}
/**
* Creates new {@link AggregationExpression} that converts a date object to a string according to a user-specified
* {@literal format}.
*
* @param format must not be {@literal null}.
* @return
*/
public DateToString toString(String format) {
return (usesFieldRef() ? DateToString.dateOf(fieldReference) : DateToString.dateOf(expression)).toString(format);
}
/**
* Creates new {@link AggregationExpression} that returns the weekday number in ISO 8601 format, ranging from 1 (for
* Monday) to 7 (for Sunday).
*
* @return
*/
public IsoDayOfWeek isoDayOfWeek() {
return usesFieldRef() ? IsoDayOfWeek.isoDayOfWeek(fieldReference) : IsoDayOfWeek.isoDayOfWeek(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the week number in ISO 8601 format, ranging from 1 to 53.
*
* @return
*/
public IsoWeek isoWeek() {
return usesFieldRef() ? IsoWeek.isoWeekOf(fieldReference) : IsoWeek.isoWeekOf(expression);
}
/**
* Creates new {@link AggregationExpression} that returns the year number in ISO 8601 format.
*
* @return
*/
public IsoWeekYear isoWeekYear() {
return usesFieldRef() ? IsoWeekYear.isoWeekYearOf(fieldReference) : IsoWeekYear.isoWeekYearOf(expression);
}
private boolean usesFieldRef() {
return fieldReference != null;
}
}
/**
* {@link AggregationExpression} for {@code $dayOfYear}.
*
* @author Christoph Strobl
*/
public static class DayOfYear extends AbstractAggregationExpression {
private DayOfYear(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$dayOfYear";
}
/**
* Creates new {@link DayOfYear}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static DayOfYear dayOfYear(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new DayOfYear(Fields.field(fieldReference));
}
/**
* Creates new {@link DayOfYear}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static DayOfYear dayOfYear(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new DayOfYear(expression);
}
}
/**
* {@link AggregationExpression} for {@code $dayOfMonth}.
*
* @author Christoph Strobl
*/
public static class DayOfMonth extends AbstractAggregationExpression {
private DayOfMonth(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$dayOfMonth";
}
/**
* Creates new {@link DayOfMonth}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static DayOfMonth dayOfMonth(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new DayOfMonth(Fields.field(fieldReference));
}
/**
* Creates new {@link DayOfMonth}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static DayOfMonth dayOfMonth(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new DayOfMonth(expression);
}
}
/**
* {@link AggregationExpression} for {@code $dayOfWeek}.
*
* @author Christoph Strobl
*/
public static class DayOfWeek extends AbstractAggregationExpression {
private DayOfWeek(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$dayOfWeek";
}
/**
* Creates new {@link DayOfWeek}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static DayOfWeek dayOfWeek(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new DayOfWeek(Fields.field(fieldReference));
}
/**
* Creates new {@link DayOfWeek}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static DayOfWeek dayOfWeek(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new DayOfWeek(expression);
}
}
/**
* {@link AggregationExpression} for {@code $year}.
*
* @author Christoph Strobl
*/
public static class Year extends AbstractAggregationExpression {
private Year(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$year";
}
/**
* Creates new {@link Year}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Year yearOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Year(Fields.field(fieldReference));
}
/**
* Creates new {@link Year}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Year yearOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Year(expression);
}
}
/**
* {@link AggregationExpression} for {@code $month}.
*
* @author Christoph Strobl
*/
public static class Month extends AbstractAggregationExpression {
private Month(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$month";
}
/**
* Creates new {@link Month}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Month monthOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Month(Fields.field(fieldReference));
}
/**
* Creates new {@link Month}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Month monthOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Month(expression);
}
}
/**
* {@link AggregationExpression} for {@code $week}.
*
* @author Christoph Strobl
*/
public static class Week extends AbstractAggregationExpression {
private Week(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$week";
}
/**
* Creates new {@link Week}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Week weekOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Week(Fields.field(fieldReference));
}
/**
* Creates new {@link Week}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Week weekOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Week(expression);
}
}
/**
* {@link AggregationExpression} for {@code $hour}.
*
* @author Christoph Strobl
*/
public static class Hour extends AbstractAggregationExpression {
private Hour(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$hour";
}
/**
* Creates new {@link Hour}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Hour hourOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Hour(Fields.field(fieldReference));
}
/**
* Creates new {@link Hour}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Hour hourOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Hour(expression);
}
}
/**
* {@link AggregationExpression} for {@code $minute}.
*
* @author Christoph Strobl
*/
public static class Minute extends AbstractAggregationExpression {
private Minute(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$minute";
}
/**
* Creates new {@link Minute}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Minute minuteOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Minute(Fields.field(fieldReference));
}
/**
* Creates new {@link Minute}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Minute minuteOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Minute(expression);
}
}
/**
* {@link AggregationExpression} for {@code $second}.
*
* @author Christoph Strobl
*/
public static class Second extends AbstractAggregationExpression {
private Second(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$second";
}
/**
* Creates new {@link Second}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Second secondOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Second(Fields.field(fieldReference));
}
/**
* Creates new {@link Second}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Second secondOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Second(expression);
}
}
/**
* {@link AggregationExpression} for {@code $millisecond}.
*
* @author Christoph Strobl
*/
public static class Millisecond extends AbstractAggregationExpression {
private Millisecond(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$millisecond";
}
/**
* Creates new {@link Millisecond}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Millisecond millisecondOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Millisecond(Fields.field(fieldReference));
}
/**
* Creates new {@link Millisecond}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Millisecond millisecondOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Millisecond(expression);
}
}
/**
* {@link AggregationExpression} for {@code $dateToString}.
*
* @author Christoph Strobl
*/
public static class DateToString extends AbstractAggregationExpression {
private DateToString(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$dateToString";
}
/**
* Creates new {@link FormatBuilder} allowing to define the date format to apply.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static FormatBuilder dateOf(final String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new FormatBuilder() {
@Override
public DateToString toString(String format) {
Assert.notNull(format, "Format must not be null!");
return new DateToString(argumentMap(Fields.field(fieldReference), format));
}
};
}
/**
* Creates new {@link FormatBuilder} allowing to define the date format to apply.
*
* @param expression must not be {@literal null}.
* @return
*/
public static FormatBuilder dateOf(final AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new FormatBuilder() {
@Override
public DateToString toString(String format) {
Assert.notNull(format, "Format must not be null!");
return new DateToString(argumentMap(expression, format));
}
};
}
private static java.util.Map<String, Object> argumentMap(Object date, String format) {
java.util.Map<String, Object> args = new LinkedHashMap<String, Object>(2);
args.put("format", format);
args.put("date", date);
return args;
}
public interface FormatBuilder {
/**
* Creates new {@link DateToString} with all previously added arguments appending the given one.
*
* @param format must not be {@literal null}.
* @return
*/
DateToString toString(String format);
}
}
/**
* {@link AggregationExpression} for {@code $isoDayOfWeek}.
*
* @author Christoph Strobl
*/
public static class IsoDayOfWeek extends AbstractAggregationExpression {
private IsoDayOfWeek(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$isoDayOfWeek";
}
/**
* Creates new {@link IsoDayOfWeek}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static IsoDayOfWeek isoDayOfWeek(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new IsoDayOfWeek(Fields.field(fieldReference));
}
/**
* Creates new {@link IsoDayOfWeek}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static IsoDayOfWeek isoDayOfWeek(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new IsoDayOfWeek(expression);
}
}
/**
* {@link AggregationExpression} for {@code $isoWeek}.
*
* @author Christoph Strobl
*/
public static class IsoWeek extends AbstractAggregationExpression {
private IsoWeek(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$isoWeek";
}
/**
* Creates new {@link IsoWeek}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static IsoWeek isoWeekOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new IsoWeek(Fields.field(fieldReference));
}
/**
* Creates new {@link IsoWeek}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static IsoWeek isoWeekOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new IsoWeek(expression);
}
}
/**
* {@link AggregationExpression} for {@code $isoWeekYear}.
*
* @author Christoph Strobl
*/
public static class IsoWeekYear extends AbstractAggregationExpression {
private IsoWeekYear(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$isoWeekYear";
}
/**
* Creates new {@link IsoWeekYear}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static IsoWeekYear isoWeekYearOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new IsoWeekYear(Fields.field(fieldReference));
}
/**
* Creates new {@link Millisecond}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static IsoWeekYear isoWeekYearOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new IsoWeekYear(expression);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,14 +22,17 @@ import java.util.Iterator;
import java.util.List;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
import org.springframework.util.Assert;
import org.springframework.util.CompositeIterator;
import org.springframework.util.ObjectUtils;
/**
* Value object to capture the fields exposed by an {@link AggregationOperation}.
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Mark Paluch
* @since 1.3
*/
public final class ExposedFields implements Iterable<ExposedField> {
@@ -88,7 +91,7 @@ public final class ExposedFields implements Iterable<ExposedField> {
}
/**
* Creates a new {@link ExposedFields} instance for the given fields in either sythetic or non-synthetic way.
* Creates a new {@link ExposedFields} instance for the given fields in either synthetic or non-synthetic way.
*
* @param fields must not be {@literal null}.
* @param synthetic
@@ -103,11 +106,11 @@ public final class ExposedFields implements Iterable<ExposedField> {
result.add(new ExposedField(field, synthetic));
}
return ExposedFields.from(result);
return from(result);
}
/**
* Creates a new {@link ExposedFields} with the given orignals and synthetics.
* Creates a new {@link ExposedFields} with the given originals and synthetics.
*
* @param originals must not be {@literal null}.
* @param synthetic must not be {@literal null}.
@@ -203,8 +206,13 @@ public final class ExposedFields implements Iterable<ExposedField> {
public Iterator<ExposedField> iterator() {
CompositeIterator<ExposedField> iterator = new CompositeIterator<ExposedField>();
iterator.add(syntheticFields.iterator());
iterator.add(originalFields.iterator());
if (!syntheticFields.isEmpty()) {
iterator.add(syntheticFields.iterator());
}
if (!originalFields.isEmpty()) {
iterator.add(originalFields.iterator());
}
return iterator;
}
@@ -330,12 +338,36 @@ public final class ExposedFields implements Iterable<ExposedField> {
}
}
/**
* A reference to an {@link ExposedField}.
*
* @author Christoph Strobl
* @since 1.10
*/
interface FieldReference {
/**
* Returns the raw, unqualified reference, i.e. the field reference without a {@literal $} prefix.
*
* @return
*/
String getRaw();
/**
* Returns the reference value for the given field reference. Will return 1 for a synthetic, unaliased field or the
* raw rendering of the reference otherwise.
*
* @return
*/
Object getReferenceValue();
}
/**
* A reference to an {@link ExposedField}.
*
* @author Oliver Gierke
*/
static class FieldReference {
static class DirectFieldReference implements FieldReference {
private final ExposedField field;
@@ -344,17 +376,16 @@ public final class ExposedFields implements Iterable<ExposedField> {
*
* @param field must not be {@literal null}.
*/
public FieldReference(ExposedField field) {
public DirectFieldReference(ExposedField field) {
Assert.notNull(field, "ExposedField must not be null!");
this.field = field;
}
/**
* Returns the raw, unqualified reference, i.e. the field reference without a {@literal $} prefix.
*
* @return
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference#getRaw()
*/
public String getRaw() {
@@ -362,11 +393,9 @@ public final class ExposedFields implements Iterable<ExposedField> {
return field.synthetic ? target : String.format("%s.%s", Fields.UNDERSCORE_ID, target);
}
/**
* Returns the referenve value for the given field reference. Will return 1 for a synthetic, unaliased field or the
* raw rendering of the reference otherwise.
*
* @return
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference#getReferenceValue()
*/
public Object getReferenceValue() {
return field.synthetic && !field.isAliased() ? 1 : toString();
@@ -378,6 +407,11 @@ public final class ExposedFields implements Iterable<ExposedField> {
*/
@Override
public String toString() {
if(getRaw().startsWith("$")) {
return getRaw();
}
return String.format("$%s", getRaw());
}
@@ -392,11 +426,11 @@ public final class ExposedFields implements Iterable<ExposedField> {
return true;
}
if (!(obj instanceof FieldReference)) {
if (!(obj instanceof DirectFieldReference)) {
return false;
}
FieldReference that = (FieldReference) obj;
DirectFieldReference that = (DirectFieldReference) obj;
return this.field.equals(that.field);
}
@@ -410,4 +444,78 @@ public final class ExposedFields implements Iterable<ExposedField> {
return field.hashCode();
}
}
/**
* A {@link FieldReference} to a {@link Field} used within a nested {@link AggregationExpression}.
*
* @author Christoph Strobl
* @since 1.10
*/
static class ExpressionFieldReference implements FieldReference {
private FieldReference delegate;
/**
* Creates a new {@link FieldReference} for the given {@link ExposedField}.
*
* @param field must not be {@literal null}.
*/
public ExpressionFieldReference(FieldReference field) {
delegate = field;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference#getRaw()
*/
@Override
public String getRaw() {
return delegate.getRaw();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference#getReferenceValue()
*/
@Override
public Object getReferenceValue() {
return delegate.getReferenceValue();
}
@Override
public String toString() {
String fieldRef = delegate.toString();
if (fieldRef.startsWith("$$")) {
return fieldRef;
}
if (fieldRef.startsWith("$")) {
return "$" + fieldRef;
}
return fieldRef;
}
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (!(obj instanceof ExpressionFieldReference)) {
return false;
}
ExpressionFieldReference that = (ExpressionFieldReference) obj;
return ObjectUtils.nullSafeEquals(this.delegate, that.delegate);
}
@Override
public int hashCode() {
return delegate.hashCode();
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.DirectFieldReference;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.util.Assert;
@@ -24,9 +25,10 @@ import com.mongodb.DBObject;
/**
* {@link AggregationOperationContext} that combines the available field references from a given
* {@code AggregationOperationContext} and an {@link FieldsExposingAggregationOperation}.
*
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Mark Paluch
* @since 1.4
*/
class ExposedFieldsAggregationOperationContext implements AggregationOperationContext {
@@ -37,11 +39,12 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
/**
* Creates a new {@link ExposedFieldsAggregationOperationContext} from the given {@link ExposedFields}. Uses the given
* {@link AggregationOperationContext} to perform a mapping to mongo types if necessary.
*
*
* @param exposedFields must not be {@literal null}.
* @param rootContext must not be {@literal null}.
*/
public ExposedFieldsAggregationOperationContext(ExposedFields exposedFields, AggregationOperationContext rootContext) {
public ExposedFieldsAggregationOperationContext(ExposedFields exposedFields,
AggregationOperationContext rootContext) {
Assert.notNull(exposedFields, "ExposedFields must not be null!");
Assert.notNull(rootContext, "RootContext must not be null!");
@@ -79,7 +82,7 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
/**
* Returns a {@link FieldReference} to the given {@link Field} with the given {@code name}.
*
*
* @param field may be {@literal null}
* @param name must not be {@literal null}
* @return
@@ -88,16 +91,32 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
Assert.notNull(name, "Name must not be null!");
FieldReference exposedField = resolveExposedField(field, name);
if (exposedField != null) {
return exposedField;
}
throw new IllegalArgumentException(String.format("Invalid reference '%s'!", name));
}
/**
* Resolves a {@link field}/{@link name} for a {@link FieldReference} if possible.
*
* @param field may be {@literal null}
* @param name must not be {@literal null}
* @return the resolved reference or {@literal null}
*/
protected FieldReference resolveExposedField(Field field, String name) {
ExposedField exposedField = exposedFields.getField(name);
if (exposedField != null) {
if (field != null) {
// we return a FieldReference to the given field directly to make sure that we reference the proper alias here.
return new FieldReference(new ExposedField(field, exposedField.isSynthetic()));
return new DirectFieldReference(new ExposedField(field, exposedField.isSynthetic()));
}
return new FieldReference(exposedField);
return new DirectFieldReference(exposedField);
}
if (name.contains(".")) {
@@ -108,10 +127,9 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
if (rootField != null) {
// We have to synthetic to true, in order to render the field-name as is.
return new FieldReference(new ExposedField(name, true));
return new DirectFieldReference(new ExposedField(name, true));
}
}
throw new IllegalArgumentException(String.format("Invalid reference '%s'!", name));
return null;
}
}

View File

@@ -0,0 +1,228 @@
/*
* Copyright 2016-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import org.springframework.data.mongodb.core.aggregation.BucketOperationSupport.Output;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $facet}-operation. <br />
* Facet of {@link AggregationOperation}s to be used in an {@link Aggregation}. Processes multiple
* {@link AggregationOperation} pipelines within a single stage on the same set of input documents. Each sub-pipeline
* has its own field in the output document where its results are stored as an array of documents.
* {@link FacetOperation} enables various aggregations on the same set of input documents, without needing to retrieve
* the input documents multiple times. <br />
* As of MongoDB 3.4, {@link FacetOperation} cannot be used with nested pipelines containing {@link GeoNearOperation},
* {@link OutOperation} and {@link FacetOperation}. <br />
* We recommend to use the static factory method {@link Aggregation#facet()} instead of creating instances of this class
* directly.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 1.10
* @see <a href="https://docs.mongodb.org/manual/reference/aggregation/facet/">MongoDB Aggregation Framework: $facet</a>
*/
public class FacetOperation implements FieldsExposingAggregationOperation {
/**
* Empty (initial) {@link FacetOperation}.
*/
public static final FacetOperation EMPTY = new FacetOperation();
private final Facets facets;
/**
* Creates a new {@link FacetOperation}.
*/
public FacetOperation() {
this(Facets.EMPTY);
}
private FacetOperation(Facets facets) {
this.facets = facets;
}
/**
* Creates a new {@link FacetOperationBuilder} to append a new facet using {@literal operations}. <br />
* {@link FacetOperationBuilder} takes a pipeline of {@link AggregationOperation} to categorize documents into a
* single facet.
*
* @param operations must not be {@literal null} or empty.
* @return
*/
public FacetOperationBuilder and(AggregationOperation... operations) {
Assert.notNull(operations, "AggregationOperations must not be null!");
Assert.notEmpty(operations, "AggregationOperations must not be empty!");
return new FacetOperationBuilder(facets, Arrays.asList(operations));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$facet", facets.toDBObject(context));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
*/
@Override
public ExposedFields getFields() {
return facets.asExposedFields();
}
/**
* Builder for {@link FacetOperation} by adding existing and the new pipeline of {@link AggregationOperation} to the
* new {@link FacetOperation}.
*
* @author Mark Paluch
*/
public static class FacetOperationBuilder {
private final Facets current;
private final List<AggregationOperation> operations;
private FacetOperationBuilder(Facets current, List<AggregationOperation> operations) {
this.current = current;
this.operations = operations;
}
/**
* Creates a new {@link FacetOperation} that contains the configured pipeline of {@link AggregationOperation}
* exposed as {@literal fieldName} in the resulting facet document.
*
* @param fieldName must not be {@literal null} or empty.
* @return
*/
public FacetOperation as(String fieldName) {
Assert.hasText(fieldName, "FieldName must not be null or empty!");
return new FacetOperation(current.and(fieldName, operations));
}
}
/**
* Encapsulates multiple {@link Facet}s
*
* @author Mark Paluch
*/
private static class Facets {
private static final Facets EMPTY = new Facets(Collections.<Facet> emptyList());
private List<Facet> facets;
/**
* Creates a new {@link Facets} given {@link List} of {@link Facet}.
*
* @param facets
*/
private Facets(List<Facet> facets) {
this.facets = facets;
}
/**
* @return the {@link ExposedFields} derived from {@link Output}.
*/
ExposedFields asExposedFields() {
ExposedFields fields = ExposedFields.from();
for (Facet facet : facets) {
fields = fields.and(facet.getExposedField());
}
return fields;
}
DBObject toDBObject(AggregationOperationContext context) {
DBObject dbObject = new BasicDBObject(facets.size());
for (Facet facet : facets) {
dbObject.put(facet.getExposedField().getName(), facet.toDBObjects(context));
}
return dbObject;
}
/**
* Adds a facet to this {@link Facets}.
*
* @param fieldName must not be {@literal null}.
* @param operations must not be {@literal null}.
* @return the new {@link Facets}.
*/
Facets and(String fieldName, List<AggregationOperation> operations) {
Assert.hasText(fieldName, "FieldName must not be null or empty!");
Assert.notNull(operations, "AggregationOperations must not be null!");
List<Facet> facets = new ArrayList<Facet>(this.facets.size() + 1);
facets.addAll(this.facets);
facets.add(new Facet(new ExposedField(fieldName, true), operations));
return new Facets(facets);
}
}
/**
* A single facet with a {@link ExposedField} and its {@link AggregationOperation} pipeline.
*
* @author Mark Paluch
*/
private static class Facet {
private final ExposedField exposedField;
private final List<AggregationOperation> operations;
/**
* Creates a new {@link Facet} given {@link ExposedField} and {@link AggregationOperation} pipeline.
*
* @param exposedField must not be {@literal null}.
* @param operations must not be {@literal null}.
*/
Facet(ExposedField exposedField, List<AggregationOperation> operations) {
Assert.notNull(exposedField, "ExposedField must not be null!");
Assert.notNull(operations, "AggregationOperations must not be null!");
this.exposedField = exposedField;
this.operations = operations;
}
ExposedField getExposedField() {
return exposedField;
}
List<DBObject> toDBObjects(AggregationOperationContext context) {
return AggregationOperationRenderer.toDBObject(operations, context);
}
}
}

View File

@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.HashMap;
import java.util.Iterator;
import java.util.List;
@@ -185,6 +186,14 @@ public final class Fields implements Iterable<Field> {
return fields.iterator();
}
/**
* @return
* @since 1.10
*/
public List<Field> asList() {
return Collections.unmodifiableList(fields);
}
/**
* Value object to encapsulate a field in an aggregation operation.
*
@@ -192,6 +201,7 @@ public final class Fields implements Iterable<Field> {
*/
static class AggregationField implements Field {
private final String raw;
private final String name;
private final String target;
@@ -216,6 +226,7 @@ public final class Fields implements Iterable<Field> {
*/
public AggregationField(String name, String target) {
raw = name;
String nameToSet = cleanUp(name);
String targetToSet = cleanUp(target);
@@ -257,6 +268,11 @@ public final class Fields implements Iterable<Field> {
* @see org.springframework.data.mongodb.core.aggregation.Field#getAlias()
*/
public String getTarget() {
if (isLocalVar()) {
return this.getRaw();
}
return StringUtils.hasText(this.target) ? this.target : this.name;
}
@@ -269,6 +285,22 @@ public final class Fields implements Iterable<Field> {
return !getName().equals(getTarget());
}
/**
* @return {@literal true} in case the field name starts with {@code $$}.
* @since 1.10
*/
public boolean isLocalVar() {
return raw.startsWith("$$") && !raw.startsWith("$$$");
}
/**
* @return
* @since 1.10
*/
public String getRaw() {
return raw;
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,17 +16,28 @@
package org.springframework.data.mongodb.core.aggregation;
/**
* {@link AggregationOperation} that exposes new {@link ExposedFields} that can be used for later aggregation pipeline
* {@code AggregationOperation}s.
*
* {@link AggregationOperation} that exposes {@link ExposedFields} that can be used for later aggregation pipeline
* {@code AggregationOperation}s. A {@link FieldsExposingAggregationOperation} implementing the
* {@link InheritsFieldsAggregationOperation} will expose fields from its parent operations. Not implementing
* {@link InheritsFieldsAggregationOperation} will replace existing exposed fields.
*
* @author Thomas Darimont
* @author Mark Paluch
*/
public interface FieldsExposingAggregationOperation extends AggregationOperation {
/**
* Returns the fields exposed by the {@link AggregationOperation}.
*
*
* @return will never be {@literal null}.
*/
ExposedFields getFields();
/**
* Marker interface for {@link AggregationOperation} that inherits fields from previous operations.
*/
static interface InheritsFieldsAggregationOperation extends FieldsExposingAggregationOperation {
}
}

View File

@@ -0,0 +1,411 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $graphLookup}-operation. <br />
* Performs a recursive search on a collection, with options for restricting the search by recursion depth and query
* filter. <br />
* We recommend to use the static factory method {@link Aggregation#graphLookup(String)} instead of creating instances
* of this class directly.
*
* @see <a href=
* "https://docs.mongodb.org/manual/reference/aggregation/graphLookup/">https://docs.mongodb.org/manual/reference/aggregation/graphLookup/</a>
* @author Mark Paluch
* @author Christoph Strobl
* @since 1.10
*/
public class GraphLookupOperation implements InheritsFieldsAggregationOperation {
private static final Set<Class<?>> ALLOWED_START_TYPES = new HashSet<Class<?>>(
Arrays.<Class<?>> asList(AggregationExpression.class, String.class, Field.class, DBObject.class));
private final String from;
private final List<Object> startWith;
private final Field connectFrom;
private final Field connectTo;
private final Field as;
private final Long maxDepth;
private final Field depthField;
private final CriteriaDefinition restrictSearchWithMatch;
private GraphLookupOperation(String from, List<Object> startWith, Field connectFrom, Field connectTo, Field as,
Long maxDepth, Field depthField, CriteriaDefinition restrictSearchWithMatch) {
this.from = from;
this.startWith = startWith;
this.connectFrom = connectFrom;
this.connectTo = connectTo;
this.as = as;
this.maxDepth = maxDepth;
this.depthField = depthField;
this.restrictSearchWithMatch = restrictSearchWithMatch;
}
/**
* Creates a new {@link FromBuilder} to build {@link GraphLookupOperation}.
*
* @return a new {@link FromBuilder}.
*/
public static FromBuilder builder() {
return new GraphLookupOperationFromBuilder();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
DBObject graphLookup = new BasicDBObject();
graphLookup.put("from", from);
List<Object> mappedStartWith = new ArrayList<Object>(startWith.size());
for (Object startWithElement : startWith) {
if (startWithElement instanceof AggregationExpression) {
mappedStartWith.add(((AggregationExpression) startWithElement).toDbObject(context));
} else if (startWithElement instanceof Field) {
mappedStartWith.add(context.getReference((Field) startWithElement).toString());
} else {
mappedStartWith.add(startWithElement);
}
}
graphLookup.put("startWith", mappedStartWith.size() == 1 ? mappedStartWith.iterator().next() : mappedStartWith);
graphLookup.put("connectFromField", connectFrom.getName());
graphLookup.put("connectToField", connectTo.getName());
graphLookup.put("as", as.getName());
if (maxDepth != null) {
graphLookup.put("maxDepth", maxDepth);
}
if (depthField != null) {
graphLookup.put("depthField", depthField.getName());
}
if (restrictSearchWithMatch != null) {
graphLookup.put("restrictSearchWithMatch", context.getMappedObject(restrictSearchWithMatch.getCriteriaObject()));
}
return new BasicDBObject("$graphLookup", graphLookup);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
*/
@Override
public ExposedFields getFields() {
return ExposedFields.from(new ExposedField(as, true));
}
/**
* @author Mark Paluch
*/
public interface FromBuilder {
/**
* Set the {@literal collectionName} to apply the {@code $graphLookup} to.
*
* @param collectionName must not be {@literal null} or empty.
* @return
*/
StartWithBuilder from(String collectionName);
}
/**
* @author Mark Paluch
* @author Christoph Strobl
*/
public interface StartWithBuilder {
/**
* Set the startWith {@literal fieldReferences} to apply the {@code $graphLookup} to.
*
* @param fieldReferences must not be {@literal null}.
* @return
*/
ConnectFromBuilder startWith(String... fieldReferences);
/**
* Set the startWith {@literal expressions} to apply the {@code $graphLookup} to.
*
* @param expressions must not be {@literal null}.
* @return
*/
ConnectFromBuilder startWith(AggregationExpression... expressions);
/**
* Set the startWith as either {@literal fieldReferences}, {@link Fields}, {@link DBObject} or
* {@link AggregationExpression} to apply the {@code $graphLookup} to.
*
* @param expressions must not be {@literal null}.
* @return
* @throws IllegalArgumentException
*/
ConnectFromBuilder startWith(Object... expressions);
}
/**
* @author Mark Paluch
*/
public interface ConnectFromBuilder {
/**
* Set the connectFrom {@literal fieldName} to apply the {@code $graphLookup} to.
*
* @param fieldName must not be {@literal null} or empty.
* @return
*/
ConnectToBuilder connectFrom(String fieldName);
}
/**
* @author Mark Paluch
*/
public interface ConnectToBuilder {
/**
* Set the connectTo {@literal fieldName} to apply the {@code $graphLookup} to.
*
* @param fieldName must not be {@literal null} or empty.
* @return
*/
GraphLookupOperationBuilder connectTo(String fieldName);
}
/**
* Builder to build the initial {@link GraphLookupOperationBuilder} that configures the initial mandatory set of
* {@link GraphLookupOperation} properties.
*
* @author Mark Paluch
*/
static final class GraphLookupOperationFromBuilder
implements FromBuilder, StartWithBuilder, ConnectFromBuilder, ConnectToBuilder {
private String from;
private List<? extends Object> startWith;
private String connectFrom;
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.GraphLookupOperation.FromBuilder#from(java.lang.String)
*/
@Override
public StartWithBuilder from(String collectionName) {
Assert.hasText(collectionName, "CollectionName must not be null or empty!");
this.from = collectionName;
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.GraphLookupOperation.StartWithBuilder#startWith(java.lang.String[])
*/
@Override
public ConnectFromBuilder startWith(String... fieldReferences) {
Assert.notNull(fieldReferences, "FieldReferences must not be null!");
Assert.noNullElements(fieldReferences, "FieldReferences must not contain null elements!");
List<Object> fields = new ArrayList<Object>(fieldReferences.length);
for (String fieldReference : fieldReferences) {
fields.add(Fields.field(fieldReference));
}
this.startWith = fields;
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.GraphLookupOperation.StartWithBuilder#startWith(org.springframework.data.mongodb.core.aggregation.AggregationExpression[])
*/
@Override
public ConnectFromBuilder startWith(AggregationExpression... expressions) {
Assert.notNull(expressions, "AggregationExpressions must not be null!");
Assert.noNullElements(expressions, "AggregationExpressions must not contain null elements!");
this.startWith = Arrays.asList(expressions);
return this;
}
@Override
public ConnectFromBuilder startWith(Object... expressions) {
Assert.notNull(expressions, "Expressions must not be null!");
Assert.noNullElements(expressions, "Expressions must not contain null elements!");
this.startWith = verifyAndPotentiallyTransformStartsWithTypes(expressions);
return this;
}
private List<Object> verifyAndPotentiallyTransformStartsWithTypes(Object... expressions) {
List<Object> expressionsToUse = new ArrayList<Object>(expressions.length);
for (Object expression : expressions) {
assertStartWithType(expression);
if (expression instanceof String) {
expressionsToUse.add(Fields.field((String) expression));
} else {
expressionsToUse.add(expression);
}
}
return expressionsToUse;
}
private void assertStartWithType(Object expression) {
for (Class<?> type : ALLOWED_START_TYPES) {
if (ClassUtils.isAssignable(type, expression.getClass())) {
return;
}
}
throw new IllegalArgumentException(
String.format("Expression must be any of %s but was %s", ALLOWED_START_TYPES, expression.getClass()));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.GraphLookupOperation.ConnectFromBuilder#connectFrom(java.lang.String)
*/
@Override
public ConnectToBuilder connectFrom(String fieldName) {
Assert.hasText(fieldName, "ConnectFrom must not be null or empty!");
this.connectFrom = fieldName;
return this;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.GraphLookupOperation.ConnectToBuilder#connectTo(java.lang.String)
*/
@Override
public GraphLookupOperationBuilder connectTo(String fieldName) {
Assert.hasText(fieldName, "ConnectTo must not be null or empty!");
return new GraphLookupOperationBuilder(from, startWith, connectFrom, fieldName);
}
}
/**
* @author Mark Paluch
*/
static final class GraphLookupOperationBuilder {
private final String from;
private final List<Object> startWith;
private final Field connectFrom;
private final Field connectTo;
private Long maxDepth;
private Field depthField;
private CriteriaDefinition restrictSearchWithMatch;
protected GraphLookupOperationBuilder(String from, List<? extends Object> startWith, String connectFrom,
String connectTo) {
this.from = from;
this.startWith = new ArrayList<Object>(startWith);
this.connectFrom = Fields.field(connectFrom);
this.connectTo = Fields.field(connectTo);
}
/**
* Optionally limit the number of recursions.
*
* @param numberOfRecursions must be greater or equal to zero.
* @return
*/
public GraphLookupOperationBuilder maxDepth(long numberOfRecursions) {
Assert.isTrue(numberOfRecursions >= 0, "Max depth must be >= 0!");
this.maxDepth = numberOfRecursions;
return this;
}
/**
* Optionally add a depth field {@literal fieldName} to each traversed document in the search path.
*
* @param fieldName must not be {@literal null} or empty.
* @return
*/
public GraphLookupOperationBuilder depthField(String fieldName) {
Assert.hasText(fieldName, "Depth field name must not be null or empty!");
this.depthField = Fields.field(fieldName);
return this;
}
/**
* Optionally add a query specifying conditions to the recursive search.
*
* @param criteriaDefinition must not be {@literal null}.
* @return
*/
public GraphLookupOperationBuilder restrict(CriteriaDefinition criteriaDefinition) {
Assert.notNull(criteriaDefinition, "CriteriaDefinition must not be null!");
this.restrictSearchWithMatch = criteriaDefinition;
return this;
}
/**
* Set the name of the array field added to each output document and return the final {@link GraphLookupOperation}.
* Contains the documents traversed in the {@literal $graphLookup} stage to reach the document.
*
* @param fieldName must not be {@literal null} or empty.
* @return the final {@link GraphLookupOperation}.
*/
public GraphLookupOperation as(String fieldName) {
Assert.hasText(fieldName, "As field name must not be null or empty!");
return new GraphLookupOperation(from, startWith, connectFrom, connectTo, Fields.field(fieldName), maxDepth,
depthField, restrictSearchWithMatch);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -35,11 +35,13 @@ import com.mongodb.DBObject;
* We recommend to use the static factory method {@link Aggregation#group(Fields)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/group/#stage._S_group
* @author Sebastian Herold
* @author Thomas Darimont
* @author Oliver Gierke
* @author Gustavo de Geus
* @author Christoph Strobl
* @since 1.3
* @see <a href="https://docs.mongodb.org/manual/reference/aggregation/group/">MongoDB Aggregation Framework: $group</a>
*/
public class GroupOperation implements FieldsExposingAggregationOperation {
@@ -307,6 +309,51 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return newBuilder(GroupOps.MAX, null, expr);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $stdDevSamp}-expression that for the given
* field-reference.
*
* @param reference must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public GroupOperationBuilder stdDevSamp(String reference) {
return newBuilder(GroupOps.STD_DEV_SAMP, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $stdDevSamp}-expression that for the given {@link AggregationExpression}.
*
* @param expr must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public GroupOperationBuilder stdDevSamp(AggregationExpression expr) {
return newBuilder(GroupOps.STD_DEV_SAMP, null, expr);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $stdDevPop}-expression that for the given field-reference.
*
* @param reference must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public GroupOperationBuilder stdDevPop(String reference) {
return newBuilder(GroupOps.STD_DEV_POP, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $stdDevPop}-expression that for the given {@link AggregationExpression}.
*
* @param expr must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public GroupOperationBuilder stdDevPop(AggregationExpression expr) {
return newBuilder(GroupOps.STD_DEV_POP, null, expr);
}
private GroupOperationBuilder newBuilder(Keyword keyword, String reference, Object value) {
return new GroupOperationBuilder(this, new Operation(keyword, null, reference, value));
}
@@ -371,21 +418,18 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
private static enum GroupOps implements Keyword {
SUM, LAST, FIRST, PUSH, AVG, MIN, MAX, ADD_TO_SET, COUNT;
SUM("$sum"), LAST("$last"), FIRST("$first"), PUSH("$push"), AVG("$avg"), MIN("$min"), MAX("$max"), ADD_TO_SET("$addToSet"), STD_DEV_POP("$stdDevPop"), STD_DEV_SAMP("$stdDevSamp");
private String mongoOperator;
GroupOps(String mongoOperator) {
this.mongoOperator = mongoOperator;
}
@Override
public String toString() {
String[] parts = name().split("_");
StringBuilder builder = new StringBuilder();
for (String part : parts) {
String lowerCase = part.toLowerCase(Locale.US);
builder.append(builder.length() == 0 ? lowerCase : StringUtils.capitalize(lowerCase));
}
return "$" + builder.toString();
return mongoOperator;
}
}

View File

@@ -0,0 +1,64 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
/**
* {@link ExposedFieldsAggregationOperationContext} that inherits fields from its parent
* {@link AggregationOperationContext}.
*
* @author Mark Paluch
* @since 1.9
*/
class InheritingExposedFieldsAggregationOperationContext extends ExposedFieldsAggregationOperationContext {
private final AggregationOperationContext previousContext;
/**
* Creates a new {@link ExposedFieldsAggregationOperationContext} from the given {@link ExposedFields}. Uses the given
* {@link AggregationOperationContext} to perform a mapping to mongo types if necessary.
*
* @param exposedFields must not be {@literal null}.
* @param previousContext must not be {@literal null}.
*/
public InheritingExposedFieldsAggregationOperationContext(ExposedFields exposedFields,
AggregationOperationContext previousContext) {
super(exposedFields, previousContext);
this.previousContext = previousContext;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ExposedFieldsAggregationOperationContext#resolveExposedField(org.springframework.data.mongodb.core.aggregation.Field, java.lang.String)
*/
@Override
protected FieldReference resolveExposedField(Field field, String name) {
FieldReference fieldReference = super.resolveExposedField(field, name);
if (fieldReference != null) {
return fieldReference;
}
if (field != null) {
return previousContext.getReference(field);
}
return previousContext.getReference(name);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,10 +26,10 @@ import com.mongodb.DBObject;
* We recommend to use the static factory method {@link Aggregation#limit(long)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/limit/
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
* @see <a href="https://docs.mongodb.org/manual/reference/aggregation/limit/">MongoDB Aggregation Framework: $limit</a>
*/
public class LimitOperation implements AggregationOperation {

View File

@@ -0,0 +1,96 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.util.Assert;
/**
* Gateway to {@literal literal} aggregation operations.
*
* @author Christoph Strobl
* @since 1.10
*/
public class LiteralOperators {
/**
* Take the value referenced by given {@literal value}.
*
* @param value must not be {@literal null}.
* @return
*/
public static LiteralOperatorFactory valueOf(Object value) {
Assert.notNull(value, "Value must not be null!");
return new LiteralOperatorFactory(value);
}
/**
* @author Christoph Strobl
*/
public static class LiteralOperatorFactory {
private final Object value;
/**
* Creates new {@link LiteralOperatorFactory} for given {@literal value}.
*
* @param value must not be {@literal null}.
*/
public LiteralOperatorFactory(Object value) {
Assert.notNull(value, "Value must not be null!");
this.value = value;
}
/**
* Creates new {@link Literal} that returns the associated value without parsing.
*
* @return
*/
public Literal asLiteral() {
return Literal.asLiteral(value);
}
}
/**
* {@link AggregationExpression} for {@code $literal}.
*
* @author Christoph Strobl
*/
public static class Literal extends AbstractAggregationExpression {
private Literal(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$literal";
}
/**
* Creates new {@link Literal}.
*
* @param value must not be {@literal null}.
* @return
*/
public static Literal asLiteral(Object value) {
Assert.notNull(value, "Value must not be null!");
return new Literal(value);
}
}
}

View File

@@ -0,0 +1,196 @@
/*
* Copyright 2016-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $lookup}-operation. We recommend to use the static factory method
* {@link Aggregation#lookup(String, String, String, String)} instead of creating instances of this class directly.
*
* @author Alessio Fachechi
* @author Christoph Strobl
* @author Mark Paluch
* @since 1.9
* @see <a href="https://docs.mongodb.com/manual/reference/operator/aggregation/lookup/">MongoDB Aggregation Framework: $lookup</a>
*/
public class LookupOperation implements FieldsExposingAggregationOperation, InheritsFieldsAggregationOperation {
private Field from;
private Field localField;
private Field foreignField;
private ExposedField as;
/**
* Creates a new {@link LookupOperation} for the given {@link Field}s.
*
* @param from must not be {@literal null}.
* @param localField must not be {@literal null}.
* @param foreignField must not be {@literal null}.
* @param as must not be {@literal null}.
*/
public LookupOperation(Field from, Field localField, Field foreignField, Field as) {
Assert.notNull(from, "From must not be null!");
Assert.notNull(localField, "LocalField must not be null!");
Assert.notNull(foreignField, "ForeignField must not be null!");
Assert.notNull(as, "As must not be null!");
this.from = from;
this.localField = localField;
this.foreignField = foreignField;
this.as = new ExposedField(as, true);
}
private LookupOperation() {
// used by builder
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
*/
@Override
public ExposedFields getFields() {
return ExposedFields.from(as);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
BasicDBObject lookupObject = new BasicDBObject();
lookupObject.append("from", from.getTarget());
lookupObject.append("localField", localField.getTarget());
lookupObject.append("foreignField", foreignField.getTarget());
lookupObject.append("as", as.getTarget());
return new BasicDBObject("$lookup", lookupObject);
}
/**
* Get a builder that allows creation of {@link LookupOperation}.
*
* @return
*/
public static FromBuilder newLookup() {
return new LookupOperationBuilder();
}
public static interface FromBuilder {
/**
* @param name the collection in the same database to perform the join with, must not be {@literal null} or empty.
* @return
*/
LocalFieldBuilder from(String name);
}
public static interface LocalFieldBuilder {
/**
* @param name the field from the documents input to the {@code $lookup} stage, must not be {@literal null} or
* empty.
* @return
*/
ForeignFieldBuilder localField(String name);
}
public static interface ForeignFieldBuilder {
/**
* @param name the field from the documents in the {@code from} collection, must not be {@literal null} or empty.
* @return
*/
AsBuilder foreignField(String name);
}
public static interface AsBuilder {
/**
* @param name the name of the new array field to add to the input documents, must not be {@literal null} or empty.
* @return
*/
LookupOperation as(String name);
}
/**
* Builder for fluent {@link LookupOperation} creation.
*
* @author Christoph Strobl
* @since 1.9
*/
public static final class LookupOperationBuilder
implements FromBuilder, LocalFieldBuilder, ForeignFieldBuilder, AsBuilder {
private final LookupOperation lookupOperation;
private LookupOperationBuilder() {
this.lookupOperation = new LookupOperation();
}
/**
* Creates new builder for {@link LookupOperation}.
*
* @return never {@literal null}.
*/
public static FromBuilder newBuilder() {
return new LookupOperationBuilder();
}
@Override
public LocalFieldBuilder from(String name) {
Assert.hasText(name, "'From' must not be null or empty!");
lookupOperation.from = Fields.field(name);
return this;
}
@Override
public LookupOperation as(String name) {
Assert.hasText(name, "'As' must not be null or empty!");
lookupOperation.as = new ExposedField(Fields.field(name), true);
return new LookupOperation(lookupOperation.from, lookupOperation.localField, lookupOperation.foreignField,
lookupOperation.as);
}
@Override
public AsBuilder foreignField(String name) {
Assert.hasText(name, "'ForeignField' must not be null or empty!");
lookupOperation.foreignField = Fields.field(name);
return this;
}
@Override
public ForeignFieldBuilder localField(String name) {
Assert.hasText(name, "'LocalField' must not be null or empty!");
lookupOperation.localField = Fields.field(name);
return this;
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,11 +28,11 @@ import com.mongodb.DBObject;
* {@link Aggregation#match(org.springframework.data.mongodb.core.query.Criteria)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/match/
* @author Sebastian Herold
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
* @see <a href="https://docs.mongodb.com/manual/reference/operator/aggregation/match/">MongoDB Aggregation Framework: $match</a>
*/
public class MatchOperation implements AggregationOperation {

View File

@@ -0,0 +1,73 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExpressionFieldReference;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* {@link AggregationOperationContext} that delegates {@link FieldReference} resolution and mapping to a parent one, but
* assures {@link FieldReference} get converted into {@link ExpressionFieldReference} using {@code $$} to ref an inner
* variable.
*
* @author Christoph Strobl
* @since 1.10
*/
class NestedDelegatingExpressionAggregationOperationContext implements AggregationOperationContext {
private final AggregationOperationContext delegate;
/**
* Creates new {@link NestedDelegatingExpressionAggregationOperationContext}.
*
* @param referenceContext must not be {@literal null}.
*/
public NestedDelegatingExpressionAggregationOperationContext(AggregationOperationContext referenceContext) {
Assert.notNull(referenceContext, "Reference context must not be null!");
this.delegate = referenceContext;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(com.mongodb.DBObject)
*/
@Override
public DBObject getMappedObject(DBObject dbObject) {
return delegate.getMappedObject(dbObject);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.Field)
*/
@Override
public FieldReference getReference(Field field) {
return new ExpressionFieldReference(delegate.getReference(field));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
*/
@Override
public FieldReference getReference(String name) {
return new ExpressionFieldReference(delegate.getReference(name));
}
}

View File

@@ -0,0 +1,51 @@
/*
* Copyright 2016-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import org.springframework.util.Assert;
/**
* Encapsulates the {@code $out}-operation.
* <p>
* We recommend to use the static factory method {@link Aggregation#out(String)} instead of creating instances of this
* class directly.
*
* @author Nikolay Bogdanov
* @see <a href="https://docs.mongodb.com/manual/reference/operator/aggregation/out/">MongoDB Aggregation Framework: $out</a>
*/
public class OutOperation implements AggregationOperation {
private final String collectionName;
/**
* @param outCollectionName Collection name to export the results. Must not be {@literal null}.
*/
public OutOperation(String outCollectionName) {
Assert.notNull(outCollectionName, "Collection name must not be null!");
this.collectionName = outCollectionName;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$out", collectionName);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,11 +17,16 @@ package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.List;
import org.springframework.data.mongodb.core.aggregation.ConditionalOperators.Cond;
import org.springframework.data.mongodb.core.aggregation.ConditionalOperators.IfNull;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
import org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder.FieldProjection;
import org.springframework.data.mongodb.core.aggregation.VariableOperators.Let.ExpressionVariable;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
@@ -36,11 +41,13 @@ import com.mongodb.DBObject;
* We recommend to use the static factory method {@link Aggregation#project(Fields)} instead of creating instances of
* this class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/project/
* @author Tobias Trelle
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
* @author Mark Paluch
* @since 1.3
* @see <a href="https://docs.mongodb.com/manual/reference/operator/aggregation/project/">MongoDB Aggregation Framework: $project</a>
*/
public class ProjectionOperation implements FieldsExposingAggregationOperation {
@@ -102,8 +109,8 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
*/
private ProjectionOperation andReplaceLastOneWith(Projection projection) {
List<Projection> projections = this.projections.isEmpty() ? Collections.<Projection> emptyList() : this.projections
.subList(0, this.projections.size() - 1);
List<Projection> projections = this.projections.isEmpty() ? Collections.<Projection> emptyList()
: this.projections.subList(0, this.projections.size() - 1);
return new ProjectionOperation(projections, Arrays.asList(projection));
}
@@ -238,6 +245,24 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @return
*/
public abstract ProjectionOperation as(String alias);
/**
* Apply a conditional projection using {@link Cond}.
*
* @param cond must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public abstract ProjectionOperation applyCondition(Cond cond);
/**
* Apply a conditional value replacement for {@literal null} values using {@link IfNull}.
*
* @param ifNull must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public abstract ProjectionOperation applyCondition(IfNull ifNull);
}
/**
@@ -339,7 +364,8 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return new BasicDBObject(getExposedField().getName(), toMongoExpression(context, expression, params));
}
protected static Object toMongoExpression(AggregationOperationContext context, String expression, Object[] params) {
protected static Object toMongoExpression(AggregationOperationContext context, String expression,
Object[] params) {
return TRANSFORMER.transform(expression, context, params);
}
}
@@ -350,6 +376,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public static class ProjectionOperationBuilder extends AbstractProjectionOperationBuilder {
@@ -367,7 +394,8 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
* @param operation must not be {@literal null}.
* @param previousProjection the previous operation projection, may be {@literal null}.
*/
public ProjectionOperationBuilder(String name, ProjectionOperation operation, OperationProjection previousProjection) {
public ProjectionOperationBuilder(String name, ProjectionOperation operation,
OperationProjection previousProjection) {
super(name, operation);
this.name = name;
@@ -416,7 +444,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/**
* Allows to specify an alias for the previous projection operation.
*
* @param string
* @param alias
* @return
*/
@Override
@@ -433,6 +461,28 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return this.operation.and(new FieldProjection(Fields.field(alias, name), null));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.AbstractProjectionOperationBuilder#transform(org.springframework.data.mongodb.core.aggregation.ConditionalOperator)
*/
@Override
public ProjectionOperation applyCondition(Cond cond) {
Assert.notNull(cond, "ConditionalOperator must not be null!");
return this.operation.and(new ExpressionProjection(Fields.field(name), cond));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.AbstractProjectionOperationBuilder#transform(org.springframework.data.mongodb.core.aggregation.IfNullOperator)
*/
@Override
public ProjectionOperation applyCondition(IfNull ifNull) {
Assert.notNull(ifNull, "IfNullOperator must not be null!");
return this.operation.and(new ExpressionProjection(Fields.field(name), ifNull));
}
/**
* Generates an {@code $add} expression that adds the given number to the previously mentioned field.
*
@@ -482,6 +532,20 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return project("subtract", Fields.field(fieldReference));
}
/**
* Generates an {@code $subtract} expression that subtracts the result of the given {@link AggregationExpression}
* from the previously mentioned field.
*
* @param expression must not be {@literal null}.
* @return
* @since 1.10
*/
public ProjectionOperationBuilder minus(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return project("subtract", expression);
}
/**
* Generates an {@code $multiply} expression that multiplies the given number with the previously mentioned field.
*
@@ -507,6 +571,20 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return project("multiply", Fields.field(fieldReference));
}
/**
* Generates an {@code $multiply} expression that multiplies the previously with the result of the
* {@link AggregationExpression}. mentioned field.
*
* @param expression must not be {@literal null}.
* @return
* @since 1.10
*/
public ProjectionOperationBuilder multiply(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return project("multiply", expression);
}
/**
* Generates an {@code $divide} expression that divides the previously mentioned field by the given number.
*
@@ -533,6 +611,20 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return project("divide", Fields.field(fieldReference));
}
/**
* Generates an {@code $divide} expression that divides the value of the previously mentioned by the result of the
* {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
* @return
* @since 1.10
*/
public ProjectionOperationBuilder divide(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return project("divide", expression);
}
/**
* Generates an {@code $mod} expression that divides the previously mentioned field by the given number and returns
* the remainder.
@@ -550,7 +642,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
/**
* Generates an {@code $mod} expression that divides the value of the given field by the previously mentioned field
* and returns the remainder.
*
*
* @param fieldReference
* @return
*/
@@ -560,11 +652,581 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return project("mod", Fields.field(fieldReference));
}
/**
* Generates an {@code $mod} expression that divides the value of the previously mentioned field by the result of
* the {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
* @return
* @since 1.10
*/
public ProjectionOperationBuilder mod(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return project("mod", expression);
}
/**
* Generates a {@code $size} expression that returns the size of the array held by the given field. <br />
*
* @return never {@literal null}.
* @since 1.7
*/
public ProjectionOperationBuilder size() {
return project("size");
}
/*
/**
* Generates a {@code $cmp} expression (compare to) that compares the value of the field to a given value or field.
*
* @param compareValue compare value or a {@link Field} object.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder cmp(Object compareValue) {
return project("cmp", compareValue);
}
/**
* Generates a {@code $eq} expression (equal) that compares the value of the field to a given value or field.
*
* @param compareValue compare value or a {@link Field} object.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder eq(Object compareValue) {
return project("eq", compareValue);
}
/**
* Generates a {@code $gt} expression (greater than) that compares the value of the field to a given value or field.
*
* @param compareValue compare value or a {@link Field} object.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder gt(Object compareValue) {
return project("gt", compareValue);
}
/**
* Generates a {@code $gte} expression (greater than equal) that compares the value of the field to a given value or
* field.
*
* @param compareValue compare value or a {@link Field} object.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder gte(Object compareValue) {
return project("gte", compareValue);
}
/**
* Generates a {@code $lt} expression (less than) that compares the value of the field to a given value or field.
*
* @param compareValue compare value or a {@link Field} object.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder lt(Object compareValue) {
return project("lt", compareValue);
}
/**
* Generates a {@code $lte} expression (less than equal) that compares the value of the field to a given value or
* field.
*
* @param compareValue the compare value or a {@link Field} object.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder lte(Object compareValue) {
return project("lte", compareValue);
}
/**
* Generates a {@code $ne} expression (not equal) that compares the value of the field to a given value or field.
*
* @param compareValue compare value or a {@link Field} object.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder ne(Object compareValue) {
return project("ne", compareValue);
}
/**
* Generates a {@code $slice} expression that returns a subset of the array held by the given field. <br />
* If {@literal n} is positive, $slice returns up to the first n elements in the array. <br />
* If {@literal n} is negative, $slice returns up to the last n elements in the array.
*
* @param count max number of elements.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder slice(int count) {
return project("slice", count);
}
/**
* Generates a {@code $slice} expression that returns a subset of the array held by the given field. <br />
*
* @param count max number of elements. Must not be negative.
* @param offset the offset within the array to start from.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder slice(int count, int offset) {
return project("slice", offset, count);
}
/**
* Generates a {@code $filter} expression that returns a subset of the array held by the given field.
*
* @param as The variable name for the element in the input array. Must not be {@literal null}.
* @param condition The {@link AggregationExpression} that determines whether to include the element in the
* resulting array. Must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder filter(String as, AggregationExpression condition) {
return this.operation.and(ArrayOperators.Filter.filter(name).as(as).by(condition));
}
// SET OPERATORS
/**
* Generates a {@code $setEquals} expression that compares the previously mentioned field to one or more arrays and
* returns {@literal true} if they have the same distinct elements and {@literal false} otherwise.
*
* @param arrays must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder equalsArrays(String... arrays) {
Assert.notEmpty(arrays, "Arrays must not be null or empty!");
return project("setEquals", Fields.fields(arrays));
}
/**
* Generates a {@code $setIntersection} expression that takes array of the previously mentioned field and one or
* more arrays and returns an array that contains the elements that appear in every of those.
*
* @param arrays must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder intersectsArrays(String... arrays) {
Assert.notEmpty(arrays, "Arrays must not be null or empty!");
return project("setIntersection", Fields.fields(arrays));
}
/**
* Generates a {@code $setUnion} expression that takes array of the previously mentioned field and one or more
* arrays and returns an array that contains the elements that appear in any of those.
*
* @param arrays must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder unionArrays(String... arrays) {
Assert.notEmpty(arrays, "Arrays must not be null or empty!");
return project("setUnion", Fields.fields(arrays));
}
/**
* Generates a {@code $setDifference} expression that takes array of the previously mentioned field and returns an
* array containing the elements that do not exist in the given {@literal array}.
*
* @param array must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder differenceToArray(String array) {
Assert.hasText(array, "Array must not be null or empty!");
return project("setDifference", Fields.fields(array));
}
/**
* Generates a {@code $setIsSubset} expression that takes array of the previously mentioned field and returns
* {@literal true} if it is a subset of the given {@literal array}.
*
* @param array must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder subsetOfArray(String array) {
Assert.hasText(array, "Array must not be null or empty!");
return project("setIsSubset", Fields.fields(array));
}
/**
* Generates an {@code $anyElementTrue} expression that Takes array of the previously mentioned field and returns
* {@literal true} if any of the elements are {@literal true} and {@literal false} otherwise.
*
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder anyElementInArrayTrue() {
return project("anyElementTrue");
}
/**
* Generates an {@code $allElementsTrue} expression that takes array of the previously mentioned field and returns
* {@literal true} if no elements is {@literal false}.
*
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder allElementsInArrayTrue() {
return project("allElementsTrue");
}
/**
* Generates a {@code $abs} expression that takes the number of the previously mentioned field and returns the
* absolute value of it.
*
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder absoluteValue() {
return this.operation.and(ArithmeticOperators.Abs.absoluteValueOf(name));
}
/**
* Generates a {@code $ceil} expression that takes the number of the previously mentioned field and returns the
* smallest integer greater than or equal to the specified number.
*
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder ceil() {
return this.operation.and(ArithmeticOperators.Ceil.ceilValueOf(name));
}
/**
* Generates a {@code $exp} expression that takes the number of the previously mentioned field and raises Eulers
* number (i.e. e ) on it.
*
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder exp() {
return this.operation.and(ArithmeticOperators.Exp.expValueOf(name));
}
/**
* Generates a {@code $floor} expression that takes the number of the previously mentioned field and returns the
* largest integer less than or equal to it.
*
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder floor() {
return this.operation.and(ArithmeticOperators.Floor.floorValueOf(name));
}
/**
* Generates a {@code $ln} expression that takes the number of the previously mentioned field and calculates the
* natural logarithm ln (i.e loge) of it.
*
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder ln() {
return this.operation.and(ArithmeticOperators.Ln.lnValueOf(name));
}
/**
* Generates a {@code $log} expression that takes the number of the previously mentioned field and calculates the
* log of the associated number in the specified base.
*
* @param baseFieldRef must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder log(String baseFieldRef) {
return this.operation.and(ArithmeticOperators.Log.valueOf(name).log(baseFieldRef));
}
/**
* Generates a {@code $log} expression that takes the number of the previously mentioned field and calculates the
* log of the associated number in the specified base.
*
* @param base must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder log(Number base) {
return this.operation.and(ArithmeticOperators.Log.valueOf(name).log(base));
}
/**
* Generates a {@code $log} expression that takes the number of the previously mentioned field and calculates the
* log of the associated number in the specified base.
*
* @param base must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder log(AggregationExpression base) {
return this.operation.and(ArithmeticOperators.Log.valueOf(name).log(base));
}
/**
* Generates a {@code $log10} expression that takes the number of the previously mentioned field and calculates the
* log base 10.
*
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder log10() {
return this.operation.and(ArithmeticOperators.Log10.log10ValueOf(name));
}
/**
* Generates a {@code $pow} expression that takes the number of the previously mentioned field and raises it by the
* specified exponent.
*
* @param exponentFieldRef must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder pow(String exponentFieldRef) {
return this.operation.and(ArithmeticOperators.Pow.valueOf(name).pow(exponentFieldRef));
}
/**
* Generates a {@code $pow} expression that takes the number of the previously mentioned field and raises it by the
* specified exponent.
*
* @param exponent must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder pow(Number exponent) {
return this.operation.and(ArithmeticOperators.Pow.valueOf(name).pow(exponent));
}
/**
* Generates a {@code $pow} expression that Takes the number of the previously mentioned field and raises it by the
* specified exponent.
*
* @param exponentExpression must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder pow(AggregationExpression exponentExpression) {
return this.operation.and(ArithmeticOperators.Pow.valueOf(name).pow(exponentExpression));
}
/**
* Generates a {@code $sqrt} expression that takes the number of the previously mentioned field and calculates the
* square root.
*
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder sqrt() {
return this.operation.and(ArithmeticOperators.Sqrt.sqrtOf(name));
}
/**
* Takes the number of the previously mentioned field and truncates it to its integer value.
*
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder trunc() {
return this.operation.and(ArithmeticOperators.Trunc.truncValueOf(name));
}
/**
* Generates a {@code $concat} expression that takes the string representation of the previously mentioned field and
* concats given values to it.
*
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder concat(Object... values) {
return project("concat", values);
}
/**
* Generates a {@code $substr} expression that Takes the string representation of the previously mentioned field and
* returns a substring starting at a specified index position.
*
* @param start
* @return
* @since 1.10
*/
public ProjectionOperationBuilder substring(int start) {
return substring(start, -1);
}
/**
* Generates a {@code $substr} expression that takes the string representation of the previously mentioned field and
* returns a substring starting at a specified index position including the specified number of characters.
*
* @param start
* @param nrOfChars
* @return
* @since 1.10
*/
public ProjectionOperationBuilder substring(int start, int nrOfChars) {
return project("substr", start, nrOfChars);
}
/**
* Generates a {@code $toLower} expression that takes the string representation of the previously mentioned field
* and lowers it.
*
* @return
* @since 1.10
*/
public ProjectionOperationBuilder toLower() {
return this.operation.and(StringOperators.ToLower.lowerValueOf(name));
}
/**
* Generates a {@code $toUpper} expression that takes the string representation of the previously mentioned field
* and uppers it.
*
* @return
* @since 1.10
*/
public ProjectionOperationBuilder toUpper() {
return this.operation.and(StringOperators.ToUpper.upperValueOf(name));
}
/**
* Generates a {@code $strcasecmp} expression that takes the string representation of the previously mentioned field
* and performs case-insensitive comparison to the given {@literal value}.
*
* @param value must not be {@literal null}.
* @return
* @since 1.10
*/
public ProjectionOperationBuilder strCaseCmp(String value) {
return project("strcasecmp", value);
}
/**
* Generates a {@code $strcasecmp} expression that takes the string representation of the previously mentioned field
* and performs case-insensitive comparison to the referenced {@literal fieldRef}.
*
* @param fieldRef must not be {@literal null}.
* @return
* @since 1.10
*/
public ProjectionOperationBuilder strCaseCmpValueOf(String fieldRef) {
return project("strcasecmp", fieldRef);
}
/**
* Generates a {@code $strcasecmp} expression that takes the string representation of the previously mentioned field
* and performs case-insensitive comparison to the result of the given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
* @return
* @since 1.10
*/
public ProjectionOperationBuilder strCaseCmp(AggregationExpression expression) {
return project("strcasecmp", expression);
}
/**
* Generates a {@code $arrayElemAt} expression that takes the string representation of the previously mentioned
* field and returns the element at the specified array {@literal position}.
*
* @param position
* @return
* @since 1.10
*/
public ProjectionOperationBuilder arrayElementAt(int position) {
return project("arrayElemAt", position);
}
/**
* Generates a {@code $concatArrays} expression that takes the string representation of the previously mentioned
* field and concats it with the arrays from the referenced {@literal fields}.
*
* @param fields must not be {@literal null}.
* @return
* @since 1.10
*/
public ProjectionOperationBuilder concatArrays(String... fields) {
return project("concatArrays", Fields.fields(fields));
}
/**
* Generates a {@code $isArray} expression that takes the string representation of the previously mentioned field
* and checks if its an array.
*
* @return
* @since 1.10
*/
public ProjectionOperationBuilder isArray() {
return this.operation.and(ArrayOperators.IsArray.isArray(name));
}
/**
* Generates a {@code $literal} expression that Takes the value previously and uses it as literal.
*
* @return
* @since 1.10
*/
public ProjectionOperationBuilder asLiteral() {
return this.operation.and(LiteralOperators.Literal.asLiteral(name));
}
/**
* Generates a {@code $dateToString} expression that takes the date representation of the previously mentioned field
* and applies given {@literal format} to it.
*
* @param format must not be {@literal null}.
* @return
* @since 1.10
*/
public ProjectionOperationBuilder dateAsFormattedString(String format) {
return this.operation.and(DateOperators.DateToString.dateOf(name).toString(format));
}
/**
* Generates a {@code $let} expression that binds variables for use in the specified expression, and returns the
* result of the expression.
*
* @param valueExpression The {@link AggregationExpression} bound to {@literal variableName}.
* @param variableName The variable name to be used in the {@literal in} {@link AggregationExpression}.
* @param in The {@link AggregationExpression} to evaluate.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder let(AggregationExpression valueExpression, String variableName,
AggregationExpression in) {
return this.operation.and(VariableOperators.Let
.define(ExpressionVariable.newVariable(variableName).forExpression(valueExpression)).andApply(in));
}
/**
* Generates a {@code $let} expression that binds variables for use in the specified expression, and returns the
* result of the expression.
*
* @param variables The bound {@link ExpressionVariable}s.
* @param in The {@link AggregationExpression} to evaluate.
* @return never {@literal null}.
* @since 1.10
*/
public ProjectionOperationBuilder let(Collection<ExpressionVariable> variables, AggregationExpression in) {
return this.operation.and(VariableOperators.Let.define(variables).andApply(in));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@@ -620,6 +1282,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Mark Paluch
*/
static class FieldProjection extends Projection {
@@ -638,7 +1301,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
private FieldProjection(Field field, Object value) {
super(field);
super(new ExposedField(field.getName(), true));
this.field = field;
this.value = value;
@@ -730,7 +1393,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
this.values = Arrays.asList(values);
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@@ -748,7 +1411,18 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
result.add(context.getReference(getField().getName()).toString());
for (Object element : values) {
result.add(element instanceof Field ? context.getReference((Field) element).toString() : element);
if (element instanceof Field) {
result.add(context.getReference((Field) element).toString());
} else if (element instanceof Fields) {
for (Field field : (Fields) element) {
result.add(context.getReference(field).toString());
}
} else if (element instanceof AggregationExpression) {
result.add(((AggregationExpression) element).toDbObject(context));
} else {
result.add(element);
}
}
return result;
@@ -763,6 +1437,20 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return field;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ProjectionOperation.Projection#getExposedField()
*/
@Override
public ExposedField getExposedField() {
if (!getField().isAliased()) {
return super.getExposedField();
}
return new ExposedField(new AggregationField(getField().getName()), true);
}
/**
* Creates a new instance of this {@link OperationProjection} with the given alias.
*

View File

@@ -0,0 +1,573 @@
/*
* Copyright 2016-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.List;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.expression.spel.ast.Projection;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Encapsulates the aggregation framework {@code $replaceRoot}-operation. <br />
* We recommend to use the static factory method {@link Aggregation#replaceRoot(String)} instead of creating instances
* of this class directly.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 1.10
* @see <a href="https://docs.mongodb.com/manual/reference/operator/aggregation/replaceRoot/">MongoDB Aggregation Framework: $replaceRoot</a>
*/
public class ReplaceRootOperation implements FieldsExposingAggregationOperation {
private final Replacement replacement;
/**
* Creates a new {@link ReplaceRootOperation} given the {@link Field} field name.
*
* @param field must not be {@literal null} or empty.
*/
public ReplaceRootOperation(Field field) {
this(new FieldReplacement(field));
}
/**
* Creates a new {@link ReplaceRootOperation} given the {@link AggregationExpression} pointing to a document.
*
* @param aggregationExpression must not be {@literal null}.
*/
public ReplaceRootOperation(AggregationExpression aggregationExpression) {
this(new AggregationExpressionReplacement(aggregationExpression));
}
/**
* Creates a new {@link ReplaceRootOperation} given the {@link Replacement}.
*
* @param replacement must not be {@literal null}.
*/
public ReplaceRootOperation(Replacement replacement) {
Assert.notNull(replacement, "Replacement must not be null!");
this.replacement = replacement;
}
/**
* Creates a new {@link ReplaceRootDocumentOperationBuilder}.
*
* @return a new {@link ReplaceRootDocumentOperationBuilder}.
*/
public static ReplaceRootOperationBuilder builder() {
return new ReplaceRootOperationBuilder();
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$replaceRoot", new BasicDBObject("newRoot", replacement.toDocumentExpression(context)));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
*/
@Override
public ExposedFields getFields() {
return ExposedFields.from();
}
/**
* Builder for {@link ReplaceRootOperation}.
*
* @author Mark Paluch
*/
public static class ReplaceRootOperationBuilder {
/**
* Defines a root document replacement based on a {@literal fieldName} that resolves to a document.
*
* @param fieldName must not be {@literal null} or empty.
* @return the final {@link ReplaceRootOperation}.
*/
public ReplaceRootOperation withValueOf(String fieldName) {
return new ReplaceRootOperation(Fields.field(fieldName));
}
/**
* Defines a root document replacement based on a {@link AggregationExpression} that resolves to a document.
*
* @param aggregationExpression must not be {@literal null}.
* @return the final {@link ReplaceRootOperation}.
*/
public ReplaceRootOperation withValueOf(AggregationExpression aggregationExpression) {
return new ReplaceRootOperation(aggregationExpression);
}
/**
* Defines a root document replacement based on a composable document that is empty initially. <br />
* {@link ReplaceRootOperation} can be populated with individual entries and derive its values from other, existing
* documents.
*
* @return the {@link ReplaceRootDocumentOperation}.
*/
public ReplaceRootDocumentOperation withDocument() {
return new ReplaceRootDocumentOperation();
}
/**
* Defines a root document replacement based on a composable document given {@literal dbObject}. <br />
* {@link ReplaceRootOperation} can be populated with individual entries and derive its values from other, existing
* documents.
*
* @param dbObject must not be {@literal null}.
* @return the final {@link ReplaceRootOperation}.
*/
public ReplaceRootOperation withDocument(DBObject dbObject) {
Assert.notNull(dbObject, "DBObject must not be null!");
return new ReplaceRootDocumentOperation().andValuesOf(dbObject);
}
}
/**
* Encapsulates the aggregation framework {@code $replaceRoot}-operation to result in a composable replacement
* document. <br />
* Instances of {@link ReplaceRootDocumentOperation} yield empty upon construction and can be populated with single
* values and documents.
*
* @author Mark Paluch
*/
static class ReplaceRootDocumentOperation extends ReplaceRootOperation {
private final static ReplacementDocument EMPTY = new ReplacementDocument();
private final ReplacementDocument current;
/**
* Creates an empty {@link ReplaceRootDocumentOperation}.
*/
public ReplaceRootDocumentOperation() {
this(EMPTY);
}
private ReplaceRootDocumentOperation(ReplacementDocument replacementDocument) {
super(replacementDocument);
current = replacementDocument;
}
/**
* Creates an extended {@link ReplaceRootDocumentOperation} that combines {@link ReplacementDocument}s from the
* {@literal currentOperation} and {@literal extension} operation.
*
* @param currentOperation must not be {@literal null}.
* @param extension must not be {@literal null}.
*/
protected ReplaceRootDocumentOperation(ReplaceRootDocumentOperation currentOperation,
ReplacementDocument extension) {
this(currentOperation.current.extendWith(extension));
}
/**
* Creates a new {@link ReplaceRootDocumentOperationBuilder} to define a field for the
* {@link AggregationExpression}.
*
* @param aggregationExpression must not be {@literal null}.
* @return the {@link ReplaceRootDocumentOperationBuilder}.
*/
public ReplaceRootDocumentOperationBuilder and(AggregationExpression aggregationExpression) {
return new ReplaceRootDocumentOperationBuilder(this, aggregationExpression);
}
/**
* Creates a new {@link ReplaceRootDocumentOperationBuilder} to define a field for the {@literal value}.
*
* @param value must not be {@literal null}.
* @return the {@link ReplaceRootDocumentOperationBuilder}.
*/
public ReplaceRootDocumentOperationBuilder andValue(Object value) {
return new ReplaceRootDocumentOperationBuilder(this, value);
}
/**
* Creates a new {@link ReplaceRootDocumentOperation} that merges all existing replacement values with values from
* {@literal value}. Existing replacement values are overwritten.
*
* @param value must not be {@literal null}.
* @return the {@link ReplaceRootDocumentOperation}.
*/
public ReplaceRootDocumentOperation andValuesOf(Object value) {
return new ReplaceRootDocumentOperation(this, ReplacementDocument.valueOf(value));
}
}
/**
* Builder for {@link ReplaceRootDocumentOperation} to populate {@link ReplacementDocument}
*
* @author Mark Paluch
*/
public static class ReplaceRootDocumentOperationBuilder {
private final ReplaceRootDocumentOperation currentOperation;
private final Object value;
protected ReplaceRootDocumentOperationBuilder(ReplaceRootDocumentOperation currentOperation, Object value) {
Assert.notNull(currentOperation, "Current ReplaceRootDocumentOperation must not be null!");
Assert.notNull(value, "Value must not be null!");
this.currentOperation = currentOperation;
this.value = value;
}
public ReplaceRootDocumentOperation as(String fieldName) {
if (value instanceof AggregationExpression) {
return new ReplaceRootDocumentOperation(currentOperation,
ReplacementDocument.forExpression(fieldName, (AggregationExpression) value));
}
return new ReplaceRootDocumentOperation(currentOperation, ReplacementDocument.forSingleValue(fieldName, value));
}
}
/**
* Replacement object that results in a replacement document or an expression that results in a document.
*
* @author Mark Paluch
* @author Christoph Strobl
*/
public interface Replacement {
/**
* Renders the current {@link Replacement} into a its MongoDB representation based on the given
* {@link AggregationOperationContext}.
*
* @param context will never be {@literal null}.
* @return a replacement document or an expression that results in a document.
*/
Object toDocumentExpression(AggregationOperationContext context);
}
/**
* {@link Replacement} that uses a {@link AggregationExpression} that results in a replacement document.
*
* @author Mark Paluch
*/
private static class AggregationExpressionReplacement implements Replacement {
private final AggregationExpression aggregationExpression;
protected AggregationExpressionReplacement(AggregationExpression aggregationExpression) {
Assert.notNull(aggregationExpression, "AggregationExpression must not be null!");
this.aggregationExpression = aggregationExpression;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ReplaceRootOperation.Replacement#toObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDocumentExpression(AggregationOperationContext context) {
return aggregationExpression.toDbObject(context);
}
}
/**
* {@link Replacement that references a {@link Field} inside the current aggregation pipeline.
*
* @author Mark Paluch
*/
private static class FieldReplacement implements Replacement {
private final Field field;
/**
* Creates {@link FieldReplacement} given {@link Field}.
*/
protected FieldReplacement(Field field) {
Assert.notNull(field, "Field must not be null!");
this.field = field;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ReplaceRootOperation.Replacement#toObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Object toDocumentExpression(AggregationOperationContext context) {
return context.getReference(field).toString();
}
}
/**
* Replacement document consisting of multiple {@link ReplacementContributor}s.
*
* @author Mark Paluch
*/
private static class ReplacementDocument implements Replacement {
private final Collection<ReplacementContributor> replacements;
/**
* Creates an empty {@link ReplacementDocument}.
*/
protected ReplacementDocument() {
replacements = new ArrayList<ReplacementContributor>();
}
/**
* Creates a {@link ReplacementDocument} given {@link ReplacementContributor}.
*
* @param contributor
*/
protected ReplacementDocument(ReplacementContributor contributor) {
Assert.notNull(contributor, "ReplacementContributor must not be null!");
replacements = Collections.singleton(contributor);
}
private ReplacementDocument(Collection<ReplacementContributor> replacements) {
this.replacements = replacements;
}
/**
* Creates a {@link ReplacementDocument} given a {@literal value}.
*
* @param value must not be {@literal null}.
* @return
*/
public static ReplacementDocument valueOf(Object value) {
return new ReplacementDocument(new DocumentContributor(value));
}
/**
* Creates a {@link ReplacementDocument} given a single {@literal field} and {@link AggregationExpression}.
*
* @param aggregationExpression must not be {@literal null}.
* @return
*/
public static ReplacementDocument forExpression(String field, AggregationExpression aggregationExpression) {
return new ReplacementDocument(new ExpressionFieldContributor(Fields.field(field), aggregationExpression));
}
/**
* Creates a {@link ReplacementDocument} given a single {@literal field} and {@literal value}.
*
* @param value must not be {@literal null}.
* @return
*/
public static ReplacementDocument forSingleValue(String field, Object value) {
return new ReplacementDocument(new ValueFieldContributor(Fields.field(field), value));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ReplaceRootOperation.Replacement#toObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDocumentExpression(AggregationOperationContext context) {
DBObject dbObject = new BasicDBObject();
for (ReplacementContributor replacement : replacements) {
dbObject.putAll(replacement.toDbObject(context));
}
return dbObject;
}
/**
* Extend a replacement document that merges {@code this} and {@literal replacement} {@link ReplacementContributor}s
* in a new {@link ReplacementDocument}.
*
* @param extension must not be {@literal null}.
* @return the new, extended {@link ReplacementDocument}
*/
public ReplacementDocument extendWith(ReplacementDocument extension) {
Assert.notNull(extension, "ReplacementDocument must not be null");
ReplacementDocument replacementDocument = new ReplacementDocument();
List<ReplacementContributor> replacements = new ArrayList<ReplacementContributor>(
this.replacements.size() + extension.replacements.size());
replacements.addAll(this.replacements);
replacements.addAll(extension.replacements);
return new ReplacementDocument(replacements);
}
}
/**
* Partial {@link DBObject} contributor for document replacement.
*
* @author Mark Paluch
*/
private interface ReplacementContributor extends AggregationExpression {
/**
* Renders the current {@link ReplacementContributor} into a {@link DBObject} based on the given
* {@link AggregationOperationContext}.
*
* @param context will never be {@literal null}.
* @return
*/
DBObject toDbObject(AggregationOperationContext context);
}
/**
* {@link ReplacementContributor} to contribute multiple fields based on the input {@literal value}. <br />
* The value object is mapped into a MongoDB {@link DBObject}.
*
* @author Mark Paluch
* @author Christoph Strobl
*/
private static class DocumentContributor implements ReplacementContributor {
private final Object value;
/**
* Creates new {@link Projection} for the given {@link Field}.
*
* @param value must not be {@literal null}.
*/
public DocumentContributor(Object value) {
Assert.notNull(value, "Value must not be null!");
this.value = value;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ReplaceRootOperation.ReplacementContributor#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
if (value instanceof DBObject) {
return (DBObject) value;
}
return (DBObject) context.getMappedObject(new BasicDBObject("$set", value)).get("$set");
}
}
/**
* Base class for {@link ReplacementContributor} implementations to contribute a single {@literal field} Typically
* used to construct a composite document that should contain the resulting key-value pair.
*
* @author Mark Paluch
*/
private abstract static class FieldContributorSupport implements ReplacementContributor {
private final ExposedField field;
/**
* Creates new {@link FieldContributorSupport} for the given {@link Field}.
*
* @param field must not be {@literal null}.
*/
public FieldContributorSupport(Field field) {
Assert.notNull(field, "Field must not be null!");
this.field = new ExposedField(field, true);
}
/**
* @return the {@link ExposedField}.
*/
public ExposedField getField() {
return field;
}
}
/**
* {@link ReplacementContributor} to contribute a single {@literal field} and {@literal value}. The {@literal value}
* is mapped to a MongoDB {@link DBObject} and can be a singular value, a list or subdocument.
*
* @author Mark Paluch
*/
private static class ValueFieldContributor extends FieldContributorSupport {
private final Object value;
/**
* Creates new {@link Projection} for the given {@link Field}.
*
* @param field must not be {@literal null}.
* @param value must not be {@literal null}.
*/
public ValueFieldContributor(Field field, Object value) {
super(field);
Assert.notNull(value, "Value must not be null!");
this.value = value;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ReplaceRootOperation.ReplacementContributor#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
Object mappedValue = value instanceof BasicDBObject ? value
: context.getMappedObject(new BasicDBObject("$set", value)).get("$set");
return new BasicDBObject(getField().getTarget(), mappedValue);
}
}
/**
* {@link ReplacementContributor} to contribute a single {@literal field} and value based on a
* {@link AggregationExpression}.
*
* @author Mark Paluch
*/
private static class ExpressionFieldContributor extends FieldContributorSupport {
private final AggregationExpression aggregationExpression;
/**
* Creates new {@link Projection} for the given {@link Field}.
*
* @param field must not be {@literal null}.
* @param aggregationExpression must not be {@literal null}.
*/
public ExpressionFieldContributor(Field field, AggregationExpression aggregationExpression) {
super(field);
Assert.notNull(aggregationExpression, "AggregationExpression must not be null!");
this.aggregationExpression = aggregationExpression;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ReplaceRootOperation.ReplacementContributor#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
return new BasicDBObject(getField().getTarget(), aggregationExpression.toDbObject(context));
}
}
}

View File

@@ -0,0 +1,666 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.Sum;
import org.springframework.util.Assert;
/**
* Gateway to {@literal Set expressions} which perform {@literal set} operation on arrays, treating arrays as sets.
*
* @author Christoph Strobl
* @since 1.10
*/
public class SetOperators {
/**
* Take the array referenced by given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static SetOperatorFactory arrayAsSet(String fieldReference) {
return new SetOperatorFactory(fieldReference);
}
/**
* Take the array resulting from the given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static SetOperatorFactory arrayAsSet(AggregationExpression expression) {
return new SetOperatorFactory(expression);
}
/**
* @author Christoph Strobl
*/
public static class SetOperatorFactory {
private final String fieldReference;
private final AggregationExpression expression;
/**
* Creates new {@link SetOperatorFactory} for given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
*/
public SetOperatorFactory(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
this.fieldReference = fieldReference;
this.expression = null;
}
/**
* Creates new {@link SetOperatorFactory} for given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
*/
public SetOperatorFactory(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
this.fieldReference = null;
this.expression = expression;
}
/**
* Creates new {@link AggregationExpression} that compares the previously mentioned field to one or more arrays and
* returns {@literal true} if they have the same distinct elements and {@literal false} otherwise.
*
* @param arrayReferences must not be {@literal null}.
* @return
*/
public SetEquals isEqualTo(String... arrayReferences) {
return createSetEquals().isEqualTo(arrayReferences);
}
/**
* Creates new {@link AggregationExpression} that compares the previously mentioned field to one or more arrays and
* returns {@literal true} if they have the same distinct elements and {@literal false} otherwise.
*
* @param expressions must not be {@literal null}.
* @return
*/
public SetEquals isEqualTo(AggregationExpression... expressions) {
return createSetEquals().isEqualTo(expressions);
}
private SetEquals createSetEquals() {
return usesFieldRef() ? SetEquals.arrayAsSet(fieldReference) : SetEquals.arrayAsSet(expression);
}
/**
* Creates new {@link AggregationExpression} that takes array of the previously mentioned field and one or more
* arrays and returns an array that contains the elements that appear in every of those.
*
* @param arrayReferences must not be {@literal null}.
* @return
*/
public SetIntersection intersects(String... arrayReferences) {
return createSetIntersection().intersects(arrayReferences);
}
/**
* Creates new {@link AggregationExpression} that takes array of the previously mentioned field and one or more
* arrays and returns an array that contains the elements that appear in every of those.
*
* @param expressions must not be {@literal null}.
* @return
*/
public SetIntersection intersects(AggregationExpression... expressions) {
return createSetIntersection().intersects(expressions);
}
private SetIntersection createSetIntersection() {
return usesFieldRef() ? SetIntersection.arrayAsSet(fieldReference) : SetIntersection.arrayAsSet(expression);
}
/**
* Creates new {@link AggregationExpression} that takes array of the previously mentioned field and one or more
* arrays and returns an array that contains the elements that appear in any of those.
*
* @param arrayReferences must not be {@literal null}.
* @return
*/
public SetUnion union(String... arrayReferences) {
return createSetUnion().union(arrayReferences);
}
/**
* Creates new {@link AggregationExpression} that takes array of the previously mentioned field and one or more
* arrays and returns an array that contains the elements that appear in any of those.
*
* @param expressions must not be {@literal null}.
* @return
*/
public SetUnion union(AggregationExpression... expressions) {
return createSetUnion().union(expressions);
}
private SetUnion createSetUnion() {
return usesFieldRef() ? SetUnion.arrayAsSet(fieldReference) : SetUnion.arrayAsSet(expression);
}
/**
* Creates new {@link AggregationExpression} that takes array of the previously mentioned field and returns an array
* containing the elements that do not exist in the given {@literal arrayReference}.
*
* @param arrayReference must not be {@literal null}.
* @return
*/
public SetDifference differenceTo(String arrayReference) {
return createSetDifference().differenceTo(arrayReference);
}
/**
* Creates new {@link AggregationExpression} that takes array of the previously mentioned field and returns an array
* containing the elements that do not exist in the given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
* @return
*/
public SetDifference differenceTo(AggregationExpression expression) {
return createSetDifference().differenceTo(expression);
}
private SetDifference createSetDifference() {
return usesFieldRef() ? SetDifference.arrayAsSet(fieldReference) : SetDifference.arrayAsSet(expression);
}
/**
* Creates new {@link AggregationExpression} that takes array of the previously mentioned field and returns
* {@literal true} if it is a subset of the given {@literal arrayReference}.
*
* @param arrayReference must not be {@literal null}.
* @return
*/
public SetIsSubset isSubsetOf(String arrayReference) {
return createSetIsSubset().isSubsetOf(arrayReference);
}
/**
* Creates new {@link AggregationExpression} that takes array of the previously mentioned field and returns
* {@literal true} if it is a subset of the given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
* @return
*/
public SetIsSubset isSubsetOf(AggregationExpression expression) {
return createSetIsSubset().isSubsetOf(expression);
}
private SetIsSubset createSetIsSubset() {
return usesFieldRef() ? SetIsSubset.arrayAsSet(fieldReference) : SetIsSubset.arrayAsSet(expression);
}
/**
* Creates new {@link AggregationExpression} that takes array of the previously mentioned field and returns
* {@literal true} if any of the elements are {@literal true} and {@literal false} otherwise.
*
* @return
*/
public AnyElementTrue anyElementTrue() {
return usesFieldRef() ? AnyElementTrue.arrayAsSet(fieldReference) : AnyElementTrue.arrayAsSet(expression);
}
/**
* Creates new {@link AggregationExpression} that tkes array of the previously mentioned field and returns
* {@literal true} if no elements is {@literal false}.
*
* @return
*/
public AllElementsTrue allElementsTrue() {
return usesFieldRef() ? AllElementsTrue.arrayAsSet(fieldReference) : AllElementsTrue.arrayAsSet(expression);
}
private boolean usesFieldRef() {
return this.fieldReference != null;
}
}
/**
* {@link AggregationExpression} for {@code $setEquals}.
*
* @author Christoph Strobl
*/
public static class SetEquals extends AbstractAggregationExpression {
private SetEquals(List<?> arrays) {
super(arrays);
}
@Override
protected String getMongoMethod() {
return "$setEquals";
}
/**
* Create new {@link SetEquals}.
*
* @param arrayReference must not be {@literal null}.
* @return
*/
public static SetEquals arrayAsSet(String arrayReference) {
Assert.notNull(arrayReference, "ArrayReference must not be null!");
return new SetEquals(asFields(arrayReference));
}
/**
* Create new {@link SetEquals}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static SetEquals arrayAsSet(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new SetEquals(Collections.singletonList(expression));
}
/**
* Creates new {@link java.util.Set} with all previously added arguments appending the given one.
*
* @param arrayReferences must not be {@literal null}.
* @return
*/
public SetEquals isEqualTo(String... arrayReferences) {
Assert.notNull(arrayReferences, "ArrayReferences must not be null!");
return new SetEquals(append(Fields.fields(arrayReferences).asList()));
}
/**
* Creates new {@link Sum} with all previously added arguments appending the given one.
*
* @param expressions must not be {@literal null}.
* @return
*/
public SetEquals isEqualTo(AggregationExpression... expressions) {
Assert.notNull(expressions, "Expressions must not be null!");
return new SetEquals(append(Arrays.asList(expressions)));
}
/**
* Creates new {@link Sum} with all previously added arguments appending the given one.
*
* @param array must not be {@literal null}.
* @return
*/
public SetEquals isEqualTo(Object[] array) {
Assert.notNull(array, "Array must not be null!");
return new SetEquals(append(array));
}
}
/**
* {@link AggregationExpression} for {@code $setIntersection}.
*
* @author Christoph Strobl
*/
public static class SetIntersection extends AbstractAggregationExpression {
private SetIntersection(List<?> arrays) {
super(arrays);
}
@Override
protected String getMongoMethod() {
return "$setIntersection";
}
/**
* Creates new {@link SetIntersection}
*
* @param arrayReference must not be {@literal null}.
* @return
*/
public static SetIntersection arrayAsSet(String arrayReference) {
Assert.notNull(arrayReference, "ArrayReference must not be null!");
return new SetIntersection(asFields(arrayReference));
}
/**
* Creates new {@link SetIntersection}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static SetIntersection arrayAsSet(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new SetIntersection(Collections.singletonList(expression));
}
/**
* Creates new {@link SetIntersection} with all previously added arguments appending the given one.
*
* @param arrayReferences must not be {@literal null}.
* @return
*/
public SetIntersection intersects(String... arrayReferences) {
Assert.notNull(arrayReferences, "ArrayReferences must not be null!");
return new SetIntersection(append(asFields(arrayReferences)));
}
/**
* Creates new {@link SetIntersection} with all previously added arguments appending the given one.
*
* @param expressions must not be {@literal null}.
* @return
*/
public SetIntersection intersects(AggregationExpression... expressions) {
Assert.notNull(expressions, "Expressions must not be null!");
return new SetIntersection(append(Arrays.asList(expressions)));
}
}
/**
* {@link AggregationExpression} for {@code $setUnion}.
*
* @author Christoph Strobl
*/
public static class SetUnion extends AbstractAggregationExpression {
private SetUnion(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$setUnion";
}
/**
* Creates new {@link SetUnion}.
*
* @param arrayReference must not be {@literal null}.
* @return
*/
public static SetUnion arrayAsSet(String arrayReference) {
Assert.notNull(arrayReference, "ArrayReference must not be null!");
return new SetUnion(asFields(arrayReference));
}
/**
* Creates new {@link SetUnion}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static SetUnion arrayAsSet(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new SetUnion(Collections.singletonList(expression));
}
/**
* Creates new {@link SetUnion} with all previously added arguments appending the given one.
*
* @param arrayReferences must not be {@literal null}.
* @return
*/
public SetUnion union(String... arrayReferences) {
Assert.notNull(arrayReferences, "ArrayReferences must not be null!");
return new SetUnion(append(asFields(arrayReferences)));
}
/**
* Creates new {@link SetUnion} with all previously added arguments appending the given one.
*
* @param expressions must not be {@literal null}.
* @return
*/
public SetUnion union(AggregationExpression... expressions) {
Assert.notNull(expressions, "Expressions must not be null!");
return new SetUnion(append(Arrays.asList(expressions)));
}
}
/**
* {@link AggregationExpression} for {@code $setDifference}.
*
* @author Christoph Strobl
*/
public static class SetDifference extends AbstractAggregationExpression {
private SetDifference(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$setDifference";
}
/**
* Creates new {@link SetDifference}.
*
* @param arrayReference must not be {@literal null}.
* @return
*/
public static SetDifference arrayAsSet(String arrayReference) {
Assert.notNull(arrayReference, "ArrayReference must not be null!");
return new SetDifference(asFields(arrayReference));
}
/**
* Creates new {@link SetDifference}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static SetDifference arrayAsSet(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new SetDifference(Collections.singletonList(expression));
}
/**
* Creates new {@link SetDifference} with all previously added arguments appending the given one.
*
* @param arrayReference must not be {@literal null}.
* @return
*/
public SetDifference differenceTo(String arrayReference) {
Assert.notNull(arrayReference, "ArrayReference must not be null!");
return new SetDifference(append(Fields.field(arrayReference)));
}
/**
* Creates new {@link SetDifference} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public SetDifference differenceTo(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new SetDifference(append(expression));
}
}
/**
* {@link AggregationExpression} for {@code $setIsSubset}.
*
* @author Christoph Strobl
*/
public static class SetIsSubset extends AbstractAggregationExpression {
private SetIsSubset(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$setIsSubset";
}
/**
* Creates new {@link SetIsSubset}.
*
* @param arrayReference must not be {@literal null}.
* @return
*/
public static SetIsSubset arrayAsSet(String arrayReference) {
Assert.notNull(arrayReference, "ArrayReference must not be null!");
return new SetIsSubset(asFields(arrayReference));
}
/**
* Creates new {@link SetIsSubset}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static SetIsSubset arrayAsSet(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new SetIsSubset(Collections.singletonList(expression));
}
/**
* Creates new {@link SetIsSubset} with all previously added arguments appending the given one.
*
* @param arrayReference must not be {@literal null}.
* @return
*/
public SetIsSubset isSubsetOf(String arrayReference) {
Assert.notNull(arrayReference, "ArrayReference must not be null!");
return new SetIsSubset(append(Fields.field(arrayReference)));
}
/**
* Creates new {@link SetIsSubset} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public SetIsSubset isSubsetOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new SetIsSubset(append(expression));
}
}
/**
* {@link AggregationExpression} for {@code $anyElementTrue}.
*
* @author Christoph Strobl
*/
public static class AnyElementTrue extends AbstractAggregationExpression {
private AnyElementTrue(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$anyElementTrue";
}
/**
* Creates new {@link AnyElementTrue}.
*
* @param arrayReference must not be {@literal null}.
* @return
*/
public static AnyElementTrue arrayAsSet(String arrayReference) {
Assert.notNull(arrayReference, "ArrayReference must not be null!");
return new AnyElementTrue(asFields(arrayReference));
}
/**
* Creates new {@link AnyElementTrue}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static AnyElementTrue arrayAsSet(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new AnyElementTrue(Collections.singletonList(expression));
}
public AnyElementTrue anyElementTrue() {
return this;
}
}
/**
* {@link AggregationExpression} for {@code $allElementsTrue}.
*
* @author Christoph Strobl
*/
public static class AllElementsTrue extends AbstractAggregationExpression {
private AllElementsTrue(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$allElementsTrue";
}
/**
* Creates new {@link AllElementsTrue}.
*
* @param arrayReference must not be {@literal null}.
* @return
*/
public static AllElementsTrue arrayAsSet(String arrayReference) {
Assert.notNull(arrayReference, "ArrayReference must not be null!");
return new AllElementsTrue(asFields(arrayReference));
}
/**
* Creates new {@link AllElementsTrue}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static AllElementsTrue arrayAsSet(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new AllElementsTrue(Collections.singletonList(expression));
}
public AllElementsTrue allElementsTrue() {
return this;
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,10 +26,10 @@ import com.mongodb.DBObject;
* We recommend to use the static factory method {@link Aggregation#skip(int)} instead of creating instances of this
* class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/skip/
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
* @see <a href="https://docs.mongodb.com/manual/reference/operator/aggregation/skip/">MongoDB Aggregation Framework: $skip</a>
*/
public class SkipOperation implements AggregationOperation {
@@ -38,7 +38,7 @@ public class SkipOperation implements AggregationOperation {
/**
* Creates a new {@link SkipOperation} skipping the given number of elements.
*
* @param skipCount number of documents to skip.
* @param skipCount number of documents to skip, must not be less than zero.
*/
public SkipOperation(long skipCount) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -30,10 +30,10 @@ import com.mongodb.DBObject;
* We recommend to use the static factory method {@link Aggregation#sort(Direction, String...)} instead of creating
* instances of this class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/sort/#pipe._S_sort
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
* @see <a href="https://docs.mongodb.com/manual/reference/operator/aggregation/sort/">MongoDB Aggregation Framework: $sort</a>
*/
public class SortOperation implements AggregationOperation {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -26,19 +26,26 @@ import org.springframework.data.mongodb.core.spel.ExpressionNode;
import org.springframework.data.mongodb.core.spel.ExpressionTransformationContextSupport;
import org.springframework.data.mongodb.core.spel.LiteralNode;
import org.springframework.data.mongodb.core.spel.MethodReferenceNode;
import org.springframework.data.mongodb.core.spel.MethodReferenceNode.AggregationMethodReference;
import org.springframework.data.mongodb.core.spel.MethodReferenceNode.AggregationMethodReference.ArgumentType;
import org.springframework.data.mongodb.core.spel.NotOperatorNode;
import org.springframework.data.mongodb.core.spel.OperatorNode;
import org.springframework.expression.spel.ExpressionState;
import org.springframework.expression.spel.SpelNode;
import org.springframework.expression.spel.SpelParserConfiguration;
import org.springframework.expression.spel.ast.CompoundExpression;
import org.springframework.expression.spel.ast.ConstructorReference;
import org.springframework.expression.spel.ast.Indexer;
import org.springframework.expression.spel.ast.InlineList;
import org.springframework.expression.spel.ast.InlineMap;
import org.springframework.expression.spel.ast.OperatorNot;
import org.springframework.expression.spel.ast.PropertyOrFieldReference;
import org.springframework.expression.spel.standard.SpelExpression;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.expression.spel.support.StandardEvaluationContext;
import org.springframework.util.Assert;
import org.springframework.util.NumberUtils;
import org.springframework.util.ObjectUtils;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
@@ -48,6 +55,7 @@ import com.mongodb.DBObject;
* Renders the AST of a SpEL expression as a MongoDB Aggregation Framework projection expression.
*
* @author Thomas Darimont
* @author Christoph Strobl
*/
class SpelExpressionTransformer implements AggregationExpressionTransformer {
@@ -69,6 +77,8 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
conversions.add(new PropertyOrFieldReferenceNodeConversion(this));
conversions.add(new CompoundExpressionNodeConversion(this));
conversions.add(new MethodReferenceNodeConversion(this));
conversions.add(new NotOperatorNodeConversion(this));
conversions.add(new ValueRetrievingNodeConversion(this));
this.conversions = Collections.unmodifiableList(conversions);
}
@@ -131,8 +141,8 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
* @author Thomas Darimont
* @author Oliver Gierke
*/
private static abstract class ExpressionNodeConversion<T extends ExpressionNode> implements
AggregationExpressionTransformer {
private static abstract class ExpressionNodeConversion<T extends ExpressionNode>
implements AggregationExpressionTransformer {
private final AggregationExpressionTransformer transformer;
private final Class<? extends ExpressionNode> nodeType;
@@ -235,8 +245,17 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
protected Object convert(AggregationExpressionTransformationContext<OperatorNode> context) {
OperatorNode currentNode = context.getCurrentNode();
DBObject operationObject = createOperationObjectAndAddToPreviousArgumentsIfNecessary(context, currentNode);
if (currentNode.isLogicalOperator()) {
for (ExpressionNode expressionNode : currentNode) {
transform(expressionNode, currentNode, operationObject, context);
}
return operationObject;
}
Object leftResult = transform(currentNode.getLeft(), currentNode, operationObject, context);
if (currentNode.isUnaryMinus()) {
@@ -271,7 +290,8 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
return nextDbObject;
}
private Object convertUnaryMinusOp(ExpressionTransformationContextSupport<OperatorNode> context, Object leftResult) {
private Object convertUnaryMinusOp(ExpressionTransformationContextSupport<OperatorNode> context,
Object leftResult) {
Object result = leftResult instanceof Number ? leftResult
: new BasicDBObject("$multiply", dbList(-1, leftResult));
@@ -289,7 +309,7 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
*/
@Override
protected boolean supports(ExpressionNode node) {
return node.isMathematicalOperation();
return node.isMathematicalOperation() || node.isLogicalOperator();
}
}
@@ -462,13 +482,33 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
protected Object convert(AggregationExpressionTransformationContext<MethodReferenceNode> context) {
MethodReferenceNode node = context.getCurrentNode();
List<Object> args = new ArrayList<Object>();
AggregationMethodReference methodReference = node.getMethodReference();
for (ExpressionNode childNode : node) {
args.add(transform(childNode, context));
Object args = null;
if (ObjectUtils.nullSafeEquals(methodReference.getArgumentType(), ArgumentType.SINGLE)) {
args = transform(node.getChild(0), context);
} else if (ObjectUtils.nullSafeEquals(methodReference.getArgumentType(), ArgumentType.MAP)) {
DBObject dbo = new BasicDBObject();
int i = 0;
for(ExpressionNode child : node) {
dbo.put(methodReference.getArgumentMap()[i++], transform(child, context));
}
args = dbo;
} else {
List<Object> argList = new ArrayList<Object>();
for (ExpressionNode childNode : node) {
argList.add(transform(childNode, context));
}
args = dbList(argList.toArray());
}
return context.addToPreviousOrReturn(new BasicDBObject(node.getMethodName(), dbList(args.toArray())));
return context.addToPreviousOrReturn(new BasicDBObject(methodReference.getMongoOperator(), args));
}
}
@@ -510,4 +550,81 @@ class SpelExpressionTransformer implements AggregationExpressionTransformer {
return node.isOfType(CompoundExpression.class);
}
}
/**
* @author Christoph Strobl
* @since 1.10
*/
static class NotOperatorNodeConversion extends ExpressionNodeConversion<NotOperatorNode> {
/**
* Creates a new {@link ExpressionNodeConversion}.
*
* @param transformer must not be {@literal null}.
*/
public NotOperatorNodeConversion(AggregationExpressionTransformer transformer) {
super(transformer);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
*/
@Override
protected Object convert(AggregationExpressionTransformationContext<NotOperatorNode> context) {
NotOperatorNode node = context.getCurrentNode();
List<Object> args = new ArrayList<Object>();
for (ExpressionNode childNode : node) {
args.add(transform(childNode, context));
}
return context.addToPreviousOrReturn(new BasicDBObject(node.getMongoOperator(), dbList(args.toArray())));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.NodeConversion#supports(org.springframework.data.mongodb.core.spel.ExpressionNode)
*/
@Override
protected boolean supports(ExpressionNode node) {
return node.isOfType(OperatorNot.class);
}
}
/**
* @author Christoph Strobl
* @since 1.10
*/
static class ValueRetrievingNodeConversion extends ExpressionNodeConversion<ExpressionNode> {
/**
* Creates a new {@link ExpressionNodeConversion}.
*
* @param transformer must not be {@literal null}.
*/
public ValueRetrievingNodeConversion(AggregationExpressionTransformer transformer) {
super(transformer);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.SpelNodeWrapper#convertSpelNodeToMongoObjectExpression(org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.ExpressionConversionContext)
*/
@Override
protected Object convert(AggregationExpressionTransformationContext<ExpressionNode> context) {
return context.getCurrentNode().getValue();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.SpelExpressionTransformer.NodeConversion#supports(org.springframework.data.mongodb.core.spel.ExpressionNode)
*/
@Override
protected boolean supports(ExpressionNode node) {
return node.isOfType(InlineMap.class) || node.isOfType(InlineList.class)
|| node.isOfType(ConstructorReference.class);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,6 +22,7 @@ import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.PersistentPropertyPath;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.DirectFieldReference;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
@@ -34,6 +35,7 @@ import com.mongodb.DBObject;
* property references into document field names.
*
* @author Oliver Gierke
* @author Mark Paluch
* @since 1.3
*/
public class TypeBasedAggregationOperationContext implements AggregationOperationContext {
@@ -95,9 +97,9 @@ public class TypeBasedAggregationOperationContext implements AggregationOperatio
PersistentPropertyPath<MongoPersistentProperty> propertyPath = mappingContext.getPersistentPropertyPath(
field.getTarget(), type);
Field mappedField = field(propertyPath.getLeafProperty().getName(),
Field mappedField = field(field.getName(),
propertyPath.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE));
return new FieldReference(new ExposedField(mappedField, true));
return new DirectFieldReference(new ExposedField(mappedField, true));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,32 +27,241 @@ import com.mongodb.DBObject;
* We recommend to use the static factory method {@link Aggregation#unwind(String)} instead of creating instances of
* this class directly.
*
* @see http://docs.mongodb.org/manual/reference/aggregation/unwind/#pipe._S_unwind
* @author Thomas Darimont
* @author Oliver Gierke
* @author Mark Paluch
* @author Christoph Strobl
* @since 1.3
* @see <a href="https://docs.mongodb.com/manual/reference/operator/aggregation/unwind/">MongoDB Aggregation Framework: $unwind</a>
*/
public class UnwindOperation implements AggregationOperation {
public class UnwindOperation
implements AggregationOperation, FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation {
private final ExposedField field;
private final ExposedField arrayIndex;
private final boolean preserveNullAndEmptyArrays;
/**
* Creates a new {@link UnwindOperation} for the given {@link Field}.
*
*
* @param field must not be {@literal null}.
*/
public UnwindOperation(Field field) {
Assert.notNull(field);
this.field = new ExposedField(field, true);
this(new ExposedField(field, true), false);
}
/*
/**
* Creates a new {@link UnwindOperation} using Mongo 3.2 syntax.
*
* @param field must not be {@literal null}.
* @param preserveNullAndEmptyArrays {@literal true} to output the document if path is {@literal null}, missing or
* array is empty.
* @since 1.10
*/
public UnwindOperation(Field field, boolean preserveNullAndEmptyArrays) {
Assert.notNull(field, "Field must not be null!");
this.field = new ExposedField(field, true);
this.arrayIndex = null;
this.preserveNullAndEmptyArrays = preserveNullAndEmptyArrays;
}
/**
* Creates a new {@link UnwindOperation} using Mongo 3.2 syntax.
*
* @param field must not be {@literal null}.
* @param arrayIndex optional field name to expose the field array index, must not be {@literal null}.
* @param preserveNullAndEmptyArrays {@literal true} to output the document if path is {@literal null}, missing or
* array is empty.
* @since 1.10
*/
public UnwindOperation(Field field, Field arrayIndex, boolean preserveNullAndEmptyArrays) {
Assert.notNull(field, "Field must not be null!");
Assert.notNull(arrayIndex, "ArrayIndex must not be null!");
this.field = new ExposedField(field, true);
this.arrayIndex = new ExposedField(arrayIndex, true);
this.preserveNullAndEmptyArrays = preserveNullAndEmptyArrays;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$unwind", context.getReference(field).toString());
String path = context.getReference(field).toString();
if (!preserveNullAndEmptyArrays && arrayIndex == null) {
return new BasicDBObject("$unwind", path);
}
DBObject unwindArgs = new BasicDBObject();
unwindArgs.put("path", path);
if (arrayIndex != null) {
unwindArgs.put("includeArrayIndex", arrayIndex.getName());
}
unwindArgs.put("preserveNullAndEmptyArrays", preserveNullAndEmptyArrays);
return new BasicDBObject("$unwind", unwindArgs);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
*/
@Override
public ExposedFields getFields() {
return arrayIndex != null ? ExposedFields.from(arrayIndex) : ExposedFields.from();
}
/**
* Get a builder that allows creation of {@link LookupOperation}.
*
* @return
* @since 1.10
*/
public static PathBuilder newUnwind() {
return UnwindOperationBuilder.newBuilder();
}
/**
* @author Mark Paluch
* @since 1.10
*/
public static interface PathBuilder {
/**
* @param path the path to unwind, must not be {@literal null} or empty.
* @return
*/
IndexBuilder path(String path);
}
/**
* @author Mark Paluch
* @since 1.10
*/
public static interface IndexBuilder {
/**
* Exposes the array index as {@code field}.
*
* @param field field name to expose the field array index, must not be {@literal null} or empty.
* @return
*/
EmptyArraysBuilder arrayIndex(String field);
/**
* Do not expose the array index.
*
* @return
*/
EmptyArraysBuilder noArrayIndex();
}
public static interface EmptyArraysBuilder {
/**
* Output documents if the array is null or empty.
*
* @return
*/
UnwindOperation preserveNullAndEmptyArrays();
/**
* Do not output documents if the array is null or empty.
*
* @return
*/
UnwindOperation skipNullAndEmptyArrays();
}
/**
* Builder for fluent {@link UnwindOperation} creation.
*
* @author Mark Paluch
* @since 1.10
*/
public static final class UnwindOperationBuilder implements PathBuilder, IndexBuilder, EmptyArraysBuilder {
private Field field;
private Field arrayIndex;
private UnwindOperationBuilder() {}
/**
* Creates new builder for {@link UnwindOperation}.
*
* @return never {@literal null}.
*/
public static PathBuilder newBuilder() {
return new UnwindOperationBuilder();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.UnwindOperation.EmptyArraysBuilder#preserveNullAndEmptyArrays()
*/
@Override
public UnwindOperation preserveNullAndEmptyArrays() {
if (arrayIndex != null) {
return new UnwindOperation(field, arrayIndex, true);
}
return new UnwindOperation(field, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.UnwindOperation.EmptyArraysBuilder#skipNullAndEmptyArrays()
*/
@Override
public UnwindOperation skipNullAndEmptyArrays() {
if (arrayIndex != null) {
return new UnwindOperation(field, arrayIndex, false);
}
return new UnwindOperation(field, false);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.UnwindOperation.IndexBuilder#arrayIndex(java.lang.String)
*/
@Override
public EmptyArraysBuilder arrayIndex(String field) {
Assert.hasText(field, "'ArrayIndex' must not be null or empty!");
arrayIndex = Fields.field(field);
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.UnwindOperation.IndexBuilder#noArrayIndex()
*/
@Override
public EmptyArraysBuilder noArrayIndex() {
arrayIndex = null;
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.UnwindOperation.PathBuilder#path(java.lang.String)
*/
@Override
public UnwindOperationBuilder path(String path) {
Assert.hasText(path, "'Path' must not be null or empty!");
field = Fields.field(path);
return this;
}
}
}

View File

@@ -0,0 +1,391 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.List;
import org.springframework.data.mongodb.core.aggregation.VariableOperators.Let.ExpressionVariable;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Gateway to {@literal variable} aggregation operations.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 1.10
*/
public class VariableOperators {
/**
* Starts building new {@link Map} that applies an {@link AggregationExpression} to each item of a referenced array
* and returns an array with the applied results.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Map.AsBuilder mapItemsOf(String fieldReference) {
return Map.itemsOf(fieldReference);
}
/**
* Starts building new {@link Map} that applies an {@link AggregationExpression} to each item of a referenced array
* and returns an array with the applied results.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Map.AsBuilder mapItemsOf(AggregationExpression expression) {
return Map.itemsOf(expression);
}
/**
* Start creating new {@link Let} that allows definition of {@link ExpressionVariable} that can be used within a
* nested {@link AggregationExpression}.
*
* @param variables must not be {@literal null}.
* @return
*/
public static Let.LetBuilder define(ExpressionVariable... variables) {
return Let.define(variables);
}
/**
* Start creating new {@link Let} that allows definition of {@link ExpressionVariable} that can be used within a
* nested {@link AggregationExpression}.
*
* @param variables must not be {@literal null}.
* @return
*/
public static Let.LetBuilder define(Collection<ExpressionVariable> variables) {
return Let.define(variables);
}
/**
* {@link AggregationExpression} for {@code $map}.
*/
public static class Map implements AggregationExpression {
private Object sourceArray;
private String itemVariableName;
private AggregationExpression functionToApply;
private Map(Object sourceArray, String itemVariableName, AggregationExpression functionToApply) {
Assert.notNull(sourceArray, "SourceArray must not be null!");
Assert.notNull(itemVariableName, "ItemVariableName must not be null!");
Assert.notNull(functionToApply, "FunctionToApply must not be null!");
this.sourceArray = sourceArray;
this.itemVariableName = itemVariableName;
this.functionToApply = functionToApply;
}
/**
* Starts building new {@link Map} that applies an {@link AggregationExpression} to each item of a referenced array
* and returns an array with the applied results.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static AsBuilder itemsOf(final String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new AsBuilder() {
@Override
public FunctionBuilder as(final String variableName) {
Assert.notNull(variableName, "VariableName must not be null!");
return new FunctionBuilder() {
@Override
public Map andApply(final AggregationExpression expression) {
Assert.notNull(expression, "AggregationExpression must not be null!");
return new Map(Fields.field(fieldReference), variableName, expression);
}
};
}
};
}
/**
* Starts building new {@link Map} that applies an {@link AggregationExpression} to each item of a referenced array
* and returns an array with the applied results.
*
* @param source must not be {@literal null}.
* @return
*/
public static AsBuilder itemsOf(final AggregationExpression source) {
Assert.notNull(source, "AggregationExpression must not be null!");
return new AsBuilder() {
@Override
public FunctionBuilder as(final String variableName) {
Assert.notNull(variableName, "VariableName must not be null!");
return new FunctionBuilder() {
@Override
public Map andApply(final AggregationExpression expression) {
Assert.notNull(expression, "AggregationExpression must not be null!");
return new Map(source, variableName, expression);
}
};
}
};
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(final AggregationOperationContext context) {
return toMap(ExposedFields.synthetic(Fields.fields(itemVariableName)), context);
}
private DBObject toMap(ExposedFields exposedFields, AggregationOperationContext context) {
BasicDBObject map = new BasicDBObject();
InheritingExposedFieldsAggregationOperationContext operationContext = new InheritingExposedFieldsAggregationOperationContext(
exposedFields, context);
BasicDBObject input;
if (sourceArray instanceof Field) {
input = new BasicDBObject("input", context.getReference((Field) sourceArray).toString());
} else {
input = new BasicDBObject("input", ((AggregationExpression) sourceArray).toDbObject(context));
}
map.putAll(context.getMappedObject(input));
map.put("as", itemVariableName);
map.put("in",
functionToApply.toDbObject(new NestedDelegatingExpressionAggregationOperationContext(operationContext)));
return new BasicDBObject("$map", map);
}
public interface AsBuilder {
/**
* Define the {@literal variableName} for addressing items within the array.
*
* @param variableName must not be {@literal null}.
* @return
*/
FunctionBuilder as(String variableName);
}
public interface FunctionBuilder {
/**
* Creates new {@link Map} that applies the given {@link AggregationExpression} to each item of the referenced
* array and returns an array with the applied results.
*
* @param expression must not be {@literal null}.
* @return
*/
Map andApply(AggregationExpression expression);
}
}
/**
* {@link AggregationExpression} for {@code $let} that binds {@link AggregationExpression} to variables for use in the
* specified {@code in} expression, and returns the result of the expression.
*
* @author Christoph Strobl
* @since 1.10
*/
public static class Let implements AggregationExpression {
private final List<ExpressionVariable> vars;
private final AggregationExpression expression;
private Let(List<ExpressionVariable> vars, AggregationExpression expression) {
this.vars = vars;
this.expression = expression;
}
/**
* Start creating new {@link Let} by defining the variables for {@code $vars}.
*
* @param variables must not be {@literal null}.
* @return
*/
public static LetBuilder define(final Collection<ExpressionVariable> variables) {
Assert.notNull(variables, "Variables must not be null!");
return new LetBuilder() {
@Override
public Let andApply(final AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Let(new ArrayList<ExpressionVariable>(variables), expression);
}
};
}
/**
* Start creating new {@link Let} by defining the variables for {@code $vars}.
*
* @param variables must not be {@literal null}.
* @return
*/
public static LetBuilder define(final ExpressionVariable... variables) {
Assert.notNull(variables, "Variables must not be null!");
return new LetBuilder() {
@Override
public Let andApply(final AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Let(Arrays.asList(variables), expression);
}
};
}
public interface LetBuilder {
/**
* Define the {@link AggregationExpression} to evaluate.
*
* @param expression must not be {@literal null}.
* @return
*/
Let andApply(AggregationExpression expression);
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(final AggregationOperationContext context) {
return toLet(ExposedFields.synthetic(Fields.fields(getVariableNames())), context);
}
private String[] getVariableNames() {
String[] varNames = new String[this.vars.size()];
for (int i = 0; i < this.vars.size(); i++) {
varNames[i] = this.vars.get(i).variableName;
}
return varNames;
}
private DBObject toLet(ExposedFields exposedFields, AggregationOperationContext context) {
DBObject letExpression = new BasicDBObject();
DBObject mappedVars = new BasicDBObject();
InheritingExposedFieldsAggregationOperationContext operationContext = new InheritingExposedFieldsAggregationOperationContext(
exposedFields, context);
for (ExpressionVariable var : this.vars) {
mappedVars.putAll(getMappedVariable(var, context));
}
letExpression.put("vars", mappedVars);
letExpression.put("in", getMappedIn(operationContext));
return new BasicDBObject("$let", letExpression);
}
private DBObject getMappedVariable(ExpressionVariable var, AggregationOperationContext context) {
return new BasicDBObject(var.variableName, var.expression instanceof AggregationExpression
? ((AggregationExpression) var.expression).toDbObject(context) : var.expression);
}
private Object getMappedIn(AggregationOperationContext context) {
return expression.toDbObject(new NestedDelegatingExpressionAggregationOperationContext(context));
}
/**
* @author Christoph Strobl
*/
public static class ExpressionVariable {
private final String variableName;
private final Object expression;
/**
* Creates new {@link ExpressionVariable}.
*
* @param variableName can be {@literal null}.
* @param expression can be {@literal null}.
*/
private ExpressionVariable(String variableName, Object expression) {
this.variableName = variableName;
this.expression = expression;
}
/**
* Create a new {@link ExpressionVariable} with given name.
*
* @param variableName must not be {@literal null}.
* @return never {@literal null}.
*/
public static ExpressionVariable newVariable(String variableName) {
Assert.notNull(variableName, "VariableName must not be null!");
return new ExpressionVariable(variableName, null);
}
/**
* Create a new {@link ExpressionVariable} with current name and given {@literal expression}.
*
* @param expression must not be {@literal null}.
* @return never {@literal null}.
*/
public ExpressionVariable forExpression(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new ExpressionVariable(variableName, expression);
}
/**
* Create a new {@link ExpressionVariable} with current name and given {@literal expressionObject}.
*
* @param expressionObject must not be {@literal null}.
* @return never {@literal null}.
*/
public ExpressionVariable forExpression(DBObject expressionObject) {
Assert.notNull(expressionObject, "Expression must not be null!");
return new ExpressionVariable(variableName, expressionObject);
}
}
}
}

View File

@@ -75,15 +75,13 @@ public abstract class AbstractMongoConverter implements MongoConverter, Initiali
*/
private void initializeConverters() {
if (!conversionService.canConvert(ObjectId.class, String.class)) {
conversionService.addConverter(ObjectIdToStringConverter.INSTANCE);
}
if (!conversionService.canConvert(String.class, ObjectId.class)) {
conversionService.addConverter(StringToObjectIdConverter.INSTANCE);
}
conversionService.addConverter(ObjectIdToStringConverter.INSTANCE);
conversionService.addConverter(StringToObjectIdConverter.INSTANCE);
if (!conversionService.canConvert(ObjectId.class, BigInteger.class)) {
conversionService.addConverter(ObjectIdToBigIntegerConverter.INSTANCE);
}
if (!conversionService.canConvert(BigInteger.class, ObjectId.class)) {
conversionService.addConverter(BigIntegerToObjectIdConverter.INSTANCE);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -42,16 +42,6 @@ import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.ThreeTenBackPortConverters;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigDecimalToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.NamedMongoScriptToDBObjectConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.DBObjectToNamedMongoScriptCoverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.DBObjectToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigDecimalConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigIntegerConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToURLConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.TermToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.URLToStringConverter;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.util.CacheValue;
import org.springframework.util.Assert;
@@ -66,6 +56,7 @@ import org.springframework.util.Assert;
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
* @author Mark Paluch
*/
public class CustomConversions {
@@ -112,17 +103,7 @@ public class CustomConversions {
// Add user provided converters to make sure they can override the defaults
toRegister.addAll(converters);
toRegister.add(CustomToStringConverter.INSTANCE);
toRegister.add(BigDecimalToStringConverter.INSTANCE);
toRegister.add(StringToBigDecimalConverter.INSTANCE);
toRegister.add(BigIntegerToStringConverter.INSTANCE);
toRegister.add(StringToBigIntegerConverter.INSTANCE);
toRegister.add(URLToStringConverter.INSTANCE);
toRegister.add(StringToURLConverter.INSTANCE);
toRegister.add(DBObjectToStringConverter.INSTANCE);
toRegister.add(TermToStringConverter.INSTANCE);
toRegister.add(NamedMongoScriptToDBObjectConverter.INSTANCE);
toRegister.add(DBObjectToNamedMongoScriptCoverter.INSTANCE);
toRegister.addAll(MongoConverters.getConvertersToRegister());
toRegister.addAll(JodaTimeConverters.getConvertersToRegister());
toRegister.addAll(GeoConverters.getConvertersToRegister());
toRegister.addAll(Jsr310Converters.getConvertersToRegister());
@@ -186,14 +167,15 @@ public class CustomConversions {
}
if (!added) {
throw new IllegalArgumentException("Given set contains element that is neither Converter nor ConverterFactory!");
throw new IllegalArgumentException(
"Given set contains element that is neither Converter nor ConverterFactory!");
}
}
}
/**
* Registers a conversion for the given converter. Inspects either generics or the {@link ConvertiblePair}s returned
* by a {@link GenericConverter}.
* Registers a conversion for the given converter. Inspects either generics of {@link Converter} and
* {@link ConverterFactory} or the {@link ConvertiblePair}s returned by a {@link GenericConverter}.
*
* @param converter
*/
@@ -208,6 +190,10 @@ public class CustomConversions {
for (ConvertiblePair pair : genericConverter.getConvertibleTypes()) {
register(new ConverterRegistration(pair, isReading, isWriting));
}
} else if (converter instanceof ConverterFactory) {
Class<?>[] arguments = GenericTypeResolver.resolveTypeArguments(converter.getClass(), ConverterFactory.class);
register(new ConverterRegistration(arguments[0], arguments[1], isReading, isWriting));
} else if (converter instanceof Converter) {
Class<?>[] arguments = GenericTypeResolver.resolveTypeArguments(converter.getClass(), Converter.class);
register(new ConverterRegistration(arguments[0], arguments[1], isReading, isWriting));
@@ -412,6 +398,10 @@ public class CustomConversions {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.GenericConverter#getConvertibleTypes()
*/
public Set<ConvertiblePair> getConvertibleTypes() {
ConvertiblePair localeToString = new ConvertiblePair(Locale.class, String.class);
@@ -420,6 +410,10 @@ public class CustomConversions {
return new HashSet<ConvertiblePair>(Arrays.asList(localeToString, booleanToString));
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.GenericConverter#convert(java.lang.Object, org.springframework.core.convert.TypeDescriptor, org.springframework.core.convert.TypeDescriptor)
*/
public Object convert(Object source, TypeDescriptor sourceType, TypeDescriptor targetType) {
return source.toString();
}

View File

@@ -75,9 +75,7 @@ class DBObjectAccessor {
String part = parts.next();
if (parts.hasNext()) {
BasicDBObject nestedDbObject = new BasicDBObject();
dbObject.put(part, nestedDbObject);
dbObject = nestedDbObject;
dbObject = getOrCreateNestedDbObject(part, dbObject);
} else {
dbObject.put(part, value);
}
@@ -116,8 +114,48 @@ class DBObjectAccessor {
return result;
}
/**
* Returns whether the underlying {@link DBObject} has a value ({@literal null} or non-{@literal null}) for the given
* {@link MongoPersistentProperty}.
*
* @param property must not be {@literal null}.
* @return
*/
public boolean hasValue(MongoPersistentProperty property) {
Assert.notNull(property, "Property must not be null!");
String fieldName = property.getFieldName();
if (!fieldName.contains(".")) {
return this.dbObject.containsField(fieldName);
}
String[] parts = fieldName.split("\\.");
Map<String, Object> source = this.dbObject;
Object result = null;
for (int i = 1; i < parts.length; i++) {
result = source.get(parts[i - 1]);
source = getAsMap(result);
if (source == null) {
return false;
}
}
return source.containsKey(parts[parts.length - 1]);
}
/**
* Returns the given source object as map, i.e. {@link BasicDBObject}s and maps as is or {@literal null} otherwise.
*
* @param source can be {@literal null}.
* @return
*/
@SuppressWarnings("unchecked")
private Map<String, Object> getAsMap(Object source) {
private static Map<String, Object> getAsMap(Object source) {
if (source instanceof BasicDBObject) {
return (BasicDBObject) source;
@@ -129,4 +167,26 @@ class DBObjectAccessor {
return null;
}
/**
* Returns the {@link DBObject} which either already exists in the given source under the given key, or creates a new
* nested one, registers it with the source and returns it.
*
* @param key must not be {@literal null} or empty.
* @param source must not be {@literal null}.
* @return
*/
private static DBObject getOrCreateNestedDbObject(String key, DBObject source) {
Object existing = source.get(key);
if (existing instanceof BasicDBObject) {
return (BasicDBObject) existing;
}
DBObject nested = new BasicDBObject();
source.put(key, nested);
return nested;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,9 @@
*/
package org.springframework.data.mongodb.core.convert;
import java.util.List;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
@@ -64,4 +67,16 @@ public interface DbRefResolver {
* @since 1.7
*/
DBObject fetch(DBRef dbRef);
/**
* Loads a given {@link List} of {@link DBRef}s from the datasource in one batch. The resulting {@link List} of
* {@link DBObject} will reflect the ordering of the {@link DBRef} passed in.<br />
* The {@link DBRef} elements in the list must not reference different collections.
*
* @param dbRefs must not be {@literal null}.
* @return never {@literal null}.
* @throws InvalidDataAccessApiUsageException in case not all {@link DBRef} target the same collection.
* @since 1.10
*/
List<DBObject> bulkFetch(List<DBRef> dbRefs);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,6 +22,10 @@ import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import java.io.Serializable;
import java.lang.reflect.Method;
import java.util.ArrayList;
import java.util.Collections;
import java.util.Comparator;
import java.util.List;
import org.aopalliance.intercept.MethodInterceptor;
import org.aopalliance.intercept.MethodInvocation;
@@ -31,6 +35,7 @@ import org.springframework.cglib.proxy.Enhancer;
import org.springframework.cglib.proxy.Factory;
import org.springframework.cglib.proxy.MethodProxy;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.LazyLoadingException;
import org.springframework.data.mongodb.MongoDbFactory;
@@ -40,6 +45,9 @@ import org.springframework.objenesis.ObjenesisStd;
import org.springframework.util.Assert;
import org.springframework.util.ReflectionUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.DB;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
@@ -109,6 +117,40 @@ public class DefaultDbRefResolver implements DbRefResolver {
return ReflectiveDBRefResolver.fetch(mongoDbFactory, dbRef);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DbRefResolver#bulkFetch(java.util.List)
*/
@Override
public List<DBObject> bulkFetch(List<DBRef> refs) {
Assert.notNull(mongoDbFactory, "Factory must not be null!");
Assert.notNull(refs, "DBRef to fetch must not be null!");
if (refs.isEmpty()) {
return Collections.emptyList();
}
String collection = refs.iterator().next().getCollectionName();
List<Object> ids = new ArrayList<Object>(refs.size());
for (DBRef ref : refs) {
if (!collection.equals(ref.getCollectionName())) {
throw new InvalidDataAccessApiUsageException(
"DBRefs must all target the same collection for bulk fetch operation.");
}
ids.add(ref.getId());
}
DB db = mongoDbFactory.getDb();
List<DBObject> result = db.getCollection(collection)
.find(new BasicDBObjectBuilder().add("_id", new BasicDBObject("$in", ids)).get()).toArray();
Collections.sort(result, new DbRefByReferencePositionComparator(ids));
return result;
}
/**
* Creates a proxy for the given {@link MongoPersistentProperty} using the given {@link DbRefResolverCallback} to
* eventually resolve the value of the property.
@@ -180,8 +222,8 @@ public class DefaultDbRefResolver implements DbRefResolver {
* @author Oliver Gierke
* @author Christoph Strobl
*/
static class LazyLoadingInterceptor implements MethodInterceptor, org.springframework.cglib.proxy.MethodInterceptor,
Serializable {
static class LazyLoadingInterceptor
implements MethodInterceptor, org.springframework.cglib.proxy.MethodInterceptor, Serializable {
private static final Method INITIALIZE_METHOD, TO_DBREF_METHOD, FINALIZE_METHOD;
@@ -387,11 +429,45 @@ public class DefaultDbRefResolver implements DbRefResolver {
} catch (RuntimeException ex) {
DataAccessException translatedException = this.exceptionTranslator.translateExceptionIfPossible(ex);
throw new LazyLoadingException("Unable to lazily resolve DBRef!", translatedException);
throw new LazyLoadingException("Unable to lazily resolve DBRef!",
translatedException != null ? translatedException : ex);
}
}
return result;
}
}
/**
* {@link Comparator} for sorting {@link DBObject} that have been loaded in random order by a predefined list of
* reference identifiers.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.10
*/
private static class DbRefByReferencePositionComparator implements Comparator<DBObject> {
private final List<Object> reference;
/**
* Creates a new {@link DbRefByReferencePositionComparator} for the given list of reference identifiers.
*
* @param referenceIds must not be {@literal null}.
*/
public DbRefByReferencePositionComparator(List<Object> referenceIds) {
Assert.notNull(referenceIds, "Reference identifiers must not be null!");
this.reference = new ArrayList<Object>(referenceIds);
}
/*
* (non-Javadoc)
* @see java.util.Comparator#compare(java.lang.Object, java.lang.Object)
*/
@Override
public int compare(DBObject o1, DBObject o2) {
return Integer.compare(reference.indexOf(o1.get("_id")), reference.indexOf(o2.get("_id")));
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -45,9 +45,9 @@ import com.mongodb.DBObject;
public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implements MongoTypeMapper {
public static final String DEFAULT_TYPE_KEY = "_class";
@SuppressWarnings("rawtypes")//
@SuppressWarnings("rawtypes") //
private static final TypeInformation<List> LIST_TYPE_INFO = ClassTypeInformation.from(List.class);
@SuppressWarnings("rawtypes")//
@SuppressWarnings("rawtypes") //
private static final TypeInformation<Map> MAP_TYPE_INFO = ClassTypeInformation.from(Map.class);
private final TypeAliasAccessor<DBObject> accessor;
@@ -58,12 +58,12 @@ public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implemen
}
public DefaultMongoTypeMapper(String typeKey) {
this(typeKey, Arrays.asList(SimpleTypeInformationMapper.INSTANCE));
this(typeKey, Arrays.asList(new SimpleTypeInformationMapper()));
}
public DefaultMongoTypeMapper(String typeKey, MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext) {
this(typeKey, new DBObjectTypeAliasAccessor(typeKey), mappingContext, Arrays
.asList(SimpleTypeInformationMapper.INSTANCE));
this(typeKey, new DBObjectTypeAliasAccessor(typeKey), mappingContext,
Arrays.asList(new SimpleTypeInformationMapper()));
}
public DefaultMongoTypeMapper(String typeKey, List<? extends TypeInformationMapper> mappers) {
@@ -71,7 +71,8 @@ public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implemen
}
private DefaultMongoTypeMapper(String typeKey, TypeAliasAccessor<DBObject> accessor,
MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext, List<? extends TypeInformationMapper> mappers) {
MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext,
List<? extends TypeInformationMapper> mappers) {
super(accessor, mappingContext, mappers);

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2014-2015 the original author or authors.
* Copyright 2014-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -111,9 +111,17 @@ abstract class GeoConverters {
@Override
public Point convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(source.keySet().size() == 2, "Source must contain 2 elements");
return source == null ? null : new Point((Double) source.get("x"), (Double) source.get("y"));
if (source.containsField("type")) {
return DbObjectToGeoJsonPointConverter.INSTANCE.convert(source);
}
return new Point((Double) source.get("x"), (Double) source.get("y"));
}
}
@@ -591,7 +599,7 @@ abstract class GeoConverters {
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "Point"),
String.format("Cannot convert type '%s' to Point.", source.get("type")));
List<Double> dbl = (List<Double>) source.get("coordinates");
List<Number> dbl = (List<Number>) source.get("coordinates");
return new GeoJsonPoint(dbl.get(0).doubleValue(), dbl.get(1).doubleValue());
}
}
@@ -824,7 +832,7 @@ abstract class GeoConverters {
Assert.isInstanceOf(List.class, point);
List<Double> coordinatesList = (List<Double>) point;
List<Number> coordinatesList = (List<Number>) point;
points.add(new GeoJsonPoint(coordinatesList.get(0).doubleValue(), coordinatesList.get(1).doubleValue()));
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 by the original author(s).
* Copyright 2011-2017 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,9 +20,11 @@ import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import java.util.Set;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@@ -53,6 +55,9 @@ import org.springframework.data.mapping.model.SpELExpressionParameterValueProvid
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.event.AfterConvertEvent;
import org.springframework.data.mongodb.core.mapping.event.AfterLoadEvent;
import org.springframework.data.mongodb.core.mapping.event.MongoMappingEvent;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
import org.springframework.expression.spel.standard.SpelExpressionParser;
@@ -74,6 +79,7 @@ import com.mongodb.DBRef;
* @author Patrik Wasik
* @author Thomas Darimont
* @author Christoph Strobl
* @author Jordi Llach
*/
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware, ValueResolver {
@@ -95,7 +101,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
/**
* Creates a new {@link MappingMongoConverter} given the new {@link DbRefResolver} and {@link MappingContext}.
*
* @param mongoDbFactory must not be {@literal null}.
* @param dbRefResolver must not be {@literal null}.
* @param mappingContext must not be {@literal null}.
*/
public MappingMongoConverter(DbRefResolver dbRefResolver,
@@ -258,13 +264,14 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
// make sure id property is set before all other properties
Object idValue = null;
final DBObjectAccessor dbObjectAccessor = new DBObjectAccessor(dbo);
if (idProperty != null) {
if (idProperty != null && dbObjectAccessor.hasValue(idProperty)) {
idValue = getValueInternal(idProperty, dbo, evaluator, path);
accessor.setProperty(idProperty, idValue);
}
final ObjectPath currentPath = path.push(result, entity, idValue);
final ObjectPath currentPath = path.push(result, entity, idValue != null ? dbObjectAccessor.get(idProperty) : null);
// Set properties not already set in the constructor
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
@@ -275,7 +282,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return;
}
if (!dbo.containsField(prop.getFieldName()) || entity.isConstructorArgument(prop)) {
if (entity.isConstructorArgument(prop) || !dbObjectAccessor.hasValue(prop)) {
return;
}
@@ -288,9 +295,9 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
public void doWithAssociation(Association<MongoPersistentProperty> association) {
final MongoPersistentProperty property = association.getInverse();
Object value = dbo.get(property.getFieldName());
Object value = dbObjectAccessor.get(property);
if (value == null) {
if (value == null || entity.isConstructorArgument(property)) {
return;
}
@@ -312,28 +319,28 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.MongoWriter#toDBRef(java.lang.Object, org.springframework.data.mongodb.core.mapping.MongoPersistentProperty)
*/
public DBRef toDBRef(Object object, MongoPersistentProperty referingProperty) {
public DBRef toDBRef(Object object, MongoPersistentProperty referringProperty) {
org.springframework.data.mongodb.core.mapping.DBRef annotation = null;
if (referingProperty != null) {
annotation = referingProperty.getDBRef();
if (referringProperty != null) {
annotation = referringProperty.getDBRef();
Assert.isTrue(annotation != null, "The referenced property has to be mapped with @DBRef!");
}
// @see DATAMONGO-913
// DATAMONGO-913
if (object instanceof LazyLoadingProxy) {
return ((LazyLoadingProxy) object).toDBRef();
}
return createDBRef(object, referingProperty);
return createDBRef(object, referringProperty);
}
/**
* Root entry method into write conversion. Adds a type discriminator to the {@link DBObject}. Shouldn't be called for
* nested conversions.
*
* @see org.springframework.data.mongodb.core.core.convert.MongoWriter#write(java.lang.Object, com.mongodb.DBObject)
* @see org.springframework.data.mongodb.core.convert.MongoWriter#write(java.lang.Object, com.mongodb.DBObject)
*/
public void write(final Object obj, final DBObject dbo) {
@@ -877,6 +884,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param path must not be {@literal null}.
* @return the converted {@link Collection} or array, will never be {@literal null}.
*/
@SuppressWarnings({ "rawtypes", "unchecked" })
private Object readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue, ObjectPath path) {
Assert.notNull(targetType, "Target type must not be null!");
@@ -884,10 +892,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Class<?> collectionType = targetType.getType();
if (sourceValue.isEmpty()) {
return getPotentiallyConvertedSimpleRead(new HashSet<Object>(), collectionType);
}
TypeInformation<?> componentType = targetType.getComponentType();
Class<?> rawComponentType = componentType == null ? null : componentType.getType();
@@ -895,13 +899,20 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Collection<Object> items = targetType.getType().isArray() ? new ArrayList<Object>()
: CollectionFactory.createCollection(collectionType, rawComponentType, sourceValue.size());
for (int i = 0; i < sourceValue.size(); i++) {
if (sourceValue.isEmpty()) {
return getPotentiallyConvertedSimpleRead(items, collectionType);
}
Object dbObjItem = sourceValue.get(i);
if (!DBRef.class.equals(rawComponentType) && isCollectionOfDbRefWhereBulkFetchIsPossible(sourceValue)) {
return bulkReadAndConvertDBRefs((List<DBRef>) (List) (sourceValue), componentType, path, rawComponentType);
}
for (Object dbObjItem : sourceValue) {
if (dbObjItem instanceof DBRef) {
items.add(
DBRef.class.equals(rawComponentType) ? dbObjItem : read(componentType, readRef((DBRef) dbObjItem), path));
items.add(DBRef.class.equals(rawComponentType) ? dbObjItem
: readAndConvertDBRef((DBRef) dbObjItem, componentType, path, rawComponentType));
} else if (dbObjItem instanceof DBObject) {
items.add(read(componentType, (DBObject) dbObjItem, path));
} else {
@@ -937,6 +948,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Map<Object, Object> map = CollectionFactory.createMap(mapType, rawKeyType, dbObject.keySet().size());
Map<String, Object> sourceMap = dbObject.toMap();
if (!DBRef.class.equals(rawValueType) && isCollectionOfDbRefWhereBulkFetchIsPossible(sourceMap.values())) {
bulkReadAndConvertDBRefMapIntoTarget(valueType, rawValueType, sourceMap, map);
return map;
}
for (Entry<String, Object> entry : sourceMap.entrySet()) {
if (typeMapper.isTypeKey(entry.getKey())) {
continue;
@@ -953,7 +969,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (value instanceof DBObject) {
map.put(key, read(valueType, (DBObject) value, path));
} else if (value instanceof DBRef) {
map.put(key, DBRef.class.equals(rawValueType) ? value : read(valueType, readRef((DBRef) value)));
map.put(key, DBRef.class.equals(rawValueType) ? value
: readAndConvertDBRef((DBRef) value, valueType, ObjectPath.ROOT, rawValueType));
} else {
Class<?> valueClass = valueType == null ? null : valueType.getType();
map.put(key, getPotentiallyConvertedSimpleRead(value, valueClass));
@@ -991,20 +1008,32 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (obj instanceof DBObject) {
DBObject newValueDbo = new BasicDBObject();
for (String vk : ((DBObject) obj).keySet()) {
Object o = ((DBObject) obj).get(vk);
newValueDbo.put(vk, convertToMongoType(o, typeHint));
}
return newValueDbo;
}
if (obj instanceof Map) {
DBObject result = new BasicDBObject();
for (Map.Entry<Object, Object> entry : ((Map<Object, Object>) obj).entrySet()) {
result.put(entry.getKey().toString(), convertToMongoType(entry.getValue(), typeHint));
Map<Object, Object> converted = new LinkedHashMap<Object, Object>();
for (Entry<Object, Object> entry : ((Map<Object, Object>) obj).entrySet()) {
TypeInformation<? extends Object> valueTypeHint = typeHint != null && typeHint.getMapValueType() != null
? typeHint.getMapValueType() : typeHint;
converted.put(getPotentiallyConvertedSimpleWrite(entry.getKey()).toString(),
convertToMongoType(entry.getValue(), valueTypeHint));
}
return result;
return new BasicDBObject(converted);
}
if (obj.getClass().isArray()) {
@@ -1196,12 +1225,70 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
Object object = dbref == null ? null : path.getPathItem(dbref.getId(), dbref.getCollectionName());
return (T) (object != null ? object : readAndConvertDBRef(dbref, type, path, rawType));
}
if (object != null) {
return (T) object;
private <T> T readAndConvertDBRef(DBRef dbref, TypeInformation<?> type, ObjectPath path, final Class<?> rawType) {
List<T> result = bulkReadAndConvertDBRefs(Collections.singletonList(dbref), type, path, rawType);
return CollectionUtils.isEmpty(result) ? null : result.iterator().next();
}
@SuppressWarnings({ "unchecked", "rawtypes" })
private void bulkReadAndConvertDBRefMapIntoTarget(TypeInformation<?> valueType, Class<?> rawValueType,
Map<String, Object> sourceMap, Map<Object, Object> targetMap) {
LinkedHashMap<String, Object> referenceMap = new LinkedHashMap<String, Object>(sourceMap);
List<Object> convertedObjects = bulkReadAndConvertDBRefs((List<DBRef>) new ArrayList(referenceMap.values()),
valueType, ObjectPath.ROOT, rawValueType);
int index = 0;
for (String key : referenceMap.keySet()) {
targetMap.put(key, convertedObjects.get(index));
index++;
}
}
@SuppressWarnings("unchecked")
private <T> List<T> bulkReadAndConvertDBRefs(List<DBRef> dbrefs, TypeInformation<?> type, ObjectPath path,
final Class<?> rawType) {
if (CollectionUtils.isEmpty(dbrefs)) {
return Collections.emptyList();
}
return (T) (object != null ? object : read(type, readRef(dbref), path));
List<DBObject> referencedRawDocuments = dbrefs.size() == 1
? Collections.singletonList(readRef(dbrefs.iterator().next())) : bulkReadRefs(dbrefs);
String collectionName = dbrefs.iterator().next().getCollectionName();
List<T> targeList = new ArrayList<T>(dbrefs.size());
for (DBObject document : referencedRawDocuments) {
if (document != null) {
maybeEmitEvent(new AfterLoadEvent<T>(document, (Class<T>) rawType, collectionName));
}
final T target = (T) read(type, document, path);
targeList.add(target);
if (target != null) {
maybeEmitEvent(new AfterConvertEvent<T>(document, target, collectionName));
}
}
return targeList;
}
private void maybeEmitEvent(MongoMappingEvent<?> event) {
if (canPublishEvent()) {
this.applicationContext.publishEvent(event);
}
}
private boolean canPublishEvent() {
return this.applicationContext != null;
}
/**
@@ -1214,6 +1301,45 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return dbRefResolver.fetch(ref);
}
/**
* Performs a bulk fetch operation for the given {@link DBRef}s.
*
* @param references must not be {@literal null}.
* @return never {@literal null}.
* @since 1.10
*/
List<DBObject> bulkReadRefs(List<DBRef> references) {
return dbRefResolver.bulkFetch(references);
}
/**
* Returns whether the given {@link Iterable} contains {@link DBRef} instances all pointing to the same collection.
*
* @param source must not be {@literal null}.
* @return
*/
private static boolean isCollectionOfDbRefWhereBulkFetchIsPossible(Iterable<Object> source) {
Assert.notNull(source, "Iterable of DBRefs must not be null!");
Set<String> collectionsFound = new HashSet<String>();
for (Object dbObjItem : source) {
if (!(dbObjItem instanceof DBRef)) {
return false;
}
collectionsFound.add(((DBRef) dbObjItem).getCollectionName());
if (collectionsFound.size() > 1) {
return false;
}
}
return true;
}
/**
* Marker class used to indicate we have a non root document object here that might be used within an update - so we
* need to preserve type hints for potential nested elements but need to remove it on top level.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,16 +19,26 @@ import java.math.BigDecimal;
import java.math.BigInteger;
import java.net.MalformedURLException;
import java.net.URL;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Currency;
import java.util.List;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.concurrent.atomic.AtomicLong;
import org.bson.types.Code;
import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionFailedException;
import org.springframework.core.convert.TypeDescriptor;
import org.springframework.core.convert.converter.ConditionalConverter;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.convert.converter.ConverterFactory;
import org.springframework.data.convert.ReadingConverter;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mongodb.core.query.Term;
import org.springframework.data.mongodb.core.script.NamedMongoScript;
import org.springframework.util.Assert;
import org.springframework.util.NumberUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
@@ -49,6 +59,36 @@ abstract class MongoConverters {
*/
private MongoConverters() {}
/**
* Returns the converters to be registered.
*
* @return
* @since 1.9
*/
public static Collection<Object> getConvertersToRegister() {
List<Object> converters = new ArrayList<Object>();
converters.add(BigDecimalToStringConverter.INSTANCE);
converters.add(StringToBigDecimalConverter.INSTANCE);
converters.add(BigIntegerToStringConverter.INSTANCE);
converters.add(StringToBigIntegerConverter.INSTANCE);
converters.add(URLToStringConverter.INSTANCE);
converters.add(StringToURLConverter.INSTANCE);
converters.add(DBObjectToStringConverter.INSTANCE);
converters.add(TermToStringConverter.INSTANCE);
converters.add(NamedMongoScriptToDBObjectConverter.INSTANCE);
converters.add(DBObjectToNamedMongoScriptCoverter.INSTANCE);
converters.add(CurrencyToStringConverter.INSTANCE);
converters.add(StringToCurrencyConverter.INSTANCE);
converters.add(AtomicIntegerToIntegerConverter.INSTANCE);
converters.add(AtomicLongToLongConverter.INSTANCE);
converters.add(LongToAtomicLongConverter.INSTANCE);
converters.add(IntegerToAtomicIntegerConverter.INSTANCE);
return converters;
}
/**
* Simple singleton to convert {@link ObjectId}s to their {@link String} representation.
*
@@ -228,4 +268,177 @@ abstract class MongoConverters {
return builder.get();
}
}
/**
* {@link Converter} implementation converting {@link Currency} into its ISO 4217 {@link String} representation.
*
* @author Christoph Strobl
* @since 1.9
*/
@WritingConverter
public static enum CurrencyToStringConverter implements Converter<Currency, String> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public String convert(Currency source) {
return source == null ? null : source.getCurrencyCode();
}
}
/**
* {@link Converter} implementation converting ISO 4217 {@link String} into {@link Currency}.
*
* @author Christoph Strobl
* @since 1.9
*/
@ReadingConverter
public static enum StringToCurrencyConverter implements Converter<String, Currency> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public Currency convert(String source) {
return StringUtils.hasText(source) ? Currency.getInstance(source) : null;
}
}
/**
* {@link ConverterFactory} implementation using {@link NumberUtils} for number conversion and parsing. Additionally
* deals with {@link AtomicInteger} and {@link AtomicLong} by calling {@code get()} before performing the actual
* conversion.
*
* @author Christoph Strobl
* @since 1.9
*/
@WritingConverter
public static enum NumberToNumberConverterFactory implements ConverterFactory<Number, Number>,ConditionalConverter {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.ConverterFactory#getConverter(java.lang.Class)
*/
@Override
public <T extends Number> Converter<Number, T> getConverter(Class<T> targetType) {
return new NumberToNumberConverter<T>(targetType);
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.ConditionalConverter#matches(org.springframework.core.convert.TypeDescriptor, org.springframework.core.convert.TypeDescriptor)
*/
@Override
public boolean matches(TypeDescriptor sourceType, TypeDescriptor targetType) {
return !sourceType.equals(targetType);
}
private final static class NumberToNumberConverter<T extends Number> implements Converter<Number, T> {
private final Class<T> targetType;
/**
* Creates a new {@link NumberToNumberConverter} for the given target type.
*
* @param targetType must not be {@literal null}.
*/
public NumberToNumberConverter(Class<T> targetType) {
Assert.notNull(targetType, "Target type must not be null!");
this.targetType = targetType;
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public T convert(Number source) {
if (source instanceof AtomicInteger) {
return NumberUtils.convertNumberToTargetClass(((AtomicInteger) source).get(), this.targetType);
}
if (source instanceof AtomicLong) {
return NumberUtils.convertNumberToTargetClass(((AtomicLong) source).get(), this.targetType);
}
return NumberUtils.convertNumberToTargetClass(source, this.targetType);
}
}
}
/**
* {@link ConverterFactory} implementation converting {@link AtomicLong} into {@link Long}.
*
* @author Christoph Strobl
* @since 1.10
*/
@WritingConverter
public static enum AtomicLongToLongConverter implements Converter<AtomicLong, Long> {
INSTANCE;
@Override
public Long convert(AtomicLong source) {
return NumberUtils.convertNumberToTargetClass(source, Long.class);
}
}
/**
* {@link ConverterFactory} implementation converting {@link AtomicInteger} into {@link Integer}.
*
* @author Christoph Strobl
* @since 1.10
*/
@WritingConverter
public static enum AtomicIntegerToIntegerConverter implements Converter<AtomicInteger, Integer> {
INSTANCE;
@Override
public Integer convert(AtomicInteger source) {
return NumberUtils.convertNumberToTargetClass(source, Integer.class);
}
}
/**
* {@link ConverterFactory} implementation converting {@link Long} into {@link AtomicLong}.
*
* @author Christoph Strobl
* @since 1.10
*/
@ReadingConverter
public static enum LongToAtomicLongConverter implements Converter<Long, AtomicLong> {
INSTANCE;
@Override
public AtomicLong convert(Long source) {
return source != null ? new AtomicLong(source) : null;
}
}
/**
* {@link ConverterFactory} implementation converting {@link Integer} into {@link AtomicInteger}.
*
* @author Christoph Strobl
* @since 1.10
*/
@ReadingConverter
public static enum IntegerToAtomicIntegerConverter implements Converter<Integer, AtomicInteger> {
INSTANCE;
@Override
public AtomicInteger convert(Integer source) {
return source != null ? new AtomicInteger(source) : null;
}
}
}

View File

@@ -0,0 +1,275 @@
/*
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import java.util.Set;
import java.util.Stack;
import java.util.regex.Pattern;
import org.springframework.data.domain.Example;
import org.springframework.data.domain.ExampleMatcher.NullHandler;
import org.springframework.data.domain.ExampleMatcher.PropertyValueTransformer;
import org.springframework.data.domain.ExampleMatcher.StringMatcher;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.query.MongoRegexCreator;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.data.repository.core.support.ExampleMatcherAccessor;
import org.springframework.data.repository.query.parser.Part.Type;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* @author Christoph Strobl
* @author Mark Paluch
* @since 1.8
*/
public class MongoExampleMapper {
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final MongoConverter converter;
private final Map<StringMatcher, Type> stringMatcherPartMapping = new HashMap<StringMatcher, Type>();
public MongoExampleMapper(MongoConverter converter) {
this.converter = converter;
this.mappingContext = converter.getMappingContext();
stringMatcherPartMapping.put(StringMatcher.EXACT, Type.SIMPLE_PROPERTY);
stringMatcherPartMapping.put(StringMatcher.CONTAINING, Type.CONTAINING);
stringMatcherPartMapping.put(StringMatcher.STARTING, Type.STARTING_WITH);
stringMatcherPartMapping.put(StringMatcher.ENDING, Type.ENDING_WITH);
stringMatcherPartMapping.put(StringMatcher.REGEX, Type.REGEX);
}
/**
* Returns the given {@link Example} as {@link DBObject} holding matching values extracted from
* {@link Example#getProbe()}.
*
* @param example must not be {@literal null}.
* @return
*/
public DBObject getMappedExample(Example<?> example) {
Assert.notNull(example, "Example must not be null!");
return getMappedExample(example, mappingContext.getPersistentEntity(example.getProbeType()));
}
/**
* Returns the given {@link Example} as {@link DBObject} holding matching values extracted from
* {@link Example#getProbe()}.
*
* @param example must not be {@literal null}.
* @param entity must not be {@literal null}.
* @return
*/
public DBObject getMappedExample(Example<?> example, MongoPersistentEntity<?> entity) {
Assert.notNull(example, "Example must not be null!");
Assert.notNull(entity, "MongoPersistentEntity must not be null!");
DBObject reference = (DBObject) converter.convertToMongoType(example.getProbe());
if (entity.hasIdProperty() && entity.getIdentifierAccessor(example.getProbe()).getIdentifier() == null) {
reference.removeField(entity.getIdProperty().getFieldName());
}
ExampleMatcherAccessor matcherAccessor = new ExampleMatcherAccessor(example.getMatcher());
applyPropertySpecs("", reference, example.getProbeType(), matcherAccessor);
DBObject flattened = ObjectUtils.nullSafeEquals(NullHandler.INCLUDE, matcherAccessor.getNullHandler()) ? reference
: new BasicDBObject(SerializationUtils.flattenMap(reference));
DBObject result = example.getMatcher().isAllMatching() ? flattened : orConcatenate(flattened);
this.converter.getTypeMapper().writeTypeRestrictions(result, getTypesToMatch(example));
return result;
}
private static DBObject orConcatenate(DBObject source) {
List<DBObject> foo = new ArrayList<DBObject>(source.keySet().size());
for (String key : source.keySet()) {
foo.add(new BasicDBObject(key, source.get(key)));
}
return new BasicDBObject("$or", foo);
}
private Set<Class<?>> getTypesToMatch(Example<?> example) {
Set<Class<?>> types = new HashSet<Class<?>>();
for (TypeInformation<?> reference : mappingContext.getManagedTypes()) {
if (example.getProbeType().isAssignableFrom(reference.getType())) {
types.add(reference.getType());
}
}
return types;
}
private String getMappedPropertyPath(String path, Class<?> probeType) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(probeType);
Iterator<String> parts = Arrays.asList(path.split("\\.")).iterator();
final Stack<MongoPersistentProperty> stack = new Stack<MongoPersistentProperty>();
List<String> resultParts = new ArrayList<String>();
while (parts.hasNext()) {
final String part = parts.next();
MongoPersistentProperty prop = entity.getPersistentProperty(part);
if (prop == null) {
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
@Override
public void doWithPersistentProperty(MongoPersistentProperty property) {
if (property.getFieldName().equals(part)) {
stack.push(property);
}
}
});
if (stack.isEmpty()) {
return "";
}
prop = stack.pop();
}
resultParts.add(prop.getName());
if (prop.isEntity() && mappingContext.hasPersistentEntityFor(prop.getActualType())) {
entity = mappingContext.getPersistentEntity(prop.getActualType());
} else {
break;
}
}
return StringUtils.collectionToDelimitedString(resultParts, ".");
}
private void applyPropertySpecs(String path, DBObject source, Class<?> probeType,
ExampleMatcherAccessor exampleSpecAccessor) {
if (!(source instanceof BasicDBObject)) {
return;
}
Iterator<Map.Entry<String, Object>> iter = ((BasicDBObject) source).entrySet().iterator();
while (iter.hasNext()) {
Map.Entry<String, Object> entry = iter.next();
String propertyPath = StringUtils.hasText(path) ? path + "." + entry.getKey() : entry.getKey();
String mappedPropertyPath = getMappedPropertyPath(propertyPath, probeType);
if (isEmptyIdProperty(entry)) {
iter.remove();
continue;
}
if (exampleSpecAccessor.isIgnoredPath(propertyPath) || exampleSpecAccessor.isIgnoredPath(mappedPropertyPath)) {
iter.remove();
continue;
}
StringMatcher stringMatcher = exampleSpecAccessor.getDefaultStringMatcher();
Object value = entry.getValue();
boolean ignoreCase = exampleSpecAccessor.isIgnoreCaseEnabled();
if (exampleSpecAccessor.hasPropertySpecifiers()) {
mappedPropertyPath = exampleSpecAccessor.hasPropertySpecifier(propertyPath) ? propertyPath
: getMappedPropertyPath(propertyPath, probeType);
stringMatcher = exampleSpecAccessor.getStringMatcherForPath(mappedPropertyPath);
ignoreCase = exampleSpecAccessor.isIgnoreCaseForPath(mappedPropertyPath);
}
// TODO: should a PropertySpecifier outrule the later on string matching?
if (exampleSpecAccessor.hasPropertySpecifier(mappedPropertyPath)) {
PropertyValueTransformer valueTransformer = exampleSpecAccessor.getValueTransformerForPath(mappedPropertyPath);
value = valueTransformer.convert(value);
if (value == null) {
iter.remove();
continue;
}
entry.setValue(value);
}
if (entry.getValue() instanceof String) {
applyStringMatcher(entry, stringMatcher, ignoreCase);
} else if (entry.getValue() instanceof BasicDBObject) {
applyPropertySpecs(propertyPath, (BasicDBObject) entry.getValue(), probeType, exampleSpecAccessor);
}
}
}
private boolean isEmptyIdProperty(Entry<String, Object> entry) {
return entry.getKey().equals("_id") && entry.getValue() == null;
}
private void applyStringMatcher(Map.Entry<String, Object> entry, StringMatcher stringMatcher, boolean ignoreCase) {
BasicDBObject dbo = new BasicDBObject();
if (ObjectUtils.nullSafeEquals(StringMatcher.DEFAULT, stringMatcher)) {
if (ignoreCase) {
dbo.put("$regex", Pattern.quote((String) entry.getValue()));
entry.setValue(dbo);
}
} else {
Type type = stringMatcherPartMapping.get(stringMatcher);
String expression = MongoRegexCreator.INSTANCE.toRegularExpression((String) entry.getValue(), type);
dbo.put("$regex", expression);
entry.setValue(dbo);
}
if (ignoreCase) {
dbo.put("$options", "i");
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,10 +27,12 @@ import org.bson.types.ObjectId;
import org.springframework.core.convert.ConversionException;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.domain.Example;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PropertyPath;
import org.springframework.data.mapping.PropertyReferenceException;
import org.springframework.data.mapping.context.InvalidPersistentPropertyPath;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.PersistentPropertyPath;
import org.springframework.data.mapping.model.MappingException;
@@ -56,6 +58,7 @@ import com.mongodb.DBRef;
* @author Patryk Wasik
* @author Thomas Darimont
* @author Christoph Strobl
* @author Mark Paluch
*/
public class QueryMapper {
@@ -64,12 +67,13 @@ public class QueryMapper {
static final ClassTypeInformation<?> NESTED_DOCUMENT = ClassTypeInformation.from(NestedDocument.class);
private enum MetaMapping {
FORCE, WHEN_PRESENT, IGNORE;
FORCE, WHEN_PRESENT, IGNORE
}
private final ConversionService conversionService;
private final MongoConverter converter;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final MongoExampleMapper exampleMapper;
/**
* Creates a new {@link QueryMapper} with the given {@link MongoConverter}.
@@ -83,6 +87,7 @@ public class QueryMapper {
this.conversionService = converter.getConversionService();
this.converter = converter;
this.mappingContext = converter.getMappingContext();
this.exampleMapper = new MongoExampleMapper(converter);
}
/**
@@ -119,10 +124,20 @@ public class QueryMapper {
continue;
}
Field field = createPropertyField(entity, key, mappingContext);
Entry<String, Object> entry = getMappedObjectForField(field, query.get(key));
try {
result.put(entry.getKey(), entry.getValue());
Field field = createPropertyField(entity, key, mappingContext);
Entry<String, Object> entry = getMappedObjectForField(field, query.get(key));
result.put(entry.getKey(), entry.getValue());
} catch (InvalidPersistentPropertyPath invalidPathException) {
// in case the object has not already been mapped
if (!(query.get(key) instanceof DBObject)) {
throw invalidPathException;
}
result.put(key, query.get(key));
}
}
return result;
@@ -239,6 +254,10 @@ public class QueryMapper {
return new BasicDBObject(keyword.getKey(), newConditions);
}
if (keyword.isSample()) {
return exampleMapper.getMappedExample(keyword.<Example<?>> getValue(), entity);
}
return new BasicDBObject(keyword.getKey(), convertSimpleOrDBObject(keyword.getValue(), entity));
}
@@ -298,7 +317,7 @@ public class QueryMapper {
}
if (isNestedKeyword(value)) {
return getMappedKeyword(new Keyword((DBObject) value), null);
return getMappedKeyword(new Keyword((DBObject) value), documentField.getPropertyEntity());
}
if (isAssociationConversionNecessary(documentField, value)) {
@@ -566,6 +585,16 @@ public class QueryMapper {
return "$geometry".equalsIgnoreCase(key);
}
/**
* Returns wheter the current keyword indicates a sample object.
*
* @return
* @since 1.8
*/
public boolean isSample() {
return "$sample".equalsIgnoreCase(key);
}
public boolean hasIterableValue() {
return value instanceof Iterable;
}
@@ -867,7 +896,7 @@ public class QueryMapper {
* @return
*/
protected Converter<MongoPersistentProperty, String> getPropertyConverter() {
return PropertyToFieldNameConverter.INSTANCE;
return new PositionParameterRetainingPropertyKeyConverter(name);
}
/**
@@ -881,6 +910,28 @@ public class QueryMapper {
return new AssociationConverter(getAssociation());
}
/**
* @author Christoph Strobl
* @since 1.8
*/
static class PositionParameterRetainingPropertyKeyConverter implements Converter<MongoPersistentProperty, String> {
private final KeyMapper keyMapper;
public PositionParameterRetainingPropertyKeyConverter(String rawKey) {
this.keyMapper = new KeyMapper(rawKey);
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public String convert(MongoPersistentProperty source) {
return keyMapper.mapPropertyName(source);
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.Field#getTypeHint()
@@ -901,6 +952,61 @@ public class QueryMapper {
return NESTED_DOCUMENT;
}
/**
* @author Christoph Strobl
* @since 1.8
*/
static class KeyMapper {
private final Iterator<String> iterator;
public KeyMapper(String key) {
this.iterator = Arrays.asList(key.split("\\.")).iterator();
this.iterator.next();
}
/**
* Maps the property name while retaining potential positional operator {@literal $}.
*
* @param property
* @return
*/
protected String mapPropertyName(MongoPersistentProperty property) {
StringBuilder mappedName = new StringBuilder(PropertyToFieldNameConverter.INSTANCE.convert(property));
boolean inspect = iterator.hasNext();
while (inspect) {
String partial = iterator.next();
boolean isPositional = (isPositionalParameter(partial) && (property.isMap() || property.isCollectionLike()));
if (isPositional) {
mappedName.append(".").append(partial);
}
inspect = isPositional && iterator.hasNext();
}
return mappedName.toString();
}
private static boolean isPositionalParameter(String partial) {
if ("$".equals(partial)) {
return true;
}
try {
Long.valueOf(partial);
return true;
} catch (NumberFormatException e) {
return false;
}
}
}
}
/**

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,22 +15,20 @@
*/
package org.springframework.data.mongodb.core.convert;
import java.util.Arrays;
import java.util.Iterator;
import java.util.Map.Entry;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty.PropertyToFieldNameConverter;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update.Modifier;
import org.springframework.data.mongodb.core.query.Update.Modifiers;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
@@ -41,6 +39,7 @@ import com.mongodb.DBObject;
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
* @author Mark Paluch
*/
public class UpdateMapper extends QueryMapper {
@@ -66,8 +65,8 @@ public class UpdateMapper extends QueryMapper {
*/
@Override
protected Object delegateConvertToMongoType(Object source, MongoPersistentEntity<?> entity) {
return entity == null ? super.delegateConvertToMongoType(source, null)
: converter.convertToMongoType(source, getTypeHintForEntity(entity));
return converter.convertToMongoType(source,
entity == null ? ClassTypeInformation.OBJECT : getTypeHintForEntity(source, entity));
}
/*
@@ -90,7 +89,7 @@ public class UpdateMapper extends QueryMapper {
return getMappedUpdateModifier(field, rawValue);
}
return super.getMappedObjectForField(field, getMappedValue(field, rawValue));
return super.getMappedObjectForField(field, rawValue);
}
private Entry<String, Object> getMappedUpdateModifier(Field field, Object rawValue) {
@@ -134,25 +133,50 @@ public class UpdateMapper extends QueryMapper {
}
private DBObject getMappedValue(Field field, Modifier modifier) {
return new BasicDBObject(modifier.getKey(), getMappedModifier(field, modifier));
}
private Object getMappedModifier(Field field, Modifier modifier) {
Object value = modifier.getValue();
if (value instanceof Sort) {
DBObject sortObject = getSortObject((Sort) value);
return field == null || field.getPropertyEntity() == null ? sortObject
: getMappedSort(sortObject, field.getPropertyEntity());
}
TypeInformation<?> typeHint = field == null ? ClassTypeInformation.OBJECT : field.getTypeHint();
Object value = converter.convertToMongoType(modifier.getValue(), typeHint);
return new BasicDBObject(modifier.getKey(), value);
return converter.convertToMongoType(value, typeHint);
}
private TypeInformation<?> getTypeHintForEntity(MongoPersistentEntity<?> entity) {
return processTypeHintForNestedDocuments(entity.getTypeInformation());
}
private TypeInformation<?> processTypeHintForNestedDocuments(TypeInformation<?> info) {
private TypeInformation<?> getTypeHintForEntity(Object source, MongoPersistentEntity<?> entity) {
TypeInformation<?> info = entity.getTypeInformation();
Class<?> type = info.getActualType().getType();
if (type.isInterface() || java.lang.reflect.Modifier.isAbstract(type.getModifiers())) {
if (source == null || type.isInterface() || java.lang.reflect.Modifier.isAbstract(type.getModifiers())) {
return info;
}
return NESTED_DOCUMENT;
if (!type.equals(source.getClass())) {
return info;
}
return NESTED_DOCUMENT;
}
public DBObject getSortObject(Sort sort) {
DBObject dbo = new BasicDBObject();
for (Order order : sort) {
dbo.put(order.getProperty(), order.isAscending() ? 1 : -1);
}
return dbo;
}
/*
@@ -211,7 +235,7 @@ public class UpdateMapper extends QueryMapper {
*/
@Override
protected Converter<MongoPersistentProperty, String> getPropertyConverter() {
return new UpdatePropertyConverter(key);
return new PositionParameterRetainingPropertyKeyConverter(key);
}
/*
@@ -223,99 +247,6 @@ public class UpdateMapper extends QueryMapper {
return new UpdateAssociationConverter(getAssociation(), key);
}
/**
* Special mapper handling positional parameter {@literal $} within property names.
*
* @author Christoph Strobl
* @since 1.7
*/
private static class UpdateKeyMapper {
private final Iterator<String> iterator;
protected UpdateKeyMapper(String rawKey) {
Assert.hasText(rawKey, "Key must not be null or empty!");
this.iterator = Arrays.asList(rawKey.split("\\.")).iterator();
this.iterator.next();
}
/**
* Maps the property name while retaining potential positional operator {@literal $}.
*
* @param property
* @return
*/
protected String mapPropertyName(MongoPersistentProperty property) {
String mappedName = PropertyToFieldNameConverter.INSTANCE.convert(property);
boolean inspect = iterator.hasNext();
while (inspect) {
String partial = iterator.next();
boolean isPositional = isPositionalParameter(partial);
if (isPositional) {
mappedName += "." + partial;
}
inspect = isPositional && iterator.hasNext();
}
return mappedName;
}
boolean isPositionalParameter(String partial) {
if (partial.equals("$")) {
return true;
}
try {
Long.valueOf(partial);
return true;
} catch (NumberFormatException e) {
return false;
}
}
}
/**
* Special {@link Converter} for {@link MongoPersistentProperty} instances that will concatenate the {@literal $}
* contained in the source update key.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
private static class UpdatePropertyConverter implements Converter<MongoPersistentProperty, String> {
private final UpdateKeyMapper mapper;
/**
* Creates a new {@link UpdatePropertyConverter} with the given update key.
*
* @param updateKey must not be {@literal null} or empty.
*/
public UpdatePropertyConverter(String updateKey) {
Assert.hasText(updateKey, "Update key must not be null or empty!");
this.mapper = new UpdateKeyMapper(updateKey);
}
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Override
public String convert(MongoPersistentProperty property) {
return mapper.mapPropertyName(property);
}
}
/**
* {@link Converter} retaining positional parameter {@literal $} for {@link Association}s.
*
@@ -323,7 +254,7 @@ public class UpdateMapper extends QueryMapper {
*/
protected static class UpdateAssociationConverter extends AssociationConverter {
private final UpdateKeyMapper mapper;
private final KeyMapper mapper;
/**
* Creates a new {@link AssociationConverter} for the given {@link Association}.
@@ -333,7 +264,7 @@ public class UpdateMapper extends QueryMapper {
public UpdateAssociationConverter(Association<MongoPersistentProperty> association, String key) {
super(association);
this.mapper = new UpdateKeyMapper(key);
this.mapper = new KeyMapper(key);
}
/*

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,7 +27,7 @@ public interface GeoJson<T extends Iterable<?>> {
* String value representing the type of the {@link GeoJson} object.
*
* @return will never be {@literal null}.
* @see http://geojson.org/geojson-spec.html#geojson-objects
* @see <a href="http://geojson.org/geojson-spec.html#geojson-objects">http://geojson.org/geojson-spec.html#geojson-objects</a>
*/
String getType();
@@ -36,7 +36,7 @@ public interface GeoJson<T extends Iterable<?>> {
* determined by {@link #getType()} of geometry.
*
* @return will never be {@literal null}.
* @see http://geojson.org/geojson-spec.html#geometry-objects
* @see <a href="http://geojson.org/geojson-spec.html#geometry-objects">http://geojson.org/geojson-spec.html#geometry-objects</a>
*/
T getCoordinates();
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,7 +27,7 @@ import org.springframework.util.ObjectUtils;
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#geometry-collection
* @see <a href="http://geojson.org/geojson-spec.html#geometry-collection">http://geojson.org/geojson-spec.html#geometry-collection</a>
*/
public class GeoJsonGeometryCollection implements GeoJson<Iterable<GeoJson<?>>> {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -24,7 +24,7 @@ import org.springframework.data.geo.Point;
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#linestring
* @see <a href="http://geojson.org/geojson-spec.html#linestring">http://geojson.org/geojson-spec.html#linestring</a>
*/
public class GeoJsonLineString extends GeoJsonMultiPoint {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,7 +28,7 @@ import org.springframework.util.ObjectUtils;
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#multilinestring
* @see <a href="http://geojson.org/geojson-spec.html#multilinestring">http://geojson.org/geojson-spec.html#multilinestring</a>
*/
public class GeoJsonMultiLineString implements GeoJson<Iterable<GeoJsonLineString>> {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -29,7 +29,7 @@ import org.springframework.util.ObjectUtils;
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#multipoint
* @see <a href="http://geojson.org/geojson-spec.html#multipoint">http://geojson.org/geojson-spec.html#multipoint</a>
*/
public class GeoJsonMultiPoint implements GeoJson<Iterable<Point>> {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2017 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -25,7 +25,7 @@ import org.springframework.data.geo.Point;
*
* @author Christoph Strobl
* @since 1.7
* @see http://geojson.org/geojson-spec.html#point
* @see <a href="http://geojson.org/geojson-spec.html#point">http://geojson.org/geojson-spec.html#point</a>
*/
public class GeoJsonPoint extends Point implements GeoJson<List<Double>> {

Some files were not shown because too many files have changed in this diff Show More