Compare commits

..

99 Commits

Author SHA1 Message Date
Spring Operator
4eb88ecfbf DATAMONGO-2231 - URL Cleanup.
This commit updates URLs to prefer the https protocol. Redirects are not followed to avoid accidentally expanding intentionally shortened URLs (i.e. if using a URL shortener).

# Fixed URLs

## Fixed Success
These URLs were switched to an https URL with a 2xx status. While the status was successful, your review is still recommended.

* [ ] http://www.apache.org/licenses/ with 1 occurrences migrated to:
  https://www.apache.org/licenses/ ([https](https://www.apache.org/licenses/) result 200).
* [ ] http://www.apache.org/licenses/LICENSE-2.0 with 503 occurrences migrated to:
  https://www.apache.org/licenses/LICENSE-2.0 ([https](https://www.apache.org/licenses/LICENSE-2.0) result 200).

Original Pull Request: #697
2019-03-22 09:39:47 +01:00
Spring Operator
af9fc126d5 DATAMONGO-2231 - URL Cleanup.
This commit updates URLs to prefer the https protocol. Redirects are not followed to avoid accidentally expanding intentionally shortened URLs (i.e. if using a URL shortener).

# Fixed URLs

## Fixed Success
These URLs were switched to an https URL with a 2xx status. While the status was successful, your review is still recommended.

* http://maven.apache.org/xsd/maven-4.0.0.xsd with 3 occurrences migrated to:
  https://maven.apache.org/xsd/maven-4.0.0.xsd ([https](https://maven.apache.org/xsd/maven-4.0.0.xsd) result 200).
* http://www.gopivotal.com (302) with 6 occurrences migrated to:
  https://pivotal.io ([https](https://www.gopivotal.com) result 200).
* http://maven.apache.org/maven-v4_0_0.xsd with 2 occurrences migrated to:
  https://maven.apache.org/maven-v4_0_0.xsd ([https](https://maven.apache.org/maven-v4_0_0.xsd) result 301).
* http://projects.spring.io/spring-data-mongodb with 1 occurrences migrated to:
  https://projects.spring.io/spring-data-mongodb ([https](https://projects.spring.io/spring-data-mongodb) result 301).

# Ignored
These URLs were intentionally ignored.

* http://maven.apache.org/POM/4.0.0 with 10 occurrences
* http://www.w3.org/2001/XMLSchema-instance with 5 occurrences

Original Pull Request: #663
2019-03-19 12:53:39 +01:00
Oliver Gierke
7c61134b67 DATAMONGO-1592 - Adapt AuditingEventListenerUnitTests to changes in core auditing.
The core auditing implementation now skips the invocation of auditing in case the candidate aggregate doesn't need any auditing in the first place. We needed to adapt the sample class we use to actually carry the necessary auditing annotations.

Related ticket: DATACMNS-957.
2017-01-20 16:36:40 +01:00
Oliver Gierke
7c4b5a1812 DATAMONGO-1522 - Updated changelog. 2016-12-21 19:35:34 +01:00
Oliver Gierke
e7bef411df DATAMONGO-1469 - Updated changelog. 2016-12-21 18:42:58 +01:00
Oliver Gierke
97f8374be0 DATAMONGO-1525 - Improved creation of empty collections, esp. EnumSet.
We now use more type information to create a better empty collection in the first place. The previous algorithm always used an empty HashSet plus a subsequent conversion using the raw collection type. Especially the latter caused problems for EnumSets as the conversion into one requires the presence of component type information.

We now use Spring's collection factory and more available type information to create a proper collection in the first place and only rely on a subsequent conversion for arrays.
2016-12-01 20:18:20 +01:00
Oliver Gierke
d09417dbf6 DATAMONGO-1527 - Updated changelog. 2016-11-23 13:52:44 +01:00
Oliver Gierke
57650f79d8 DATAMONGO-1502 - Updated changelog. 2016-11-03 18:56:46 +01:00
Oliver Gierke
4875f07a92 DATAMONGO-1521 - Added Aggregation.skip(…) overload to support longs.
Deprecated the one taking an int.
2016-11-03 15:04:22 +01:00
Oliver Gierke
e974187a85 DATAMONGO-1513 - Fixed identifier population for event listener generated, non-ObjectId on batch inserts.
The methods in MongoTemplate inserting a batch of documents previously only returned database generated identifiers, more especially ObjectId ones. This caused non-ObjectId identifiers potentially generated by other parties — i.e. an event listener reacting to a BeforeSaveEvent — not being considered for source object identifier population.

This commit adds a workaround augmenting the list of database generated identifiers with the ones actually present in the documents to be inserted. A follow-up ticket DATAMONGO-1519 was created to track the removal of the workaround in favor of a proper fix unfortunately requiring a change in public API (so a 2.0 candidate only).

Related tickets: DATAMONGO-1519.
2016-11-02 09:49:16 +01:00
Oliver Gierke
2089b32e6c DATAMONGO-1514 - Polishing.
Extended license years in copyright header.

Original pull request: #401.
2016-10-27 14:34:16 +02:00
Martin Macko
140f23c7db DATAMONGO-1514 - SpringDataMongodbQuery needs to be public.
SpringDataMongodbQuery is exposed publicly in QuerydslRepositorySupport, that's we've got to make it public to make sure class to the exposed methods from outside the package actually compile.

Original pull request: #401.
2016-10-27 14:33:53 +02:00
Oliver Gierke
3cc99b9ea7 DATAMONGO-1495 - Updated changelog. 2016-09-29 14:42:10 +02:00
Oliver Gierke
240c2edd30 DATAMONGO-1499 - After release cleanups. 2016-09-29 10:40:41 +02:00
Spring Buildmaster
a5c3e46e67 DATAMONGO-1499 - Prepare next development iteration. 2016-09-29 08:13:29 +00:00
Spring Buildmaster
5ff2277a6c DATAMONGO-1499 - Release version 1.8.6 (Gosling SR6). 2016-09-29 08:13:29 +00:00
Oliver Gierke
691b66b79e DATAMONGO-1499 - Prepare 1.8.6 (Gosling SR6). 2016-09-29 09:33:54 +02:00
Oliver Gierke
2787021477 DATAMONGO-1499 - Updated changelog. 2016-09-29 09:33:49 +02:00
Oliver Gierke
054a8794fe DATAMONGO-1494 - After release cleanups. 2016-09-29 09:12:35 +02:00
Oliver Gierke
a9d22825aa DATAMONGO-1497 - MappingMongoConverter now consistently uses DbObjectAccessor.
We now use DbObjectAccessor also for preliminary inspections of the source DBObject (e.g. whether a value is present at all). Previously we operated on the DBObject directly which caused issues with properties mapped to nested fields as the keys weren't exploded correctly and thus the check always failed.
2016-09-22 17:56:18 +02:00
Spring Buildmaster
9395ae4587 DATAMONGO-1494 - Prepare next development iteration. 2016-09-20 14:01:57 +00:00
Spring Buildmaster
1cbce6f093 DATAMONGO-1494 - Release version 1.8.5.RELEASE. 2016-09-20 14:01:57 +00:00
Oliver Gierke
2b145e252f DATAMONGO-1494 - Prepare 1.8.5 (Gosling SR5). 2016-09-20 15:29:20 +02:00
Oliver Gierke
6532e0d804 DATAMONGO-1494 - Updated changelog. 2016-09-20 15:29:11 +02:00
Oliver Gierke
d348878f5b DATAMONGO-1450 - Updated changelog. 2016-09-20 11:50:44 +02:00
Oliver Gierke
6b6d2aa525 DATAMONGO-1445 - Prevent eager conversion of query parameters.
We now removed all eager query parameter conversions during query creation to be sure that the query mapping later on sees all to property names and is able to derive proper conversions.
2016-09-20 09:22:35 +02:00
Christoph Strobl
b0389845b3 DATAMONGO-1492 - Make o.s.d.m.core.aggregation.AggregationExpression public.
By turning `AggregationExpression` public we allow adding custom expressions without workarounds. It is now possible to create eg. `ProjectionOperation` like:

ProjectionOperation agg = Aggregation.project()
      .and(new AggregationExpression() {

        @Override
        public DBObject toDbObject(AggregationOperationContext context) {

          DBObject filterExpression = new BasicDBObject();
          filterExpression.put("input", "$x");
          filterExpression.put("as", "y");
          filterExpression.put("cond", new BasicDBObject("$eq", Arrays.<Object> asList("$$y.z", 2)));

          return new BasicDBObject("$filter", filterExpression);
        }
      }).as("profile");

Original pull request: #392.
2016-09-19 16:13:50 +02:00
Christoph Strobl
e7220a0f12 DATAMONGO-1485 - Consider potential custom conversion for enums in Querydsl paths.
We now take potential registered converters for enums into account when serializing path expressions via SpringDataMongodbSerializer.

Original pull request: #388.
2016-09-19 07:01:37 +02:00
Mark Paluch
f7ab448786 DATAMONGO-1406 - Propagate PersistentEntity when mapping query criteria for nested keywords.
We now propagate the PersistentEntity when mapping nested keywords so that the criteria mapping chain for nested keywords and properties has now access to the PersistentEntity and can use configured field names.

Previously the plain property names have been used as field names and potential customizations via @Field have been ignored.

Original Pull Request: #384
2016-08-25 13:37:22 +02:00
Mark Paluch
2026097be8 DATAMONGO-1465 - Polishing.
Replace boolean flag in convertAndJoinScriptArgs with literal. Joined args are rendered to JavaScript and require always string quotation.

Original pull request: #383.
2016-08-23 15:05:56 +02:00
Christoph Strobl
2006804133 DATAMONGO-1465 - Fix String quotation in DefaultScriptOperations.execute().
This change prevents Strings from being quoted prior to sending them as args of a script.

Original pull request: #383.
2016-08-23 15:05:56 +02:00
Oliver Gierke
be549b3d23 DATAMONGO-1471 - Converter only applies identifier values if actually available.
Setting the value for the identifier property is an explicit step in MappingMongoConverter and always executed if the type to be created has an identifier property. If the source document doesn't contain an _id field (e.g. because it has been excluded explicitly) that previously caused null to be set on the identifier. This caused an exception if the identifier property is a primitive type.

We now explicitly check whether the field backing the identifier property is actually present in the source document and only explicitly set the value if so.
2016-08-17 16:54:50 +02:00
Oliver Gierke
16bb07d8ae DATAMONGO-1409 - Updated changelog. 2016-07-28 08:58:24 +02:00
Oliver Gierke
c17c3c60a6 DATAMONGO-1410 - Updated changelog. 2016-07-28 08:58:16 +02:00
Christoph Strobl
d4dba035dc DATAMONGO-1453 - Fix GeoJson conversion when coordinates are Integers.
We now use Number instead of Double for reading "coordinates" from GeoJSON representations.

Original pull request: #369.
2016-06-24 14:05:44 +02:00
Kevin Dosey
bd41123535 DATAMONGO-1449 - Switched to foreach loop in collection handling of MappingMongoConverter.
This should result in minor to moderate performance improvement for iteration on Collections/Arrays during DBObject to object mapping.

Original pull request: #368.
2016-06-11 18:34:01 +02:00
Oliver Gierke
185b85b9dd DATAMONGO-1423 - Polishing.
Orignal pull request: #365.
2016-05-25 17:31:44 +02:00
Christoph Strobl
2d4082af02 DATAMONGO-1423 - Map keys now get registered conversions applied for Updates.
We now pipe map keys through the potentially registered conversions when mapping Updates.

Orignal pull request: #365.
2016-05-25 17:31:42 +02:00
Mark Paluch
b2cd7bbfd6 DATAMONGO-1425 - Polishing.
Add NotContaining to documentation. Add integration test for Containing/NotContaining on collection properties.

Original pull request: #363.
2016-05-09 11:12:22 +02:00
Christoph Strobl
0aff94b5ce DATAMONGO-1425 - Fix query derivation for notContaining on String properties.
We now correctly build up the criteria for derived queries using notContaining keyword on String properties.

Original pull request: #363.
2016-05-09 11:12:17 +02:00
Mark Paluch
87657aaec8 DATAMONGO-1412 - Fix backticks and code element highlighting.
Fixed broken highlighting using backticks followed by chars/single quotes. Convert single quote emphasis of id to backtick code fences. Add missing spaces between words and backticks.

Original Pull Request: #359
2016-05-02 14:20:21 +02:00
Mark Paluch
034c90fb97 DATAMONGO-1412 - Document mapping rules for Java types to MongoDB representation.
Original Pull Request: #359
Related pull request: #353
Related ticket: DATAMONGO-1404
2016-05-02 14:20:15 +02:00
Oliver Gierke
628e89cc78 DATAMONGO-1408 - Updated changelog. 2016-04-06 23:14:26 +02:00
Oliver Gierke
cb297d0890 DATAMONGO-1405 - Updated changelog. 2016-04-06 18:44:37 +02:00
Christoph Strobl
e4eaf6f2d3 DATAMONGO-1401 - Fix error when updating entity with both GeoJsonPoint and Version property.
We now ignore property reference exceptions when resolving field values that have already been mapped. Eg. in case of an already mapped update extracted from an actual domain type instance.

Original pull request: #351.
2016-03-31 09:35:12 +02:00
Christoph Strobl
6e453864fd DATAMONGO-1387 - Polishing.
Added a few more tests and append values if present on Query.

Original Pull Request: #345
2016-03-18 19:26:36 +01:00
John Willemin
f36baf2e37 DATAMONGO-1387 - Fix BasicQuery getFieldsObject() inconsistency.
We changed BasicQuery to consider its parent getFieldsObject() when not given an explicit fields DBObject.

Original Pull Request: #345
CLA: 165520160303021604 (John Willemin)
2016-03-18 19:26:24 +01:00
Oliver Gierke
e5a29ab1b5 DATAMONGO-1392 - Updated changelog. 2016-03-18 13:19:25 +01:00
Oliver Gierke
4e37cfedcc DATAMONGO-1397 - Polishing.
Switched to Slf4J-native placeholder replacement in debug logging for MongoTemplate.

Original pull request: #348.
2016-03-16 17:55:55 +01:00
Mark Paluch
de0feed565 DATAMONGO-1397 - Log command, entity and collection name in MongoTemplate.geoNear(…).
Original pull request: #348.
2016-03-16 17:55:13 +01:00
Oliver Gierke
bb2271e3fa DATAMONGO-1381 - After release cleanups. 2016-02-23 14:18:16 +01:00
Spring Buildmaster
d6aa9a0ea4 DATAMONGO-1381 - Prepare next development iteration. 2016-02-23 04:47:26 -08:00
Spring Buildmaster
1e6632846f DATAMONGO-1381 - Release version 1.8.4 (Gosling SR4). 2016-02-23 04:47:26 -08:00
Oliver Gierke
179b246173 DATAMONGO-1381 - Prepare 1.8.4 (Gosling SR4). 2016-02-23 12:56:55 +01:00
Oliver Gierke
8f29600eb4 DATAMONGO-1381 - Updated changelog. 2016-02-23 12:56:45 +01:00
Oliver Gierke
717c6100eb DATAMONGO-1366 - Updated changelog. 2016-02-12 22:10:04 +01:00
Uxío Fuentefría
38dade7c7a DATAMONGO-1378 - Update reference documentation: Change Query.sort() to Query.with(Sort sort).
sort() is not a method of Query, to sort a query you have to use with().

Original pull request: #320.
CLA: 162620160211060822 (Uxío Fuentefría)
2016-02-11 20:24:35 +01:00
Oliver Gierke
4a4c53766e DATAMONGO-1381 - Tweaked version numbers for Gosling SR4. 2016-02-11 16:09:02 +01:00
Mark Paluch
7c5d15edb5 DATAMONGO-1380 - Polishing.
Add credits, use message formatting instead string concatenation.

Original pull request: #317.
2016-02-11 12:04:41 +01:00
Alex Vengrovsk
8c2af8f0fc DATAMONGO-1380 - Improve logging in MongoChangeSetPersister.
Add checking for debug enabling in the getPersistentId method

Original pull request: #317.
2016-02-11 12:04:40 +01:00
Timo Kockert
419d0a4f92 DATAMONGO-1270 - Update documentation to reflect deprecation of MongoFactoryBean.
Original pull request: #315.
2016-02-10 15:58:09 +01:00
Thomas Dudouet
ce919e57c6 DATAMONGO-1377 - Update JavaDoc: Use @EnableMongoRepositories instead of @EnableJpaRepositories.
The JavaDoc description references the EnableJpaRepositories annotation instead of the EnableMongoRepositories annotation.

Original pull request: #340.
2016-02-10 14:52:40 +01:00
Oliver Gierke
781de00ab1 DATAMONGO-1376 - Moved away from SimpleTypeInformationMapper.INSTANCE.
Related tickets: DATACMNS-815.
2016-02-09 14:37:23 +01:00
Martin Macko
e06d352a71 DATAMONGO-1375 - Fix typo in MongoOperations JavaDoc.
Original pull request: #343.
2016-02-09 11:30:37 +01:00
Oliver Gierke
294990891c DATAMONGO-1361 - Guard command result statistics evaluation against changes in MongoDB 3.2.
MongoDB 3.2 RC1 decided to remove fields from statistics JSON documents returned in case no result was found for a geo near query. The avgDistance field is unfortunately missing as of that version.

Introduced a value object to encapsulate the mitigation behavior and make client code unaware of that.
2016-01-21 12:45:21 +01:00
Oliver Gierke
530f7396fa DATAMONGO-1360 - Query instances contained in a Near Query now get mapped during geoNear(…) execution.
A Query instance which might be part of a NearQuery definition is now passed through the QueryMapper to make sure complex types contained in it or even in more general types that have custom conversions registered are mapped correctly before the near command is actually executed.
2016-01-20 13:12:33 +01:00
Oliver Gierke
79aabfbbde DATAMONGO-1355 - After release cleanups. 2015-12-18 10:26:35 +01:00
Spring Buildmaster
6af3729fb3 DATAMONGO-1355 - Prepare next development iteration. 2015-12-18 00:24:12 -08:00
Spring Buildmaster
437a48ff4a DATAMONGO-1355 - Release version 1.8.2.RELEASE (Gosling SR2). 2015-12-18 00:24:12 -08:00
Oliver Gierke
583339641d DATAMONGO-1355 - Prepare 1.8.2.RELEASE (Gosling SR2). 2015-12-18 08:30:37 +01:00
Oliver Gierke
8af4bef772 DATAMONGO-1355 - Updated changelog. 2015-12-18 08:30:32 +01:00
Christoph Strobl
b0336c27a9 DATAMONGO-1334 - Map-reduce operations now honor MapReduceOptions.limit.
We now also consider the limit set via MapReduceOptions when executing mapReduce operations via MongoTemplate.mapReduce(…).

MapReduceOptions.limit(…) supersedes a potential limit set via the Query itself. This change also allows to define a limit even when no explicit Query is used.

Original pull request: #338.
2015-12-16 11:57:55 +01:00
Christoph Strobl
0b492b6c55 DATAMONGO-1317 - Assert compatibility with mongo-java-driver 3.2.
We now do a defensive check against the actual WObject of WriteConcern to avoid the IllegalStateException raised by the new java-driver in case _w is null or not an Integer. This allows us to run against recent 2.13, 2.14, 3.0, 3.1 and the latest 3.2.0.

Original pull request: #337.
2015-12-16 11:49:08 +01:00
Oliver Gierke
b3116a523b DATAMONGO-1289 - Polishing.
Some additional JavaDoc and comment removal.

Original pull request: #333.
2015-12-16 11:46:12 +01:00
Christoph Strobl
2ba9e5f403 DATAMONGO-1289 - MappingMongoEntityInformation no uses fallback identifier type derived from repository declaration.
We now use RepositoryMetdata.getIdType() to provide a fallback identifier type in case the entity information does not hold an id property which is perfectly valid for MongoDB.

Original pull request: #333.
2015-12-16 11:46:11 +01:00
Oliver Gierke
8b1805a145 DATAMONGO-1346 - Update.pullAll(…) now registers multiple invocations correctly.
Previously calling the method multiple times overrode the result of previous calls. We now use addMultiFieldOperation(…) to make sure already existing values are kept.
2015-12-10 15:40:23 +01:00
Oliver Gierke
5832055840 DATAMONGO-1337 - Another round of polishes on SonarQuber complaints. 2015-11-26 12:28:54 +01:00
Oliver Gierke
a13e7b8b24 DATAMONGO-1337 - Reverted making some of the loggers static.
The logger instance in AbstractMonitor is supposed to pick up the type of the actual implementation class and thus cannot be static.

Related pull request: #336.
2015-11-26 12:03:25 +01:00
Christian Ivan
a152aa3ce8 DATAMONGO-1337 - General code quality improvements.
A round of code polish regarding the PMD and Squid rules referred to in the ticket.

Original pull request: #336.
2015-11-26 11:53:16 +01:00
Oliver Gierke
ca56ea4aea DATAMONGO-1342 - Fixed potential NullPointerException in MongoQueryCreator.
MongoQueryCreator.nextAsArray(…) now returns a single element object array in case null is handed to the method. It previously failed with a NullPointerException.
2015-11-25 17:23:29 +01:00
Oliver Gierke
284e2f462d DATAMONGO-1335 - DBObjectAccessor now writes all nested fields correctly.
Previously, DBObjectAccessor has always reset the in-between values when traversing nested properties. This caused previously written values to be erased if subsequent values are written. We now reuse an already existing BasicDBObject if present.
2015-11-25 16:07:02 +01:00
Oliver Gierke
257bc891dd DATAMONGO-1287 - Optimizations in reading associations as constructor arguments.
As per discussion on the ticket we now omit looking up the value for an association being used as constructor argument as the simple check whether the currently handled property is a constructor argument is sufficient to potentially skip handling the value.

Related pull requests: #335, #322.
2015-11-23 11:18:23 +01:00
Christoph Strobl
d26db17bf0 DATAMONGO-1287 - Fix double fetching for lazy DbRefs used in entity constructor.
We now check properties for their usage as constructor arguments, that might already have been resolved, before setting the actual value. This prevents turning already eagerly fetched DBRefs back into LazyLoadingProxies.

Original pull request: #335.
Related pull request: #322.
2015-11-20 13:39:06 +01:00
Christoph Strobl
d760d9cc11 DATAMONGO-1290 - Convert byte[] parameter in @Query to $binary representation.
We now convert non quoted binary parameters to the $binary format. This allows using them along with the @Query annotation.

Original pull request: #332.
2015-11-20 13:06:30 +01:00
Christoph Strobl
be65970710 DATAMONGO-1204 - ObjectPath now uses raw id values to track resolved objects.
We now use the native id within ObjectPath for checking if a DBref has already been resolved. This is required as MongoDB Java driver 3 generation changed ObjectId.equals(…) which now performs a type check.

Original pull request: #334.
Related pull request: #288.
2015-11-20 12:50:51 +01:00
Oliver Gierke
418a4f8b8c DATAMONGO-1324 - Register ObjectId converters unconditionally to make sure they really get used.
The presence of ObjectToObjectConverter in a DefaultConversionService causes the guard trying to register converters for ObjectIds in AbstractMongoConverter to not trigger the registration. This in turn caused ObjectId conversions to be executed via reflection instead of the straight forward method calls and thus a drop in performance for such operations.

We no unconditionally register the converters to make sure they really get applied.

Related tickets: SPR-13703.
2015-11-19 12:24:23 +01:00
Oliver Gierke
fa5f93aad5 DATAMONGO-1316 - After release cleanups. 2015-11-19 12:24:09 +01:00
Spring Buildmaster
504e14d4a3 DATAMONGO-1316 - Prepare next development iteration. 2015-11-15 05:56:05 -08:00
Spring Buildmaster
f68effe155 DATAMONGO-1316 - Release version 1.8.1.RELEASE (Gosling SR1). 2015-11-15 05:55:54 -08:00
Oliver Gierke
6fc80f287e DATAMONGO-1316 - Prepare 1.8.1.RELEASE (Gosling SR1). 2015-11-15 14:06:53 +01:00
Oliver Gierke
02aed56fd1 DATAMONGO-1316 - Updated changelog. 2015-11-15 14:06:43 +01:00
Christoph Strobl
f1771504f6 DATAMONGO-1297 - Allow @Indexed annotation on DBRef.
We now also treat references as source of a potential index. This enforces index creation for Objects like:

@Document
class WithDbRef {

  @Indexed
  @DBRef
  ReferencedObject reference;
}

Combining @TextIndexed or @GeoSpatialIndexed with a DBRef will lead to a MappingException.

Original pull request: #329.
2015-11-13 17:55:00 +01:00
Christoph Strobl
741a27edae DATAMONGO-1302 - Allow ConverterFactory to be registered in CustomConversions.
We now allow registration of ConverterFactory within CustomConversions by inspecting the generic type arguments for determining the conversion source and target types.

Original pull request: #330.
2015-11-10 14:52:42 +01:00
Christoph Strobl
e1869abf3f DATAMONGO-1293 - Polishing.
Move configuration parsing error into method actually responsible for reading uri/client-uri attributes.

Original Pull Request: #328
2015-10-29 12:49:25 +01:00
Viktor Khoroshko
c7be5bfcaa DATAMONGO-1293 - Allowed id attribute in addition to client-uri attribute in MongoDbFactoryParser.
We now allow write-concern and id to be configured along with the uri or client-uri attribute of <mongo:db-factory.

Original Pull Request: #328
CLA: 140120150929074128 (Viktor Khoroshko)
2015-10-29 12:49:19 +01:00
Oliver Gierke
9968b752e7 DATAMONGO-1276 - Fixed potential NullPointerExceptions in MongoTemplate.
Triggering data access exception translation could lead to NullPointerException in cases where PersistenceExceptionTranslator returned null because the original exception couldn't be translated and the result was directly used from a throw clause.

This is now fixed by consistently the potentiallyConvertRuntimeException(…) method, which was made static to be able to refer to it from nested static classes.

Refactored Scanner usage to actually close the Scanner instance to prevent a resource leak.
2015-10-21 15:10:15 +02:00
Oliver Gierke
e001c6bf89 DATAMONGO-1304 - Updated changelog. 2015-10-14 13:46:05 +02:00
Oliver Gierke
913d383b99 DATAMONGO-1282 - After release cleanups. 2015-09-03 18:34:11 +02:00
Spring Buildmaster
f446d7e29f DATAMONGO-1282 - Prepare next development iteration. 2015-09-03 18:33:40 +02:00
613 changed files with 7601 additions and 33404 deletions

View File

@@ -1,12 +0,0 @@
<!--
Thank you for proposing a pull request. This template will guide you through the essential steps necessary for a pull request.
Make sure that:
-->
- [ ] You have read the [Spring Data contribution guidelines](https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.adoc).
- [ ] There is a ticket in the bug tracker for the project in our [JIRA](https://jira.spring.io/browse/DATAMONGO).
- [ ] You use the code formatters provided [here](https://github.com/spring-projects/spring-data-build/tree/master/etc/ide) and have them applied to your changes. Dont submit any formatting related changes.
- [ ] You submit test cases (unit or integration tests) that back your changes.
- [ ] You added yourself as author in the headers of the classes you touched. Amend the date range in the Apache license header if needed. For new types, add the license header (copy from another file and set the current year only).

View File

@@ -3,31 +3,13 @@ language: java
jdk:
- oraclejdk8
before_script:
- mongod --version
services:
- mongodb
env:
matrix:
- PROFILE=ci
- PROFILE=mongo-next
- PROFILE=mongo3
- PROFILE=mongo3-next
- PROFILE=mongo31
- PROFILE=mongo32
- PROFILE=mongo33
- PROFILE=mongo34
- PROFILE=mongo34-next
# Current MongoDB version is 2.4.2 as of 2016-04, see https://github.com/travis-ci/travis-ci/issues/3694
# apt-get starts a MongoDB instance so it's not started using before_script
addons:
apt:
sources:
- mongodb-3.4-precise
packages:
- mongodb-org-server
- mongodb-org-shell
- oracle-java8-installer
sudo: false

View File

@@ -1,27 +0,0 @@
= Contributor Code of Conduct
As contributors and maintainers of this project, and in the interest of fostering an open and welcoming community, we pledge to respect all people who contribute through reporting issues, posting feature requests, updating documentation, submitting pull requests or patches, and other activities.
We are committed to making participation in this project a harassment-free experience for everyone, regardless of level of experience, gender, gender identity and expression, sexual orientation, disability, personal appearance, body size, race, ethnicity, age, religion, or nationality.
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery
* Personal attacks
* Trolling or insulting/derogatory comments
* Public or private harassment
* Publishing other's private information, such as physical or electronic addresses,
without explicit permission
* Other unethical or unprofessional conduct
Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful.
By adopting this Code of Conduct, project maintainers commit themselves to fairly and consistently applying these principles to every aspect of managing this project. Project maintainers who do not follow or enforce the Code of Conduct may be permanently removed from the project team.
This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community.
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting a project maintainer at spring-code-of-conduct@pivotal.io.
All complaints will be reviewed and investigated and will result in a response that is deemed necessary and appropriate to the circumstances.
Maintainers are obligated to maintain confidentiality with regard to the reporter of an incident.
This Code of Conduct is adapted from the http://contributor-covenant.org[Contributor Covenant], version 1.3.0, available at http://contributor-covenant.org/version/1/3/0/[contributor-covenant.org/version/1/3/0/].

1
CONTRIBUTING.MD Normal file
View File

@@ -0,0 +1 @@
You find the contribution guidelines for Spring Data projects [here](https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.md).

View File

@@ -1,3 +0,0 @@
= Spring Data contribution guidelines
You find the contribution guidelines for Spring Data projects https://github.com/spring-projects/spring-data-build/blob/master/CONTRIBUTING.adoc[here].

View File

@@ -1,6 +1,3 @@
[![Spring Data MongoDB](https://spring.io/badges/spring-data-mongodb/ga.svg)](http://projects.spring.io/spring-data-mongodb#quick-start)
[![Spring Data MongoDB](https://spring.io/badges/spring-data-mongodb/snapshot.svg)](http://projects.spring.io/spring-data-mongodb#quick-start)
# Spring Data MongoDB
The primary goal of the [Spring Data](http://projects.spring.io/spring-data) project is to make it easier to build Spring-powered applications that use new data access technologies such as non-relational databases, map-reduce frameworks, and cloud based data services.
@@ -29,7 +26,7 @@ Add the Maven dependency:
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}.RELEASE</version>
<version>1.5.0.RELEASE</version>
</dependency>
```
@@ -39,7 +36,7 @@ If you'd rather like the latest snapshots of the upcoming major version, use our
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>${version}.BUILD-SNAPSHOT</version>
<version>1.6.0.BUILD-SNAPSHOT</version>
</dependency>
<repository>
@@ -143,8 +140,8 @@ public class MyService {
Here are some ways for you to get involved in the community:
* Get involved with the Spring community on Stackoverflow and help out on the [spring-data-mongodb](http://stackoverflow.com/questions/tagged/spring-data-mongodb) tag by responding to questions and joining the debate.
* Create [JIRA](https://jira.spring.io/browse/DATAMONGO) tickets for bugs and new features and comment and vote on the ones that you are interested in.
* Create [JIRA](https://jira.springframework.org/browse/DATADOC) tickets for bugs and new features and comment and vote on the ones that you are interested in.
* Github is for social coding: if you want to write code, we encourage contributions through pull requests from [forks of this repository](http://help.github.com/forking/). If you want to contribute code this way, please reference a JIRA ticket as well covering the specific issue you are addressing.
* Watch for upcoming articles on Spring by [subscribing](http://spring.io/blog) to spring.io.
Before we accept a non-trivial patch or pull request we will need you to [sign the Contributor License Agreement](https://cla.pivotal.io/sign/spring). Signing the contributors agreement does not grant anyone commit rights to the main repository, but it does mean that we can accept your contributions, and you will get an author credit if we do. If you forget to do so, you'll be reminded when you submit a pull request. Active contributors might be asked to join the core team, and given the ability to merge pull requests.
Before we accept a non-trivial patch or pull request we will need you to sign the [contributor's agreement](https://support.springsource.com/spring_committer_signup). Signing the contributor's agreement does not grant anyone commit rights to the main repository, but it does mean that we can accept your contributions, and you will get an author credit if we do. Active contributors might be asked to join the core team, and given the ability to merge pull requests.

107
pom.xml
View File

@@ -1,21 +1,21 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.3.RELEASE</version>
<version>1.8.7.BUILD-SNAPSHOT</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
<description>MongoDB support for Spring Data</description>
<url>http://projects.spring.io/spring-data-mongodb</url>
<url>https://projects.spring.io/spring-data-mongodb</url>
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.9.3.RELEASE</version>
<version>1.7.7.BUILD-SNAPSHOT</version>
</parent>
<modules>
@@ -28,8 +28,8 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.13.3.RELEASE</springdata.commons>
<mongo>2.14.3</mongo>
<springdata.commons>1.11.7.BUILD-SNAPSHOT</springdata.commons>
<mongo>2.13.0</mongo>
<mongo.osgi>2.13.0</mongo.osgi>
</properties>
@@ -39,7 +39,7 @@
<name>Oliver Gierke</name>
<email>ogierke at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Project Lead</role>
</roles>
@@ -50,7 +50,7 @@
<name>Thomas Risberg</name>
<email>trisberg at vmware.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -61,7 +61,7 @@
<name>Mark Pollack</name>
<email>mpollack at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -72,7 +72,7 @@
<name>Jon Brisbin</name>
<email>jbrisbin at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -83,7 +83,7 @@
<name>Thomas Darimont</name>
<email>tdarimont at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -94,7 +94,7 @@
<name>Christoph Strobl</name>
<email>cstrobl at gopivotal.com</email>
<organization>Pivotal</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<organizationUrl>https://pivotal.io</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -107,7 +107,7 @@
<id>mongo-next</id>
<properties>
<mongo>2.15.0-SNAPSHOT</mongo>
<mongo>2.14.0-SNAPSHOT</mongo>
</properties>
<repositories>
@@ -123,7 +123,7 @@
<id>mongo3</id>
<properties>
<mongo>3.0.4</mongo>
<mongo>3.0.2</mongo>
</properties>
</profile>
@@ -132,7 +132,7 @@
<id>mongo3-next</id>
<properties>
<mongo>3.0.5-SNAPSHOT</mongo>
<mongo>3.0.0-SNAPSHOT</mongo>
</properties>
<repositories>
@@ -143,79 +143,6 @@
</repositories>
</profile>
<profile>
<id>mongo31</id>
<properties>
<mongo>3.1.1</mongo>
</properties>
</profile>
<profile>
<id>mongo32</id>
<properties>
<mongo>3.2.2</mongo>
</properties>
</profile>
<profile>
<id>mongo33</id>
<properties>
<mongo>3.3.0</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>mongo34</id>
<properties>
<mongo>3.4.0</mongo>
</properties>
</profile>
<profile>
<id>mongo34-next</id>
<properties>
<mongo>3.4.1-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>release</id>
<build>
<plugins>
<plugin>
<groupId>org.jfrog.buildinfo</groupId>
<artifactId>artifactory-maven-plugin</artifactId>
<inherited>false</inherited>
</plugin>
</plugins>
</build>
</profile>
</profiles>
<dependencies>
@@ -229,8 +156,8 @@
<repositories>
<repository>
<id>spring-libs-release</id>
<url>https://repo.spring.io/libs-release</url>
<id>spring-libs-snapshot</id>
<url>https://repo.spring.io/libs-snapshot</url>
</repository>
</repositories>

View File

@@ -1,12 +1,12 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.3.RELEASE</version>
<version>1.8.7.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -48,7 +48,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.10.3.RELEASE</version>
<version>1.8.7.BUILD-SNAPSHOT</version>
</dependency>
<dependency>
@@ -81,7 +81,7 @@
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>${validation}</version>
<version>1.0.0.GA</version>
<scope>test</scope>
</dependency>
<dependency>

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.3.RELEASE</version>
<version>1.8.7.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,11 +1,11 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.3.RELEASE</version>
<version>1.8.7.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2016 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -18,8 +18,6 @@ package org.springframework.data.mongodb.log4j;
import java.net.UnknownHostException;
import java.util.Arrays;
import java.util.Calendar;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import org.apache.log4j.AppenderSkeleton;
@@ -32,18 +30,13 @@ import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoCredential;
import com.mongodb.ServerAddress;
import com.mongodb.WriteConcern;
/**
* Log4j appender writing log entries into a MongoDB instance.
*
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Christoph Strobl
* @author Ricardo Espirito Santo
*/
public class MongoLog4jAppender extends AppenderSkeleton {
@@ -61,19 +54,17 @@ public class MongoLog4jAppender extends AppenderSkeleton {
protected String host = "localhost";
protected int port = 27017;
protected String username;
protected String password;
protected String authenticationDatabase;
protected String database = "logs";
protected String collectionPattern = "%c";
protected PatternLayout collectionLayout = new PatternLayout(collectionPattern);
protected String applicationId = System.getProperty("APPLICATION_ID", null);
protected WriteConcern warnOrHigherWriteConcern = WriteConcern.ACKNOWLEDGED;
protected WriteConcern infoOrLowerWriteConcern = WriteConcern.UNACKNOWLEDGED;
protected WriteConcern warnOrHigherWriteConcern = WriteConcern.SAFE;
protected WriteConcern infoOrLowerWriteConcern = WriteConcern.NORMAL;
protected Mongo mongo;
protected DB db;
public MongoLog4jAppender() {}
public MongoLog4jAppender() {
}
public MongoLog4jAppender(boolean isActive) {
super(isActive);
@@ -95,53 +86,6 @@ public class MongoLog4jAppender extends AppenderSkeleton {
this.port = port;
}
/**
* @return
* @since 1.10
*/
public String getUsername() {
return username;
}
/**
* @param username may be {@literal null} for unauthenticated access.
* @since 1.10
*/
public void setUsername(String username) {
this.username = username;
}
/**
* @return
* @since 1.10
*/
public String getPassword() {
return password;
}
/**
* @param password may be {@literal null} for unauthenticated access.
* @since 1.10
*/
public void setPassword(String password) {
this.password = password;
}
/**
* @return
*/
public String getAuthenticationDatabase() {
return authenticationDatabase;
}
/**
* @param authenticationDatabase may be {@literal null} to use {@link #getDatabase()} as authentication database.
* @since 1.10
*/
public void setAuthenticationDatabase(String authenticationDatabase) {
this.authenticationDatabase = authenticationDatabase;
}
public String getDatabase() {
return database;
}
@@ -167,14 +111,14 @@ public class MongoLog4jAppender extends AppenderSkeleton {
this.applicationId = applicationId;
}
public String getWarnOrHigherWriteConcern() {
return warnOrHigherWriteConcern.toString();
}
public void setWarnOrHigherWriteConcern(String wc) {
this.warnOrHigherWriteConcern = WriteConcern.valueOf(wc);
}
public String getWarnOrHigherWriteConcern() {
return warnOrHigherWriteConcern.toString();
}
public String getInfoOrLowerWriteConcern() {
return infoOrLowerWriteConcern.toString();
}
@@ -184,26 +128,10 @@ public class MongoLog4jAppender extends AppenderSkeleton {
}
protected void connectToMongo() throws UnknownHostException {
this.mongo = createMongoClient();
this.mongo = new Mongo(host, port);
this.db = mongo.getDB(database);
}
private MongoClient createMongoClient() throws UnknownHostException {
ServerAddress serverAddress = new ServerAddress(host, port);
if (null == password || null == username) {
return new MongoClient(serverAddress);
}
String authenticationDatabaseToUse = authenticationDatabase == null ? this.database : authenticationDatabase;
MongoCredential mongoCredential = MongoCredential.createCredential(username,
authenticationDatabaseToUse, password.toCharArray());
List<MongoCredential> credentials = Collections.singletonList(mongoCredential);
return new MongoClient(serverAddress, credentials);
}
/*
* (non-Javadoc)
* @see org.apache.log4j.AppenderSkeleton#append(org.apache.log4j.spi.LoggingEvent)

View File

@@ -1,114 +0,0 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.log4j;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.util.Calendar;
import java.util.Collections;
import org.apache.log4j.LogManager;
import org.apache.log4j.Logger;
import org.apache.log4j.MDC;
import org.apache.log4j.PropertyConfigurator;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.DB;
import com.mongodb.DBCursor;
import com.mongodb.MongoClient;
import com.mongodb.MongoCredential;
import com.mongodb.ServerAddress;
/**
* Integration tests for {@link MongoLog4jAppender} using authentication.
*
* @author Mark Paluch
*/
public class MongoLog4jAppenderAuthenticationIntegrationTests {
private final static String username = "admin";
private final static String password = "test";
private final static String authenticationDatabase = "logs";
MongoClient mongo;
DB db;
String collection;
ServerAddress serverLocation;
Logger log;
@Before
public void setUp() throws Exception {
serverLocation = new ServerAddress("localhost", 27017);
mongo = new MongoClient(serverLocation);
db = mongo.getDB("logs");
BasicDBList roles = new BasicDBList();
roles.add("dbOwner");
db.command(new BasicDBObjectBuilder().add("createUser", username).add("pwd", password).add("roles", roles).get());
mongo.close();
mongo = new MongoClient(serverLocation, Collections
.singletonList(MongoCredential.createCredential(username, authenticationDatabase, password.toCharArray())));
db = mongo.getDB("logs");
Calendar now = Calendar.getInstance();
collection = String.valueOf(now.get(Calendar.YEAR)) + String.format("%1$02d", now.get(Calendar.MONTH) + 1);
LogManager.resetConfiguration();
PropertyConfigurator.configure(getClass().getResource("/log4j-with-authentication.properties"));
log = Logger.getLogger(MongoLog4jAppenderIntegrationTests.class.getName());
}
@After
public void tearDown() {
if (db != null) {
db.getCollection(collection).remove(new BasicDBObject());
db.command(new BasicDBObject("dropUser", username));
}
LogManager.resetConfiguration();
PropertyConfigurator.configure(getClass().getResource("/log4j.properties"));
}
@Test
public void testLogging() {
log.debug("DEBUG message");
log.info("INFO message");
log.warn("WARN message");
log.error("ERROR message");
DBCursor msgs = db.getCollection(collection).find();
assertThat(msgs.count(), is(4));
}
@Test
public void testProperties() {
MDC.put("property", "one");
log.debug("DEBUG message");
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2016 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -20,51 +20,39 @@ import static org.junit.Assert.*;
import java.util.Calendar;
import org.apache.log4j.LogManager;
import org.apache.log4j.Logger;
import org.apache.log4j.MDC;
import org.apache.log4j.PropertyConfigurator;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.DBCursor;
import com.mongodb.MongoClient;
import com.mongodb.ServerAddress;
import com.mongodb.Mongo;
/**
* Integration tests for {@link MongoLog4jAppender}.
*
*
* @author Jon Brisbin
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MongoLog4jAppenderIntegrationTests {
MongoClient mongo;
static final String NAME = MongoLog4jAppenderIntegrationTests.class.getName();
private static final Logger log = Logger.getLogger(NAME);
Mongo mongo;
DB db;
String collection;
ServerAddress serverLocation;
Logger log;
@Before
public void setUp() throws Exception {
serverLocation = new ServerAddress("localhost", 27017);
mongo = new MongoClient(serverLocation);
mongo = new Mongo("localhost", 27017);
db = mongo.getDB("logs");
Calendar now = Calendar.getInstance();
collection = String.valueOf(now.get(Calendar.YEAR)) + String.format("%1$02d", now.get(Calendar.MONTH) + 1);
log = Logger.getLogger(MongoLog4jAppenderIntegrationTests.class.getName());
}
@After
public void tearDown() {
db.getCollection(collection).remove(new BasicDBObject());
db.getCollection(collection).drop();
}
@Test
@@ -76,11 +64,13 @@ public class MongoLog4jAppenderIntegrationTests {
log.error("ERROR message");
DBCursor msgs = db.getCollection(collection).find();
assertThat(msgs.count(), is(4));
}
@Test
public void testProperties() {
MDC.put("property", "one");
log.debug("DEBUG message");
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2017 the original author or authors.
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -24,7 +24,10 @@ import org.junit.Test;
*/
public class MongoLog4jAppenderUnitTests {
@Test // DATAMONGO-641
/**
* @see DATAMONGO-641
*/
@Test
public void closesWithoutMongoInstancePresent() {
new MongoLog4jAppender().close();
}

View File

@@ -1,16 +0,0 @@
log4j.rootCategory=INFO, mongo
log4j.appender.mongo=org.springframework.data.mongodb.log4j.MongoLog4jAppender
log4j.appender.mongo.layout=org.apache.log4j.PatternLayout
log4j.appender.mongo.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.appender.mongo.host = localhost
log4j.appender.mongo.port = 27017
log4j.appender.mongo.database = logs
log4j.appender.mongo.username = admin
log4j.appender.mongo.password = test
log4j.appender.mongo.authenticationDatabase = logs
log4j.appender.mongo.collectionPattern = %X{year}%X{month}
log4j.appender.mongo.applicationId = my.application
log4j.appender.mongo.warnOrHigherWriteConcern = FSYNC_SAFE
log4j.category.org.springframework.data.mongodb=DEBUG

View File

@@ -1,13 +1,13 @@
log4j.rootCategory=INFO, mongo
log4j.rootCategory=INFO, stdout
log4j.appender.mongo=org.springframework.data.mongodb.log4j.MongoLog4jAppender
log4j.appender.mongo.layout=org.apache.log4j.PatternLayout
log4j.appender.mongo.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.appender.mongo.host = localhost
log4j.appender.mongo.port = 27017
log4j.appender.mongo.database = logs
log4j.appender.mongo.collectionPattern = %X{year}%X{month}
log4j.appender.mongo.applicationId = my.application
log4j.appender.mongo.warnOrHigherWriteConcern = FSYNC_SAFE
log4j.appender.stdout=org.springframework.data.mongodb.log4j.MongoLog4jAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.appender.stdout.host = localhost
log4j.appender.stdout.port = 27017
log4j.appender.stdout.database = logs
log4j.appender.stdout.collectionPattern = %X{year}%X{month}
log4j.appender.stdout.applicationId = my.application
log4j.appender.stdout.warnOrHigherWriteConcern = FSYNC_SAFE
log4j.category.org.springframework.data.mongodb=DEBUG

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<context version="7.2.2.230">
<context version="7.1.10.209">
<scope type="Project" name="spring-data-mongodb">
<element type="TypeFilterReferenceOverridden" name="Filter">
<element type="IncludeTypePattern" name="org.springframework.data.mongodb.**"/>
@@ -35,12 +35,6 @@
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Implementation" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="CDI">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.cdi.**"/>
</element>
<stereotype name="Unrestricted"/>
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Config" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core" type="AllowedDependency"/>
</element>
@@ -82,11 +76,6 @@
</element>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="Script">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.script.**"/>
</element>
</element>
<element type="Subsystem" name="Conversion">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.core.convert.**"/>
@@ -94,7 +83,6 @@
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Script" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="SpEL">
<element type="TypeFilter" name="Assignment">
@@ -117,11 +105,6 @@
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="MapReduce">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="**.mapreduce.**"/>
</element>
</element>
<element type="Subsystem" name="Core">
<element type="TypeFilter" name="Assignment">
<element type="WeakTypePattern" name="**.core.**"/>
@@ -130,10 +113,8 @@
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Conversion" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Geospatial" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Index" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|MapReduce" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Mapping" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Query" type="AllowedDependency"/>
<dependency toName="Project|spring-data-mongodb::Layer|Core::Subsystem|Script" type="AllowedDependency"/>
</element>
<element type="Subsystem" name="Util">
<element type="TypeFilter" name="Assignment">
@@ -188,32 +169,7 @@
</element>
<element type="Subsystem" name="Querydsl">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="com.querydsl.**"/>
</element>
</element>
<element type="Subsystem" name="Slf4j">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="org.slf4j.**"/>
</element>
</element>
<element type="Subsystem" name="Jackson">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="com.fasterxml.jackson.**"/>
</element>
</element>
<element type="Subsystem" name="DOM">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="org.w3c.dom.**"/>
</element>
</element>
<element type="Subsystem" name="AOP Alliance">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="org.aopalliance.**"/>
</element>
</element>
<element type="Subsystem" name="Guava">
<element type="TypeFilter" name="Assignment">
<element type="IncludeTypePattern" name="com.google.common.**"/>
<element type="IncludeTypePattern" name="com.mysema.query.**"/>
</element>
</element>
</architecture>

View File

@@ -1,5 +1,5 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
@@ -11,11 +11,12 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.10.3.RELEASE</version>
<version>1.8.7.BUILD-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<properties>
<validation>1.0.0.GA</validation>
<objenesis>1.3</objenesis>
<equalsverifier>1.5</equalsverifier>
</properties>
@@ -58,14 +59,14 @@
</dependency>
<dependency>
<groupId>com.querydsl</groupId>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-mongodb</artifactId>
<version>${querydsl}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>com.querydsl</groupId>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>${querydsl}</version>
<scope>provided</scope>
@@ -182,7 +183,7 @@
<version>${apt}</version>
<dependencies>
<dependency>
<groupId>com.querydsl</groupId>
<groupId>com.mysema.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>${querydsl}</version>
</dependency>

View File

@@ -1,61 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import java.util.List;
import org.springframework.dao.DataAccessException;
import com.mongodb.BulkWriteError;
import com.mongodb.BulkWriteException;
import com.mongodb.BulkWriteResult;
/**
* Is thrown when errors occur during bulk operations.
*
* @author Tobias Trelle
* @author Oliver Gierke
* @since 1.9
*/
public class BulkOperationException extends DataAccessException {
private static final long serialVersionUID = 73929601661154421L;
private final List<BulkWriteError> errors;
private final BulkWriteResult result;
/**
* Creates a new {@link BulkOperationException} with the given message and source {@link BulkWriteException}.
*
* @param message must not be {@literal null}.
* @param source must not be {@literal null}.
*/
public BulkOperationException(String message, BulkWriteException source) {
super(message, source);
this.errors = source.getWriteErrors();
this.result = source.getWriteResult();
}
public List<BulkWriteError> getErrors() {
return errors;
}
public BulkWriteResult getResult() {
return result;
}
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2016 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,7 +15,6 @@
*/
package org.springframework.data.mongodb.config;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
@@ -119,33 +118,17 @@ public abstract class AbstractMongoConfiguration {
* Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration
* class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending
* {@link AbstractMongoConfiguration} the base package will be considered {@code com.acme} unless the method is
* overridden to implement alternate behavior.
* overriden to implement alternate behaviour.
*
* @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for
* entities.
* @deprecated use {@link #getMappingBasePackages()} instead.
*/
@Deprecated
protected String getMappingBasePackage() {
Package mappingBasePackage = getClass().getPackage();
return mappingBasePackage == null ? null : mappingBasePackage.getName();
}
/**
* Returns the base packages to scan for MongoDB mapped entities at startup. Will return the package name of the
* configuration class' (the concrete class, not this one here) by default. So if you have a
* {@code com.acme.AppConfig} extending {@link AbstractMongoConfiguration} the base package will be considered
* {@code com.acme} unless the method is overridden to implement alternate behavior.
*
* @return the base packages to scan for mapped {@link Document} classes or an empty collection to not enable scanning
* for entities.
* @since 1.10
*/
protected Collection<String> getMappingBasePackages() {
return Collections.singleton(getMappingBasePackage());
}
/**
* Return {@link UserCredentials} to be used when connecting to the MongoDB instance or {@literal null} if none shall
* be used.
@@ -221,52 +204,26 @@ public abstract class AbstractMongoConfiguration {
}
/**
* Scans the mapping base package for classes annotated with {@link Document}. By default, it scans for entities in
* all packages returned by {@link #getMappingBasePackages()}.
* Scans the mapping base package for classes annotated with {@link Document}.
*
* @see #getMappingBasePackages()
* @see #getMappingBasePackage()
* @return
* @throws ClassNotFoundException
*/
protected Set<Class<?>> getInitialEntitySet() throws ClassNotFoundException {
Set<Class<?>> initialEntitySet = new HashSet<Class<?>>();
for (String basePackage : getMappingBasePackages()) {
initialEntitySet.addAll(scanForEntities(basePackage));
}
return initialEntitySet;
}
/**
* Scans the given base package for entities, i.e. MongoDB specific types annotated with {@link Document} and
* {@link Persistent}.
*
* @param basePackage must not be {@literal null}.
* @return
* @throws ClassNotFoundException
* @since 1.10
*/
protected Set<Class<?>> scanForEntities(String basePackage) throws ClassNotFoundException {
if (!StringUtils.hasText(basePackage)) {
return Collections.emptySet();
}
String basePackage = getMappingBasePackage();
Set<Class<?>> initialEntitySet = new HashSet<Class<?>>();
if (StringUtils.hasText(basePackage)) {
ClassPathScanningCandidateComponentProvider componentProvider = new ClassPathScanningCandidateComponentProvider(
false);
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Document.class));
componentProvider.addIncludeFilter(new AnnotationTypeFilter(Persistent.class));
for (BeanDefinition candidate : componentProvider.findCandidateComponents(basePackage)) {
initialEntitySet
.add(ClassUtils.forName(candidate.getBeanClassName(), AbstractMongoConfiguration.class.getClassLoader()));
initialEntitySet.add(ClassUtils.forName(candidate.getBeanClassName(),
AbstractMongoConfiguration.class.getClassLoader()));
}
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -361,9 +361,7 @@ public class MappingMongoConverterParser implements BeanDefinitionParser {
* @param filters
*/
public NegatingFilter(TypeFilter... filters) {
Assert.notNull(filters, "TypeFilters must not be null");
Assert.notNull(filters);
this.delegates = new HashSet<TypeFilter>(Arrays.asList(filters));
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2016 the original author or authors.
* Copyright 2013-2014 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,24 +15,24 @@
*/
package org.springframework.data.mongodb.config;
import static org.springframework.beans.factory.config.BeanDefinition.*;
import static org.springframework.data.mongodb.config.BeanNames.*;
import java.lang.annotation.Annotation;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.RootBeanDefinition;
import org.springframework.context.annotation.ImportBeanDefinitionRegistrar;
import org.springframework.core.type.AnnotationMetadata;
import org.springframework.data.auditing.IsNewAwareAuditingHandler;
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
import org.springframework.data.auditing.config.AuditingConfiguration;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.event.AuditingEventListener;
import org.springframework.data.support.IsNewStrategyFactory;
import org.springframework.util.Assert;
/**
@@ -71,6 +71,7 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
Assert.notNull(annotationMetadata, "AnnotationMetadata must not be null!");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
defaultDependenciesIfNecessary(registry, annotationMetadata);
super.registerBeanDefinitions(annotationMetadata, registry);
}
@@ -84,11 +85,7 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
Assert.notNull(configuration, "AuditingConfiguration must not be null!");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(IsNewAwareAuditingHandler.class);
BeanDefinitionBuilder definition = BeanDefinitionBuilder.genericBeanDefinition(MongoMappingContextLookup.class);
definition.setAutowireMode(AbstractBeanDefinition.AUTOWIRE_CONSTRUCTOR);
builder.addConstructorArgValue(definition.getBeanDefinition());
builder.addConstructorArgReference(MAPPING_CONTEXT_BEAN_NAME);
return configureDefaultAuditHandlerAttributes(configuration, builder);
}
@@ -105,58 +102,29 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
BeanDefinitionBuilder listenerBeanDefinitionBuilder = BeanDefinitionBuilder
.rootBeanDefinition(AuditingEventListener.class);
listenerBeanDefinitionBuilder
.addConstructorArgValue(ParsingUtils.getObjectFactoryBeanDefinition(getAuditingHandlerBeanName(), registry));
listenerBeanDefinitionBuilder.addConstructorArgValue(ParsingUtils.getObjectFactoryBeanDefinition(
getAuditingHandlerBeanName(), registry));
registerInfrastructureBeanWithId(listenerBeanDefinitionBuilder.getBeanDefinition(),
AuditingEventListener.class.getName(), registry);
}
/**
* Simple helper to be able to wire the {@link MappingContext} from a {@link MappingMongoConverter} bean available in
* the application context.
*
* @author Oliver Gierke
* Register default bean definitions for a {@link MongoMappingContext} and an {@link IsNewStrategyFactory} in case we
* don't find beans with the assumed names in the registry.
*
* @param registry the {@link BeanDefinitionRegistry} to use to register the components into.
* @param source the source which the registered components shall be registered with
*/
static class MongoMappingContextLookup
implements FactoryBean<MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty>> {
private void defaultDependenciesIfNecessary(BeanDefinitionRegistry registry, Object source) {
private final MappingMongoConverter converter;
if (!registry.containsBeanDefinition(MAPPING_CONTEXT_BEAN_NAME)) {
/**
* Creates a new {@link MongoMappingContextLookup} for the given {@link MappingMongoConverter}.
*
* @param converter must not be {@literal null}.
*/
public MongoMappingContextLookup(MappingMongoConverter converter) {
this.converter = converter;
}
RootBeanDefinition definition = new RootBeanDefinition(MongoMappingContext.class);
definition.setRole(ROLE_INFRASTRUCTURE);
definition.setSource(source);
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
@Override
public MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> getObject() throws Exception {
return converter.getMappingContext();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
@Override
public Class<?> getObjectType() {
return MappingContext.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
*/
@Override
public boolean isSingleton() {
return true;
registry.registerBeanDefinition(MAPPING_CONTEXT_BEAN_NAME, definition);
}
}
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -129,7 +129,6 @@ abstract class MongoParsingUtils {
setPropertyValue(clientOptionsDefBuilder, optionsElement, "heartbeat-socket-timeout", "heartbeatSocketTimeout");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "ssl", "ssl");
setPropertyReference(clientOptionsDefBuilder, optionsElement, "ssl-socket-factory-ref", "sslSocketFactory");
setPropertyValue(clientOptionsDefBuilder, optionsElement, "server-selection-timeout", "serverSelectionTimeout");
mongoClientBuilder.addPropertyValue("mongoClientOptions", clientOptionsDefBuilder.getBeanDefinition());

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,145 +0,0 @@
/*
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.List;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.Pair;
import com.mongodb.BulkWriteResult;
/**
* Bulk operations for insert/update/remove actions on a collection. These bulks operation are available since MongoDB
* 2.6 and make use of low level bulk commands on the protocol level. This interface defines a fluent API to add
* multiple single operations or list of similar operations in sequence which can then eventually be executed by calling
* {@link #execute()}.
*
* @author Tobias Trelle
* @author Oliver Gierke
* @since 1.9
*/
public interface BulkOperations {
/**
* Mode for bulk operation.
**/
public enum BulkMode {
/** Perform bulk operations in sequence. The first error will cancel processing. */
ORDERED,
/** Perform bulk operations in parallel. Processing will continue on errors. */
UNORDERED
};
/**
* Add a single insert to the bulk operation.
*
* @param documents the document to insert, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the insert added, will never be {@literal null}.
*/
BulkOperations insert(Object documents);
/**
* Add a list of inserts to the bulk operation.
*
* @param documents List of documents to insert, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the insert added, will never be {@literal null}.
*/
BulkOperations insert(List<? extends Object> documents);
/**
* Add a single update to the bulk operation. For the update request, only the first matching document is updated.
*
* @param query update criteria, must not be {@literal null}.
* @param update {@link Update} operation to perform, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateOne(Query query, Update update);
/**
* Add a list of updates to the bulk operation. For each update request, only the first matching document is updated.
*
* @param updates Update operations to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateOne(List<Pair<Query, Update>> updates);
/**
* Add a single update to the bulk operation. For the update request, all matching documents are updated.
*
* @param query Update criteria.
* @param update Update operation to perform.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateMulti(Query query, Update update);
/**
* Add a list of updates to the bulk operation. For each update request, all matching documents are updated.
*
* @param updates Update operations to perform.
* @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations updateMulti(List<Pair<Query, Update>> updates);
/**
* Add a single upsert to the bulk operation. An upsert is an update if the set of matching documents is not empty,
* else an insert.
*
* @param query Update criteria.
* @param update Update operation to perform.
* @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations upsert(Query query, Update update);
/**
* Add a list of upserts to the bulk operation. An upsert is an update if the set of matching documents is not empty,
* else an insert.
*
* @param updates Updates/insert operations to perform.
* @return The bulk operation.
* @return the current {@link BulkOperations} instance with the update added, will never be {@literal null}.
*/
BulkOperations upsert(List<Pair<Query, Update>> updates);
/**
* Add a single remove operation to the bulk operation.
*
* @param remove the {@link Query} to select the documents to be removed, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the removal added, will never be {@literal null}.
*/
BulkOperations remove(Query remove);
/**
* Add a list of remove operations to the bulk operation.
*
* @param removes the remove operations to perform, must not be {@literal null}.
* @return the current {@link BulkOperations} instance with the removal added, will never be {@literal null}.
*/
BulkOperations remove(List<Query> removes);
/**
* Execute all bulk operations using the default write concern.
*
* @return Result of the bulk operation providing counters for inserts/updates etc.
* @throws {@link BulkOperationException} if an error occurred during bulk processing.
*/
BulkWriteResult execute();
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,337 +0,0 @@
/*
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.List;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.Pair;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.BulkWriteException;
import com.mongodb.BulkWriteOperation;
import com.mongodb.BulkWriteRequestBuilder;
import com.mongodb.BulkWriteResult;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.WriteConcern;
/**
* Default implementation for {@link BulkOperations}.
*
* @author Tobias Trelle
* @author Oliver Gierke
* @author Christoph Strobl
* @since 1.9
*/
class DefaultBulkOperations implements BulkOperations {
private final MongoOperations mongoOperations;
private final BulkMode bulkMode;
private final String collectionName;
private final Class<?> entityType;
private PersistenceExceptionTranslator exceptionTranslator;
private WriteConcernResolver writeConcernResolver;
private WriteConcern defaultWriteConcern;
private BulkWriteOperation bulk;
/**
* Creates a new {@link DefaultBulkOperations} for the given {@link MongoOperations}, {@link BulkMode}, collection
* name and {@link WriteConcern}.
*
* @param mongoOperations The underlying {@link MongoOperations}, must not be {@literal null}.
* @param bulkMode must not be {@literal null}.
* @param collectionName Name of the collection to work on, must not be {@literal null} or empty.
* @param entityType the entity type, can be {@literal null}.
*/
DefaultBulkOperations(MongoOperations mongoOperations, BulkMode bulkMode, String collectionName,
Class<?> entityType) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.notNull(bulkMode, "BulkMode must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
this.mongoOperations = mongoOperations;
this.bulkMode = bulkMode;
this.collectionName = collectionName;
this.entityType = entityType;
this.exceptionTranslator = new MongoExceptionTranslator();
this.writeConcernResolver = DefaultWriteConcernResolver.INSTANCE;
this.bulk = initBulkOperation();
}
/**
* Configures the {@link PersistenceExceptionTranslator} to be used. Defaults to {@link MongoExceptionTranslator}.
*
* @param exceptionTranslator can be {@literal null}.
*/
public void setExceptionTranslator(PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? new MongoExceptionTranslator() : exceptionTranslator;
}
/**
* Configures the {@link WriteConcernResolver} to be used. Defaults to {@link DefaultWriteConcernResolver}.
*
* @param writeConcernResolver can be {@literal null}.
*/
public void setWriteConcernResolver(WriteConcernResolver writeConcernResolver) {
this.writeConcernResolver = writeConcernResolver == null ? DefaultWriteConcernResolver.INSTANCE
: writeConcernResolver;
}
/**
* Configures the default {@link WriteConcern} to be used. Defaults to {@literal null}.
*
* @param defaultWriteConcern can be {@literal null}.
*/
public void setDefaultWriteConcern(WriteConcern defaultWriteConcern) {
this.defaultWriteConcern = defaultWriteConcern;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#insert(java.lang.Object)
*/
@Override
public BulkOperations insert(Object document) {
Assert.notNull(document, "Document must not be null!");
if (document instanceof DBObject) {
bulk.insert((DBObject) document);
return this;
}
DBObject sink = new BasicDBObject();
mongoOperations.getConverter().write(document, sink);
bulk.insert(sink);
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#insert(java.util.List)
*/
@Override
public BulkOperations insert(List<? extends Object> documents) {
Assert.notNull(documents, "Documents must not be null!");
for (Object document : documents) {
insert(document);
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateOne(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateOne(Query query, Update update) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
return updateOne(Arrays.asList(Pair.of(query, update)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateOne(java.util.List)
*/
@Override
public BulkOperations updateOne(List<Pair<Query, Update>> updates) {
Assert.notNull(updates, "Updates must not be null!");
for (Pair<Query, Update> update : updates) {
update(update.getFirst(), update.getSecond(), false, false);
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateMulti(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
@SuppressWarnings("unchecked")
public BulkOperations updateMulti(Query query, Update update) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
return updateMulti(Arrays.asList(Pair.of(query, update)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#updateMulti(java.util.List)
*/
@Override
public BulkOperations updateMulti(List<Pair<Query, Update>> updates) {
Assert.notNull(updates, "Updates must not be null!");
for (Pair<Query, Update> update : updates) {
update(update.getFirst(), update.getSecond(), false, true);
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#upsert(org.springframework.data.mongodb.core.query.Query, org.springframework.data.mongodb.core.query.Update)
*/
@Override
public BulkOperations upsert(Query query, Update update) {
return update(query, update, true, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#upsert(java.util.List)
*/
@Override
public BulkOperations upsert(List<Pair<Query, Update>> updates) {
for (Pair<Query, Update> update : updates) {
upsert(update.getFirst(), update.getSecond());
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#remove(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public BulkOperations remove(Query query) {
Assert.notNull(query, "Query must not be null!");
bulk.find(query.getQueryObject()).remove();
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#remove(java.util.List)
*/
@Override
public BulkOperations remove(List<Query> removes) {
Assert.notNull(removes, "Removals must not be null!");
for (Query query : removes) {
remove(query);
}
return this;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.BulkOperations#executeBulk()
*/
@Override
public BulkWriteResult execute() {
MongoAction action = new MongoAction(defaultWriteConcern, MongoActionOperation.BULK, collectionName, entityType,
null, null);
WriteConcern writeConcern = writeConcernResolver.resolve(action);
try {
return writeConcern == null ? bulk.execute() : bulk.execute(writeConcern);
} catch (BulkWriteException o_O) {
DataAccessException toThrow = exceptionTranslator.translateExceptionIfPossible(o_O);
throw toThrow == null ? o_O : toThrow;
} finally {
this.bulk = initBulkOperation();
}
}
/**
* Performs update and upsert bulk operations.
*
* @param query the {@link Query} to determine documents to update.
* @param update the {@link Update} to perform, must not be {@literal null}.
* @param upsert whether to upsert.
* @param multi whether to issue a multi-update.
* @return the {@link BulkOperations} with the update registered.
*/
private BulkOperations update(Query query, Update update, boolean upsert, boolean multi) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(update, "Update must not be null!");
BulkWriteRequestBuilder builder = bulk.find(query.getQueryObject());
if (upsert) {
if (multi) {
builder.upsert().update(update.getUpdateObject());
} else {
builder.upsert().updateOne(update.getUpdateObject());
}
} else {
if (multi) {
builder.update(update.getUpdateObject());
} else {
builder.updateOne(update.getUpdateObject());
}
}
return this;
}
private final BulkWriteOperation initBulkOperation() {
DBCollection collection = mongoOperations.getCollection(collectionName);
switch (bulkMode) {
case ORDERED:
return collection.initializeOrderedBulkOperation();
case UNORDERED:
return collection.initializeUnorderedBulkOperation();
}
throw new IllegalStateException("BulkMode was null!");
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -15,15 +15,17 @@
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.domain.Sort.Direction.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.List;
import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexField;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.util.Assert;
import com.mongodb.DBCollection;
@@ -40,12 +42,12 @@ import com.mongodb.MongoException;
*/
public class DefaultIndexOperations implements IndexOperations {
private static final String PARTIAL_FILTER_EXPRESSION_KEY = "partialFilterExpression";
private static final Double ONE = Double.valueOf(1);
private static final Double MINUS_ONE = Double.valueOf(-1);
private static final Collection<String> TWO_D_IDENTIFIERS = Arrays.asList("2d", "2dsphere");
private final MongoOperations mongoOperations;
private final String collectionName;
private final QueryMapper mapper;
private final Class<?> type;
/**
* Creates a new {@link DefaultIndexOperations}.
@@ -54,26 +56,12 @@ public class DefaultIndexOperations implements IndexOperations {
* @param collectionName must not be {@literal null}.
*/
public DefaultIndexOperations(MongoOperations mongoOperations, String collectionName) {
this(mongoOperations, collectionName, null);
}
/**
* Creates a new {@link DefaultIndexOperations}.
*
* @param mongoOperations must not be {@literal null}.
* @param collectionName must not be {@literal null}.
* @param type Type used for mapping potential partial index filter expression. Can be {@literal null}.
* @since 1.10
*/
public DefaultIndexOperations(MongoOperations mongoOperations, String collectionName, Class<?> type) {
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
Assert.notNull(collectionName, "Collection name can not be null!");
this.mongoOperations = mongoOperations;
this.collectionName = collectionName;
this.mapper = new QueryMapper(mongoOperations.getConverter());
this.type = type;
}
/*
@@ -81,20 +69,9 @@ public class DefaultIndexOperations implements IndexOperations {
* @see org.springframework.data.mongodb.core.IndexOperations#ensureIndex(org.springframework.data.mongodb.core.index.IndexDefinition)
*/
public void ensureIndex(final IndexDefinition indexDefinition) {
mongoOperations.execute(collectionName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
DBObject indexOptions = indexDefinition.getIndexOptions();
if (indexOptions != null && indexOptions.containsField(PARTIAL_FILTER_EXPRESSION_KEY)) {
Assert.isInstanceOf(DBObject.class, indexOptions.get(PARTIAL_FILTER_EXPRESSION_KEY));
indexOptions.put(PARTIAL_FILTER_EXPRESSION_KEY,
mapper.getMappedObject((DBObject) indexOptions.get(PARTIAL_FILTER_EXPRESSION_KEY),
lookupPersistentEntity(type, collectionName)));
}
if (indexOptions != null) {
collection.createIndex(indexDefinition.getIndexKeys(), indexOptions);
} else {
@@ -102,24 +79,6 @@ public class DefaultIndexOperations implements IndexOperations {
}
return null;
}
private MongoPersistentEntity<?> lookupPersistentEntity(Class<?> entityType, String collection) {
if (entityType != null) {
return mongoOperations.getConverter().getMappingContext().getPersistentEntity(entityType);
}
Collection<? extends MongoPersistentEntity<?>> entities = mongoOperations.getConverter().getMappingContext()
.getPersistentEntities();
for (MongoPersistentEntity<?> entity : entities) {
if (entity.getCollection().equals(collection)) {
return entity;
}
}
return null;
}
});
}
@@ -167,9 +126,7 @@ public class DefaultIndexOperations implements IndexOperations {
public List<IndexInfo> getIndexInfo() {
return mongoOperations.execute(collectionName, new CollectionCallback<List<IndexInfo>>() {
public List<IndexInfo> doInCollection(DBCollection collection) throws MongoException, DataAccessException {
List<DBObject> dbObjectList = collection.getIndexInfo();
return getIndexData(dbObjectList);
}
@@ -179,7 +136,44 @@ public class DefaultIndexOperations implements IndexOperations {
List<IndexInfo> indexInfoList = new ArrayList<IndexInfo>();
for (DBObject ix : dbObjectList) {
indexInfoList.add(IndexInfo.indexInfoOf(ix));
DBObject keyDbObject = (DBObject) ix.get("key");
int numberOfElements = keyDbObject.keySet().size();
List<IndexField> indexFields = new ArrayList<IndexField>(numberOfElements);
for (String key : keyDbObject.keySet()) {
Object value = keyDbObject.get(key);
if (TWO_D_IDENTIFIERS.contains(value)) {
indexFields.add(IndexField.geo(key));
} else if ("text".equals(value)) {
DBObject weights = (DBObject) ix.get("weights");
for (String fieldName : weights.keySet()) {
indexFields.add(IndexField.text(fieldName, Float.valueOf(weights.get(fieldName).toString())));
}
} else {
Double keyValue = new Double(value.toString());
if (ONE.equals(keyValue)) {
indexFields.add(IndexField.create(key, ASC));
} else if (MINUS_ONE.equals(keyValue)) {
indexFields.add(IndexField.create(key, DESC));
}
}
}
String name = ix.get("name").toString();
boolean unique = ix.containsField("unique") ? (Boolean) ix.get("unique") : false;
boolean dropDuplicates = ix.containsField("dropDups") ? (Boolean) ix.get("dropDups") : false;
boolean sparse = ix.containsField("sparse") ? (Boolean) ix.get("sparse") : false;
String language = ix.containsField("default_language") ? (String) ix.get("default_language") : "";
indexInfoList.add(new IndexInfo(indexFields, name, unique, dropDuplicates, sparse, language));
}
return indexInfoList;

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,32 +0,0 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import com.mongodb.WriteConcern;
/**
* Default {@link WriteConcernResolver} resolving the {@link WriteConcern} from the given {@link MongoAction}.
*
* @author Oliver Gierke
*/
enum DefaultWriteConcernResolver implements WriteConcernResolver {
INSTANCE;
public WriteConcern resolve(MongoAction action) {
return action.getDefaultWriteConcern();
}
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2016-2017 the original author or authors.
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -62,7 +62,7 @@ class GeoCommandStatistics {
* didn't return any result introduced in MongoDB 3.2 RC1.
*
* @return
* @see <a href="https://jira.mongodb.org/browse/SERVER-21024">MongoDB Jira SERVER-21024</a>
* @see https://jira.mongodb.org/browse/SERVER-21024
*/
public double getAverageDistance() {

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2016 the original author or authors.
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -17,14 +17,15 @@ package org.springframework.data.mongodb.core;
import org.springframework.context.annotation.Bean;
import org.springframework.data.mongodb.core.geo.GeoJsonModule;
import org.springframework.data.web.config.SpringDataJacksonModules;
import org.springframework.data.web.config.SpringDataWebConfigurationMixin;
/**
* Configuration class to expose {@link GeoJsonModule} as a Spring bean.
*
* @author Oliver Gierke
*/
public class GeoJsonConfiguration implements SpringDataJacksonModules {
@SpringDataWebConfigurationMixin
public class GeoJsonConfiguration {
@Bean
public GeoJsonModule geoJsonModule() {

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -25,5 +25,5 @@ package org.springframework.data.mongodb.core;
*/
public enum MongoActionOperation {
REMOVE, UPDATE, INSERT, INSERT_LIST, SAVE, BULK;
REMOVE, UPDATE, INSERT, INSERT_LIST, SAVE
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2011-2017 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -27,8 +27,7 @@ import com.mongodb.Mongo;
* Mongo server administration exposed via JMX annotations
*
* @author Mark Pollack
* @author Thomas Darimont
* @author Mark Paluch
* @author Thomas Darimont
*/
@ManagedResource(description = "Mongo Admin Operations")
public class MongoAdmin implements MongoAdminOperations {
@@ -36,11 +35,10 @@ public class MongoAdmin implements MongoAdminOperations {
private final Mongo mongo;
private String username;
private String password;
private String authenticationDatabaseName;
private String authenticationDatabaseName;
public MongoAdmin(Mongo mongo) {
Assert.notNull(mongo, "Mongo must not be null!");
Assert.notNull(mongo);
this.mongo = mongo;
}
@@ -86,16 +84,16 @@ public class MongoAdmin implements MongoAdminOperations {
this.password = password;
}
/**
* Sets the authenticationDatabaseName to use to authenticate with the Mongo database.
*
* @param authenticationDatabaseName The authenticationDatabaseName to use.
*/
public void setAuthenticationDatabaseName(String authenticationDatabaseName) {
this.authenticationDatabaseName = authenticationDatabaseName;
}
/**
* Sets the authenticationDatabaseName to use to authenticate with the Mongo database.
*
* @param authenticationDatabaseName The authenticationDatabaseName to use.
*/
public void setAuthenticationDatabaseName(String authenticationDatabaseName) {
this.authenticationDatabaseName = authenticationDatabaseName;
}
DB getDB(String databaseName) {
return MongoDbUtils.getDB(mongo, databaseName, new UserCredentials(username, password), authenticationDatabaseName);
return MongoDbUtils.getDB(mongo, databaseName, new UserCredentials(username, password), authenticationDatabaseName);
}
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2017 the original author or authors.
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -18,10 +18,8 @@ package org.springframework.data.mongodb.core;
import javax.net.SocketFactory;
import javax.net.ssl.SSLSocketFactory;
import org.springframework.beans.DirectFieldAccessor;
import org.springframework.beans.factory.config.AbstractFactoryBean;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.util.MongoClientVersion;
import com.mongodb.DBDecoderFactory;
import com.mongodb.DBEncoderFactory;
@@ -32,7 +30,7 @@ import com.mongodb.WriteConcern;
/**
* A factory bean for construction of a {@link MongoClientOptions} instance.
*
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
@@ -64,14 +62,13 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
private int heartbeatConnectTimeout = DEFAULT_MONGO_OPTIONS.getHeartbeatConnectTimeout();
private int heartbeatSocketTimeout = DEFAULT_MONGO_OPTIONS.getHeartbeatSocketTimeout();
private String requiredReplicaSetName = DEFAULT_MONGO_OPTIONS.getRequiredReplicaSetName();
private int serverSelectionTimeout = Integer.MIN_VALUE;
private boolean ssl;
private SSLSocketFactory sslSocketFactory;
/**
* Set the {@link MongoClient} description.
*
*
* @param description
*/
public void setDescription(String description) {
@@ -80,7 +77,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/**
* Set the minimum number of connections per host.
*
*
* @param minConnectionsPerHost
*/
public void setMinConnectionsPerHost(int minConnectionsPerHost) {
@@ -90,7 +87,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/**
* Set the number of connections allowed per host. Will block if run out. Default is 10. System property
* {@code MONGO.POOLSIZE} can override
*
*
* @param connectionsPerHost
*/
public void setConnectionsPerHost(int connectionsPerHost) {
@@ -101,7 +98,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
* Set the multiplier for connectionsPerHost for # of threads that can block. Default is 5. If connectionsPerHost is
* 10, and threadsAllowedToBlockForConnectionMultiplier is 5, then 50 threads can block more than that and an
* exception will be thrown.
*
*
* @param threadsAllowedToBlockForConnectionMultiplier
*/
public void setThreadsAllowedToBlockForConnectionMultiplier(int threadsAllowedToBlockForConnectionMultiplier) {
@@ -110,7 +107,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/**
* Set the max wait time of a blocking thread for a connection. Default is 12000 ms (2 minutes)
*
*
* @param maxWaitTime
*/
public void setMaxWaitTime(int maxWaitTime) {
@@ -119,7 +116,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/**
* The maximum idle time for a pooled connection.
*
*
* @param maxConnectionIdleTime
*/
public void setMaxConnectionIdleTime(int maxConnectionIdleTime) {
@@ -128,7 +125,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/**
* Set the maximum life time for a pooled connection.
*
*
* @param maxConnectionLifeTime
*/
public void setMaxConnectionLifeTime(int maxConnectionLifeTime) {
@@ -137,7 +134,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/**
* Set the connect timeout in milliseconds. 0 is default and infinite.
*
*
* @param connectTimeout
*/
public void setConnectTimeout(int connectTimeout) {
@@ -146,7 +143,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/**
* Set the socket timeout. 0 is default and infinite.
*
*
* @param socketTimeout
*/
public void setSocketTimeout(int socketTimeout) {
@@ -155,7 +152,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/**
* Set the keep alive flag, controls whether or not to have socket keep alive timeout. Defaults to false.
*
*
* @param socketKeepAlive
*/
public void setSocketKeepAlive(boolean socketKeepAlive) {
@@ -164,7 +161,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/**
* Set the {@link ReadPreference}.
*
*
* @param readPreference
*/
public void setReadPreference(ReadPreference readPreference) {
@@ -174,7 +171,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/**
* Set the {@link WriteConcern} that will be the default value used when asking the {@link MongoDbFactory} for a DB
* object.
*
*
* @param writeConcern
*/
public void setWriteConcern(WriteConcern writeConcern) {
@@ -190,7 +187,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/**
* Set the frequency that the driver will attempt to determine the current state of each server in the cluster.
*
*
* @param heartbeatFrequency
*/
public void setHeartbeatFrequency(int heartbeatFrequency) {
@@ -200,7 +197,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/**
* In the event that the driver has to frequently re-check a server's availability, it will wait at least this long
* since the previous check to avoid wasted effort.
*
*
* @param minHeartbeatFrequency
*/
public void setMinHeartbeatFrequency(int minHeartbeatFrequency) {
@@ -209,7 +206,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/**
* Set the connect timeout for connections used for the cluster heartbeat.
*
*
* @param heartbeatConnectTimeout
*/
public void setHeartbeatConnectTimeout(int heartbeatConnectTimeout) {
@@ -218,7 +215,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/**
* Set the socket timeout for connections used for the cluster heartbeat.
*
*
* @param heartbeatSocketTimeout
*/
public void setHeartbeatSocketTimeout(int heartbeatSocketTimeout) {
@@ -227,7 +224,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/**
* Configures the name of the replica set.
*
*
* @param requiredReplicaSetName
*/
public void setRequiredReplicaSetName(String requiredReplicaSetName) {
@@ -236,7 +233,7 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/**
* This controls if the driver should us an SSL connection. Defaults to |@literal false}.
*
*
* @param ssl
*/
public void setSsl(boolean ssl) {
@@ -246,41 +243,24 @@ public class MongoClientOptionsFactoryBean extends AbstractFactoryBean<MongoClie
/**
* Set the {@link SSLSocketFactory} to use for the {@literal SSL} connection. If none is configured here,
* {@link SSLSocketFactory#getDefault()} will be used.
*
*
* @param sslSocketFactory
*/
public void setSslSocketFactory(SSLSocketFactory sslSocketFactory) {
this.sslSocketFactory = sslSocketFactory;
}
/**
* Set the {@literal server selection timeout} in msec for a 3.x MongoDB Java driver. If not set the default value of
* 30 sec will be used. A value of 0 means that it will timeout immediately if no server is available. A negative
* value means to wait indefinitely.
*
* @param serverSelectionTimeout in msec.
*/
public void setServerSelectionTimeout(int serverSelectionTimeout) {
this.serverSelectionTimeout = serverSelectionTimeout;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.config.AbstractFactoryBean#createInstance()
*/
@Override
protected MongoClientOptions createInstance() throws Exception {
SocketFactory socketFactoryToUse = ssl
? (sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory.getDefault()) : this.socketFactory;
SocketFactory socketFactoryToUse = ssl ? (sslSocketFactory != null ? sslSocketFactory : SSLSocketFactory
.getDefault()) : this.socketFactory;
MongoClientOptions.Builder builder = MongoClientOptions.builder();
if (MongoClientVersion.isMongo3Driver() && serverSelectionTimeout != Integer.MIN_VALUE) {
new DirectFieldAccessor(builder).setPropertyValue("serverSelectionTimeout", serverSelectionTimeout);
}
return builder //
return MongoClientOptions.builder() //
.alwaysUseMBeans(this.alwaysUseMBeans) //
.connectionsPerHost(this.connectionsPerHost) //
.connectTimeout(connectTimeout) //

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.core;
import java.util.Arrays;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Set;
@@ -27,12 +28,9 @@ import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.dao.InvalidDataAccessResourceUsageException;
import org.springframework.dao.PermissionDeniedDataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.BulkOperationException;
import org.springframework.data.mongodb.UncategorizedMongoDbException;
import org.springframework.data.mongodb.util.MongoDbErrorCodes;
import org.springframework.util.ClassUtils;
import com.mongodb.BulkWriteException;
import com.mongodb.MongoException;
/**
@@ -46,12 +44,12 @@ import com.mongodb.MongoException;
*/
public class MongoExceptionTranslator implements PersistenceExceptionTranslator {
private static final Set<String> DULICATE_KEY_EXCEPTIONS = new HashSet<String>(
Arrays.asList("MongoException.DuplicateKey", "DuplicateKeyException"));
private static final Set<String> DULICATE_KEY_EXCEPTIONS = new HashSet<String>(Arrays.asList(
"MongoException.DuplicateKey", "DuplicateKeyException"));
private static final Set<String> RESOURCE_FAILURE_EXCEPTIONS = new HashSet<String>(
Arrays.asList("MongoException.Network", "MongoSocketException", "MongoException.CursorNotFound",
"MongoCursorNotFoundException", "MongoServerSelectionException", "MongoTimeoutException"));
private static final Set<String> RESOURCE_FAILURE_EXCEPTIONS = new HashSet<String>(Arrays.asList(
"MongoException.Network", "MongoSocketException", "MongoException.CursorNotFound",
"MongoCursorNotFoundException", "MongoServerSelectionException", "MongoTimeoutException"));
private static final Set<String> RESOURCE_USAGE_EXCEPTIONS = new HashSet<String>(
Arrays.asList("MongoInternalException"));
@@ -85,10 +83,6 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
return new DataIntegrityViolationException(ex.getMessage(), ex);
}
if (ex instanceof BulkWriteException) {
return new BulkOperationException(ex.getMessage(), (BulkWriteException) ex);
}
// All other MongoExceptions
if (ex instanceof MongoException) {
@@ -112,4 +106,126 @@ public class MongoExceptionTranslator implements PersistenceExceptionTranslator
// that translation should not occur.
return null;
}
/**
* {@link MongoDbErrorCodes} holds MongoDB specific error codes outlined in {@literal mongo/base/error_codes.err}.
*
* @author Christoph Strobl
* @since 1.8
*/
public static final class MongoDbErrorCodes {
static HashMap<Integer, String> dataAccessResourceFailureCodes;
static HashMap<Integer, String> dataIntegrityViolationCodes;
static HashMap<Integer, String> duplicateKeyCodes;
static HashMap<Integer, String> invalidDataAccessApiUsageExeption;
static HashMap<Integer, String> permissionDeniedCodes;
static HashMap<Integer, String> errorCodes;
static {
dataAccessResourceFailureCodes = new HashMap<Integer, String>(10);
dataAccessResourceFailureCodes.put(6, "HostUnreachable");
dataAccessResourceFailureCodes.put(7, "HostNotFound");
dataAccessResourceFailureCodes.put(89, "NetworkTimeout");
dataAccessResourceFailureCodes.put(91, "ShutdownInProgress");
dataAccessResourceFailureCodes.put(12000, "SlaveDelayDifferential");
dataAccessResourceFailureCodes.put(10084, "CannotFindMapFile64Bit");
dataAccessResourceFailureCodes.put(10085, "CannotFindMapFile");
dataAccessResourceFailureCodes.put(10357, "ShutdownInProgress");
dataAccessResourceFailureCodes.put(10359, "Header==0");
dataAccessResourceFailureCodes.put(13440, "BadOffsetInFile");
dataAccessResourceFailureCodes.put(13441, "BadOffsetInFile");
dataAccessResourceFailureCodes.put(13640, "DataFileHeaderCorrupt");
dataIntegrityViolationCodes = new HashMap<Integer, String>(6);
dataIntegrityViolationCodes.put(67, "CannotCreateIndex");
dataIntegrityViolationCodes.put(68, "IndexAlreadyExists");
dataIntegrityViolationCodes.put(85, "IndexOptionsConflict");
dataIntegrityViolationCodes.put(86, "IndexKeySpecsConflict");
dataIntegrityViolationCodes.put(112, "WriteConflict");
dataIntegrityViolationCodes.put(117, "ConflictingOperationInProgress");
duplicateKeyCodes = new HashMap<Integer, String>(3);
duplicateKeyCodes.put(3, "OBSOLETE_DuplicateKey");
duplicateKeyCodes.put(84, "DuplicateKeyValue");
duplicateKeyCodes.put(11000, "DuplicateKey");
duplicateKeyCodes.put(11001, "DuplicateKey");
invalidDataAccessApiUsageExeption = new HashMap<Integer, String>();
invalidDataAccessApiUsageExeption.put(5, "GraphContainsCycle");
invalidDataAccessApiUsageExeption.put(9, "FailedToParse");
invalidDataAccessApiUsageExeption.put(14, "TypeMismatch");
invalidDataAccessApiUsageExeption.put(15, "Overflow");
invalidDataAccessApiUsageExeption.put(16, "InvalidLength");
invalidDataAccessApiUsageExeption.put(20, "IllegalOperation");
invalidDataAccessApiUsageExeption.put(21, "EmptyArrayOperation");
invalidDataAccessApiUsageExeption.put(22, "InvalidBSON");
invalidDataAccessApiUsageExeption.put(23, "AlreadyInitialized");
invalidDataAccessApiUsageExeption.put(29, "NonExistentPath");
invalidDataAccessApiUsageExeption.put(30, "InvalidPath");
invalidDataAccessApiUsageExeption.put(40, "ConflictingUpdateOperators");
invalidDataAccessApiUsageExeption.put(45, "UserDataInconsistent");
invalidDataAccessApiUsageExeption.put(30, "DollarPrefixedFieldName");
invalidDataAccessApiUsageExeption.put(52, "InvalidPath");
invalidDataAccessApiUsageExeption.put(53, "InvalidIdField");
invalidDataAccessApiUsageExeption.put(54, "NotSingleValueField");
invalidDataAccessApiUsageExeption.put(55, "InvalidDBRef");
invalidDataAccessApiUsageExeption.put(56, "EmptyFieldName");
invalidDataAccessApiUsageExeption.put(57, "DottedFieldName");
invalidDataAccessApiUsageExeption.put(59, "CommandNotFound");
invalidDataAccessApiUsageExeption.put(60, "DatabaseNotFound");
invalidDataAccessApiUsageExeption.put(61, "ShardKeyNotFound");
invalidDataAccessApiUsageExeption.put(62, "OplogOperationUnsupported");
invalidDataAccessApiUsageExeption.put(66, "ImmutableField");
invalidDataAccessApiUsageExeption.put(72, "InvalidOptions");
invalidDataAccessApiUsageExeption.put(115, "CommandNotSupported");
invalidDataAccessApiUsageExeption.put(116, "DocTooLargeForCapped");
invalidDataAccessApiUsageExeption.put(130, "SymbolNotFound");
invalidDataAccessApiUsageExeption.put(17280, "KeyTooLong");
invalidDataAccessApiUsageExeption.put(13334, "ShardKeyTooBig");
permissionDeniedCodes = new HashMap<Integer, String>();
permissionDeniedCodes.put(11, "UserNotFound");
permissionDeniedCodes.put(18, "AuthenticationFailed");
permissionDeniedCodes.put(31, "RoleNotFound");
permissionDeniedCodes.put(32, "RolesNotRelated");
permissionDeniedCodes.put(33, "PrvilegeNotFound");
permissionDeniedCodes.put(15847, "CannotAuthenticate");
permissionDeniedCodes.put(16704, "CannotAuthenticateToAdminDB");
permissionDeniedCodes.put(16705, "CannotAuthenticateToAdminDB");
errorCodes = new HashMap<Integer, String>();
errorCodes.putAll(dataAccessResourceFailureCodes);
errorCodes.putAll(dataIntegrityViolationCodes);
errorCodes.putAll(duplicateKeyCodes);
errorCodes.putAll(invalidDataAccessApiUsageExeption);
errorCodes.putAll(permissionDeniedCodes);
}
public static boolean isDataIntegrityViolationCode(Integer errorCode) {
return errorCode == null ? false : dataIntegrityViolationCodes.containsKey(errorCode);
}
public static boolean isDataAccessResourceFailureCode(Integer errorCode) {
return errorCode == null ? false : dataAccessResourceFailureCodes.containsKey(errorCode);
}
public static boolean isDuplicateKeyCode(Integer errorCode) {
return errorCode == null ? false : duplicateKeyCodes.containsKey(errorCode);
}
public static boolean isPermissionDeniedCode(Integer errorCode) {
return errorCode == null ? false : permissionDeniedCodes.containsKey(errorCode);
}
public static boolean isInvalidDataAccessApiUsageCode(Integer errorCode) {
return errorCode == null ? false : invalidDataAccessApiUsageExeption.containsKey(errorCode);
}
public static String getErrorDescription(Integer errorCode) {
return errorCode == null ? null : errorCodes.get(errorCode);
}
}
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -20,7 +20,6 @@ import java.util.List;
import java.util.Set;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.mongodb.core.BulkOperations.BulkMode;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
@@ -175,28 +174,13 @@ public interface MongoOperations {
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed.
*
* @param <T> element return type
* @param query must not be {@literal null}.
* @param entityType must not be {@literal null}.
* @return will never be {@literal null}.
* @param query
* @param entityType
* @return
* @since 1.7
*/
<T> CloseableIterator<T> stream(Query query, Class<T> entityType);
/**
* Executes the given {@link Query} on the entity collection of the specified {@code entityType} and collection backed
* by a Mongo DB {@link Cursor}.
* <p>
* Returns a {@link CloseableIterator} that wraps the a Mongo DB {@link Cursor} that needs to be closed.
*
* @param <T> element return type
* @param query must not be {@literal null}.
* @param entityType must not be {@literal null}.
* @param collectionName must not be {@literal null} or empty.
* @return will never be {@literal null}.
* @since 1.10
*/
<T> CloseableIterator<T> stream(Query query, Class<T> entityType, String collectionName);
/**
* Create an uncapped collection with a name based on the provided entity class.
*
@@ -308,34 +292,6 @@ public interface MongoOperations {
*/
ScriptOperations scriptOps();
/**
* Returns a new {@link BulkOperations} for the given collection.
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param collectionName the name of the collection to work on, must not be {@literal null} or empty.
* @return {@link BulkOperations} on the named collection
*/
BulkOperations bulkOps(BulkMode mode, String collectionName);
/**
* Returns a new {@link BulkOperations} for the given entity type.
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param entityType the name of the entity class, must not be {@literal null}.
* @return {@link BulkOperations} on the named collection associated of the given entity class.
*/
BulkOperations bulkOps(BulkMode mode, Class<?> entityType);
/**
* Returns a new {@link BulkOperations} for the given entity type and collection name.
*
* @param mode the {@link BulkMode} to use for bulk operations, must not be {@literal null}.
* @param entityClass the name of the entity class, must not be {@literal null}.
* @param collectionName the name of the collection to work on, must not be {@literal null} or empty.
* @return {@link BulkOperations} on the named collection associated with the given entity class.
*/
BulkOperations bulkOps(BulkMode mode, Class<?> entityType, String collectionName);
/**
* Query for a list of objects of type T from the collection used by the entity class.
* <p/>
@@ -644,8 +600,8 @@ public interface MongoOperations {
<T> T findById(Object id, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
@@ -656,8 +612,8 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, Class<T> entityClass);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query}.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
* fields specification.
@@ -669,8 +625,8 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, Class<T> entityClass, String collectionName);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
@@ -683,8 +639,8 @@ public interface MongoOperations {
<T> T findAndModify(Query query, Update update, FindAndModifyOptions options, Class<T> entityClass);
/**
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify
* <a/> to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* Triggers <a href="http://docs.mongodb.org/manual/reference/method/db.collection.findAndModify/">findAndModify<a/>
* to apply provided {@link Update} on documents matching {@link Criteria} of given {@link Query} taking
* {@link FindAndModifyOptions} into account.
*
* @param query the {@link Query} class that specifies the {@link Criteria} used to find a record and also an optional
@@ -772,9 +728,9 @@ public interface MongoOperations {
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" >
* Spring's Type Conversion"</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert"
* >Spring's Type Conversion"</a> for more details.
* <p/>
* <p/>
* Insert is used to initially store the object into the database. To update an existing object use the save method.
@@ -829,9 +785,9 @@ public interface MongoOperations {
* <p/>
* If you object has an "Id' property, it will be set with the generated Id from MongoDB. If your Id property is a
* String then MongoDB ObjectId will be used to populate that string. Otherwise, the conversion from ObjectId to your
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See
* <a href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert" >
* Spring's Type Conversion"</a> for more details.
* property type will be handled by Spring's BeanWrapper class that leverages Type Conversion API. See <a
* href="http://docs.spring.io/spring/docs/current/spring-framework-reference/html/validation.html#core-convert"
* >Spring's Type Conversion"</a> for more details.
*
* @param objectToSave the object to store in the collection
*/
@@ -1046,4 +1002,5 @@ public interface MongoOperations {
* @return
*/
MongoConverter getConverter();
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2010-2017 the original author or authors.
* Copyright 2010-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -60,7 +60,6 @@ import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.ConvertingPropertyAccessor;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.BulkOperations.BulkMode;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
@@ -93,7 +92,6 @@ import org.springframework.data.mongodb.core.mapreduce.GroupByResults;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.mapreduce.MapReduceResults;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Meta;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
@@ -126,7 +124,7 @@ import com.mongodb.util.JSONParseException;
/**
* Primary implementation of {@link MongoOperations}.
*
*
* @author Thomas Risberg
* @author Graeme Rocher
* @author Mark Pollack
@@ -139,9 +137,7 @@ import com.mongodb.util.JSONParseException;
* @author Chuong Ngo
* @author Christoph Strobl
* @author Doménique Tilleuil
* @author Niko Schmuck
* @author Mark Paluch
* @author Laszlo Csontos
*/
@SuppressWarnings("deprecation")
public class MongoTemplate implements MongoOperations, ApplicationContextAware {
@@ -178,7 +174,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Constructor used for a basic template configuration
*
*
* @param mongo must not be {@literal null}.
* @param databaseName must not be {@literal null} or empty.
*/
@@ -189,7 +185,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Constructor used for a template configuration with user credentials in the form of
* {@link org.springframework.data.authentication.UserCredentials}
*
*
* @param mongo must not be {@literal null}.
* @param databaseName must not be {@literal null} or empty.
* @param userCredentials
@@ -200,7 +196,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Constructor used for a basic template configuration.
*
*
* @param mongoDbFactory must not be {@literal null}.
*/
public MongoTemplate(MongoDbFactory mongoDbFactory) {
@@ -209,13 +205,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Constructor used for a basic template configuration.
*
*
* @param mongoDbFactory must not be {@literal null}.
* @param mongoConverter
*/
public MongoTemplate(MongoDbFactory mongoDbFactory, MongoConverter mongoConverter) {
Assert.notNull(mongoDbFactory, "MongoDbFactory must not be null!");
Assert.notNull(mongoDbFactory);
this.mongoDbFactory = mongoDbFactory;
this.exceptionTranslator = mongoDbFactory.getExceptionTranslator();
@@ -238,7 +234,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Configures the {@link WriteResultChecking} to be used with the template. Setting {@literal null} will reset the
* default of {@value #DEFAULT_WRITE_RESULT_CHECKING}.
*
*
* @param resultChecking
*/
public void setWriteResultChecking(WriteResultChecking resultChecking) {
@@ -249,7 +245,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* Configures the {@link WriteConcern} to be used with the template. If none is configured the {@link WriteConcern}
* configured on the {@link MongoDbFactory} will apply. If you configured a {@link Mongo} instance no
* {@link WriteConcern} will be used.
*
*
* @param writeConcern
*/
public void setWriteConcern(WriteConcern writeConcern) {
@@ -258,7 +254,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Configures the {@link WriteConcernResolver} to be used with the template.
*
*
* @param writeConcernResolver
*/
public void setWriteConcernResolver(WriteConcernResolver writeConcernResolver) {
@@ -268,7 +264,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Used by @{link {@link #prepareCollection(DBCollection)} to set the {@link ReadPreference} before any operations are
* performed.
*
*
* @param readPreference
*/
public void setReadPreference(ReadPreference readPreference) {
@@ -295,7 +291,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* they were registered for the current {@link MappingContext}. If no creator for the current {@link MappingContext}
* can be found we manually add the internally created one as {@link ApplicationListener} to make sure indexes get
* created appropriately for entity types persisted through this {@link MongoTemplate} instance.
*
*
* @param context must not be {@literal null}.
*/
private void prepareIndexCreator(ApplicationContext context) {
@@ -315,36 +311,22 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
/**
* Returns the default {@link org.springframework.data.mongodb.core.convert.MongoConverter}.
*
* Returns the default {@link org.springframework.data.mongodb.core.core.convert.MongoConverter}.
*
* @return
*/
public MongoConverter getConverter() {
return this.mongoConverter;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#executeAsStream(org.springframework.data.mongodb.core.query.Query, java.lang.Class)
*/
@Override
public <T> CloseableIterator<T> stream(final Query query, final Class<T> entityType) {
return stream(query, entityType, determineCollectionName(entityType));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#stream(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String)
*/
@Override
public <T> CloseableIterator<T> stream(final Query query, final Class<T> entityType, final String collectionName) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(entityType, "Entity type must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
return execute(collectionName, new CollectionCallback<CloseableIterator<T>>() {
return execute(entityType, new CollectionCallback<CloseableIterator<T>>() {
@Override
public CloseableIterator<T> doInCollection(DBCollection collection) throws MongoException, DataAccessException {
@@ -357,7 +339,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DBCursor cursor = collection.find(mappedQuery, mappedFields);
QueryCursorPreparer cursorPreparer = new QueryCursorPreparer(query, entityType);
ReadDbObjectCallback<T> readCallback = new ReadDbObjectCallback<T>(mongoConverter, entityType, collectionName);
ReadDbObjectCallback<T> readCallback = new ReadDbObjectCallback<T>(mongoConverter, entityType, collection
.getName());
return new CloseableIterableCursorAdapter<T>(cursorPreparer.prepare(cursor), exceptionTranslator, readCallback);
}
@@ -390,8 +373,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
*/
@Deprecated
public CommandResult executeCommand(final DBObject command, final int options) {
return executeCommand(command,
(options & Bytes.QUERYOPTION_SLAVEOK) != 0 ? ReadPreference.secondaryPreferred() : ReadPreference.primary());
return executeCommand(command, (options & Bytes.QUERYOPTION_SLAVEOK) != 0 ? ReadPreference.secondaryPreferred()
: ReadPreference.primary());
}
/*
@@ -429,7 +412,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Execute a MongoDB query and iterate over the query results on a per-document basis with a
* {@link DocumentCallbackHandler} using the provided CursorPreparer.
*
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification, must not be {@literal null}.
* @param collectionName name of the collection to retrieve the objects from
@@ -437,10 +420,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* @param preparer allows for customization of the {@link DBCursor} used when iterating over the result set, (apply
* limits, skips and so on).
*/
protected void executeQuery(Query query, String collectionName, DocumentCallbackHandler dch,
CursorPreparer preparer) {
protected void executeQuery(Query query, String collectionName, DocumentCallbackHandler dch, CursorPreparer preparer) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(query);
DBObject queryObject = queryMapper.getMappedObject(query.getQueryObject(), null);
DBObject sortObject = query.getSortObject();
@@ -456,7 +438,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
public <T> T execute(DbCallback<T> action) {
Assert.notNull(action, "DbCallbackmust not be null!");
Assert.notNull(action);
try {
DB db = this.getDb();
@@ -472,7 +454,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
public <T> T execute(String collectionName, CollectionCallback<T> callback) {
Assert.notNull(callback, "CollectionCallback must not be null!");
Assert.notNull(callback);
try {
DBCollection collection = getAndPrepareCollection(getDb(), collectionName);
@@ -558,29 +540,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public IndexOperations indexOps(Class<?> entityClass) {
return new DefaultIndexOperations(this, determineCollectionName(entityClass), entityClass);
}
public BulkOperations bulkOps(BulkMode bulkMode, String collectionName) {
return bulkOps(bulkMode, null, collectionName);
}
public BulkOperations bulkOps(BulkMode bulkMode, Class<?> entityClass) {
return bulkOps(bulkMode, entityClass, determineCollectionName(entityClass));
}
public BulkOperations bulkOps(BulkMode mode, Class<?> entityType, String collectionName) {
Assert.notNull(mode, "BulkMode must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
DefaultBulkOperations operations = new DefaultBulkOperations(this, mode, collectionName, entityType);
operations.setExceptionTranslator(exceptionTranslator);
operations.setWriteConcernResolver(writeConcernResolver);
operations.setDefaultWriteConcern(writeConcern);
return operations;
return new DefaultIndexOperations(this, determineCollectionName(entityClass));
}
/*
@@ -688,8 +648,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
List<Object> results = (List<Object>) commandResult.get("results");
results = results == null ? Collections.emptyList() : results;
DbObjectCallback<GeoResult<T>> callback = new GeoNearResultDbObjectCallback<T>(
new ReadDbObjectCallback<T>(mongoConverter, entityClass, collectionName), near.getMetric());
DbObjectCallback<GeoResult<T>> callback = new GeoNearResultDbObjectCallback<T>(new ReadDbObjectCallback<T>(
mongoConverter, entityClass, collectionName), near.getMetric());
List<GeoResult<T>> result = new ArrayList<GeoResult<T>>(results.size());
int index = 0;
@@ -700,8 +660,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/*
* As MongoDB currently (2.4.4) doesn't support the skipping of elements in near queries
* we skip the elements ourselves to avoid at least the document 2 object mapping overhead.
*
* @see <a href="https://jira.mongodb.org/browse/SERVER-3925">MongoDB Jira: SERVER-3925</a>
*
* @see https://jira.mongodb.org/browse/SERVER-3925
*/
if (index >= elementsToSkip) {
result.add(callback.doWith((DBObject) element));
@@ -750,8 +710,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public long count(Query query, Class<?> entityClass) {
Assert.notNull(entityClass, "Entity class must not be null!");
Assert.notNull(entityClass);
return count(query, entityClass, determineCollectionName(entityClass));
}
@@ -765,11 +724,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
*/
public long count(Query query, Class<?> entityClass, String collectionName) {
Assert.hasText(collectionName, "Collection name must not be null or empty!");
final DBObject dbObject = query == null ? null
: queryMapper.getMappedObject(query.getQueryObject(),
entityClass == null ? null : mappingContext.getPersistentEntity(entityClass));
Assert.hasText(collectionName);
final DBObject dbObject = query == null ? null : queryMapper.getMappedObject(query.getQueryObject(),
entityClass == null ? null : mappingContext.getPersistentEntity(entityClass));
return execute(collectionName, new CollectionCallback<Long>() {
public Long doInCollection(DBCollection collection) throws MongoException, DataAccessException {
@@ -807,7 +764,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Prepare the collection before any processing is done using it. This allows a convenient way to apply settings like
* slaveOk() etc. Can be overridden in sub-classes.
*
*
* @param collection
*/
protected void prepareCollection(DBCollection collection) {
@@ -821,7 +778,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* settings in sub-classes. <br />
* In case of using MongoDB Java driver version 3 the returned {@link WriteConcern} will be defaulted to
* {@link WriteConcern#ACKNOWLEDGED} when {@link WriteResultChecking} is set to {@link WriteResultChecking#EXCEPTION}.
*
*
* @param writeConcern any WriteConcern already configured or null
* @return The prepared WriteConcern or null
*/
@@ -845,10 +802,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
protected <T> void doInsert(String collectionName, T objectToSave, MongoWriter<T> writer) {
initializeVersionProperty(objectToSave);
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave, collectionName));
assertUpdateableIdIfNotSet(objectToSave);
initializeVersionProperty(objectToSave);
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave, collectionName));
DBObject dbDoc = toDbObject(objectToSave, writer);
maybeEmitEvent(new BeforeSaveEvent<T>(objectToSave, dbDoc, collectionName));
@@ -935,16 +894,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
protected <T> void doInsertBatch(String collectionName, Collection<? extends T> batchToSave, MongoWriter<T> writer) {
Assert.notNull(writer, "MongoWriter must not be null!");
Assert.notNull(writer);
List<DBObject> dbObjectList = new ArrayList<DBObject>();
for (T o : batchToSave) {
initializeVersionProperty(o);
maybeEmitEvent(new BeforeConvertEvent<T>(o, collectionName));
BasicDBObject dbDoc = new BasicDBObject();
maybeEmitEvent(new BeforeConvertEvent<T>(o, collectionName));
writer.write(o, dbDoc);
maybeEmitEvent(new BeforeSaveEvent<T>(o, dbDoc, collectionName));
@@ -965,14 +923,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
public void save(Object objectToSave) {
Assert.notNull(objectToSave, "Object to save must not be null!");
Assert.notNull(objectToSave);
save(objectToSave, determineEntityCollectionName(objectToSave));
}
public void save(Object objectToSave, String collectionName) {
Assert.notNull(objectToSave, "Object to save must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
Assert.notNull(objectToSave);
Assert.hasText(collectionName);
MongoPersistentEntity<?> mongoPersistentEntity = getPersistentEntity(objectToSave.getClass());
@@ -1001,23 +959,23 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
doInsert(collectionName, objectToSave, this.mongoConverter);
} else {
// Bump version number
convertingAccessor.setProperty(versionProperty, versionNumber.longValue() + 1);
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave, collectionName));
assertUpdateableIdIfNotSet(objectToSave);
BasicDBObject dbObject = new BasicDBObject();
this.mongoConverter.write(objectToSave, dbObject);
maybeEmitEvent(new BeforeSaveEvent<T>(objectToSave, dbObject, collectionName));
Update update = Update.fromDBObject(dbObject, ID_FIELD);
// Create query for entity with the id and old version
Object id = convertingAccessor.getProperty(idProperty);
Query query = new Query(Criteria.where(idProperty.getName()).is(id).and(versionProperty.getName()).is(version));
// Bump version number
convertingAccessor.setProperty(versionProperty, versionNumber.longValue() + 1);
BasicDBObject dbObject = new BasicDBObject();
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave, collectionName));
this.mongoConverter.write(objectToSave, dbObject);
maybeEmitEvent(new BeforeSaveEvent<T>(objectToSave, dbObject, collectionName));
Update update = Update.fromDBObject(dbObject, ID_FIELD);
doUpdate(collectionName, query, update, objectToSave.getClass(), false, false);
maybeEmitEvent(new AfterSaveEvent<T>(objectToSave, dbObject, collectionName));
}
@@ -1025,9 +983,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
protected <T> void doSave(String collectionName, T objectToSave, MongoWriter<T> writer) {
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave, collectionName));
assertUpdateableIdIfNotSet(objectToSave);
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave, collectionName));
DBObject dbDoc = toDbObject(objectToSave, writer);
maybeEmitEvent(new BeforeSaveEvent<T>(objectToSave, dbDoc, collectionName));
@@ -1048,8 +1007,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.INSERT, collectionName,
entityClass, dbDoc, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
WriteResult writeResult = writeConcernToUse == null ? collection.insert(dbDoc)
: collection.insert(dbDoc, writeConcernToUse);
WriteResult writeResult = writeConcernToUse == null ? collection.insert(dbDoc) : collection.insert(dbDoc,
writeConcernToUse);
handleAnyWriteResultErrors(writeResult, dbDoc, MongoActionOperation.INSERT);
return dbDoc.get(ID_FIELD);
}
@@ -1072,8 +1031,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.INSERT_LIST, collectionName, null,
null, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
WriteResult writeResult = writeConcernToUse == null ? collection.insert(dbDocList)
: collection.insert(dbDocList.toArray((DBObject[]) new BasicDBObject[dbDocList.size()]), writeConcernToUse);
WriteResult writeResult = writeConcernToUse == null ? collection.insert(dbDocList) : collection.insert(
dbDocList.toArray((DBObject[]) new BasicDBObject[dbDocList.size()]), writeConcernToUse);
handleAnyWriteResultErrors(writeResult, null, MongoActionOperation.INSERT_LIST);
return null;
}
@@ -1103,8 +1062,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.SAVE, collectionName, entityClass,
dbDoc, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
WriteResult writeResult = writeConcernToUse == null ? collection.save(dbDoc)
: collection.save(dbDoc, writeConcernToUse);
WriteResult writeResult = writeConcernToUse == null ? collection.save(dbDoc) : collection.save(dbDoc,
writeConcernToUse);
handleAnyWriteResultErrors(writeResult, dbDoc, MongoActionOperation.SAVE);
return dbDoc.get(ID_FIELD);
}
@@ -1157,10 +1116,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
increaseVersionForUpdateIfNecessary(entity, update);
DBObject queryObj = query == null ? new BasicDBObject()
: queryMapper.getMappedObject(query.getQueryObject(), entity);
DBObject updateObj = update == null ? new BasicDBObject()
: updateMapper.getMappedObject(update.getUpdateObject(), entity);
DBObject queryObj = query == null ? new BasicDBObject() : queryMapper.getMappedObject(query.getQueryObject(),
entity);
DBObject updateObj = update == null ? new BasicDBObject() : updateMapper.getMappedObject(
update.getUpdateObject(), entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Calling update using query: {} and update: {} in collection: {}",
@@ -1217,7 +1176,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
public WriteResult remove(Object object, String collection) {
Assert.hasText(collection, "Collection name must not be null or empty!");
Assert.hasText(collection);
if (object == null) {
return null;
@@ -1229,7 +1188,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Returns {@link Entry} containing the field name of the id property as {@link Entry#getKey()} and the {@link Id}s
* property value as its {@link Entry#getValue()}.
*
*
* @param object
* @return
*/
@@ -1256,7 +1215,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Returns a {@link Query} for the given entity by its id.
*
*
* @param object must not be {@literal null}.
* @return
*/
@@ -1268,7 +1227,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Returns a {@link Query} for the given entities by their ids.
*
*
* @param objects must not be {@literal null} or {@literal empty}.
* @return
*/
@@ -1301,9 +1260,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
Object idValue = persistentEntity.getPropertyAccessor(entity).getProperty(idProperty);
if (idValue == null && !MongoSimpleTypes.AUTOGENERATED_ID_TYPES.contains(idProperty.getType())) {
throw new InvalidDataAccessApiUsageException(
String.format("Cannot autogenerate id of type %s for entity of type %s!", idProperty.getType().getName(),
entity.getClass().getName()));
throw new InvalidDataAccessApiUsageException(String.format(
"Cannot autogenerate id of type %s for entity of type %s!", idProperty.getType().getName(), entity.getClass()
.getName()));
}
}
@@ -1342,12 +1301,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Remove using query: {} in collection: {}.",
new Object[] { serializeToJsonSafely(dboq), collectionName });
LOGGER.debug("Remove using query: {} in collection: {}.", new Object[] { serializeToJsonSafely(dboq),
collection.getName() });
}
WriteResult wr = writeConcernToUse == null ? collection.remove(dboq)
: collection.remove(dboq, writeConcernToUse);
WriteResult wr = writeConcernToUse == null ? collection.remove(dboq) : collection.remove(dboq,
writeConcernToUse);
handleAnyWriteResultErrors(wr, dboq, MongoActionOperation.REMOVE);
@@ -1363,8 +1322,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public <T> List<T> findAll(Class<T> entityClass, String collectionName) {
return executeFindMultiInternal(new FindCallback(null), null,
new ReadDbObjectCallback<T>(mongoConverter, entityClass, collectionName), collectionName);
return executeFindMultiInternal(new FindCallback(null), null, new ReadDbObjectCallback<T>(mongoConverter,
entityClass, collectionName), collectionName);
}
public <T> MapReduceResults<T> mapReduce(String inputCollectionName, String mapFunction, String reduceFunction,
@@ -1380,8 +1339,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
public <T> MapReduceResults<T> mapReduce(Query query, String inputCollectionName, String mapFunction,
String reduceFunction, Class<T> entityClass) {
return mapReduce(query, inputCollectionName, mapFunction, reduceFunction, new MapReduceOptions().outputTypeInline(),
entityClass);
return mapReduce(query, inputCollectionName, mapFunction, reduceFunction,
new MapReduceOptions().outputTypeInline(), entityClass);
}
public <T> MapReduceResults<T> mapReduce(Query query, String inputCollectionName, String mapFunction,
@@ -1392,9 +1351,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DBCollection inputCollection = getCollection(inputCollectionName);
MapReduceCommand command = new MapReduceCommand(inputCollection, mapFunc, reduceFunc,
mapReduceOptions.getOutputCollection(), mapReduceOptions.getOutputType(),
query == null || query.getQueryObject() == null ? null
: queryMapper.getMappedObject(query.getQueryObject(), null));
mapReduceOptions.getOutputCollection(), mapReduceOptions.getOutputType(), query == null
|| query.getQueryObject() == null ? null : queryMapper.getMappedObject(query.getQueryObject(), null));
copyMapReduceOptionsToCommand(query, mapReduceOptions, command);
@@ -1539,7 +1497,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* Retrieve and remove all documents matching the given {@code query} by calling {@link #find(Query, Class, String)}
* and {@link #remove(Query, Class, String)}, whereas the {@link Query} for {@link #remove(Query, Class, String)} is
* constructed out of the find result.
*
*
* @param collectionName
* @param query
* @param entityClass
@@ -1579,7 +1537,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Returns the potentially mapped results of the given {@commandResult} contained some.
*
*
* @param outputType
* @param commandResult
* @return
@@ -1691,7 +1649,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Create the specified collection using the provided options
*
*
* @param collectionName
* @param collectionOptions
* @return the collection that was created
@@ -1712,7 +1670,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Map the results of an ad-hoc query on the default MongoDB collection to an object using the template's converter.
* The query document is specified as a standard {@link DBObject} and so is the fields specification.
*
*
* @param collectionName name of the collection to retrieve the objects from.
* @param query the query document that specifies the criteria used to find a record.
* @param fields the document that specifies the fields to be returned.
@@ -1730,14 +1688,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
mappedFields, entityClass, collectionName);
}
return executeFindOneInternal(new FindOneCallback(mappedQuery, mappedFields),
new ReadDbObjectCallback<T>(this.mongoConverter, entityClass, collectionName), collectionName);
return executeFindOneInternal(new FindOneCallback(mappedQuery, mappedFields), new ReadDbObjectCallback<T>(
this.mongoConverter, entityClass, collectionName), collectionName);
}
/**
* Map the results of an ad-hoc query on the default MongoDB collection to a List using the template's converter. The
* query document is specified as a standard DBObject and so is the fields specification.
*
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query document that specifies the criteria used to find a record
* @param fields the document that specifies the fields to be returned
@@ -1745,15 +1703,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* @return the List of converted objects.
*/
protected <T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<T> entityClass) {
return doFind(collectionName, query, fields, entityClass, null,
new ReadDbObjectCallback<T>(this.mongoConverter, entityClass, collectionName));
return doFind(collectionName, query, fields, entityClass, null, new ReadDbObjectCallback<T>(this.mongoConverter,
entityClass, collectionName));
}
/**
* Map the results of an ad-hoc query on the default MongoDB collection to a List of the specified type. The object is
* converted from the MongoDB native representation using an instance of {@see MongoConverter}. The query document is
* specified as a standard DBObject and so is the fields specification.
*
*
* @param collectionName name of the collection to retrieve the objects from.
* @param query the query document that specifies the criteria used to find a record.
* @param fields the document that specifies the fields to be returned.
@@ -1764,8 +1722,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
*/
protected <T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<T> entityClass,
CursorPreparer preparer) {
return doFind(collectionName, query, fields, entityClass, preparer,
new ReadDbObjectCallback<T>(mongoConverter, entityClass, collectionName));
return doFind(collectionName, query, fields, entityClass, preparer, new ReadDbObjectCallback<T>(mongoConverter,
entityClass, collectionName));
}
protected <S, T> List<T> doFind(String collectionName, DBObject query, DBObject fields, Class<S> entityClass,
@@ -1806,7 +1764,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* The first document that matches the query is returned and also removed from the collection in the database.
* <p/>
* The query document is specified as a standard DBObject and so is the fields specification.
*
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query document that specifies the criteria used to find a record
* @param entityClass the parameterized type of the returned list.
@@ -1857,7 +1815,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Populates the id property of the saved object, if it's not set already.
*
*
* @param savedObject
* @param id
*/
@@ -1907,7 +1865,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* <li>Execute the given {@link ConnectionCallback} for a {@link DBObject}.</li>
* <li>Apply the given {@link DbObjectCallback} to each of the {@link DBObject}s to obtain the result.</li>
* <ol>
*
*
* @param <T>
* @param collectionCallback the callback to retrieve the {@link DBObject} with
* @param objectCallback the {@link DbObjectCallback} to transform {@link DBObject}s into the actual domain type
@@ -1918,8 +1876,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DbObjectCallback<T> objectCallback, String collectionName) {
try {
T result = objectCallback
.doWith(collectionCallback.doInCollection(getAndPrepareCollection(getDb(), collectionName)));
T result = objectCallback.doWith(collectionCallback.doInCollection(getAndPrepareCollection(getDb(),
collectionName)));
return result;
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e, exceptionTranslator);
@@ -1936,7 +1894,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* <li>Iterate over the {@link DBCursor} and applies the given {@link DbObjectCallback} to each of the
* {@link DBObject}s collecting the actual result {@link List}.</li>
* <ol>
*
*
* @param <T>
* @param collectionCallback the callback to retrieve the {@link DBCursor} with
* @param preparer the {@link CursorPreparer} to potentially modify the {@link DBCursor} before ireating over it
@@ -1944,8 +1902,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* @param collectionName the collection to be queried
* @return
*/
private <T> List<T> executeFindMultiInternal(CollectionCallback<DBCursor> collectionCallback, CursorPreparer preparer,
DbObjectCallback<T> objectCallback, String collectionName) {
private <T> List<T> executeFindMultiInternal(CollectionCallback<DBCursor> collectionCallback,
CursorPreparer preparer, DbObjectCallback<T> objectCallback, String collectionName) {
try {
@@ -2035,15 +1993,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
if (entity == null) {
throw new InvalidDataAccessApiUsageException(
"No Persistent Entity information found for the class " + entityClass.getName());
throw new InvalidDataAccessApiUsageException("No Persitent Entity information found for the class "
+ entityClass.getName());
}
return entity.getCollection();
}
/**
* Handles {@link WriteResult} errors based on the configured {@link WriteResultChecking}.
*
*
* @param writeResult
* @param query
* @param operation
@@ -2087,7 +2045,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Inspects the given {@link CommandResult} for erros and potentially throws an
* {@link InvalidDataAccessApiUsageException} for that error.
*
*
* @param result must not be {@literal null}.
* @param source must not be {@literal null}.
*/
@@ -2100,8 +2058,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
String error = result.getErrorMessage();
error = error == null ? "NO MESSAGE" : error;
throw new InvalidDataAccessApiUsageException(
"Command execution failed: Error [" + error + "], Command = " + source, ex);
throw new InvalidDataAccessApiUsageException("Command execution failed: Error [" + error + "], Command = "
+ source, ex);
}
}
@@ -2125,7 +2083,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Tries to convert the given {@link RuntimeException} into a {@link DataAccessException} but returns the original
* exception if the conversation failed. Thus allows safe re-throwing of the return value.
*
*
* @param ex the exception to translate
* @param exceptionTranslator the {@link PersistenceExceptionTranslator} to be used for translation
* @return
@@ -2163,7 +2121,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Simple {@link CollectionCallback} that takes a query {@link DBObject} plus an optional fields specification
* {@link DBObject} and executes that against the {@link DBCollection}.
*
*
* @author Oliver Gierke
* @author Thomas Risberg
*/
@@ -2197,7 +2155,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Simple {@link CollectionCallback} that takes a query {@link DBObject} plus an optional fields specification
* {@link DBObject} and executes that against the {@link DBCollection}.
*
*
* @author Oliver Gierke
* @author Thomas Risberg
*/
@@ -2211,7 +2169,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
public FindCallback(DBObject query, DBObject fields) {
this.query = query == null ? new BasicDBObject() : query;
this.query = query;
this.fields = fields;
}
@@ -2228,7 +2186,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Simple {@link CollectionCallback} that takes a query {@link DBObject} plus an optional fields specification
* {@link DBObject} and executes that against the {@link DBCollection}.
*
*
* @author Thomas Risberg
*/
private static class FindAndRemoveCallback implements CollectionCallback<DBObject> {
@@ -2273,12 +2231,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Simple internal callback to allow operations on a {@link DBObject}.
*
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
interface DbObjectCallback<T> {
static interface DbObjectCallback<T> {
T doWith(DBObject object);
}
@@ -2286,7 +2244,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Simple {@link DbObjectCallback} that will transform {@link DBObject} into the given target type using the given
* {@link MongoReader}.
*
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
@@ -2298,9 +2256,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
public ReadDbObjectCallback(EntityReader<? super T, DBObject> reader, Class<T> type, String collectionName) {
Assert.notNull(reader, "EntityReader must not be null!");
Assert.notNull(type, "Entity type must not be null!");
Assert.notNull(reader);
Assert.notNull(type);
this.reader = reader;
this.type = type;
this.collectionName = collectionName;
@@ -2320,8 +2277,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
class UnwrapAndReadDbObjectCallback<T> extends ReadDbObjectCallback<T> {
public UnwrapAndReadDbObjectCallback(EntityReader<? super T, DBObject> reader, Class<T> type,
String collectionName) {
public UnwrapAndReadDbObjectCallback(EntityReader<? super T, DBObject> reader, Class<T> type, String collectionName) {
super(reader, type, collectionName);
}
@@ -2348,6 +2304,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
private enum DefaultWriteConcernResolver implements WriteConcernResolver {
INSTANCE;
public WriteConcern resolve(MongoAction action) {
return action.getDefaultWriteConcern();
}
}
class QueryCursorPreparer implements CursorPreparer {
private final Query query;
@@ -2377,49 +2342,23 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DBCursor cursorToUse = cursor.copy();
try {
if (query.getSkip() > 0) {
cursorToUse = cursorToUse.skip(query.getSkip());
}
if (query.getLimit() > 0) {
cursorToUse = cursorToUse.limit(query.getLimit());
}
if (query.getSortObject() != null) {
DBObject sortDbo = type != null ? getMappedSortObject(query, type) : query.getSortObject();
cursorToUse = cursorToUse.sort(sortDbo);
}
if (StringUtils.hasText(query.getHint())) {
cursorToUse = cursorToUse.hint(query.getHint());
}
if (query.getMeta().hasValues()) {
for (Entry<String, Object> entry : query.getMeta().values()) {
cursorToUse = cursorToUse.addSpecial(entry.getKey(), entry.getValue());
}
for (Meta.CursorOption option : query.getMeta().getFlags()) {
switch (option) {
case EXHAUST:
cursorToUse = cursorToUse.addOption(Bytes.QUERYOPTION_EXHAUST);
break;
case NO_TIMEOUT:
cursorToUse = cursorToUse.addOption(Bytes.QUERYOPTION_NOTIMEOUT);
break;
case PARTIAL:
cursorToUse = cursorToUse.addOption(Bytes.QUERYOPTION_PARTIAL);
break;
case SLAVE_OK:
cursorToUse = cursorToUse.addOption(Bytes.QUERYOPTION_SLAVEOK);
break;
default:
throw new IllegalArgumentException(String.format("%s is no supported flag.", option));
}
}
}
} catch (RuntimeException e) {
@@ -2433,7 +2372,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* {@link DbObjectCallback} that assumes a {@link GeoResult} to be created, delegates actual content unmarshalling to
* a delegate and creates a {@link GeoResult} from the result.
*
*
* @author Oliver Gierke
*/
static class GeoNearResultDbObjectCallback<T> implements DbObjectCallback<GeoResult<T>> {
@@ -2444,13 +2383,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Creates a new {@link GeoNearResultDbObjectCallback} using the given {@link DbObjectCallback} delegate for
* {@link GeoResult} content unmarshalling.
*
*
* @param delegate must not be {@literal null}.
*/
public GeoNearResultDbObjectCallback(DbObjectCallback<T> delegate, Metric metric) {
Assert.notNull(delegate, "DocumentCallback must not be null!");
Assert.notNull(delegate);
this.delegate = delegate;
this.metric = metric;
}
@@ -2468,7 +2405,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* A {@link CloseableIterator} that is backed by a MongoDB {@link Cursor}.
*
*
* @since 1.7
* @author Thomas Darimont
*/
@@ -2480,7 +2417,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Creates a new {@link CloseableIterableCursorAdapter} backed by the given {@link Cursor}.
*
*
* @param cursor
* @param exceptionTranslator
* @param objectReadCallback

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,153 +0,0 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.List;
import org.springframework.util.ObjectUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* @author Christoph Strobl
* @since 1.10
*/
abstract class AbstractAggregationExpression implements AggregationExpression {
private final Object value;
protected AbstractAggregationExpression(Object value) {
this.value = value;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationExpression#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
return toDbObject(this.value, context);
}
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
Object valueToUse;
if (value instanceof List) {
List<Object> arguments = (List<Object>) value;
List<Object> args = new ArrayList<Object>(arguments.size());
for (Object val : arguments) {
args.add(unpack(val, context));
}
valueToUse = args;
} else if (value instanceof java.util.Map) {
DBObject dbo = new BasicDBObject();
for (java.util.Map.Entry<String, Object> entry : ((java.util.Map<String, Object>) value).entrySet()) {
dbo.put(entry.getKey(), unpack(entry.getValue(), context));
}
valueToUse = dbo;
} else {
valueToUse = unpack(value, context);
}
return new BasicDBObject(getMongoMethod(), valueToUse);
}
protected static List<Field> asFields(String... fieldRefs) {
if (ObjectUtils.isEmpty(fieldRefs)) {
return Collections.emptyList();
}
return Fields.fields(fieldRefs).asList();
}
@SuppressWarnings("unchecked")
private Object unpack(Object value, AggregationOperationContext context) {
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDbObject(context);
}
if (value instanceof Field) {
return context.getReference((Field) value).toString();
}
if (value instanceof List) {
List<Object> sourceList = (List<Object>) value;
List<Object> mappedList = new ArrayList<Object>(sourceList.size());
for (Object item : sourceList) {
mappedList.add(unpack(item, context));
}
return mappedList;
}
return value;
}
protected List<Object> append(Object value) {
if (this.value instanceof List) {
List<Object> clone = new ArrayList<Object>((List) this.value);
if (value instanceof List) {
for (Object val : (List) value) {
clone.add(val);
}
} else {
clone.add(value);
}
return clone;
}
return Arrays.asList(this.value, value);
}
@SuppressWarnings("unchecked")
protected java.util.Map<String, Object> append(String key, Object value) {
if (!(this.value instanceof java.util.Map)) {
throw new IllegalArgumentException("o_O");
}
java.util.Map<String, Object> clone = new LinkedHashMap<String, Object>((java.util.Map<String, Object>) this.value);
clone.put(key, value);
return clone;
}
protected List<Object> values() {
if (value instanceof List) {
return new ArrayList<Object>((List) value);
}
if (value instanceof java.util.Map) {
return new ArrayList<Object>(((java.util.Map) value).values());
}
return new ArrayList<Object>(Collections.singletonList(value));
}
protected abstract String getMongoMethod();
}

View File

@@ -1,648 +0,0 @@
/*
* Copyright 2016. the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.Collections;
import java.util.List;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* Gateway to {@literal accumulator} aggregation operations.
*
* @author Christoph Strobl
* @since 1.10
* @soundtrack Rage Against The Machine - Killing In The Name
*/
public class AccumulatorOperators {
/**
* Take the numeric value referenced by given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static AccumulatorOperatorFactory valueOf(String fieldReference) {
return new AccumulatorOperatorFactory(fieldReference);
}
/**
* Take the numeric value referenced resulting from given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static AccumulatorOperatorFactory valueOf(AggregationExpression expression) {
return new AccumulatorOperatorFactory(expression);
}
/**
* @author Christoph Strobl
*/
public static class AccumulatorOperatorFactory {
private final String fieldReference;
private final AggregationExpression expression;
/**
* Creates new {@link AccumulatorOperatorFactory} for given {@literal fieldReference}.
*
* @param fieldReference must not be {@literal null}.
*/
public AccumulatorOperatorFactory(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
this.fieldReference = fieldReference;
this.expression = null;
}
/**
* Creates new {@link AccumulatorOperatorFactory} for given {@link AggregationExpression}.
*
* @param expression must not be {@literal null}.
*/
public AccumulatorOperatorFactory(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
this.fieldReference = null;
this.expression = expression;
}
/**
* Creates new {@link AggregationExpression} that takes the associated numeric value expression and calculates and
* returns the sum.
*
* @return
*/
public Sum sum() {
return usesFieldRef() ? Sum.sumOf(fieldReference) : Sum.sumOf(expression);
}
/**
* Creates new {@link AggregationExpression} that takes the associated numeric value expression and returns the
* average value.
*
* @return
*/
public Avg avg() {
return usesFieldRef() ? Avg.avgOf(fieldReference) : Avg.avgOf(expression);
}
/**
* Creates new {@link AggregationExpression} that takes the associated numeric value expression and returns the
* maximum value.
*
* @return
*/
public Max max() {
return usesFieldRef() ? Max.maxOf(fieldReference) : Max.maxOf(expression);
}
/**
* Creates new {@link AggregationExpression} that takes the associated numeric value expression and returns the
* minimum value.
*
* @return
*/
public Min min() {
return usesFieldRef() ? Min.minOf(fieldReference) : Min.minOf(expression);
}
/**
* Creates new {@link AggregationExpression} that takes the associated numeric value expression and calculates the
* population standard deviation of the input values.
*
* @return
*/
public StdDevPop stdDevPop() {
return usesFieldRef() ? StdDevPop.stdDevPopOf(fieldReference) : StdDevPop.stdDevPopOf(expression);
}
/**
* Creates new {@link AggregationExpression} that takes the associated numeric value expression and calculates the
* sample standard deviation of the input values.
*
* @return
*/
public StdDevSamp stdDevSamp() {
return usesFieldRef() ? StdDevSamp.stdDevSampOf(fieldReference) : StdDevSamp.stdDevSampOf(expression);
}
private boolean usesFieldRef() {
return fieldReference != null;
}
}
/**
* {@link AggregationExpression} for {@code $sum}.
*
* @author Christoph Strobl
*/
public static class Sum extends AbstractAggregationExpression {
private Sum(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$sum";
}
/**
* Creates new {@link Sum}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Sum sumOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Sum(asFields(fieldReference));
}
/**
* Creates new {@link Sum}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Sum sumOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Sum(Collections.singletonList(expression));
}
/**
* Creates new {@link Sum} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Sum and(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Sum(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Sum} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param expression must not be {@literal null}.
* @return
*/
public Sum and(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Sum(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDbObject(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDbObject(((List<Object>) value).iterator().next(), context);
}
}
return super.toDbObject(value, context);
}
}
/**
* {@link AggregationExpression} for {@code $avg}.
*
* @author Christoph Strobl
*/
public static class Avg extends AbstractAggregationExpression {
private Avg(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$avg";
}
/**
* Creates new {@link Avg}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Avg avgOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Avg(asFields(fieldReference));
}
/**
* Creates new {@link Avg}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Avg avgOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Avg(Collections.singletonList(expression));
}
/**
* Creates new {@link Avg} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Avg and(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Avg(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Avg} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param expression must not be {@literal null}.
* @return
*/
public Avg and(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Avg(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDbObject(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDbObject(((List<Object>) value).iterator().next(), context);
}
}
return super.toDbObject(value, context);
}
}
/**
* {@link AggregationExpression} for {@code $max}.
*
* @author Christoph Strobl
*/
public static class Max extends AbstractAggregationExpression {
private Max(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$max";
}
/**
* Creates new {@link Max}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Max maxOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Max(asFields(fieldReference));
}
/**
* Creates new {@link Max}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Max maxOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Max(Collections.singletonList(expression));
}
/**
* Creates new {@link Max} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Max and(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Max(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Max} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param expression must not be {@literal null}.
* @return
*/
public Max and(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Max(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDbObject(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDbObject(((List<Object>) value).iterator().next(), context);
}
}
return super.toDbObject(value, context);
}
}
/**
* {@link AggregationExpression} for {@code $min}.
*
* @author Christoph Strobl
*/
public static class Min extends AbstractAggregationExpression {
private Min(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$min";
}
/**
* Creates new {@link Min}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static Min minOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Min(asFields(fieldReference));
}
/**
* Creates new {@link Min}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static Min minOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Min(Collections.singletonList(expression));
}
/**
* Creates new {@link Min} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public Min and(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new Min(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link Min} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param expression must not be {@literal null}.
* @return
*/
public Min and(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new Min(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDbObject(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDbObject(((List<Object>) value).iterator().next(), context);
}
}
return super.toDbObject(value, context);
}
}
/**
* {@link AggregationExpression} for {@code $stdDevPop}.
*
* @author Christoph Strobl
*/
public static class StdDevPop extends AbstractAggregationExpression {
private StdDevPop(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$stdDevPop";
}
/**
* Creates new {@link StdDevPop}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static StdDevPop stdDevPopOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new StdDevPop(asFields(fieldReference));
}
/**
* Creates new {@link StdDevPop} with all previously added arguments appending the given one.
*
* @param expression must not be {@literal null}.
* @return
*/
public static StdDevPop stdDevPopOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new StdDevPop(Collections.singletonList(expression));
}
/**
* Creates new {@link StdDevPop} with all previously added arguments appending the given one. <br/>
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public StdDevPop and(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new StdDevPop(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link StdDevSamp} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param expression must not be {@literal null}.
* @return
*/
public StdDevPop and(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new StdDevPop(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDbObject(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDbObject(((List<Object>) value).iterator().next(), context);
}
}
return super.toDbObject(value, context);
}
}
/**
* {@link AggregationExpression} for {@code $stdDevSamp}.
*
* @author Christoph Strobl
*/
public static class StdDevSamp extends AbstractAggregationExpression {
private StdDevSamp(Object value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$stdDevSamp";
}
/**
* Creates new {@link StdDevSamp}.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public static StdDevSamp stdDevSampOf(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new StdDevSamp(asFields(fieldReference));
}
/**
* Creates new {@link StdDevSamp}.
*
* @param expression must not be {@literal null}.
* @return
*/
public static StdDevSamp stdDevSampOf(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new StdDevSamp(Collections.singletonList(expression));
}
/**
* Creates new {@link StdDevSamp} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param fieldReference must not be {@literal null}.
* @return
*/
public StdDevSamp and(String fieldReference) {
Assert.notNull(fieldReference, "FieldReference must not be null!");
return new StdDevSamp(append(Fields.field(fieldReference)));
}
/**
* Creates new {@link StdDevSamp} with all previously added arguments appending the given one. <br />
* <strong>NOTE:</strong> Only possible in {@code $project} stage.
*
* @param expression must not be {@literal null}.
* @return
*/
public StdDevSamp and(AggregationExpression expression) {
Assert.notNull(expression, "Expression must not be null!");
return new StdDevSamp(append(expression));
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AccumulatorOperators.AbstractAggregationExpression#toDbObject(java.lang.Object, org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@SuppressWarnings("unchecked")
public DBObject toDbObject(Object value, AggregationOperationContext context) {
if (value instanceof List) {
if (((List) value).size() == 1) {
return super.toDbObject(((List<Object>) value).iterator().next(), context);
}
}
return super.toDbObject(value, context);
}
}
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2013-2017 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -23,18 +23,10 @@ import java.util.List;
import org.springframework.data.domain.Sort;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.aggregation.CountOperation.CountOperationBuilder;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.DirectFieldReference;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.aggregation.FacetOperation.FacetOperationBuilder;
import org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation;
import org.springframework.data.mongodb.core.aggregation.GraphLookupOperation.StartWithBuilder;
import org.springframework.data.mongodb.core.aggregation.ReplaceRootOperation.ReplaceRootDocumentOperationBuilder;
import org.springframework.data.mongodb.core.aggregation.ReplaceRootOperation.ReplaceRootOperationBuilder;
import org.springframework.data.mongodb.core.aggregation.Fields.*;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.util.Assert;
@@ -45,14 +37,10 @@ import com.mongodb.DBObject;
/**
* An {@code Aggregation} is a representation of a list of aggregation steps to be performed by the MongoDB Aggregation
* Framework.
*
*
* @author Tobias Trelle
* @author Thomas Darimont
* @author Oliver Gierke
* @author Mark Paluch
* @author Alessio Fachechi
* @author Christoph Strobl
* @author Nikolay Bogdanov
* @since 1.3
*/
public class Aggregation {
@@ -69,7 +57,7 @@ public class Aggregation {
*/
public static final String CURRENT = SystemVariable.CURRENT.toString();
public static final AggregationOperationContext DEFAULT_CONTEXT = AggregationOperationRenderer.DEFAULT_CONTEXT;
public static final AggregationOperationContext DEFAULT_CONTEXT = new NoOpAggregationOperationContext();
public static final AggregationOptions DEFAULT_OPTIONS = newAggregationOptions().build();
protected final List<AggregationOperation> operations;
@@ -77,7 +65,7 @@ public class Aggregation {
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
*
* @param operations must not be {@literal null} or empty.
*/
public static Aggregation newAggregation(List<? extends AggregationOperation> operations) {
@@ -86,7 +74,7 @@ public class Aggregation {
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
*
* @param operations must not be {@literal null} or empty.
*/
public static Aggregation newAggregation(AggregationOperation... operations) {
@@ -96,7 +84,7 @@ public class Aggregation {
/**
* Returns a copy of this {@link Aggregation} with the given {@link AggregationOptions} set. Note that options are
* supported in MongoDB version 2.6+.
*
*
* @param options must not be {@literal null}.
* @return
* @since 1.6
@@ -109,7 +97,7 @@ public class Aggregation {
/**
* Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s.
*
*
* @param type must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
*/
@@ -119,7 +107,7 @@ public class Aggregation {
/**
* Creates a new {@link TypedAggregation} for the given type and {@link AggregationOperation}s.
*
*
* @param type must not be {@literal null}.
* @param operations must not be {@literal null} or empty.
*/
@@ -129,7 +117,7 @@ public class Aggregation {
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
*
* @param aggregationOperations must not be {@literal null} or empty.
*/
protected Aggregation(AggregationOperation... aggregationOperations) {
@@ -149,7 +137,7 @@ public class Aggregation {
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
*
* @param aggregationOperations must not be {@literal null} or empty.
*/
protected Aggregation(List<AggregationOperation> aggregationOperations) {
@@ -158,7 +146,7 @@ public class Aggregation {
/**
* Creates a new {@link Aggregation} from the given {@link AggregationOperation}s.
*
*
* @param aggregationOperations must not be {@literal null} or empty.
* @param options must not be {@literal null} or empty.
*/
@@ -168,24 +156,13 @@ public class Aggregation {
Assert.isTrue(!aggregationOperations.isEmpty(), "At least one AggregationOperation has to be provided");
Assert.notNull(options, "AggregationOptions must not be null!");
// check $out is the last operation if it exists
for (AggregationOperation aggregationOperation : aggregationOperations) {
if (aggregationOperation instanceof OutOperation && !isLast(aggregationOperation, aggregationOperations)) {
throw new IllegalArgumentException("The $out operator must be the last stage in the pipeline.");
}
}
this.operations = aggregationOperations;
this.options = options;
}
private boolean isLast(AggregationOperation aggregationOperation, List<AggregationOperation> aggregationOperations) {
return aggregationOperations.indexOf(aggregationOperation) == aggregationOperations.size() - 1;
}
/**
* A pointer to the previous {@link AggregationOperation}.
*
*
* @return
*/
public static String previousOperation() {
@@ -194,7 +171,7 @@ public class Aggregation {
/**
* Creates a new {@link ProjectionOperation} including the given fields.
*
*
* @param fields must not be {@literal null}.
* @return
*/
@@ -203,8 +180,8 @@ public class Aggregation {
}
/**
* Creates a new {@link ProjectionOperation} including the given {@link Fields}.
*
* Creates a new {@link ProjectionOperation} includeing the given {@link Fields}.
*
* @param fields must not be {@literal null}.
* @return
*/
@@ -214,95 +191,17 @@ public class Aggregation {
/**
* Factory method to create a new {@link UnwindOperation} for the field with the given name.
*
* @param field must not be {@literal null} or empty.
*
* @param fieldName must not be {@literal null} or empty.
* @return
*/
public static UnwindOperation unwind(String field) {
return new UnwindOperation(field(field));
}
/**
* Factory method to create a new {@link ReplaceRootOperation} for the field with the given name.
*
* @param fieldName must not be {@literal null} or empty.
* @return
* @since 1.10
*/
public static ReplaceRootOperation replaceRoot(String fieldName) {
return ReplaceRootOperation.builder().withValueOf(fieldName);
}
/**
* Factory method to create a new {@link ReplaceRootOperation} for the field with the given
* {@link AggregationExpression}.
*
* @param aggregationExpression must not be {@literal null}.
* @return
* @since 1.10
*/
public static ReplaceRootOperation replaceRoot(AggregationExpression aggregationExpression) {
return ReplaceRootOperation.builder().withValueOf(aggregationExpression);
}
/**
* Factory method to create a new {@link ReplaceRootDocumentOperationBuilder} to configure a
* {@link ReplaceRootOperation}.
*
* @return the {@literal ReplaceRootDocumentOperationBuilder}.
* @since 1.10
*/
public static ReplaceRootOperationBuilder replaceRoot() {
return ReplaceRootOperation.builder();
}
/**
* Factory method to create a new {@link UnwindOperation} for the field with the given name and
* {@code preserveNullAndEmptyArrays}. Note that extended unwind is supported in MongoDB version 3.2+.
*
* @param field must not be {@literal null} or empty.
* @param preserveNullAndEmptyArrays {@literal true} to output the document if path is {@literal null}, missing or
* array is empty.
* @return new {@link UnwindOperation}
* @since 1.10
*/
public static UnwindOperation unwind(String field, boolean preserveNullAndEmptyArrays) {
return new UnwindOperation(field(field), preserveNullAndEmptyArrays);
}
/**
* Factory method to create a new {@link UnwindOperation} for the field with the given name including the name of a
* new field to hold the array index of the element as {@code arrayIndex}. Note that extended unwind is supported in
* MongoDB version 3.2+.
*
* @param field must not be {@literal null} or empty.
* @param arrayIndex must not be {@literal null} or empty.
* @return new {@link UnwindOperation}
* @since 1.10
*/
public static UnwindOperation unwind(String field, String arrayIndex) {
return new UnwindOperation(field(field), field(arrayIndex), false);
}
/**
* Factory method to create a new {@link UnwindOperation} for the field with the given nameincluding the name of a new
* field to hold the array index of the element as {@code arrayIndex} using {@code preserveNullAndEmptyArrays}. Note
* that extended unwind is supported in MongoDB version 3.2+.
*
* @param field must not be {@literal null} or empty.
* @param arrayIndex must not be {@literal null} or empty.
* @param preserveNullAndEmptyArrays {@literal true} to output the document if path is {@literal null}, missing or
* array is empty.
* @return new {@link UnwindOperation}
* @since 1.10
*/
public static UnwindOperation unwind(String field, String arrayIndex, boolean preserveNullAndEmptyArrays) {
return new UnwindOperation(field(field), field(arrayIndex), preserveNullAndEmptyArrays);
}
/**
* Creates a new {@link GroupOperation} for the given fields.
*
*
* @param fields must not be {@literal null}.
* @return
*/
@@ -312,7 +211,7 @@ public class Aggregation {
/**
* Creates a new {@link GroupOperation} for the given {@link Fields}.
*
*
* @param fields must not be {@literal null}.
* @return
*/
@@ -320,21 +219,9 @@ public class Aggregation {
return new GroupOperation(fields);
}
/**
* Creates a new {@link GraphLookupOperation.GraphLookupOperationFromBuilder} to construct a
* {@link GraphLookupOperation} given {@literal fromCollection}.
*
* @param fromCollection must not be {@literal null} or empty.
* @return
* @since 1.10
*/
public static StartWithBuilder graphLookup(String fromCollection) {
return GraphLookupOperation.builder().from(fromCollection);
}
/**
* Factory method to create a new {@link SortOperation} for the given {@link Sort}.
*
*
* @param sort must not be {@literal null}.
* @return
*/
@@ -344,7 +231,7 @@ public class Aggregation {
/**
* Factory method to create a new {@link SortOperation} for the given sort {@link Direction} and {@code fields}.
*
*
* @param direction must not be {@literal null}.
* @param fields must not be {@literal null}.
* @return
@@ -355,7 +242,7 @@ public class Aggregation {
/**
* Creates a new {@link SkipOperation} skipping the given number of elements.
*
*
* @param elementsToSkip must not be less than zero.
* @return
* @deprecated prepare to get this one removed in favor of {@link #skip(long)}.
@@ -376,7 +263,7 @@ public class Aggregation {
/**
* Creates a new {@link LimitOperation} limiting the result to the given number of elements.
*
*
* @param maxElements must not be less than zero.
* @return
*/
@@ -386,7 +273,7 @@ public class Aggregation {
/**
* Creates a new {@link MatchOperation} using the given {@link Criteria}.
*
*
* @param criteria must not be {@literal null}.
* @return
*/
@@ -394,142 +281,12 @@ public class Aggregation {
return new MatchOperation(criteria);
}
/**
* Creates a new {@link MatchOperation} using the given {@link CriteriaDefinition}.
*
* @param criteria must not be {@literal null}.
* @return
* @since 1.10
*/
public static MatchOperation match(CriteriaDefinition criteria) {
return new MatchOperation(criteria);
}
/**
* Creates a new {@link OutOperation} using the given collection name. This operation must be the last operation in
* the pipeline.
*
* @param outCollectionName collection name to export aggregation results. The {@link OutOperation} creates a new
* collection in the current database if one does not already exist. The collection is not visible until the
* aggregation completes. If the aggregation fails, MongoDB does not create the collection. Must not be
* {@literal null}.
* @return
*/
public static OutOperation out(String outCollectionName) {
return new OutOperation(outCollectionName);
}
/**
* Creates a new {@link BucketOperation} given {@literal groupByField}.
*
* @param groupByField must not be {@literal null} or empty.
* @return
* @since 1.10
*/
public static BucketOperation bucket(String groupByField) {
return new BucketOperation(field(groupByField));
}
/**
* Creates a new {@link BucketOperation} given {@link AggregationExpression group-by expression}.
*
* @param groupByExpression must not be {@literal null}.
* @return
* @since 1.10
*/
public static BucketOperation bucket(AggregationExpression groupByExpression) {
return new BucketOperation(groupByExpression);
}
/**
* Creates a new {@link BucketAutoOperation} given {@literal groupByField}.
*
* @param groupByField must not be {@literal null} or empty.
* @param buckets number of buckets, must be a positive integer.
* @return
* @since 1.10
*/
public static BucketAutoOperation bucketAuto(String groupByField, int buckets) {
return new BucketAutoOperation(field(groupByField), buckets);
}
/**
* Creates a new {@link BucketAutoOperation} given {@link AggregationExpression group-by expression}.
*
* @param groupByExpression must not be {@literal null}.
* @param buckets number of buckets, must be a positive integer.
* @return
* @since 1.10
*/
public static BucketAutoOperation bucketAuto(AggregationExpression groupByExpression, int buckets) {
return new BucketAutoOperation(groupByExpression, buckets);
}
/**
* Creates a new {@link FacetOperation}.
*
* @return
* @since 1.10
*/
public static FacetOperation facet() {
return FacetOperation.EMPTY;
}
/**
* Creates a new {@link FacetOperationBuilder} given {@link Aggregation}.
*
* @param aggregationOperations the sub-pipeline, must not be {@literal null}.
* @return
* @since 1.10
*/
public static FacetOperationBuilder facet(AggregationOperation... aggregationOperations) {
return facet().and(aggregationOperations);
}
/**
* Creates a new {@link LookupOperation}.
*
* @param from must not be {@literal null}.
* @param localField must not be {@literal null}.
* @param foreignField must not be {@literal null}.
* @param as must not be {@literal null}.
* @return never {@literal null}.
* @since 1.9
*/
public static LookupOperation lookup(String from, String localField, String foreignField, String as) {
return lookup(field(from), field(localField), field(foreignField), field(as));
}
/**
* Creates a new {@link LookupOperation} for the given {@link Fields}.
*
* @param from must not be {@literal null}.
* @param localField must not be {@literal null}.
* @param foreignField must not be {@literal null}.
* @param as must not be {@literal null}.
* @return never {@literal null}.
* @since 1.9
*/
public static LookupOperation lookup(Field from, Field localField, Field foreignField, Field as) {
return new LookupOperation(from, localField, foreignField, as);
}
/**
* Creates a new {@link CountOperationBuilder}.
*
* @return never {@literal null}.
* @since 1.10
*/
public static CountOperationBuilder count() {
return new CountOperationBuilder();
}
/**
* Creates a new {@link Fields} instance for the given field names.
*
*
* @see Fields#fields(String...)
* @param fields must not be {@literal null}.
* @return
* @see Fields#fields(String...)
*/
public static Fields fields(String... fields) {
return Fields.fields(fields);
@@ -537,7 +294,7 @@ public class Aggregation {
/**
* Creates a new {@link Fields} instance from the given field name and target reference.
*
*
* @param name must not be {@literal null} or empty.
* @param target must not be {@literal null} or empty.
* @return
@@ -549,7 +306,7 @@ public class Aggregation {
/**
* Creates a new {@link GeoNearOperation} instance from the given {@link NearQuery} and the{@code distanceField}. The
* {@code distanceField} defines output field that contains the calculated distance.
*
*
* @param query must not be {@literal null}.
* @param distanceField must not be {@literal null} or empty.
* @return
@@ -561,7 +318,7 @@ public class Aggregation {
/**
* Returns a new {@link AggregationOptions.Builder}.
*
*
* @return
* @since 1.6
*/
@@ -571,13 +328,24 @@ public class Aggregation {
/**
* Converts this {@link Aggregation} specification to a {@link DBObject}.
*
*
* @param inputCollectionName the name of the input collection
* @return the {@code DBObject} representing this aggregation
*/
public DBObject toDbObject(String inputCollectionName, AggregationOperationContext rootContext) {
List<DBObject> operationDocuments = AggregationOperationRenderer.toDBObject(operations, rootContext);
AggregationOperationContext context = rootContext;
List<DBObject> operationDocuments = new ArrayList<DBObject>(operations.size());
for (AggregationOperation operation : operations) {
operationDocuments.add(operation.toDBObject(context));
if (operation instanceof FieldsExposingAggregationOperation) {
FieldsExposingAggregationOperation exposedFieldsOperation = (FieldsExposingAggregationOperation) operation;
context = new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields(), rootContext);
}
}
DBObject command = new BasicDBObject("aggregate", inputCollectionName);
command.put("pipeline", operationDocuments);
@@ -593,14 +361,50 @@ public class Aggregation {
*/
@Override
public String toString() {
return SerializationUtils.serializeToJsonSafely(toDbObject("__collection__", DEFAULT_CONTEXT));
return SerializationUtils
.serializeToJsonSafely(toDbObject("__collection__", new NoOpAggregationOperationContext()));
}
/**
* Simple {@link AggregationOperationContext} that just returns {@link FieldReference}s as is.
*
* @author Oliver Gierke
*/
private static class NoOpAggregationOperationContext implements AggregationOperationContext {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(com.mongodb.DBObject)
*/
@Override
public DBObject getMappedObject(DBObject dbObject) {
return dbObject;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.ExposedFields.AvailableField)
*/
@Override
public FieldReference getReference(Field field) {
return new FieldReference(new ExposedField(field, true));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
*/
@Override
public FieldReference getReference(String name) {
return new FieldReference(new ExposedField(new AggregationField(name), true));
}
}
/**
* Describes the system variables available in MongoDB aggregation framework pipeline expressions.
*
*
* @author Thomas Darimont
* @see <a href="https://docs.mongodb.org/manual/reference/aggregation-variables">Aggregation Variables</a>
* @see http://docs.mongodb.org/manual/reference/aggregation-variables
*/
enum SystemVariable {
@@ -611,7 +415,7 @@ public class Aggregation {
/**
* Return {@literal true} if the given {@code fieldRef} denotes a well-known system variable, {@literal false}
* otherwise.
*
*
* @param fieldRef may be {@literal null}.
* @return
*/
@@ -633,7 +437,7 @@ public class Aggregation {
return false;
}
/*
/*
* (non-Javadoc)
* @see java.lang.Enum#toString()
*/

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,11 +1,11 @@
/*
* Copyright 2015-2016 the original author or authors.
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -29,14 +29,11 @@ import com.mongodb.DBObject;
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
* @since 1.7
* @deprecated since 1.10. Please use {@link ArithmeticOperators} and {@link ComparisonOperators} instead.
* @since 1.10
*/
@Deprecated
public enum AggregationFunctionExpressions {
SIZE, CMP, EQ, GT, GTE, LT, LTE, NE, SUBTRACT, ADD, MULTIPLY;
SIZE;
/**
* Returns an {@link AggregationExpression} build from the current {@link Enum} name and the given parameters.
@@ -55,7 +52,7 @@ public enum AggregationFunctionExpressions {
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.7
* @since 1.10
*/
static class FunctionExpression implements AggregationExpression {

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,

View File

@@ -1,109 +0,0 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.List;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.DirectFieldReference;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.data.mongodb.core.aggregation.Fields.AggregationField;
import org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation.InheritsFieldsAggregationOperation;
import com.mongodb.DBObject;
/**
* Rendering support for {@link AggregationOperation} into a {@link List} of {@link com.mongodb.DBObject}.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 1.10
*/
class AggregationOperationRenderer {
static final AggregationOperationContext DEFAULT_CONTEXT = new NoOpAggregationOperationContext();
/**
* Render a {@link List} of {@link AggregationOperation} given {@link AggregationOperationContext} into their
* {@link DBObject} representation.
*
* @param operations must not be {@literal null}.
* @param context must not be {@literal null}.
* @return the {@link List} of {@link DBObject}.
*/
static List<DBObject> toDBObject(List<AggregationOperation> operations, AggregationOperationContext rootContext) {
List<DBObject> operationDocuments = new ArrayList<DBObject>(operations.size());
AggregationOperationContext contextToUse = rootContext;
for (AggregationOperation operation : operations) {
operationDocuments.add(operation.toDBObject(contextToUse));
if (operation instanceof FieldsExposingAggregationOperation) {
FieldsExposingAggregationOperation exposedFieldsOperation = (FieldsExposingAggregationOperation) operation;
ExposedFields fields = exposedFieldsOperation.getFields();
if (operation instanceof InheritsFieldsAggregationOperation) {
contextToUse = new InheritingExposedFieldsAggregationOperationContext(fields, contextToUse);
} else {
contextToUse = fields.exposesNoFields() ? DEFAULT_CONTEXT
: new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields(), contextToUse);
}
}
}
return operationDocuments;
}
/**
* Simple {@link AggregationOperationContext} that just returns {@link FieldReference}s as is.
*
* @author Oliver Gierke
*/
private static class NoOpAggregationOperationContext implements AggregationOperationContext {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getMappedObject(com.mongodb.DBObject)
*/
@Override
public DBObject getMappedObject(DBObject dbObject) {
return dbObject;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.ExposedFields.AvailableField)
*/
@Override
public FieldReference getReference(Field field) {
return new DirectFieldReference(new ExposedField(field, true));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(java.lang.String)
*/
@Override
public FieldReference getReference(String name) {
return new DirectFieldReference(new ExposedField(new AggregationField(name), true));
}
}
}

View File

@@ -5,7 +5,7 @@
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -21,7 +21,7 @@ import com.mongodb.DBObject;
/**
* Holds a set of configurable aggregation options that can be used within an aggregation pipeline. A list of support
* aggregation options can be found in the MongoDB reference documentation
* https://docs.mongodb.org/manual/reference/command/aggregate/#aggregate
* http://docs.mongodb.org/manual/reference/command/aggregate/#aggregate
*
* @author Thomas Darimont
* @author Oliver Gierke

Some files were not shown because too many files have changed in this diff Show More