Compare commits

...

78 Commits

Author SHA1 Message Date
Spring Buildmaster
1cbce6f093 DATAMONGO-1494 - Release version 1.8.5.RELEASE. 2016-09-20 14:01:57 +00:00
Oliver Gierke
2b145e252f DATAMONGO-1494 - Prepare 1.8.5 (Gosling SR5). 2016-09-20 15:29:20 +02:00
Oliver Gierke
6532e0d804 DATAMONGO-1494 - Updated changelog. 2016-09-20 15:29:11 +02:00
Oliver Gierke
d348878f5b DATAMONGO-1450 - Updated changelog. 2016-09-20 11:50:44 +02:00
Oliver Gierke
6b6d2aa525 DATAMONGO-1445 - Prevent eager conversion of query parameters.
We now removed all eager query parameter conversions during query creation to be sure that the query mapping later on sees all to property names and is able to derive proper conversions.
2016-09-20 09:22:35 +02:00
Christoph Strobl
b0389845b3 DATAMONGO-1492 - Make o.s.d.m.core.aggregation.AggregationExpression public.
By turning `AggregationExpression` public we allow adding custom expressions without workarounds. It is now possible to create eg. `ProjectionOperation` like:

ProjectionOperation agg = Aggregation.project()
      .and(new AggregationExpression() {

        @Override
        public DBObject toDbObject(AggregationOperationContext context) {

          DBObject filterExpression = new BasicDBObject();
          filterExpression.put("input", "$x");
          filterExpression.put("as", "y");
          filterExpression.put("cond", new BasicDBObject("$eq", Arrays.<Object> asList("$$y.z", 2)));

          return new BasicDBObject("$filter", filterExpression);
        }
      }).as("profile");

Original pull request: #392.
2016-09-19 16:13:50 +02:00
Christoph Strobl
e7220a0f12 DATAMONGO-1485 - Consider potential custom conversion for enums in Querydsl paths.
We now take potential registered converters for enums into account when serializing path expressions via SpringDataMongodbSerializer.

Original pull request: #388.
2016-09-19 07:01:37 +02:00
Mark Paluch
f7ab448786 DATAMONGO-1406 - Propagate PersistentEntity when mapping query criteria for nested keywords.
We now propagate the PersistentEntity when mapping nested keywords so that the criteria mapping chain for nested keywords and properties has now access to the PersistentEntity and can use configured field names.

Previously the plain property names have been used as field names and potential customizations via @Field have been ignored.

Original Pull Request: #384
2016-08-25 13:37:22 +02:00
Mark Paluch
2026097be8 DATAMONGO-1465 - Polishing.
Replace boolean flag in convertAndJoinScriptArgs with literal. Joined args are rendered to JavaScript and require always string quotation.

Original pull request: #383.
2016-08-23 15:05:56 +02:00
Christoph Strobl
2006804133 DATAMONGO-1465 - Fix String quotation in DefaultScriptOperations.execute().
This change prevents Strings from being quoted prior to sending them as args of a script.

Original pull request: #383.
2016-08-23 15:05:56 +02:00
Oliver Gierke
be549b3d23 DATAMONGO-1471 - Converter only applies identifier values if actually available.
Setting the value for the identifier property is an explicit step in MappingMongoConverter and always executed if the type to be created has an identifier property. If the source document doesn't contain an _id field (e.g. because it has been excluded explicitly) that previously caused null to be set on the identifier. This caused an exception if the identifier property is a primitive type.

We now explicitly check whether the field backing the identifier property is actually present in the source document and only explicitly set the value if so.
2016-08-17 16:54:50 +02:00
Oliver Gierke
16bb07d8ae DATAMONGO-1409 - Updated changelog. 2016-07-28 08:58:24 +02:00
Oliver Gierke
c17c3c60a6 DATAMONGO-1410 - Updated changelog. 2016-07-28 08:58:16 +02:00
Christoph Strobl
d4dba035dc DATAMONGO-1453 - Fix GeoJson conversion when coordinates are Integers.
We now use Number instead of Double for reading "coordinates" from GeoJSON representations.

Original pull request: #369.
2016-06-24 14:05:44 +02:00
Kevin Dosey
bd41123535 DATAMONGO-1449 - Switched to foreach loop in collection handling of MappingMongoConverter.
This should result in minor to moderate performance improvement for iteration on Collections/Arrays during DBObject to object mapping.

Original pull request: #368.
2016-06-11 18:34:01 +02:00
Oliver Gierke
185b85b9dd DATAMONGO-1423 - Polishing.
Orignal pull request: #365.
2016-05-25 17:31:44 +02:00
Christoph Strobl
2d4082af02 DATAMONGO-1423 - Map keys now get registered conversions applied for Updates.
We now pipe map keys through the potentially registered conversions when mapping Updates.

Orignal pull request: #365.
2016-05-25 17:31:42 +02:00
Mark Paluch
b2cd7bbfd6 DATAMONGO-1425 - Polishing.
Add NotContaining to documentation. Add integration test for Containing/NotContaining on collection properties.

Original pull request: #363.
2016-05-09 11:12:22 +02:00
Christoph Strobl
0aff94b5ce DATAMONGO-1425 - Fix query derivation for notContaining on String properties.
We now correctly build up the criteria for derived queries using notContaining keyword on String properties.

Original pull request: #363.
2016-05-09 11:12:17 +02:00
Mark Paluch
87657aaec8 DATAMONGO-1412 - Fix backticks and code element highlighting.
Fixed broken highlighting using backticks followed by chars/single quotes. Convert single quote emphasis of id to backtick code fences. Add missing spaces between words and backticks.

Original Pull Request: #359
2016-05-02 14:20:21 +02:00
Mark Paluch
034c90fb97 DATAMONGO-1412 - Document mapping rules for Java types to MongoDB representation.
Original Pull Request: #359
Related pull request: #353
Related ticket: DATAMONGO-1404
2016-05-02 14:20:15 +02:00
Oliver Gierke
628e89cc78 DATAMONGO-1408 - Updated changelog. 2016-04-06 23:14:26 +02:00
Oliver Gierke
cb297d0890 DATAMONGO-1405 - Updated changelog. 2016-04-06 18:44:37 +02:00
Christoph Strobl
e4eaf6f2d3 DATAMONGO-1401 - Fix error when updating entity with both GeoJsonPoint and Version property.
We now ignore property reference exceptions when resolving field values that have already been mapped. Eg. in case of an already mapped update extracted from an actual domain type instance.

Original pull request: #351.
2016-03-31 09:35:12 +02:00
Christoph Strobl
6e453864fd DATAMONGO-1387 - Polishing.
Added a few more tests and append values if present on Query.

Original Pull Request: #345
2016-03-18 19:26:36 +01:00
John Willemin
f36baf2e37 DATAMONGO-1387 - Fix BasicQuery getFieldsObject() inconsistency.
We changed BasicQuery to consider its parent getFieldsObject() when not given an explicit fields DBObject.

Original Pull Request: #345
CLA: 165520160303021604 (John Willemin)
2016-03-18 19:26:24 +01:00
Oliver Gierke
e5a29ab1b5 DATAMONGO-1392 - Updated changelog. 2016-03-18 13:19:25 +01:00
Oliver Gierke
4e37cfedcc DATAMONGO-1397 - Polishing.
Switched to Slf4J-native placeholder replacement in debug logging for MongoTemplate.

Original pull request: #348.
2016-03-16 17:55:55 +01:00
Mark Paluch
de0feed565 DATAMONGO-1397 - Log command, entity and collection name in MongoTemplate.geoNear(…).
Original pull request: #348.
2016-03-16 17:55:13 +01:00
Oliver Gierke
bb2271e3fa DATAMONGO-1381 - After release cleanups. 2016-02-23 14:18:16 +01:00
Spring Buildmaster
d6aa9a0ea4 DATAMONGO-1381 - Prepare next development iteration. 2016-02-23 04:47:26 -08:00
Spring Buildmaster
1e6632846f DATAMONGO-1381 - Release version 1.8.4 (Gosling SR4). 2016-02-23 04:47:26 -08:00
Oliver Gierke
179b246173 DATAMONGO-1381 - Prepare 1.8.4 (Gosling SR4). 2016-02-23 12:56:55 +01:00
Oliver Gierke
8f29600eb4 DATAMONGO-1381 - Updated changelog. 2016-02-23 12:56:45 +01:00
Oliver Gierke
717c6100eb DATAMONGO-1366 - Updated changelog. 2016-02-12 22:10:04 +01:00
Uxío Fuentefría
38dade7c7a DATAMONGO-1378 - Update reference documentation: Change Query.sort() to Query.with(Sort sort).
sort() is not a method of Query, to sort a query you have to use with().

Original pull request: #320.
CLA: 162620160211060822 (Uxío Fuentefría)
2016-02-11 20:24:35 +01:00
Oliver Gierke
4a4c53766e DATAMONGO-1381 - Tweaked version numbers for Gosling SR4. 2016-02-11 16:09:02 +01:00
Mark Paluch
7c5d15edb5 DATAMONGO-1380 - Polishing.
Add credits, use message formatting instead string concatenation.

Original pull request: #317.
2016-02-11 12:04:41 +01:00
Alex Vengrovsk
8c2af8f0fc DATAMONGO-1380 - Improve logging in MongoChangeSetPersister.
Add checking for debug enabling in the getPersistentId method

Original pull request: #317.
2016-02-11 12:04:40 +01:00
Timo Kockert
419d0a4f92 DATAMONGO-1270 - Update documentation to reflect deprecation of MongoFactoryBean.
Original pull request: #315.
2016-02-10 15:58:09 +01:00
Thomas Dudouet
ce919e57c6 DATAMONGO-1377 - Update JavaDoc: Use @EnableMongoRepositories instead of @EnableJpaRepositories.
The JavaDoc description references the EnableJpaRepositories annotation instead of the EnableMongoRepositories annotation.

Original pull request: #340.
2016-02-10 14:52:40 +01:00
Oliver Gierke
781de00ab1 DATAMONGO-1376 - Moved away from SimpleTypeInformationMapper.INSTANCE.
Related tickets: DATACMNS-815.
2016-02-09 14:37:23 +01:00
Martin Macko
e06d352a71 DATAMONGO-1375 - Fix typo in MongoOperations JavaDoc.
Original pull request: #343.
2016-02-09 11:30:37 +01:00
Oliver Gierke
294990891c DATAMONGO-1361 - Guard command result statistics evaluation against changes in MongoDB 3.2.
MongoDB 3.2 RC1 decided to remove fields from statistics JSON documents returned in case no result was found for a geo near query. The avgDistance field is unfortunately missing as of that version.

Introduced a value object to encapsulate the mitigation behavior and make client code unaware of that.
2016-01-21 12:45:21 +01:00
Oliver Gierke
530f7396fa DATAMONGO-1360 - Query instances contained in a Near Query now get mapped during geoNear(…) execution.
A Query instance which might be part of a NearQuery definition is now passed through the QueryMapper to make sure complex types contained in it or even in more general types that have custom conversions registered are mapped correctly before the near command is actually executed.
2016-01-20 13:12:33 +01:00
Oliver Gierke
79aabfbbde DATAMONGO-1355 - After release cleanups. 2015-12-18 10:26:35 +01:00
Spring Buildmaster
6af3729fb3 DATAMONGO-1355 - Prepare next development iteration. 2015-12-18 00:24:12 -08:00
Spring Buildmaster
437a48ff4a DATAMONGO-1355 - Release version 1.8.2.RELEASE (Gosling SR2). 2015-12-18 00:24:12 -08:00
Oliver Gierke
583339641d DATAMONGO-1355 - Prepare 1.8.2.RELEASE (Gosling SR2). 2015-12-18 08:30:37 +01:00
Oliver Gierke
8af4bef772 DATAMONGO-1355 - Updated changelog. 2015-12-18 08:30:32 +01:00
Christoph Strobl
b0336c27a9 DATAMONGO-1334 - Map-reduce operations now honor MapReduceOptions.limit.
We now also consider the limit set via MapReduceOptions when executing mapReduce operations via MongoTemplate.mapReduce(…).

MapReduceOptions.limit(…) supersedes a potential limit set via the Query itself. This change also allows to define a limit even when no explicit Query is used.

Original pull request: #338.
2015-12-16 11:57:55 +01:00
Christoph Strobl
0b492b6c55 DATAMONGO-1317 - Assert compatibility with mongo-java-driver 3.2.
We now do a defensive check against the actual WObject of WriteConcern to avoid the IllegalStateException raised by the new java-driver in case _w is null or not an Integer. This allows us to run against recent 2.13, 2.14, 3.0, 3.1 and the latest 3.2.0.

Original pull request: #337.
2015-12-16 11:49:08 +01:00
Oliver Gierke
b3116a523b DATAMONGO-1289 - Polishing.
Some additional JavaDoc and comment removal.

Original pull request: #333.
2015-12-16 11:46:12 +01:00
Christoph Strobl
2ba9e5f403 DATAMONGO-1289 - MappingMongoEntityInformation no uses fallback identifier type derived from repository declaration.
We now use RepositoryMetdata.getIdType() to provide a fallback identifier type in case the entity information does not hold an id property which is perfectly valid for MongoDB.

Original pull request: #333.
2015-12-16 11:46:11 +01:00
Oliver Gierke
8b1805a145 DATAMONGO-1346 - Update.pullAll(…) now registers multiple invocations correctly.
Previously calling the method multiple times overrode the result of previous calls. We now use addMultiFieldOperation(…) to make sure already existing values are kept.
2015-12-10 15:40:23 +01:00
Oliver Gierke
5832055840 DATAMONGO-1337 - Another round of polishes on SonarQuber complaints. 2015-11-26 12:28:54 +01:00
Oliver Gierke
a13e7b8b24 DATAMONGO-1337 - Reverted making some of the loggers static.
The logger instance in AbstractMonitor is supposed to pick up the type of the actual implementation class and thus cannot be static.

Related pull request: #336.
2015-11-26 12:03:25 +01:00
Christian Ivan
a152aa3ce8 DATAMONGO-1337 - General code quality improvements.
A round of code polish regarding the PMD and Squid rules referred to in the ticket.

Original pull request: #336.
2015-11-26 11:53:16 +01:00
Oliver Gierke
ca56ea4aea DATAMONGO-1342 - Fixed potential NullPointerException in MongoQueryCreator.
MongoQueryCreator.nextAsArray(…) now returns a single element object array in case null is handed to the method. It previously failed with a NullPointerException.
2015-11-25 17:23:29 +01:00
Oliver Gierke
284e2f462d DATAMONGO-1335 - DBObjectAccessor now writes all nested fields correctly.
Previously, DBObjectAccessor has always reset the in-between values when traversing nested properties. This caused previously written values to be erased if subsequent values are written. We now reuse an already existing BasicDBObject if present.
2015-11-25 16:07:02 +01:00
Oliver Gierke
257bc891dd DATAMONGO-1287 - Optimizations in reading associations as constructor arguments.
As per discussion on the ticket we now omit looking up the value for an association being used as constructor argument as the simple check whether the currently handled property is a constructor argument is sufficient to potentially skip handling the value.

Related pull requests: #335, #322.
2015-11-23 11:18:23 +01:00
Christoph Strobl
d26db17bf0 DATAMONGO-1287 - Fix double fetching for lazy DbRefs used in entity constructor.
We now check properties for their usage as constructor arguments, that might already have been resolved, before setting the actual value. This prevents turning already eagerly fetched DBRefs back into LazyLoadingProxies.

Original pull request: #335.
Related pull request: #322.
2015-11-20 13:39:06 +01:00
Christoph Strobl
d760d9cc11 DATAMONGO-1290 - Convert byte[] parameter in @Query to $binary representation.
We now convert non quoted binary parameters to the $binary format. This allows using them along with the @Query annotation.

Original pull request: #332.
2015-11-20 13:06:30 +01:00
Christoph Strobl
be65970710 DATAMONGO-1204 - ObjectPath now uses raw id values to track resolved objects.
We now use the native id within ObjectPath for checking if a DBref has already been resolved. This is required as MongoDB Java driver 3 generation changed ObjectId.equals(…) which now performs a type check.

Original pull request: #334.
Related pull request: #288.
2015-11-20 12:50:51 +01:00
Oliver Gierke
418a4f8b8c DATAMONGO-1324 - Register ObjectId converters unconditionally to make sure they really get used.
The presence of ObjectToObjectConverter in a DefaultConversionService causes the guard trying to register converters for ObjectIds in AbstractMongoConverter to not trigger the registration. This in turn caused ObjectId conversions to be executed via reflection instead of the straight forward method calls and thus a drop in performance for such operations.

We no unconditionally register the converters to make sure they really get applied.

Related tickets: SPR-13703.
2015-11-19 12:24:23 +01:00
Oliver Gierke
fa5f93aad5 DATAMONGO-1316 - After release cleanups. 2015-11-19 12:24:09 +01:00
Spring Buildmaster
504e14d4a3 DATAMONGO-1316 - Prepare next development iteration. 2015-11-15 05:56:05 -08:00
Spring Buildmaster
f68effe155 DATAMONGO-1316 - Release version 1.8.1.RELEASE (Gosling SR1). 2015-11-15 05:55:54 -08:00
Oliver Gierke
6fc80f287e DATAMONGO-1316 - Prepare 1.8.1.RELEASE (Gosling SR1). 2015-11-15 14:06:53 +01:00
Oliver Gierke
02aed56fd1 DATAMONGO-1316 - Updated changelog. 2015-11-15 14:06:43 +01:00
Christoph Strobl
f1771504f6 DATAMONGO-1297 - Allow @Indexed annotation on DBRef.
We now also treat references as source of a potential index. This enforces index creation for Objects like:

@Document
class WithDbRef {

  @Indexed
  @DBRef
  ReferencedObject reference;
}

Combining @TextIndexed or @GeoSpatialIndexed with a DBRef will lead to a MappingException.

Original pull request: #329.
2015-11-13 17:55:00 +01:00
Christoph Strobl
741a27edae DATAMONGO-1302 - Allow ConverterFactory to be registered in CustomConversions.
We now allow registration of ConverterFactory within CustomConversions by inspecting the generic type arguments for determining the conversion source and target types.

Original pull request: #330.
2015-11-10 14:52:42 +01:00
Christoph Strobl
e1869abf3f DATAMONGO-1293 - Polishing.
Move configuration parsing error into method actually responsible for reading uri/client-uri attributes.

Original Pull Request: #328
2015-10-29 12:49:25 +01:00
Viktor Khoroshko
c7be5bfcaa DATAMONGO-1293 - Allowed id attribute in addition to client-uri attribute in MongoDbFactoryParser.
We now allow write-concern and id to be configured along with the uri or client-uri attribute of <mongo:db-factory.

Original Pull Request: #328
CLA: 140120150929074128 (Viktor Khoroshko)
2015-10-29 12:49:19 +01:00
Oliver Gierke
9968b752e7 DATAMONGO-1276 - Fixed potential NullPointerExceptions in MongoTemplate.
Triggering data access exception translation could lead to NullPointerException in cases where PersistenceExceptionTranslator returned null because the original exception couldn't be translated and the result was directly used from a throw clause.

This is now fixed by consistently the potentiallyConvertRuntimeException(…) method, which was made static to be able to refer to it from nested static classes.

Refactored Scanner usage to actually close the Scanner instance to prevent a resource leak.
2015-10-21 15:10:15 +02:00
Oliver Gierke
e001c6bf89 DATAMONGO-1304 - Updated changelog. 2015-10-14 13:46:05 +02:00
Oliver Gierke
913d383b99 DATAMONGO-1282 - After release cleanups. 2015-09-03 18:34:11 +02:00
Spring Buildmaster
f446d7e29f DATAMONGO-1282 - Prepare next development iteration. 2015-09-03 18:33:40 +02:00
78 changed files with 2398 additions and 404 deletions

View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.8.0.RELEASE</version>
<version>1.8.5.RELEASE</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.7.0.RELEASE</version>
<version>1.7.5.RELEASE</version>
</parent>
<modules>
@@ -28,7 +28,7 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.11.0.RELEASE</springdata.commons>
<springdata.commons>1.11.5.RELEASE</springdata.commons>
<mongo>2.13.0</mongo>
<mongo.osgi>2.13.0</mongo.osgi>
</properties>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.8.0.RELEASE</version>
<version>1.8.5.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -48,7 +48,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.8.0.RELEASE</version>
<version>1.8.5.RELEASE</version>
</dependency>
<dependency>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -37,6 +37,8 @@ import com.mongodb.MongoException;
/**
* @author Thomas Risberg
* @author Oliver Gierke
* @author Alex Vengrovsk
* @author Mark Paluch
*/
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
@@ -45,7 +47,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
protected final Logger log = LoggerFactory.getLogger(getClass());
private final Logger log = LoggerFactory.getLogger(getClass());
private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory;
@@ -76,25 +78,25 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for " + dbk);
log.debug("Loading MongoDB data for {}", dbk);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
for (DBObject dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) {
log.debug("Processing key: " + key);
log.debug("Processing key: {}", key);
}
if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) {
throw new DataIntegrityViolationException("Unble to convert property " + key + ": Invalid metadata, "
+ ENTITY_FIELD_CLASS + " not available");
throw new DataIntegrityViolationException(
"Unble to convert property " + key + ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
}
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: " + key);
log.debug("Adding to ChangeSet: {}", key);
}
changeSet.set(key, value);
}
@@ -109,9 +111,9 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
* @see org.springframework.data.crossstore.ChangeSetPersister#getPersistentId(org.springframework.data.crossstore.ChangeSetBacked, org.springframework.data.crossstore.ChangeSet)
*/
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
log.debug("getPersistentId called on " + entity);
if (log.isDebugEnabled()) {
log.debug("getPersistentId called on {}", entity);
}
if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
}
@@ -130,7 +132,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
}
if (log.isDebugEnabled()) {
log.debug("Flush: changeset: " + cs.getValues());
log.debug("Flush: changeset: {}", cs.getValues());
}
String collName = getCollectionNameForEntity(entity.getClass());
@@ -152,7 +154,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
});
if (value == null) {
if (log.isDebugEnabled()) {
log.debug("Flush: removing: " + dbQuery);
log.debug("Flush: removing: {}", dbQuery);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
@@ -164,7 +166,7 @@ public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
final DBObject dbDoc = new BasicDBObject();
dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) {
log.debug("Flush: saving: " + dbQuery);
log.debug("Flush: saving: {}", dbQuery);
}
mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.8.0.RELEASE</version>
<version>1.8.5.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,11 +1,11 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.8.0.RELEASE</version>
<version>1.8.5.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -160,7 +160,7 @@ public class MongoLog4jAppender extends AppenderSkeleton {
// Copy properties into document
Map<Object, Object> props = event.getProperties();
if (null != props && props.size() > 0) {
if (null != props && !props.isEmpty()) {
BasicDBObject propsDbo = new BasicDBObject();
for (Map.Entry<Object, Object> entry : props.entrySet()) {
propsDbo.put(entry.getKey().toString(), entry.getValue().toString());

View File

@@ -39,7 +39,7 @@ public class MongoLog4jAppenderIntegrationTests {
static final String NAME = MongoLog4jAppenderIntegrationTests.class.getName();
Logger log = Logger.getLogger(NAME);
private static final Logger log = Logger.getLogger(NAME);
Mongo mongo;
DB db;
String collection;

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.8.0.RELEASE</version>
<version>1.8.5.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -18,6 +18,10 @@ package org.springframework.data.mongodb.config;
import static org.springframework.data.config.ParsingUtils.*;
import static org.springframework.data.mongodb.config.MongoParsingUtils.*;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
@@ -44,9 +48,21 @@ import com.mongodb.MongoURI;
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
* @author Viktor Khoroshko
*/
public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
private static final Set<String> MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES;
static {
Set<String> mongoUriAllowedAdditionalAttributes = new HashSet<String>();
mongoUriAllowedAdditionalAttributes.add("id");
mongoUriAllowedAdditionalAttributes.add("write-concern");
MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES = Collections.unmodifiableSet(mongoUriAllowedAdditionalAttributes);
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.xml.AbstractBeanDefinitionParser#resolveId(org.w3c.dom.Element, org.springframework.beans.factory.support.AbstractBeanDefinition, org.springframework.beans.factory.xml.ParserContext)
@@ -70,13 +86,10 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
BeanDefinitionBuilder dbFactoryBuilder = BeanDefinitionBuilder.genericBeanDefinition(SimpleMongoDbFactory.class);
setPropertyValue(dbFactoryBuilder, element, "write-concern", "writeConcern");
BeanDefinition mongoUri = getMongoUri(element);
BeanDefinition mongoUri = getMongoUri(element, parserContext);
if (mongoUri != null) {
if (element.getAttributes().getLength() >= 2 && !element.hasAttribute("write-concern")) {
parserContext.getReaderContext().error("Configure either Mongo URI or details individually!",
parserContext.extractSource(element));
}
dbFactoryBuilder.addConstructorArgValue(mongoUri);
return getSourceBeanDefinition(dbFactoryBuilder, parserContext, element);
}
@@ -149,12 +162,15 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
/**
* Creates a {@link BeanDefinition} for a {@link MongoURI} or {@link MongoClientURI} depending on configured
* attributes.
* attributes. <br />
* Errors when configured element contains {@literal uri} or {@literal client-uri} along with other attributes except
* {@literal write-concern} and/or {@literal id}.
*
* @param element must not be {@literal null}.
* @param parserContext
* @return {@literal null} in case no client-/uri defined.
*/
private BeanDefinition getMongoUri(Element element) {
private BeanDefinition getMongoUri(Element element, ParserContext parserContext) {
boolean hasClientUri = element.hasAttribute("client-uri");
@@ -162,6 +178,21 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
return null;
}
int allowedAttributesCount = 1;
for (String attribute : MONGO_URI_ALLOWED_ADDITIONAL_ATTRIBUTES) {
if (element.hasAttribute(attribute)) {
allowedAttributesCount++;
}
}
if (element.getAttributes().getLength() > allowedAttributesCount) {
parserContext.getReaderContext().error(
"Configure either " + (hasClientUri ? "Mongo Client URI" : "Mongo URI") + " or details individually!",
parserContext.extractSource(element));
}
Class<?> type = hasClientUri ? MongoClientURI.class : MongoURI.class;
String uri = hasClientUri ? element.getAttribute("client-uri") : element.getAttribute("uri");

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2014-2015 the original author or authors.
* Copyright 2014-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -98,7 +98,7 @@ class DefaultScriptOperations implements ScriptOperations {
@Override
public Object doInDB(DB db) throws MongoException, DataAccessException {
return db.eval(script.getCode(), convertScriptArgs(args));
return db.eval(script.getCode(), convertScriptArgs(false, args));
}
});
}
@@ -155,7 +155,7 @@ class DefaultScriptOperations implements ScriptOperations {
return scriptNames;
}
private Object[] convertScriptArgs(Object... args) {
private Object[] convertScriptArgs(boolean quote, Object... args) {
if (ObjectUtils.isEmpty(args)) {
return args;
@@ -164,15 +164,15 @@ class DefaultScriptOperations implements ScriptOperations {
List<Object> convertedValues = new ArrayList<Object>(args.length);
for (Object arg : args) {
convertedValues.add(arg instanceof String ? String.format("'%s'", arg) : this.mongoOperations.getConverter()
.convertToMongoType(arg));
convertedValues.add(arg instanceof String && quote ? String.format("'%s'", arg)
: this.mongoOperations.getConverter().convertToMongoType(arg));
}
return convertedValues.toArray();
}
private String convertAndJoinScriptArgs(Object... args) {
return ObjectUtils.isEmpty(args) ? "" : StringUtils.arrayToCommaDelimitedString(convertScriptArgs(args));
return ObjectUtils.isEmpty(args) ? "" : StringUtils.arrayToCommaDelimitedString(convertScriptArgs(true, args));
}
/**

View File

@@ -0,0 +1,72 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Value object to mitigate different representations of geo command execution results in MongoDB.
*
* @author Oliver Gierke
* @soundtrack Fruitcake - Jeff Coffin (The Inside of the Outside)
*/
class GeoCommandStatistics {
private static final GeoCommandStatistics NONE = new GeoCommandStatistics(new BasicDBObject());
private final DBObject source;
/**
* Creates a new {@link GeoCommandStatistics} instance with the given source document.
*
* @param source must not be {@literal null}.
*/
private GeoCommandStatistics(DBObject source) {
Assert.notNull(source, "Source document must not be null!");
this.source = source;
}
/**
* Creates a new {@link GeoCommandStatistics} from the given command result extracting the statistics.
*
* @param commandResult must not be {@literal null}.
* @return
*/
public static GeoCommandStatistics from(DBObject commandResult) {
Assert.notNull(commandResult, "Command result must not be null!");
Object stats = commandResult.get("stats");
return stats == null ? NONE : new GeoCommandStatistics((DBObject) stats);
}
/**
* Returns the average distance reported by the command result. Mitigating a removal of the field in case the command
* didn't return any result introduced in MongoDB 3.2 RC1.
*
* @return
* @see https://jira.mongodb.org/browse/SERVER-21024
*/
public double getAverageDistance() {
Object averageDistance = source.get("avgDistance");
return averageDistance == null ? Double.NaN : (Double) averageDistance;
}
}

View File

@@ -190,7 +190,7 @@ public interface MongoOperations {
<T> DBCollection createCollection(Class<T> entityClass);
/**
* Create a collect with a name based on the provided entity class using the options.
* Create a collection with a name based on the provided entity class using the options.
*
* @param entityClass class that determines the collection to create
* @param collectionOptions options to use when creating the collection.
@@ -207,7 +207,7 @@ public interface MongoOperations {
DBCollection createCollection(String collectionName);
/**
* Create a collect with the provided name and options.
* Create a collection with the provided name and options.
*
* @param collectionName name of the collection
* @param collectionOptions options to use when creating the collection.

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2015 the original author or authors.
* Copyright 2010-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -137,6 +137,7 @@ import com.mongodb.util.JSONParseException;
* @author Chuong Ngo
* @author Christoph Strobl
* @author Doménique Tilleuil
* @author Mark Paluch
*/
@SuppressWarnings("deprecation")
public class MongoTemplate implements MongoOperations, ApplicationContextAware {
@@ -341,7 +342,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
ReadDbObjectCallback<T> readCallback = new ReadDbObjectCallback<T>(mongoConverter, entityType, collection
.getName());
return new CloseableIterableCusorAdapter<T>(cursorPreparer.prepare(cursor), exceptionTranslator, readCallback);
return new CloseableIterableCursorAdapter<T>(cursorPreparer.prepare(cursor), exceptionTranslator, readCallback);
}
});
}
@@ -396,13 +397,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
protected void logCommandExecutionError(final DBObject command, CommandResult result) {
String error = result.getErrorMessage();
if (error != null) {
// TODO: DATADOC-204 allow configuration of logging level / throw
// throw new
// InvalidDataAccessApiUsageException("Command execution of " +
// command.toString() + " failed: " + error);
LOGGER.warn("Command execution of " + command.toString() + " failed: " + error);
LOGGER.warn("Command execution of {} failed: {}", command.toString(), error);
}
}
@@ -430,8 +429,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DBObject fieldsObject = query.getFieldsObject();
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("Executing query: %s sort: %s fields: %s in collection: %s",
serializeToJsonSafely(queryObject), sortObject, fieldsObject, collectionName));
LOGGER.debug("Executing query: {} sort: {} fields: {} in collection: {}", serializeToJsonSafely(queryObject),
sortObject, fieldsObject, collectionName);
}
this.executeQueryInternal(new FindCallback(queryObject, fieldsObject), preparer, dch, collectionName);
@@ -445,7 +444,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DB db = this.getDb();
return action.doInDB(db);
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e);
throw potentiallyConvertRuntimeException(e, exceptionTranslator);
}
}
@@ -461,7 +460,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DBCollection collection = getAndPrepareCollection(getDb(), collectionName);
return callback.doInCollection(collection);
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e);
throw potentiallyConvertRuntimeException(e, exceptionTranslator);
}
}
@@ -529,7 +528,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
collection.drop();
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Dropped collection [" + collection.getFullName() + "]");
LOGGER.debug("Dropped collection [{}]", collection.getFullName());
}
return null;
}
@@ -630,8 +629,20 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
String collection = StringUtils.hasText(collectionName) ? collectionName : determineCollectionName(entityClass);
DBObject nearDbObject = near.toDBObject();
BasicDBObject command = new BasicDBObject("geoNear", collection);
command.putAll(near.toDBObject());
command.putAll(nearDbObject);
if (nearDbObject.containsField("query")) {
DBObject query = (DBObject) nearDbObject.get("query");
command.put("query", queryMapper.getMappedObject(query, getPersistentEntity(entityClass)));
}
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing geoNear using: {} for class: {} in collection: {}", serializeToJsonSafely(command),
entityClass, collectionName);
}
CommandResult commandResult = executeCommand(command, this.readPreference);
List<Object> results = (List<Object>) commandResult.get("results");
@@ -663,9 +674,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return new GeoResults<T>(result, near.getMetric());
}
DBObject stats = (DBObject) commandResult.get("stats");
double averageDistance = stats == null ? 0 : (Double) stats.get("avgDistance");
return new GeoResults<T>(result, new Distance(averageDistance, near.getMetric()));
GeoCommandStatistics stats = GeoCommandStatistics.from(commandResult);
return new GeoResults<T>(result, new Distance(stats.getAverageDistance(), near.getMetric()));
}
public <T> T findAndModify(Query query, Update update, Class<T> entityClass) {
@@ -775,11 +785,17 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
protected WriteConcern prepareWriteConcern(MongoAction mongoAction) {
WriteConcern wc = writeConcernResolver.resolve(mongoAction);
return potentiallyForceAcknowledgedWrite(wc);
}
if (MongoClientVersion.isMongo3Driver()
&& ObjectUtils.nullSafeEquals(WriteResultChecking.EXCEPTION, writeResultChecking)
&& (wc == null || wc.getW() < 1)) {
return WriteConcern.ACKNOWLEDGED;
private WriteConcern potentiallyForceAcknowledgedWrite(WriteConcern wc) {
if (ObjectUtils.nullSafeEquals(WriteResultChecking.EXCEPTION, writeResultChecking)
&& MongoClientVersion.isMongo3Driver()) {
if (wc == null || wc.getWObject() == null
|| (wc.getWObject() instanceof Number && ((Number) wc.getWObject()).intValue() < 1)) {
return WriteConcern.ACKNOWLEDGED;
}
}
return wc;
}
@@ -979,9 +995,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
protected Object insertDBObject(final String collectionName, final DBObject dbDoc, final Class<?> entityClass) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Inserting DBObject containing fields: " + dbDoc.keySet() + " in collection: " + collectionName);
LOGGER.debug("Inserting DBObject containing fields: {} in collection: {}", dbDoc.keySet(), collectionName);
}
return execute(collectionName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.INSERT, collectionName,
@@ -1001,8 +1019,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Inserting list of DBObjects containing " + dbDocList.size() + " items");
LOGGER.debug("Inserting list of DBObjects containing {} items", dbDocList.size());
}
execute(collectionName, new CollectionCallback<Void>() {
public Void doInCollection(DBCollection collection) throws MongoException, DataAccessException {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.INSERT_LIST, collectionName, null,
@@ -1029,9 +1048,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
protected Object saveDBObject(final String collectionName, final DBObject dbDoc, final Class<?> entityClass) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Saving DBObject containing fields: " + dbDoc.keySet());
LOGGER.debug("Saving DBObject containing fields: {}", dbDoc.keySet());
}
return execute(collectionName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.SAVE, collectionName, entityClass,
@@ -1097,8 +1118,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
update.getUpdateObject(), entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("Calling update using query: %s and update: %s in collection: %s",
serializeToJsonSafely(queryObj), serializeToJsonSafely(updateObj), collectionName));
LOGGER.debug("Calling update using query: {} and update: {} in collection: {}",
serializeToJsonSafely(queryObj), serializeToJsonSafely(updateObj), collectionName);
}
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.UPDATE, collectionName,
@@ -1332,8 +1353,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
copyMapReduceOptionsToCommand(query, mapReduceOptions, command);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing MapReduce on collection [" + command.getInput() + "], mapFunction [" + mapFunc
+ "], reduceFunction [" + reduceFunc + "]");
LOGGER.debug("Executing MapReduce on collection [{}], mapFunction [{}], reduceFunction [{}]", command.getInput(),
mapFunc, reduceFunc);
}
MapReduceOutput mapReduceOutput = inputCollection.mapReduce(command);
@@ -1548,10 +1569,17 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
throw new InvalidDataAccessApiUsageException(String.format("Resource %s not found!", function));
}
Scanner scanner = null;
try {
return new Scanner(functionResource.getInputStream()).useDelimiter("\\A").next();
scanner = new Scanner(functionResource.getInputStream());
return scanner.useDelimiter("\\A").next();
} catch (IOException e) {
throw new InvalidDataAccessApiUsageException(String.format("Cannot read map-reduce file %s!", function), e);
} finally {
if (scanner != null) {
scanner.close();
}
}
}
@@ -1566,8 +1594,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
throw new InvalidDataAccessApiUsageException(
"Can not use skip or field specification with map reduce operations");
}
if (query.getLimit() > 0) {
if (query.getLimit() > 0 && mapReduceOptions.getLimit() == null) {
mapReduceCommand.setLimit(query.getLimit());
}
if (query.getSortObject() != null) {
@@ -1575,6 +1602,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
if (mapReduceOptions.getLimit() != null && mapReduceOptions.getLimit().intValue() > 0) {
mapReduceCommand.setLimit(mapReduceOptions.getLimit());
}
if (mapReduceOptions.getJavaScriptMode() != null) {
mapReduceCommand.setJsMode(true);
}
@@ -1649,8 +1680,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DBObject mappedFields = fields == null ? null : queryMapper.getMappedObject(fields, entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("findOne using query: %s fields: %s for class: %s in collection: %s",
serializeToJsonSafely(query), mappedFields, entityClass, collectionName));
LOGGER.debug("findOne using query: {} fields: {} for class: {} in collection: {}", serializeToJsonSafely(query),
mappedFields, entityClass, collectionName);
}
return executeFindOneInternal(new FindOneCallback(mappedQuery, mappedFields), new ReadDbObjectCallback<T>(
@@ -1700,8 +1731,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DBObject mappedQuery = queryMapper.getMappedObject(query, entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("find using query: %s fields: %s for class: %s in collection: %s",
serializeToJsonSafely(mappedQuery), mappedFields, entityClass, collectionName));
LOGGER.debug("find using query: {} fields: {} for class: {} in collection: {}",
serializeToJsonSafely(mappedQuery), mappedFields, entityClass, collectionName);
}
return executeFindMultiInternal(new FindCallback(mappedQuery, mappedFields), preparer, objectCallback,
@@ -1737,12 +1768,16 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
*/
protected <T> T doFindAndRemove(String collectionName, DBObject query, DBObject fields, DBObject sort,
Class<T> entityClass) {
EntityReader<? super T, DBObject> readerToUse = this.mongoConverter;
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("findAndRemove using query: %s fields: %s sort: %s for class: %s in collection: %s",
serializeToJsonSafely(query), fields, sort, entityClass, collectionName));
LOGGER.debug("findAndRemove using query: {} fields: {} sort: {} for class: {} in collection: {}",
serializeToJsonSafely(query), fields, sort, entityClass, collectionName);
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
return executeFindOneInternal(new FindAndRemoveCallback(queryMapper.getMappedObject(query, entity), fields, sort),
new ReadDbObjectCallback<T>(readerToUse, entityClass, collectionName), collectionName);
}
@@ -1764,9 +1799,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DBObject mappedUpdate = updateMapper.getMappedObject(update.getUpdateObject(), entity);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("findAndModify using query: %s fields: %s sort: %s for class: %s and update: %s "
+ "in collection: %s", serializeToJsonSafely(mappedQuery), fields, sort, entityClass,
serializeToJsonSafely(mappedUpdate), collectionName));
LOGGER.debug(
"findAndModify using query: {} fields: {} sort: {} for class: {} and update: {} " + "in collection: {}",
serializeToJsonSafely(mappedQuery), fields, sort, entityClass, serializeToJsonSafely(mappedUpdate),
collectionName);
}
return executeFindOneInternal(new FindAndModifyCallback(mappedQuery, fields, sort, mappedUpdate, options),
@@ -1814,7 +1850,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
prepareCollection(collection);
return collection;
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e);
throw potentiallyConvertRuntimeException(e, exceptionTranslator);
}
}
@@ -1840,7 +1876,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
collectionName)));
return result;
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e);
throw potentiallyConvertRuntimeException(e, exceptionTranslator);
}
}
@@ -1893,7 +1929,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e);
throw potentiallyConvertRuntimeException(e, exceptionTranslator);
}
}
@@ -1923,7 +1959,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e);
throw potentiallyConvertRuntimeException(e, exceptionTranslator);
}
}
@@ -2002,18 +2038,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
}
/**
* Tries to convert the given {@link RuntimeException} into a {@link DataAccessException} but returns the original
* exception if the conversation failed. Thus allows safe rethrowing of the return value.
*
* @param ex
* @return
*/
private RuntimeException potentiallyConvertRuntimeException(RuntimeException ex) {
RuntimeException resolved = this.exceptionTranslator.translateExceptionIfPossible(ex);
return resolved == null ? ex : resolved;
}
/**
* Inspects the given {@link CommandResult} for erros and potentially throws an
* {@link InvalidDataAccessApiUsageException} for that error.
@@ -2052,6 +2076,20 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return queryMapper.getMappedSort(query.getSortObject(), mappingContext.getPersistentEntity(type));
}
/**
* Tries to convert the given {@link RuntimeException} into a {@link DataAccessException} but returns the original
* exception if the conversation failed. Thus allows safe re-throwing of the return value.
*
* @param ex the exception to translate
* @param exceptionTranslator the {@link PersistenceExceptionTranslator} to be used for translation
* @return
*/
private static RuntimeException potentiallyConvertRuntimeException(RuntimeException ex,
PersistenceExceptionTranslator exceptionTranslator) {
RuntimeException resolved = exceptionTranslator.translateExceptionIfPossible(ex);
return resolved == null ? ex : resolved;
}
// Callback implementations
/**
@@ -2074,14 +2112,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
public DBObject doInCollection(DBCollection collection) throws MongoException, DataAccessException {
if (fields == null) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("findOne using query: %s in db.collection: %s", serializeToJsonSafely(query),
collection.getFullName()));
LOGGER.debug("findOne using query: {} in db.collection: {}", serializeToJsonSafely(query),
collection.getFullName());
}
return collection.findOne(query);
} else {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug(String.format("findOne using query: %s fields: %s in db.collection: %s",
serializeToJsonSafely(query), fields, collection.getFullName()));
LOGGER.debug("findOne using query: {} fields: {} in db.collection: {}", serializeToJsonSafely(query), fields,
collection.getFullName());
}
return collection.findOne(query, fields);
}
@@ -2298,7 +2336,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
} catch (RuntimeException e) {
throw potentiallyConvertRuntimeException(e);
throw potentiallyConvertRuntimeException(e, exceptionTranslator);
}
return cursorToUse;
@@ -2345,20 +2383,20 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* @since 1.7
* @author Thomas Darimont
*/
static class CloseableIterableCusorAdapter<T> implements CloseableIterator<T> {
static class CloseableIterableCursorAdapter<T> implements CloseableIterator<T> {
private volatile Cursor cursor;
private PersistenceExceptionTranslator exceptionTranslator;
private DbObjectCallback<T> objectReadCallback;
/**
* Creates a new {@link CloseableIterableCusorAdapter} backed by the given {@link Cursor}.
* Creates a new {@link CloseableIterableCursorAdapter} backed by the given {@link Cursor}.
*
* @param cursor
* @param exceptionTranslator
* @param objectReadCallback
*/
public CloseableIterableCusorAdapter(Cursor cursor, PersistenceExceptionTranslator exceptionTranslator,
public CloseableIterableCursorAdapter(Cursor cursor, PersistenceExceptionTranslator exceptionTranslator,
DbObjectCallback<T> objectReadCallback) {
this.cursor = cursor;
@@ -2376,7 +2414,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
try {
return cursor.hasNext();
} catch (RuntimeException ex) {
throw exceptionTranslator.translateExceptionIfPossible(ex);
throw potentiallyConvertRuntimeException(ex, exceptionTranslator);
}
}
@@ -2392,7 +2430,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
T converted = objectReadCallback.doWith(item);
return converted;
} catch (RuntimeException ex) {
throw exceptionTranslator.translateExceptionIfPossible(ex);
throw potentiallyConvertRuntimeException(ex, exceptionTranslator);
}
}
@@ -2403,7 +2441,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
try {
c.close();
} catch (RuntimeException ex) {
throw exceptionTranslator.translateExceptionIfPossible(ex);
throw potentiallyConvertRuntimeException(ex, exceptionTranslator);
} finally {
cursor = null;
exceptionTranslator = null;

View File

@@ -153,7 +153,7 @@ public class Aggregation {
protected Aggregation(List<AggregationOperation> aggregationOperations, AggregationOptions options) {
Assert.notNull(aggregationOperations, "AggregationOperations must not be null!");
Assert.isTrue(aggregationOperations.size() > 0, "At least one AggregationOperation has to be provided");
Assert.isTrue(!aggregationOperations.isEmpty(), "At least one AggregationOperation has to be provided");
Assert.notNull(options, "AggregationOptions must not be null!");
this.operations = aggregationOperations;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,16 +20,17 @@ import com.mongodb.DBObject;
/**
* An {@link AggregationExpression} can be used with field expressions in aggregation pipeline stages like
* {@code project} and {@code group}.
*
*
* @author Thomas Darimont
* @author Oliver Gierke
* @author Christoph Strobl
*/
interface AggregationExpression {
public interface AggregationExpression {
/**
* Turns the {@link AggregationExpression} into a {@link DBObject} within the given
* {@link AggregationOperationContext}.
*
*
* @param context
* @return
*/

View File

@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.springframework.util.Assert;
@@ -56,7 +57,7 @@ public enum AggregationFunctionExpressions {
static class FunctionExpression implements AggregationExpression {
private final String name;
private final Object[] values;
private final List<Object> values;
/**
* Creates a new {@link FunctionExpression} for the given name and values.
@@ -70,7 +71,7 @@ public enum AggregationFunctionExpressions {
Assert.notNull(values, "Values must not be null!");
this.name = name;
this.values = values;
this.values = Arrays.asList(values);
}
/*
@@ -80,10 +81,10 @@ public enum AggregationFunctionExpressions {
@Override
public DBObject toDbObject(AggregationOperationContext context) {
List<Object> args = new ArrayList<Object>(values.length);
List<Object> args = new ArrayList<Object>(values.size());
for (int i = 0; i < values.length; i++) {
args.add(unpack(values[i], context));
for (Object value : values) {
args.add(unpack(value, context));
}
return new BasicDBObject("$" + name, args);

View File

@@ -75,15 +75,13 @@ public abstract class AbstractMongoConverter implements MongoConverter, Initiali
*/
private void initializeConverters() {
if (!conversionService.canConvert(ObjectId.class, String.class)) {
conversionService.addConverter(ObjectIdToStringConverter.INSTANCE);
}
if (!conversionService.canConvert(String.class, ObjectId.class)) {
conversionService.addConverter(StringToObjectIdConverter.INSTANCE);
}
conversionService.addConverter(ObjectIdToStringConverter.INSTANCE);
conversionService.addConverter(StringToObjectIdConverter.INSTANCE);
if (!conversionService.canConvert(ObjectId.class, BigInteger.class)) {
conversionService.addConverter(ObjectIdToBigIntegerConverter.INSTANCE);
}
if (!conversionService.canConvert(BigInteger.class, ObjectId.class)) {
conversionService.addConverter(BigIntegerToObjectIdConverter.INSTANCE);
}

View File

@@ -44,9 +44,9 @@ import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigDecimalToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.NamedMongoScriptToDBObjectConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.DBObjectToNamedMongoScriptCoverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.DBObjectToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.NamedMongoScriptToDBObjectConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigDecimalConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigIntegerConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToURLConverter;
@@ -192,8 +192,8 @@ public class CustomConversions {
}
/**
* Registers a conversion for the given converter. Inspects either generics or the {@link ConvertiblePair}s returned
* by a {@link GenericConverter}.
* Registers a conversion for the given converter. Inspects either generics of {@link Converter} and
* {@link ConverterFactory} or the {@link ConvertiblePair}s returned by a {@link GenericConverter}.
*
* @param converter
*/
@@ -208,6 +208,10 @@ public class CustomConversions {
for (ConvertiblePair pair : genericConverter.getConvertibleTypes()) {
register(new ConverterRegistration(pair, isReading, isWriting));
}
} else if (converter instanceof ConverterFactory) {
Class<?>[] arguments = GenericTypeResolver.resolveTypeArguments(converter.getClass(), ConverterFactory.class);
register(new ConverterRegistration(arguments[0], arguments[1], isReading, isWriting));
} else if (converter instanceof Converter) {
Class<?>[] arguments = GenericTypeResolver.resolveTypeArguments(converter.getClass(), Converter.class);
register(new ConverterRegistration(arguments[0], arguments[1], isReading, isWriting));

View File

@@ -75,9 +75,7 @@ class DBObjectAccessor {
String part = parts.next();
if (parts.hasNext()) {
BasicDBObject nestedDbObject = new BasicDBObject();
dbObject.put(part, nestedDbObject);
dbObject = nestedDbObject;
dbObject = getOrCreateNestedDbObject(part, dbObject);
} else {
dbObject.put(part, value);
}
@@ -116,8 +114,48 @@ class DBObjectAccessor {
return result;
}
/**
* Returns whether the underlying {@link DBObject} has a value ({@literal null} or non-{@literal null}) for the given
* {@link MongoPersistentProperty}.
*
* @param property must not be {@literal null}.
* @return
*/
public boolean hasValue(MongoPersistentProperty property) {
Assert.notNull(property, "Property must not be null!");
String fieldName = property.getFieldName();
if (!fieldName.contains(".")) {
return this.dbObject.containsField(fieldName);
}
String[] parts = fieldName.split("\\.");
Map<String, Object> source = this.dbObject;
Object result = null;
for (int i = 1; i < parts.length; i++) {
result = source.get(parts[i - 1]);
source = getAsMap(result);
if (source == null) {
return false;
}
}
return source.containsKey(parts[parts.length - 1]);
}
/**
* Returns the given source object as map, i.e. {@link BasicDBObject}s and maps as is or {@literal null} otherwise.
*
* @param source can be {@literal null}.
* @return
*/
@SuppressWarnings("unchecked")
private Map<String, Object> getAsMap(Object source) {
private static Map<String, Object> getAsMap(Object source) {
if (source instanceof BasicDBObject) {
return (BasicDBObject) source;
@@ -129,4 +167,26 @@ class DBObjectAccessor {
return null;
}
/**
* Returns the {@link DBObject} which either already exists in the given source under the given key, or creates a new
* nested one, registers it with the source and returns it.
*
* @param key must not be {@literal null} or empty.
* @param source must not be {@literal null}.
* @return
*/
private static DBObject getOrCreateNestedDbObject(String key, DBObject source) {
Object existing = source.get(key);
if (existing instanceof BasicDBObject) {
return (BasicDBObject) existing;
}
DBObject nested = new BasicDBObject();
source.put(key, nested);
return nested;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -45,9 +45,9 @@ import com.mongodb.DBObject;
public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implements MongoTypeMapper {
public static final String DEFAULT_TYPE_KEY = "_class";
@SuppressWarnings("rawtypes")//
@SuppressWarnings("rawtypes") //
private static final TypeInformation<List> LIST_TYPE_INFO = ClassTypeInformation.from(List.class);
@SuppressWarnings("rawtypes")//
@SuppressWarnings("rawtypes") //
private static final TypeInformation<Map> MAP_TYPE_INFO = ClassTypeInformation.from(Map.class);
private final TypeAliasAccessor<DBObject> accessor;
@@ -58,12 +58,12 @@ public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implemen
}
public DefaultMongoTypeMapper(String typeKey) {
this(typeKey, Arrays.asList(SimpleTypeInformationMapper.INSTANCE));
this(typeKey, Arrays.asList(new SimpleTypeInformationMapper()));
}
public DefaultMongoTypeMapper(String typeKey, MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext) {
this(typeKey, new DBObjectTypeAliasAccessor(typeKey), mappingContext, Arrays
.asList(SimpleTypeInformationMapper.INSTANCE));
this(typeKey, new DBObjectTypeAliasAccessor(typeKey), mappingContext,
Arrays.asList(new SimpleTypeInformationMapper()));
}
public DefaultMongoTypeMapper(String typeKey, List<? extends TypeInformationMapper> mappers) {
@@ -71,7 +71,8 @@ public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implemen
}
private DefaultMongoTypeMapper(String typeKey, TypeAliasAccessor<DBObject> accessor,
MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext, List<? extends TypeInformationMapper> mappers) {
MappingContext<? extends PersistentEntity<?, ?>, ?> mappingContext,
List<? extends TypeInformationMapper> mappers) {
super(accessor, mappingContext, mappers);

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2014-2015 the original author or authors.
* Copyright 2014-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -111,9 +111,13 @@ abstract class GeoConverters {
@Override
public Point convert(DBObject source) {
if (source == null) {
return null;
}
Assert.isTrue(source.keySet().size() == 2, "Source must contain 2 elements");
return source == null ? null : new Point((Double) source.get("x"), (Double) source.get("y"));
return new Point((Double) source.get("x"), (Double) source.get("y"));
}
}
@@ -591,7 +595,7 @@ abstract class GeoConverters {
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "Point"),
String.format("Cannot convert type '%s' to Point.", source.get("type")));
List<Double> dbl = (List<Double>) source.get("coordinates");
List<Number> dbl = (List<Number>) source.get("coordinates");
return new GeoJsonPoint(dbl.get(0).doubleValue(), dbl.get(1).doubleValue());
}
}
@@ -824,7 +828,7 @@ abstract class GeoConverters {
Assert.isInstanceOf(List.class, point);
List<Double> coordinatesList = (List<Double>) point;
List<Number> coordinatesList = (List<Number>) point;
points.add(new GeoJsonPoint(coordinatesList.get(0).doubleValue(), coordinatesList.get(1).doubleValue()));
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 by the original author(s).
* Copyright 2011-2016 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,6 +20,7 @@ import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
@@ -259,12 +260,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
// make sure id property is set before all other properties
Object idValue = null;
if (idProperty != null) {
if (idProperty != null && new DBObjectAccessor(dbo).hasValue(idProperty)) {
idValue = getValueInternal(idProperty, dbo, evaluator, path);
accessor.setProperty(idProperty, idValue);
}
final ObjectPath currentPath = path.push(result, entity, idValue);
final ObjectPath currentPath = path.push(result, entity,
idValue != null ? dbo.get(idProperty.getFieldName()) : null);
// Set properties not already set in the constructor
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
@@ -290,7 +292,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
final MongoPersistentProperty property = association.getInverse();
Object value = dbo.get(property.getFieldName());
if (value == null) {
if (value == null || entity.isConstructorArgument(property)) {
return;
}
@@ -895,9 +897,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Collection<Object> items = targetType.getType().isArray() ? new ArrayList<Object>()
: CollectionFactory.createCollection(collectionType, rawComponentType, sourceValue.size());
for (int i = 0; i < sourceValue.size(); i++) {
Object dbObjItem = sourceValue.get(i);
for (Object dbObjItem : sourceValue) {
if (dbObjItem instanceof DBRef) {
items.add(
@@ -991,20 +991,31 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (obj instanceof DBObject) {
DBObject newValueDbo = new BasicDBObject();
for (String vk : ((DBObject) obj).keySet()) {
Object o = ((DBObject) obj).get(vk);
newValueDbo.put(vk, convertToMongoType(o, typeHint));
}
return newValueDbo;
}
if (obj instanceof Map) {
DBObject result = new BasicDBObject();
for (Map.Entry<Object, Object> entry : ((Map<Object, Object>) obj).entrySet()) {
result.put(entry.getKey().toString(), convertToMongoType(entry.getValue(), typeHint));
Map<Object, Object> converted = new LinkedHashMap<Object, Object>();
for (Entry<Object, Object> entry : ((Map<Object, Object>) obj).entrySet()) {
TypeInformation<? extends Object> valueTypeHint = typeHint != null && typeHint.getMapValueType() != null
? typeHint.getMapValueType() : typeHint;
converted.put(convertToMongoType(entry.getKey()), convertToMongoType(entry.getValue(), valueTypeHint));
}
return result;
return new BasicDBObject(converted);
}
if (obj.getClass().isArray()) {
@@ -1197,10 +1208,6 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Object object = dbref == null ? null : path.getPathItem(dbref.getId(), dbref.getCollectionName());
if (object != null) {
return (T) object;
}
return (T) (object != null ? object : read(type, readRef(dbref), path));
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -31,6 +31,7 @@ import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PropertyPath;
import org.springframework.data.mapping.PropertyReferenceException;
import org.springframework.data.mapping.context.InvalidPersistentPropertyPath;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.PersistentPropertyPath;
import org.springframework.data.mapping.model.MappingException;
@@ -56,6 +57,7 @@ import com.mongodb.DBRef;
* @author Patryk Wasik
* @author Thomas Darimont
* @author Christoph Strobl
* @author Mark Paluch
*/
public class QueryMapper {
@@ -64,7 +66,7 @@ public class QueryMapper {
static final ClassTypeInformation<?> NESTED_DOCUMENT = ClassTypeInformation.from(NestedDocument.class);
private enum MetaMapping {
FORCE, WHEN_PRESENT, IGNORE;
FORCE, WHEN_PRESENT, IGNORE
}
private final ConversionService conversionService;
@@ -119,10 +121,20 @@ public class QueryMapper {
continue;
}
Field field = createPropertyField(entity, key, mappingContext);
Entry<String, Object> entry = getMappedObjectForField(field, query.get(key));
try {
result.put(entry.getKey(), entry.getValue());
Field field = createPropertyField(entity, key, mappingContext);
Entry<String, Object> entry = getMappedObjectForField(field, query.get(key));
result.put(entry.getKey(), entry.getValue());
} catch (InvalidPersistentPropertyPath invalidPathException) {
// in case the object has not already been mapped
if (!(query.get(key) instanceof DBObject)) {
throw invalidPathException;
}
result.put(key, query.get(key));
}
}
return result;
@@ -298,7 +310,7 @@ public class QueryMapper {
}
if (isNestedKeyword(value)) {
return getMappedKeyword(new Keyword((DBObject) value), null);
return getMappedKeyword(new Keyword((DBObject) value), documentField.getPropertyEntity());
}
if (isAssociationConversionNecessary(documentField, value)) {
@@ -946,7 +958,7 @@ public class QueryMapper {
*/
protected String mapPropertyName(MongoPersistentProperty property) {
String mappedName = PropertyToFieldNameConverter.INSTANCE.convert(property);
StringBuilder mappedName = new StringBuilder(PropertyToFieldNameConverter.INSTANCE.convert(property));
boolean inspect = iterator.hasNext();
while (inspect) {
@@ -955,18 +967,18 @@ public class QueryMapper {
boolean isPositional = (isPositionalParameter(partial) && (property.isMap() || property.isCollectionLike()));
if (isPositional) {
mappedName += "." + partial;
mappedName.append(".").append(partial);
}
inspect = isPositional && iterator.hasNext();
}
return mappedName;
return mappedName.toString();
}
private static boolean isPositionalParameter(String partial) {
if (partial.equals("$")) {
if ("$".equals(partial)) {
return true;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2014 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.core.index;
import java.util.LinkedHashMap;
import java.util.Map;
import java.util.Map.Entry;
import java.util.concurrent.TimeUnit;
import org.springframework.data.domain.Sort.Direction;
@@ -44,7 +45,7 @@ public class Index implements IndexDefinition {
*
* @deprecated since 1.7.
*/
@Deprecated//
@Deprecated //
DROP
}
@@ -175,11 +176,18 @@ public class Index implements IndexDefinition {
return unique();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexDefinition#getIndexKeys()
*/
public DBObject getIndexKeys() {
DBObject dbo = new BasicDBObject();
for (String k : fieldSpec.keySet()) {
dbo.put(k, fieldSpec.get(k).equals(Direction.ASC) ? 1 : -1);
for (Entry<String, Direction> entry : fieldSpec.entrySet()) {
dbo.put(entry.getKey(), Direction.ASC.equals(entry.getValue()) ? 1 : -1);
}
return dbo;
}

View File

@@ -27,7 +27,10 @@ import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.domain.Sort;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.AssociationHandler;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.core.index.Index.Duplicates;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexResolver.TextIndexIncludeOptions.IncludeStrategy;
import org.springframework.data.mongodb.core.index.TextIndexDefinition.TextIndexDefinitionBuilder;
@@ -123,6 +126,8 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
}
});
indexInformation.addAll(resolveIndexesForDbrefs("", root.getCollection(), root));
return indexInformation;
}
@@ -168,6 +173,8 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
}
});
indexInformation.addAll(resolveIndexesForDbrefs(path, collection, entity));
return indexInformation;
}
@@ -193,18 +200,19 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
return createCompoundIndexDefinitions(dotPath, collection, entity);
}
private Collection<? extends IndexDefinitionHolder> potentiallyCreateTextIndexDefinition(MongoPersistentEntity<?> root) {
private Collection<? extends IndexDefinitionHolder> potentiallyCreateTextIndexDefinition(
MongoPersistentEntity<?> root) {
TextIndexDefinitionBuilder indexDefinitionBuilder = new TextIndexDefinitionBuilder().named(root.getType()
.getSimpleName() + "_TextIndex");
TextIndexDefinitionBuilder indexDefinitionBuilder = new TextIndexDefinitionBuilder()
.named(root.getType().getSimpleName() + "_TextIndex");
if (StringUtils.hasText(root.getLanguage())) {
indexDefinitionBuilder.withDefaultLanguage(root.getLanguage());
}
try {
appendTextIndexInformation("", indexDefinitionBuilder, root,
new TextIndexIncludeOptions(IncludeStrategy.DEFAULT), new CycleGuard());
appendTextIndexInformation("", indexDefinitionBuilder, root, new TextIndexIncludeOptions(IncludeStrategy.DEFAULT),
new CycleGuard());
} catch (CyclicPropertyReferenceException e) {
LOGGER.info(e.getMessage());
}
@@ -220,9 +228,8 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
}
private void appendTextIndexInformation(final String dotPath,
final TextIndexDefinitionBuilder indexDefinitionBuilder, final MongoPersistentEntity<?> entity,
final TextIndexIncludeOptions includeOptions, final CycleGuard guard) {
private void appendTextIndexInformation(final String dotPath, final TextIndexDefinitionBuilder indexDefinitionBuilder,
final MongoPersistentEntity<?> entity, final TextIndexIncludeOptions includeOptions, final CycleGuard guard) {
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
@@ -249,8 +256,8 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
TextIndexIncludeOptions optionsForNestedType = includeOptions;
if (!IncludeStrategy.FORCE.equals(includeOptions.getStrategy()) && indexed != null) {
optionsForNestedType = new TextIndexIncludeOptions(IncludeStrategy.FORCE, new TextIndexedFieldSpec(
propertyDotPath, weight));
optionsForNestedType = new TextIndexIncludeOptions(IncludeStrategy.FORCE,
new TextIndexedFieldSpec(propertyDotPath, weight));
}
try {
@@ -259,9 +266,8 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
} catch (CyclicPropertyReferenceException e) {
LOGGER.info(e.getMessage(), e);
} catch (InvalidDataAccessApiUsageException e) {
LOGGER.info(
String.format("Potentially invalid index structure discovered. Breaking operation for %s.",
entity.getName()), e);
LOGGER.info(String.format("Potentially invalid index structure discovered. Breaking operation for %s.",
entity.getName()), e);
}
} else if (includeOptions.isForce() || indexed != null) {
indexDefinitionBuilder.onField(propertyDotPath, weight);
@@ -306,8 +312,8 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
protected IndexDefinitionHolder createCompoundIndexDefinition(String dotPath, String fallbackCollection,
CompoundIndex index, MongoPersistentEntity<?> entity) {
CompoundIndexDefinition indexDefinition = new CompoundIndexDefinition(resolveCompoundIndexKeyFromStringDefinition(
dotPath, index.def()));
CompoundIndexDefinition indexDefinition = new CompoundIndexDefinition(
resolveCompoundIndexKeyFromStringDefinition(dotPath, index.def()));
if (!index.useGeneratedName()) {
indexDefinition.named(pathAwareIndexName(index.name(), dotPath, null));
@@ -431,13 +437,45 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
if (StringUtils.hasText(dotPath)) {
nameToUse = StringUtils.hasText(nameToUse) ? (property != null ? dotPath.replace("." + property.getFieldName(),
"") : dotPath) + "." + nameToUse : dotPath;
nameToUse = StringUtils.hasText(nameToUse)
? (property != null ? dotPath.replace("." + property.getFieldName(), "") : dotPath) + "." + nameToUse
: dotPath;
}
return nameToUse;
}
private List<IndexDefinitionHolder> resolveIndexesForDbrefs(final String path, final String collection,
MongoPersistentEntity<?> entity) {
final List<IndexDefinitionHolder> indexes = new ArrayList<IndexDefinitionHolder>(0);
entity.doWithAssociations(new AssociationHandler<MongoPersistentProperty>() {
@Override
public void doWithAssociation(Association<MongoPersistentProperty> association) {
MongoPersistentProperty property = association.getInverse();
String propertyDotPath = (StringUtils.hasText(path) ? path + "." : "") + property.getFieldName();
if (property.isAnnotationPresent(GeoSpatialIndexed.class) || property.isAnnotationPresent(TextIndexed.class)) {
throw new MappingException(
String.format("Cannot create geospatial-/text- index on DBRef in collection '%s' for path '%s'.",
collection, propertyDotPath));
}
IndexDefinitionHolder indexDefinitionHolder = createIndexDefinitionHolderForProperty(propertyDotPath,
collection, property);
if (indexDefinitionHolder != null) {
indexes.add(indexDefinitionHolder);
}
}
});
return indexes;
}
/**
* {@link CycleGuard} holds information about properties and the paths for accessing those. This information is used
* to detect potential cycles within the references.

View File

@@ -45,6 +45,8 @@ public class MapReduceOptions {
private Boolean verbose = true;
private Integer limit;
private Map<String, Object> extraOptions = new HashMap<String, Object>();
/**
@@ -64,6 +66,8 @@ public class MapReduceOptions {
* @return MapReduceOptions so that methods can be chained in a fluent API style
*/
public MapReduceOptions limit(int limit) {
this.limit = limit;
return this;
}
@@ -247,6 +251,15 @@ public class MapReduceOptions {
return this.scopeVariables;
}
/**
* Get the maximum number of documents for the input into the map function.
*
* @return {@literal null} if not set.
*/
public Integer getLimit() {
return limit;
}
public DBObject getOptionsObject() {
BasicDBObject cmd = new BasicDBObject();
@@ -264,6 +277,10 @@ public class MapReduceOptions {
cmd.put("scope", scopeVariables);
}
if (limit != null) {
cmd.put("limit", limit);
}
if (!extraOptions.keySet().isEmpty()) {
cmd.putAll(extraOptions);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,13 +17,10 @@ package org.springframework.data.mongodb.core.mapreduce;
public class MapReduceTiming {
private long mapTime;
private long emitLoopTime;
private long totalTime;
private long mapTime, emitLoopTime, totalTime;
public MapReduceTiming(long mapTime, long emitLoopTime, long totalTime) {
this.mapTime = mapTime;
this.emitLoopTime = emitLoopTime;
this.totalTime = totalTime;
@@ -41,37 +38,52 @@ public class MapReduceTiming {
return totalTime;
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return "MapReduceTiming [mapTime=" + mapTime + ", emitLoopTime=" + emitLoopTime + ", totalTime=" + totalTime + "]";
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
final int prime = 31;
int result = 1;
result = prime * result + (int) (emitLoopTime ^ (emitLoopTime >>> 32));
result = prime * result + (int) (mapTime ^ (mapTime >>> 32));
result = prime * result + (int) (totalTime ^ (totalTime >>> 32));
return result;
}
/*
*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
MapReduceTiming other = (MapReduceTiming) obj;
if (emitLoopTime != other.emitLoopTime)
return false;
if (mapTime != other.mapTime)
return false;
if (totalTime != other.totalTime)
return false;
return true;
}
if (this == obj) {
return true;
}
if (!(obj instanceof MapReduceTiming)) {
return false;
}
MapReduceTiming that = (MapReduceTiming) obj;
return this.emitLoopTime == that.emitLoopTime && //
this.mapTime == that.mapTime && //
this.totalTime == that.totalTime;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2014 the original author or authors.
* Copyright 2010-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -28,6 +28,7 @@ import com.mongodb.util.JSON;
* @author Oliver Gierke
* @author Christoph Strobl
* @author Thomas Darimont
* @author John Willemin
*/
public class BasicQuery extends Query {
@@ -70,6 +71,19 @@ public class BasicQuery extends Query {
@Override
public DBObject getFieldsObject() {
if (fieldsObject == null) {
return super.getFieldsObject();
}
if (super.getFieldsObject() != null) {
DBObject combinedFieldsObject = new BasicDBObject();
combinedFieldsObject.putAll(fieldsObject);
combinedFieldsObject.putAll(super.getFieldsObject());
return combinedFieldsObject;
}
return fieldsObject;
}

View File

@@ -22,6 +22,7 @@ import java.util.Arrays;
import java.util.Collection;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map.Entry;
import java.util.regex.Pattern;
import org.bson.BSON;
@@ -118,7 +119,7 @@ public class Criteria implements CriteriaDefinition {
}
private boolean lastOperatorWasNot() {
return this.criteria.size() > 0 && "$not".equals(this.criteria.keySet().toArray()[this.criteria.size() - 1]);
return !this.criteria.isEmpty() && "$not".equals(this.criteria.keySet().toArray()[this.criteria.size() - 1]);
}
/**
@@ -581,9 +582,10 @@ public class Criteria implements CriteriaDefinition {
DBObject dbo = new BasicDBObject();
boolean not = false;
for (String k : this.criteria.keySet()) {
for (Entry<String, Object> entry : criteria.entrySet()) {
Object value = this.criteria.get(k);
String key = entry.getKey();
Object value = entry.getValue();
if (requiresGeoJsonFormat(value)) {
value = new BasicDBObject("$geometry", value);
@@ -591,14 +593,14 @@ public class Criteria implements CriteriaDefinition {
if (not) {
DBObject notDbo = new BasicDBObject();
notDbo.put(k, value);
notDbo.put(key, value);
dbo.put("$not", notDbo);
not = false;
} else {
if ("$not".equals(k) && value == null) {
if ("$not".equals(key) && value == null) {
not = true;
} else {
dbo.put(k, value);
dbo.put(key, value);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2013 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -83,14 +83,10 @@ public class Field {
public DBObject getFieldsObject() {
DBObject dbo = new BasicDBObject();
DBObject dbo = new BasicDBObject(criteria);
for (String k : criteria.keySet()) {
dbo.put(k, criteria.get(k));
}
for (String k : slices.keySet()) {
dbo.put(k, new BasicDBObject("$slice", slices.get(k)));
for (Entry<String, Object> entry : slices.entrySet()) {
dbo.put(entry.getKey(), new BasicDBObject("$slice", entry.getValue()));
}
for (Entry<String, Criteria> entry : elemMatchs.entrySet()) {
@@ -134,8 +130,8 @@ public class Field {
return false;
}
boolean samePositionKey = this.postionKey == null ? that.postionKey == null : this.postionKey
.equals(that.postionKey);
boolean samePositionKey = this.postionKey == null ? that.postionKey == null
: this.postionKey.equals(that.postionKey);
boolean samePositionValue = this.positionValue == that.positionValue;
return samePositionKey && samePositionValue;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2014 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -94,9 +94,9 @@ public class Query {
if (existing == null) {
this.criteria.put(key, criteriaDefinition);
} else {
throw new InvalidMongoDbApiUsageException("Due to limitations of the com.mongodb.BasicDBObject, "
+ "you can't add a second '" + key + "' criteria. " + "Query already contains '"
+ existing.getCriteriaObject() + "'.");
throw new InvalidMongoDbApiUsageException(
"Due to limitations of the com.mongodb.BasicDBObject, " + "you can't add a second '" + key + "' criteria. "
+ "Query already contains '" + existing.getCriteriaObject() + "'.");
}
return this;
@@ -221,10 +221,8 @@ public class Query {
DBObject dbo = new BasicDBObject();
for (String k : criteria.keySet()) {
CriteriaDefinition c = criteria.get(k);
DBObject cl = c.getCriteriaObject();
dbo.putAll(cl);
for (CriteriaDefinition definition : criteria.values()) {
dbo.putAll(definition.getCriteriaObject());
}
if (!restrictedTypes.isEmpty()) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2014 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -254,7 +254,7 @@ public class Update {
* @return
*/
public Update pullAll(String key, Object[] values) {
addFieldOperation("$pullAll", key, Arrays.copyOf(values, values.length));
addMultiFieldOperation("$pullAll", key, Arrays.copyOf(values, values.length));
return this;
}
@@ -327,17 +327,22 @@ public class Update {
}
public DBObject getUpdateObject() {
DBObject dbo = new BasicDBObject();
for (String k : modifierOps.keySet()) {
dbo.put(k, modifierOps.get(k));
}
return dbo;
return new BasicDBObject(modifierOps);
}
/**
* This method is not called anymore rather override {@link #addMultiFieldOperation(String, String, Object)}.
*
* @param operator
* @param key
* @param value
* @deprectaed Use {@link #addMultiFieldOperation(String, String, Object)} instead.
*/
@Deprecated
protected void addFieldOperation(String operator, String key, Object value) {
Assert.hasText(key, "Key/Path for update must not be null or blank.");
modifierOps.put(operator, new BasicDBObject(key, value));
this.keysToUpdate.add(key);
}
@@ -355,8 +360,8 @@ public class Update {
if (existingValue instanceof BasicDBObject) {
keyValueMap = (BasicDBObject) existingValue;
} else {
throw new InvalidDataAccessApiUsageException("Modifier Operations should be a LinkedHashMap but was "
+ existingValue.getClass());
throw new InvalidDataAccessApiUsageException(
"Modifier Operations should be a LinkedHashMap but was " + existingValue.getClass());
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2012 the original author or authors.
* Copyright 2002-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -29,6 +29,7 @@ import com.mongodb.MongoException;
* Base class to encapsulate common configuration settings when connecting to a database
*
* @author Mark Pollack
* @author Oliver Gierke
*/
public abstract class AbstractMonitor {

View File

@@ -47,7 +47,7 @@ public @interface EnableMongoRepositories {
/**
* Alias for the {@link #basePackages()} attribute. Allows for more concise annotation declarations e.g.:
* {@code @EnableJpaRepositories("org.my.pkg")} instead of {@code @EnableJpaRepositories(basePackages="org.my.pkg")}.
* {@code @EnableMongoRepositories("org.my.pkg")} instead of {@code @EnableMongoRepositories(basePackages="org.my.pkg")}.
*/
String[] value() default {};

View File

@@ -15,6 +15,9 @@
*/
package org.springframework.data.mongodb.repository.query;
import java.util.Arrays;
import java.util.List;
import org.springframework.data.domain.Range;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Point;
@@ -34,7 +37,7 @@ import org.springframework.util.ClassUtils;
public class MongoParametersParameterAccessor extends ParametersParameterAccessor implements MongoParameterAccessor {
private final MongoQueryMethod method;
private final Object[] values;
private final List<Object> values;
/**
* Creates a new {@link MongoParametersParameterAccessor}.
@@ -43,9 +46,11 @@ public class MongoParametersParameterAccessor extends ParametersParameterAccesso
* @param values must not be {@literal null}.
*/
public MongoParametersParameterAccessor(MongoQueryMethod method, Object[] values) {
super(method.getParameters(), values);
this.method = method;
this.values = values;
this.values = Arrays.asList(values);
}
public Range<Distance> getDistanceRange() {
@@ -131,6 +136,6 @@ public class MongoParametersParameterAccessor extends ParametersParameterAccesso
*/
@Override
public Object[] getValues() {
return values;
return values.toArray();
}
}

View File

@@ -177,16 +177,16 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
switch (type) {
case AFTER:
case GREATER_THAN:
return criteria.gt(parameters.nextConverted(property));
return criteria.gt(parameters.next());
case GREATER_THAN_EQUAL:
return criteria.gte(parameters.nextConverted(property));
return criteria.gte(parameters.next());
case BEFORE:
case LESS_THAN:
return criteria.lt(parameters.nextConverted(property));
return criteria.lt(parameters.next());
case LESS_THAN_EQUAL:
return criteria.lte(parameters.nextConverted(property));
return criteria.lte(parameters.next());
case BETWEEN:
return criteria.gt(parameters.nextConverted(property)).lt(parameters.nextConverted(property));
return criteria.gt(parameters.next()).lt(parameters.next());
case IS_NOT_NULL:
return criteria.ne(null);
case IS_NULL:
@@ -201,7 +201,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
case CONTAINING:
return createContainingCriteria(part, property, criteria, parameters);
case NOT_CONTAINING:
return createContainingCriteria(part, property, criteria, parameters).not();
return createContainingCriteria(part, property, criteria.not(), parameters);
case REGEX:
return criteria.regex(parameters.next().toString());
case EXISTS:
@@ -241,12 +241,12 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
return criteria.within((Shape) parameter);
case SIMPLE_PROPERTY:
return isSimpleComparisionPossible(part) ? criteria.is(parameters.nextConverted(property))
return isSimpleComparisionPossible(part) ? criteria.is(parameters.next())
: createLikeRegexCriteriaOrThrow(part, property, criteria, parameters, false);
case NEGATING_SIMPLE_PROPERTY:
return isSimpleComparisionPossible(part) ? criteria.ne(parameters.nextConverted(property))
return isSimpleComparisionPossible(part) ? criteria.ne(parameters.next())
: createLikeRegexCriteriaOrThrow(part, property, criteria, parameters, true);
default:
throw new IllegalArgumentException("Unsupported keyword!");
@@ -382,7 +382,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
if (next instanceof Collection) {
return ((Collection<?>) next).toArray();
} else if (next.getClass().isArray()) {
} else if (next != null && next.getClass().isArray()) {
return (Object[]) next;
}
@@ -420,7 +420,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
return PUNCTATION_PATTERN.matcher(source).find() ? Pattern.quote(source) : source;
}
if (source.equals("*")) {
if ("*".equals(source)) {
return ".*";
}

View File

@@ -21,6 +21,9 @@ import java.util.List;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import javax.xml.bind.DatatypeConverter;
import org.bson.BSON;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.mongodb.core.MongoOperations;
@@ -224,6 +227,15 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
return (String) value;
}
if (value instanceof byte[]) {
String base64representation = DatatypeConverter.printBase64Binary((byte[]) value);
if (!binding.isQuoted()) {
return "{ '$binary' : '" + base64representation + "', '$type' : " + BSON.B_GENERAL + "}";
}
return base64representation;
}
return JSON.serialize(value);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2015 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.repository.support;
import java.io.Serializable;
import org.bson.types.ObjectId;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.repository.query.MongoEntityInformation;
import org.springframework.data.repository.core.support.PersistentEntityInformation;
@@ -27,12 +28,14 @@ import org.springframework.data.repository.core.support.PersistentEntityInformat
* {@link MongoPersistentEntity} if given.
*
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MappingMongoEntityInformation<T, ID extends Serializable> extends PersistentEntityInformation<T, ID>
implements MongoEntityInformation<T, ID> {
private final MongoPersistentEntity<T> entityMetadata;
private final String customCollectionName;
private final Class<ID> fallbackIdType;
/**
* Creates a new {@link MappingMongoEntityInformation} for the given {@link MongoPersistentEntity}.
@@ -40,7 +43,18 @@ public class MappingMongoEntityInformation<T, ID extends Serializable> extends P
* @param entity must not be {@literal null}.
*/
public MappingMongoEntityInformation(MongoPersistentEntity<T> entity) {
this(entity, null);
this(entity, null, null);
}
/**
* Creates a new {@link MappingMongoEntityInformation} for the given {@link MongoPersistentEntity} and fallback
* identifier type.
*
* @param entity must not be {@literal null}.
* @param fallbackIdType can be {@literal null}.
*/
public MappingMongoEntityInformation(MongoPersistentEntity<T> entity, Class<ID> fallbackIdType) {
this(entity, (String) null, fallbackIdType);
}
/**
@@ -51,11 +65,26 @@ public class MappingMongoEntityInformation<T, ID extends Serializable> extends P
* @param customCollectionName can be {@literal null}.
*/
public MappingMongoEntityInformation(MongoPersistentEntity<T> entity, String customCollectionName) {
this(entity, customCollectionName, null);
}
/**
* Creates a new {@link MappingMongoEntityInformation} for the given {@link MongoPersistentEntity}, collection name
* and identifier type.
*
* @param entity must not be {@literal null}.
* @param customCollectionName can be {@literal null}.
* @param idType can be {@literal null}.
*/
@SuppressWarnings("unchecked")
private MappingMongoEntityInformation(MongoPersistentEntity<T> entity, String customCollectionName,
Class<ID> idType) {
super(entity);
this.entityMetadata = entity;
this.customCollectionName = customCollectionName;
this.fallbackIdType = idType != null ? idType : (Class<ID>) ObjectId.class;
}
/* (non-Javadoc)
@@ -71,4 +100,19 @@ public class MappingMongoEntityInformation<T, ID extends Serializable> extends P
public String getIdAttribute() {
return entityMetadata.getIdProperty().getName();
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.core.support.PersistentEntityInformation#getIdType()
*/
@Override
@SuppressWarnings("unchecked")
public Class<ID> getIdType() {
if (this.entityMetadata.hasIdProperty()) {
return super.getIdType();
}
return fallbackIdType != null ? fallbackIdType : (Class<ID>) ObjectId.class;
}
}

View File

@@ -47,6 +47,7 @@ import org.springframework.util.Assert;
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class MongoRepositoryFactory extends RepositoryFactorySupport {
@@ -88,7 +89,8 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
@Override
protected Object getTargetRepository(RepositoryInformation information) {
MongoEntityInformation<?, Serializable> entityInformation = getEntityInformation(information.getDomainType());
MongoEntityInformation<?, Serializable> entityInformation = getEntityInformation(information.getDomainType(),
information);
return getTargetRepositoryViaReflection(information, entityInformation, mongoOperations);
}
@@ -105,9 +107,13 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
* (non-Javadoc)
* @see org.springframework.data.repository.core.support.RepositoryFactorySupport#getEntityInformation(java.lang.Class)
*/
@Override
@SuppressWarnings("unchecked")
public <T, ID extends Serializable> MongoEntityInformation<T, ID> getEntityInformation(Class<T> domainClass) {
return getEntityInformation(domainClass, null);
}
@SuppressWarnings("unchecked")
private <T, ID extends Serializable> MongoEntityInformation<T, ID> getEntityInformation(Class<T> domainClass,
RepositoryInformation information) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(domainClass);
@@ -116,7 +122,8 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
String.format("Could not lookup mapping metadata for domain class %s!", domainClass.getName()));
}
return new MappingMongoEntityInformation<T, ID>((MongoPersistentEntity<T>) entity);
return new MappingMongoEntityInformation<T, ID>((MongoPersistentEntity<T>) entity,
information != null ? (Class<ID>) information.getIdType() : null);
}
/**

View File

@@ -26,6 +26,7 @@ import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
@@ -74,6 +75,20 @@ class SpringDataMongodbSerializer extends MongodbSerializer {
this.mapper = new QueryMapper(converter);
}
/*
* (non-Javadoc)
* @see com.querydsl.mongodb.MongodbSerializer#visit(com.querydsl.core.types.Constant, java.lang.Void)
*/
@Override
public Object visit(Constant<?> expr, Void context) {
if (!ClassUtils.isAssignable(Enum.class, expr.getType())) {
return super.visit(expr, context);
}
return converter.convertToMongoType(expr.getConstant());
}
/*
* (non-Javadoc)
* @see com.mysema.query.mongodb.MongodbSerializer#getKeyForPath(com.mysema.query.types.Path, com.mysema.query.types.PathMetadata)

View File

@@ -50,6 +50,7 @@ import com.mongodb.WriteConcern;
*
* @author Oliver Gierke
* @author Christoph Strobl
* @author Viktor Khoroshko
*/
public class MongoDbFactoryParserIntegrationTests {
@@ -198,6 +199,52 @@ public class MongoDbFactoryParserIntegrationTests {
reader.loadBeanDefinitions(new ClassPathResource("namespace/mongo-client-uri-and-details.xml"));
}
/**
* @see DATAMONGO-1293
*/
@Test
public void setsUpClientUriWithId() {
reader.loadBeanDefinitions(new ClassPathResource("namespace/mongo-client-uri-and-id.xml"));
BeanDefinition definition = factory.getBeanDefinition("testMongo");
ConstructorArgumentValues constructorArguments = definition.getConstructorArgumentValues();
assertThat(constructorArguments.getArgumentCount(), is(1));
ValueHolder argument = constructorArguments.getArgumentValue(0, MongoClientURI.class);
assertThat(argument, is(notNullValue()));
}
/**
* @see DATAMONGO-1293
*/
@Test
public void setsUpUriWithId() {
reader.loadBeanDefinitions(new ClassPathResource("namespace/mongo-uri-and-id.xml"));
BeanDefinition definition = factory.getBeanDefinition("testMongo");
ConstructorArgumentValues constructorArguments = definition.getConstructorArgumentValues();
assertThat(constructorArguments.getArgumentCount(), is(1));
ValueHolder argument = constructorArguments.getArgumentValue(0, MongoClientURI.class);
assertThat(argument, is(notNullValue()));
}
/**
* @see DATAMONGO-1293
*/
@Test(expected = BeanDefinitionParsingException.class)
public void rejectsClientUriPlusDetailedConfigurationAndWriteConcern() {
reader.loadBeanDefinitions(new ClassPathResource("namespace/mongo-client-uri-write-concern-and-details.xml"));
}
/**
* @see DATAMONGO-1293
*/
@Test(expected = BeanDefinitionParsingException.class)
public void rejectsUriPlusDetailedConfigurationAndWriteConcern() {
reader.loadBeanDefinitions(new ClassPathResource("namespace/mongo-client-uri-write-concern-and-details.xml"));
}
private static void assertWriteConcern(ClassPathXmlApplicationContext ctx, WriteConcern expectedWriteConcern) {
SimpleMongoDbFactory dbFactory = ctx.getBean("first", SimpleMongoDbFactory.class);

View File

@@ -0,0 +1,83 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.mockito.Mockito.*;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.MongoTemplate.CloseableIterableCursorAdapter;
import org.springframework.data.mongodb.core.MongoTemplate.DbObjectCallback;
import org.springframework.data.util.CloseableIterator;
import com.mongodb.Cursor;
/**
* Unit tests for {@link CloseableIterableCursorAdapter}.
*
* @author Oliver Gierke
* @see DATAMONGO-1276
*/
@RunWith(MockitoJUnitRunner.class)
public class CloseableIterableCursorAdapterUnitTests {
@Mock PersistenceExceptionTranslator exceptionTranslator;
@Mock DbObjectCallback<Object> callback;
Cursor cursor;
CloseableIterator<Object> adapter;
@Before
public void setUp() {
this.cursor = doThrow(IllegalArgumentException.class).when(mock(Cursor.class));
this.adapter = new CloseableIterableCursorAdapter<Object>(cursor, exceptionTranslator, callback);
}
/**
* @see DATAMONGO-1276
*/
@Test(expected = IllegalArgumentException.class)
public void propagatesOriginalExceptionFromAdapterDotNext() {
cursor.next();
adapter.next();
}
/**
* @see DATAMONGO-1276
*/
@Test(expected = IllegalArgumentException.class)
public void propagatesOriginalExceptionFromAdapterDotHasNext() {
cursor.hasNext();
adapter.hasNext();
}
/**
* @see DATAMONGO-1276
*/
@Test(expected = IllegalArgumentException.class)
public void propagatesOriginalExceptionFromAdapterDotClose() {
cursor.close();
adapter.close();
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2014-2015 the original author or authors.
* Copyright 2014-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -190,4 +190,12 @@ public class DefaultScriptOperationsTests {
public void scriptNamesShouldReturnEmptySetWhenNoScriptRegistered() {
assertThat(scriptOps.getScriptNames(), is(empty()));
}
/**
* @see DATAMONGO-1465
*/
@Test
public void executeShouldNotQuoteStrings() {
assertThat(scriptOps.execute(EXECUTABLE_SCRIPT, "spring-data"), is((Object) "spring-data"));
}
}

View File

@@ -0,0 +1,65 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.Test;
import com.mongodb.BasicDBObject;
/**
* Unit tests for {@link GeoCommandStatistics}.
*
* @author Oliver Gierke
* @soundtrack Fruitcake - Jeff Coffin (The Inside of the Outside)
*/
public class GeoCommandStatisticsUnitTests {
/**
* @see DATAMONGO-1361
*/
@Test(expected = IllegalArgumentException.class)
public void rejectsNullCommandResult() {
GeoCommandStatistics.from(null);
}
/**
* @see DATAMONGO-1361
*/
@Test
public void fallsBackToNanIfNoAverageDistanceIsAvailable() {
GeoCommandStatistics statistics = GeoCommandStatistics.from(new BasicDBObject("stats", null));
assertThat(statistics.getAverageDistance(), is(Double.NaN));
statistics = GeoCommandStatistics.from(new BasicDBObject("stats", new BasicDBObject()));
assertThat(statistics.getAverageDistance(), is(Double.NaN));
}
/**
* @see DATAMONGO-1361
*/
@Test
public void returnsAverageDistanceIfPresent() {
GeoCommandStatistics statistics = GeoCommandStatistics
.from(new BasicDBObject("stats", new BasicDBObject("avgDistance", 1.5)));
assertThat(statistics.getAverageDistance(), is(1.5));
}
}

View File

@@ -36,7 +36,7 @@ import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
@ContextConfiguration("classpath:infrastructure.xml")
public class MongoAdminIntegrationTests {
private static Log logger = LogFactory.getLog(MongoAdminIntegrationTests.class);
private static final Log logger = LogFactory.getLog(MongoAdminIntegrationTests.class);
@SuppressWarnings("unused")
private DB testAdminDb;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -63,6 +63,7 @@ import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.LazyLoadingProxy;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.geo.GeoJsonPoint;
import org.springframework.data.mongodb.core.index.Index;
import org.springframework.data.mongodb.core.index.Index.Duplicates;
import org.springframework.data.mongodb.core.index.IndexField;
@@ -197,6 +198,7 @@ public class MongoTemplateTests {
template.dropCollection(SomeTemplate.class);
template.dropCollection(Address.class);
template.dropCollection(DocumentWithCollectionOfSamples.class);
template.dropCollection(WithGeoJson.class);
}
@Test
@@ -3035,7 +3037,7 @@ public class MongoTemplateTests {
*/
@Test
public void takesLimitIntoAccountWhenStreaming() {
Person youngestPerson = new Person("John", 20);
Person oldestPerson = new Person("Jane", 42);
@@ -3049,11 +3051,124 @@ public class MongoTemplateTests {
assertThat(stream.hasNext(), is(false));
}
/**
* @see DATAMONGO-1204
*/
@Test
public void resolvesCyclicDBRefCorrectly() {
SomeMessage message = new SomeMessage();
SomeContent content = new SomeContent();
template.save(message);
template.save(content);
message.dbrefContent = content;
content.dbrefMessage = message;
template.save(message);
template.save(content);
SomeMessage messageLoaded = template.findOne(query(where("id").is(message.id)), SomeMessage.class);
SomeContent contentLoaded = template.findOne(query(where("id").is(content.id)), SomeContent.class);
assertThat(messageLoaded.dbrefContent.id, is(contentLoaded.id));
assertThat(contentLoaded.dbrefMessage.id, is(messageLoaded.id));
}
/**
* @see DATAMONGO-1287
*/
@Test
public void shouldReuseAlreadyResolvedLazyLoadedDBRefWhenUsedAsPersistenceConstrcutorArgument() {
Document docInCtor = new Document();
docInCtor.id = "doc-in-ctor";
template.save(docInCtor);
DocumentWithLazyDBrefUsedInPresistenceConstructor source = new DocumentWithLazyDBrefUsedInPresistenceConstructor(
docInCtor);
template.save(source);
DocumentWithLazyDBrefUsedInPresistenceConstructor loaded = template.findOne(query(where("id").is(source.id)),
DocumentWithLazyDBrefUsedInPresistenceConstructor.class);
assertThat(loaded.refToDocUsedInCtor, not(instanceOf(LazyLoadingProxy.class)));
assertThat(loaded.refToDocNotUsedInCtor, nullValue());
}
/**
* @see DATAMONGO-1287
*/
@Test
public void shouldNotReuseLazyLoadedDBRefWhenTypeUsedInPersistenceConstrcutorButValueRefersToAnotherProperty() {
Document docNotUsedInCtor = new Document();
docNotUsedInCtor.id = "doc-but-not-used-in-ctor";
template.save(docNotUsedInCtor);
DocumentWithLazyDBrefUsedInPresistenceConstructor source = new DocumentWithLazyDBrefUsedInPresistenceConstructor(
null);
source.refToDocNotUsedInCtor = docNotUsedInCtor;
template.save(source);
DocumentWithLazyDBrefUsedInPresistenceConstructor loaded = template.findOne(query(where("id").is(source.id)),
DocumentWithLazyDBrefUsedInPresistenceConstructor.class);
assertThat(loaded.refToDocNotUsedInCtor, instanceOf(LazyLoadingProxy.class));
assertThat(loaded.refToDocUsedInCtor, nullValue());
}
/**
* @see DATAMONGO-1287
*/
@Test
public void shouldRespectParamterValueWhenAttemptingToReuseLazyLoadedDBRefUsedInPersistenceConstrcutor() {
Document docInCtor = new Document();
docInCtor.id = "doc-in-ctor";
template.save(docInCtor);
Document docNotUsedInCtor = new Document();
docNotUsedInCtor.id = "doc-but-not-used-in-ctor";
template.save(docNotUsedInCtor);
DocumentWithLazyDBrefUsedInPresistenceConstructor source = new DocumentWithLazyDBrefUsedInPresistenceConstructor(
docInCtor);
source.refToDocNotUsedInCtor = docNotUsedInCtor;
template.save(source);
DocumentWithLazyDBrefUsedInPresistenceConstructor loaded = template.findOne(query(where("id").is(source.id)),
DocumentWithLazyDBrefUsedInPresistenceConstructor.class);
assertThat(loaded.refToDocUsedInCtor, not(instanceOf(LazyLoadingProxy.class)));
assertThat(loaded.refToDocNotUsedInCtor, instanceOf(LazyLoadingProxy.class));
}
/**
* @see DATAMONGO-1401
*/
@Test
public void updateShouldWorkForTypesContainingGeoJsonTypes() {
WithGeoJson wgj = new WithGeoJson();
wgj.id = "1";
wgj.description = "datamongo-1401";
wgj.point = new GeoJsonPoint(1D, 2D);
template.save(wgj);
wgj.description = "datamongo-1401-update";
template.save(wgj);
assertThat(template.findOne(query(where("id").is(wgj.id)), WithGeoJson.class).point, is(equalTo(wgj.point)));
}
static class DoucmentWithNamedIdField {
@Id String someIdKey;
@Field(value = "val")//
@Field(value = "val") //
String value;
@Override
@@ -3351,6 +3466,7 @@ public class MongoTemplateTests {
String id;
String text;
String name;
@org.springframework.data.mongodb.core.mapping.DBRef SomeMessage dbrefMessage;
public String getName() {
return name;
@@ -3375,4 +3491,27 @@ public class MongoTemplateTests {
@org.springframework.data.mongodb.core.mapping.DBRef SomeContent dbrefContent;
SomeContent normalContent;
}
static class DocumentWithLazyDBrefUsedInPresistenceConstructor {
@Id String id;
@org.springframework.data.mongodb.core.mapping.DBRef(lazy = true) Document refToDocUsedInCtor;
@org.springframework.data.mongodb.core.mapping.DBRef(lazy = true) Document refToDocNotUsedInCtor;
@PersistenceConstructor
public DocumentWithLazyDBrefUsedInPresistenceConstructor(Document refToDocUsedInCtor) {
this.refToDocUsedInCtor = refToDocUsedInCtor;
}
}
static class WithGeoJson {
@Id String id;
@Version //
Integer version;
String description;
GeoJsonPoint point;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2014 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -52,6 +52,7 @@ import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
@@ -66,6 +67,8 @@ import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBCursor;
import com.mongodb.DBObject;
import com.mongodb.MapReduceCommand;
import com.mongodb.MapReduceOutput;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
import com.mongodb.ReadPreference;
@@ -422,6 +425,112 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
verify(this.db, times(1)).command(Mockito.any(DBObject.class));
}
/**
* @see DATAMONGO-1334
*/
@Test
public void mapReduceShouldUseZeroAsDefaultLimit() {
ArgumentCaptor<MapReduceCommand> captor = ArgumentCaptor.forClass(MapReduceCommand.class);
MapReduceOutput output = mock(MapReduceOutput.class);
when(output.results()).thenReturn(Collections.<DBObject> emptySet());
when(collection.mapReduce(Mockito.any(MapReduceCommand.class))).thenReturn(output);
Query query = new BasicQuery("{'foo':'bar'}");
template.mapReduce(query, "collection", "function(){}", "function(key,values){}", Wrapper.class);
verify(collection).mapReduce(captor.capture());
assertThat(captor.getValue().getLimit(), is(0));
}
/**
* @see DATAMONGO-1334
*/
@Test
public void mapReduceShouldPickUpLimitFromQuery() {
ArgumentCaptor<MapReduceCommand> captor = ArgumentCaptor.forClass(MapReduceCommand.class);
MapReduceOutput output = mock(MapReduceOutput.class);
when(output.results()).thenReturn(Collections.<DBObject> emptySet());
when(collection.mapReduce(Mockito.any(MapReduceCommand.class))).thenReturn(output);
Query query = new BasicQuery("{'foo':'bar'}");
query.limit(100);
template.mapReduce(query, "collection", "function(){}", "function(key,values){}", Wrapper.class);
verify(collection).mapReduce(captor.capture());
assertThat(captor.getValue().getLimit(), is(100));
}
/**
* @see DATAMONGO-1334
*/
@Test
public void mapReduceShouldPickUpLimitFromOptions() {
ArgumentCaptor<MapReduceCommand> captor = ArgumentCaptor.forClass(MapReduceCommand.class);
MapReduceOutput output = mock(MapReduceOutput.class);
when(output.results()).thenReturn(Collections.<DBObject> emptySet());
when(collection.mapReduce(Mockito.any(MapReduceCommand.class))).thenReturn(output);
Query query = new BasicQuery("{'foo':'bar'}");
template.mapReduce(query, "collection", "function(){}", "function(key,values){}",
new MapReduceOptions().limit(1000), Wrapper.class);
verify(collection).mapReduce(captor.capture());
assertThat(captor.getValue().getLimit(), is(1000));
}
/**
* @see DATAMONGO-1334
*/
@Test
public void mapReduceShouldPickUpLimitFromOptionsWhenQueryIsNotPresent() {
ArgumentCaptor<MapReduceCommand> captor = ArgumentCaptor.forClass(MapReduceCommand.class);
MapReduceOutput output = mock(MapReduceOutput.class);
when(output.results()).thenReturn(Collections.<DBObject> emptySet());
when(collection.mapReduce(Mockito.any(MapReduceCommand.class))).thenReturn(output);
template.mapReduce("collection", "function(){}", "function(key,values){}", new MapReduceOptions().limit(1000),
Wrapper.class);
verify(collection).mapReduce(captor.capture());
assertThat(captor.getValue().getLimit(), is(1000));
}
/**
* @see DATAMONGO-1334
*/
@Test
public void mapReduceShouldPickUpLimitFromOptionsEvenWhenQueryDefinesItDifferently() {
ArgumentCaptor<MapReduceCommand> captor = ArgumentCaptor.forClass(MapReduceCommand.class);
MapReduceOutput output = mock(MapReduceOutput.class);
when(output.results()).thenReturn(Collections.<DBObject> emptySet());
when(collection.mapReduce(Mockito.any(MapReduceCommand.class))).thenReturn(output);
Query query = new BasicQuery("{'foo':'bar'}");
query.limit(100);
template.mapReduce(query, "collection", "function(){}", "function(key,values){}",
new MapReduceOptions().limit(1000), Wrapper.class);
verify(collection).mapReduce(captor.capture());
assertThat(captor.getValue().getLimit(), is(1000));
}
class AutogenerateableId {
@Id BigInteger id;

View File

@@ -0,0 +1,131 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.hamcrest.core.Is.*;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import java.util.Map;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.config.AbstractMongoConfiguration;
import org.springframework.data.mongodb.repository.MongoRepository;
import org.springframework.data.mongodb.repository.config.EnableMongoRepositories;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Integration tests for DATAMONGO-1289.
*
* @author Christoph Strobl
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
public class NoExplicitIdTests {
@Configuration
@EnableMongoRepositories(considerNestedRepositories = true)
static class Config extends AbstractMongoConfiguration {
@Override
protected String getDatabaseName() {
return "test";
}
@Override
public Mongo mongo() throws Exception {
return new MongoClient();
}
}
@Autowired MongoOperations mongoOps;
@Autowired TypeWithoutExplicitIdPropertyRepository repo;
@Before
public void setUp() {
mongoOps.dropCollection(TypeWithoutIdProperty.class);
}
/**
* @see DATAMONGO-1289
*/
@Test
public void saveAndRetrieveTypeWithoutIdPorpertyViaTemplate() {
TypeWithoutIdProperty noid = new TypeWithoutIdProperty();
noid.someString = "o.O";
mongoOps.save(noid);
TypeWithoutIdProperty retrieved = mongoOps.findOne(query(where("someString").is(noid.someString)),
TypeWithoutIdProperty.class);
assertThat(retrieved.someString, is(noid.someString));
}
/**
* @see DATAMONGO-1289
*/
@Test
public void saveAndRetrieveTypeWithoutIdPorpertyViaRepository() {
TypeWithoutIdProperty noid = new TypeWithoutIdProperty();
noid.someString = "o.O";
repo.save(noid);
TypeWithoutIdProperty retrieved = repo.findBySomeString(noid.someString);
assertThat(retrieved.someString, is(noid.someString));
}
/**
* @see DATAMONGO-1289
*/
@Test
@SuppressWarnings("unchecked")
public void saveAndRetrieveTypeWithoutIdPorpertyViaRepositoryFindOne() {
TypeWithoutIdProperty noid = new TypeWithoutIdProperty();
noid.someString = "o.O";
repo.save(noid);
Map<String, Object> map = mongoOps.findOne(query(where("someString").is(noid.someString)), Map.class,
"typeWithoutIdProperty");
TypeWithoutIdProperty retrieved = repo.findOne(map.get("_id").toString());
assertThat(retrieved.someString, is(noid.someString));
}
static class TypeWithoutIdProperty {
String someString;
}
static interface TypeWithoutExplicitIdPropertyRepository extends MongoRepository<TypeWithoutIdProperty, String> {
TypeWithoutIdProperty findBySomeString(String someString);
}
}

View File

@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.core;
import java.util.Arrays;
import org.joda.time.LocalDate;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.mongodb.core.mapping.Document;
@@ -24,10 +25,10 @@ import org.springframework.data.mongodb.core.mapping.Document;
@Document(collection = "newyork")
public class Venue {
@Id
private String id;
@Id private String id;
private String name;
private double[] location;
private LocalDate openingDate;
@PersistenceConstructor
Venue(String name, double[] location) {
@@ -50,6 +51,14 @@ public class Venue {
return location;
}
public LocalDate getOpeningDate() {
return openingDate;
}
public void setOpeningDate(LocalDate openingDate) {
this.openingDate = openingDate;
}
@Override
public String toString() {
return "Venue [id=" + id + ", name=" + name + ", location=" + Arrays.toString(location) + "]";

View File

@@ -0,0 +1,90 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import static org.mockito.Mockito.*;
import org.junit.Test;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.convert.MongoConverters.ObjectIdToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToObjectIdConverter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.util.TypeInformation;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
/**
* Unit tests for {@link AbstractMongoConverter}.
*
* @author Oliver Gierke
*/
public class AbstractMongoConverterUnitTests {
/**
* @see DATAMONGO-1324
*/
@Test
public void registersObjectIdConvertersExplicitly() {
DefaultConversionService conversionService = spy(new DefaultConversionService());
new SampleMongoConverter(conversionService).afterPropertiesSet();
verify(conversionService).addConverter(StringToObjectIdConverter.INSTANCE);
verify(conversionService).addConverter(ObjectIdToStringConverter.INSTANCE);
}
static class SampleMongoConverter extends AbstractMongoConverter {
public SampleMongoConverter(GenericConversionService conversionService) {
super(conversionService);
}
@Override
public MongoTypeMapper getTypeMapper() {
throw new UnsupportedOperationException();
}
@Override
public MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> getMappingContext() {
throw new UnsupportedOperationException();
}
@Override
public <R> R read(Class<R> type, DBObject source) {
throw new UnsupportedOperationException();
}
@Override
public void write(Object source, DBObject sink) {
throw new UnsupportedOperationException();
}
@Override
public Object convertToMongoType(Object obj, TypeInformation<?> typeInformation) {
throw new UnsupportedOperationException();
}
@Override
public DBRef toDBRef(Object object, MongoPersistentProperty referingProperty) {
throw new UnsupportedOperationException();
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -21,7 +21,9 @@ import static org.junit.Assert.*;
import java.net.URL;
import java.text.DateFormat;
import java.text.Format;
import java.text.SimpleDateFormat;
import java.util.Arrays;
import java.util.Collections;
import java.util.Date;
import java.util.Locale;
import java.util.UUID;
@@ -32,8 +34,10 @@ import org.joda.time.DateTime;
import org.junit.Test;
import org.springframework.aop.framework.ProxyFactory;
import org.springframework.core.convert.converter.Converter;
import org.springframework.core.convert.converter.ConverterFactory;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigIntegerConverter;
import org.threeten.bp.LocalDateTime;
@@ -43,12 +47,11 @@ import com.mongodb.DBRef;
* Unit tests for {@link CustomConversions}.
*
* @author Oliver Gierke
* @auhtor Christoph Strobl
* @author Christoph Strobl
*/
public class CustomConversionsUnitTests {
@Test
@SuppressWarnings("unchecked")
public void findsBasicReadAndWriteConversions() {
CustomConversions conversions = new CustomConversions(Arrays.asList(FormatToStringConverter.INSTANCE,
@@ -62,7 +65,6 @@ public class CustomConversionsUnitTests {
}
@Test
@SuppressWarnings("unchecked")
public void considersSubtypesCorrectly() {
CustomConversions conversions = new CustomConversions(Arrays.asList(NumberToStringConverter.INSTANCE,
@@ -132,6 +134,7 @@ public class CustomConversionsUnitTests {
*/
@Test
public void doesNotConsiderTypeSimpleIfOnlyReadConverterIsRegistered() {
CustomConversions conversions = new CustomConversions(Arrays.asList(StringToFormatConverter.INSTANCE));
assertThat(conversions.isSimpleType(Format.class), is(false));
}
@@ -257,6 +260,17 @@ public class CustomConversionsUnitTests {
assertThat(customConversions.hasCustomWriteTarget(LocalDateTime.class), is(true));
}
/**
* @see DATAMONGO-1302
*/
@Test
public void registersConverterFactoryCorrectly() {
CustomConversions customConversions = new CustomConversions(Collections.singletonList(new FormatConverterFactory()));
assertThat(customConversions.getCustomWriteTarget(String.class, SimpleDateFormat.class), notNullValue());
}
private static Class<?> createProxyTypeFor(Class<?> type) {
ProxyFactory factory = new ProxyFactory();
@@ -331,4 +345,37 @@ public class CustomConversionsUnitTests {
}
}
@WritingConverter
static class FormatConverterFactory implements ConverterFactory<String, Format> {
@Override
public <T extends Format> Converter<String, T> getConverter(Class<T> targetType) {
return new StringToFormat<T>(targetType);
}
private static final class StringToFormat<T extends Format> implements Converter<String, T> {
private final Class<T> targetType;
public StringToFormat(Class<T> targetType) {
this.targetType = targetType;
}
@Override
public T convert(String source) {
if (source.length() == 0) {
return null;
}
try {
return targetType.newInstance();
} catch (Exception e) {
throw new IllegalArgumentException(e.getMessage(), e);
}
}
}
}
}

View File

@@ -79,6 +79,42 @@ public class DBObjectAccessorUnitTests {
new DBObjectAccessor(null);
}
/**
* @see DATAMONGO-1335
*/
@Test
public void writesAllNestingsCorrectly() {
MongoPersistentEntity<?> entity = context.getPersistentEntity(TypeWithTwoNestings.class);
BasicDBObject target = new BasicDBObject();
DBObjectAccessor accessor = new DBObjectAccessor(target);
accessor.put(entity.getPersistentProperty("id"), "id");
accessor.put(entity.getPersistentProperty("b"), "b");
accessor.put(entity.getPersistentProperty("c"), "c");
DBObject nestedA = DBObjectTestUtils.getAsDBObject(target, "a");
assertThat(nestedA, is(notNullValue()));
assertThat(nestedA.get("b"), is((Object) "b"));
assertThat(nestedA.get("c"), is((Object) "c"));
}
/**
* @see DATAMONGO-1471
*/
@Test
public void exposesAvailabilityOfFields() {
DBObjectAccessor accessor = new DBObjectAccessor(new BasicDBObject("a", new BasicDBObject("c", "d")));
MongoPersistentEntity<?> entity = context.getPersistentEntity(ProjectingType.class);
assertThat(accessor.hasValue(entity.getPersistentProperty("foo")), is(false));
assertThat(accessor.hasValue(entity.getPersistentProperty("a")), is(true));
assertThat(accessor.hasValue(entity.getPersistentProperty("name")), is(false));
}
static class ProjectingType {
String name;
@@ -91,4 +127,10 @@ public class DBObjectAccessorUnitTests {
String c;
}
static class TypeWithTwoNestings {
String id;
@Field("a.b") String b;
@Field("a.c") String c;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -48,9 +48,9 @@ public class DefaultMongoTypeMapperUnitTests {
@Before
public void setUp() {
configurableTypeInformationMapper = new ConfigurableTypeInformationMapper(Collections.singletonMap(String.class,
"1"));
simpleTypeInformationMapper = SimpleTypeInformationMapper.INSTANCE;
configurableTypeInformationMapper = new ConfigurableTypeInformationMapper(
Collections.singletonMap(String.class, "1"));
simpleTypeInformationMapper = new SimpleTypeInformationMapper();
typeMapper = new DefaultMongoTypeMapper();
}
@@ -81,8 +81,8 @@ public class DefaultMongoTypeMapperUnitTests {
@Test
public void writesClassNamesForUnmappedValuesIfConfigured() {
typeMapper = new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, Arrays.asList(
configurableTypeInformationMapper, simpleTypeInformationMapper));
typeMapper = new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY,
Arrays.asList(configurableTypeInformationMapper, simpleTypeInformationMapper));
writesTypeToField(new BasicDBObject(), String.class, "1");
writesTypeToField(new BasicDBObject(), Object.class, Object.class.getName());
@@ -101,11 +101,12 @@ public class DefaultMongoTypeMapperUnitTests {
@Test
public void readsTypeLoadingClassesForUnmappedTypesIfConfigured() {
typeMapper = new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, Arrays.asList(
configurableTypeInformationMapper, simpleTypeInformationMapper));
typeMapper = new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY,
Arrays.asList(configurableTypeInformationMapper, simpleTypeInformationMapper));
readsTypeFromField(new BasicDBObject(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, "1"), String.class);
readsTypeFromField(new BasicDBObject(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, Object.class.getName()), Object.class);
readsTypeFromField(new BasicDBObject(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, Object.class.getName()),
Object.class);
}
/**
@@ -144,7 +145,8 @@ public class DefaultMongoTypeMapperUnitTests {
@Test
public void readsTypeFromDefaultKeyByDefault() {
readsTypeFromField(new BasicDBObject(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, String.class.getName()), String.class);
readsTypeFromField(new BasicDBObject(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, String.class.getName()),
String.class);
}
@Test

View File

@@ -2066,6 +2066,14 @@ public class MappingMongoConverterUnitTests {
assertThat(target.map.get(FooBarEnum.FOO), is("spring"));
}
/**
* @see DATAMONGO-1471
*/
@Test
public void readsDocumentWithPrimitiveIdButNoValue() {
assertThat(converter.read(ClassWithIntId.class, new BasicDBObject()), is(notNullValue()));
}
static class GenericType<T> {
T content;
}
@@ -2402,10 +2410,10 @@ public class MappingMongoConverterUnitTests {
return null;
}
if (source.equals("foo-enum-value")) {
if ("foo-enum-value".equals(source)) {
return FooBarEnum.FOO;
}
if (source.equals("bar-enum-value")) {
if ("bar-enum-value".equals(source)) {
return FooBarEnum.BAR;
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -53,6 +53,7 @@ import org.springframework.data.mongodb.core.mapping.TextScore;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.test.util.BasicDbListBuilder;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
@@ -67,6 +68,7 @@ import com.mongodb.QueryBuilder;
* @author Patryk Wasik
* @author Thomas Darimont
* @author Christoph Strobl
* @author Mark Paluch
*/
@RunWith(MockitoJUnitRunner.class)
public class QueryMapperUnitTests {
@@ -594,6 +596,28 @@ public class QueryMapperUnitTests {
assertThat(dbo.toString(), equalTo("{ \"embedded\" : { \"$in\" : [ { \"_id\" : \"1\"} , { \"_id\" : \"2\"}]}}"));
}
/**
* @see DATAMONGO-1406
*/
@Test
public void shouldMapQueryForNestedCustomizedPropertiesUsingConfiguredFieldNames() {
EmbeddedClass embeddedClass = new EmbeddedClass();
embeddedClass.customizedField = "hello";
Foo foo = new Foo();
foo.listOfItems = Arrays.asList(embeddedClass);
Query query = new Query(Criteria.where("listOfItems") //
.elemMatch(new Criteria(). //
andOperator(Criteria.where("customizedField").is(embeddedClass.customizedField))));
DBObject dbo = mapper.getMappedObject(query.getQueryObject(), context.getPersistentEntity(Foo.class));
assertThat(dbo, isBsonObject().containing("my_items.$elemMatch.$and",
new BasicDbListBuilder().add(new BasicDBObject("fancy_custom_name", embeddedClass.customizedField)).get()));
}
/**
* @see DATAMONGO-647
*/
@@ -822,10 +846,14 @@ public class QueryMapperUnitTests {
public class Foo {
@Id private ObjectId id;
EmbeddedClass embedded;
@Field("my_items") List<EmbeddedClass> listOfItems;
}
public class EmbeddedClass {
public String id;
@Field("fancy_custom_name") public String customizedField;
}
class IdWrapper {

View File

@@ -857,6 +857,33 @@ public class UpdateMapperUnitTests {
assertThat($set.get("concreteValue.name"), nullValue());
}
/**
* @see DATAMONGO-1423
*/
@Test
@SuppressWarnings("unchecked")
public void mappingShouldConsiderCustomConvertersForEnumMapKeys() {
CustomConversions conversions = new CustomConversions(
Arrays.asList(AllocationToStringConverter.INSTANCE, StringToAllocationConverter.INSTANCE));
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setSimpleTypeHolder(conversions.getSimpleTypeHolder());
mappingContext.afterPropertiesSet();
MappingMongoConverter converter = new MappingMongoConverter(mock(DbRefResolver.class), mappingContext);
converter.setCustomConversions(conversions);
converter.afterPropertiesSet();
UpdateMapper mapper = new UpdateMapper(converter);
Update update = new Update().set("enumAsMapKey", Collections.singletonMap(Allocation.AVAILABLE, 100));
DBObject result = mapper.getMappedObject(update.getUpdateObject(),
mappingContext.getPersistentEntity(ClassWithEnum.class));
assertThat(result, isBsonObject().containing("$set.enumAsMapKey.V", 100));
}
static class DomainTypeWrappingConcreteyTypeHavingListOfInterfaceTypeAttributes {
ListModelWrapper concreteTypeWithListAttributeOfInterfaceType;
}
@@ -1083,6 +1110,7 @@ public class UpdateMapperUnitTests {
static class ClassWithEnum {
Allocation allocation;
Map<Allocation, String> enumAsMapKey;
static enum Allocation {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -22,6 +22,7 @@ import static org.springframework.data.mongodb.core.query.Query.*;
import java.util.List;
import org.joda.time.LocalDate;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
@@ -49,6 +50,7 @@ import com.mongodb.WriteConcern;
/**
* @author Christoph Strobl
* @author Oliver Gierke
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
@@ -173,4 +175,13 @@ public abstract class AbstractGeoSpatialTests {
assertThat(venues.size(), is(11));
}
/**
* @see DATAMONGO-1360
*/
@Test
public void mapsQueryContainedInNearQuery() {
Query query = query(where("openingDate").lt(LocalDate.now()));
template.geoNear(NearQuery.near(1.5, 1.7).query(query), Venue.class);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2015 the original author or authors.
* Copyright 2015-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -29,6 +29,7 @@ import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Configuration;
import org.springframework.dao.DataAccessException;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.geo.GeoResults;
@@ -36,6 +37,7 @@ import org.springframework.data.geo.Metric;
import org.springframework.data.geo.Metrics;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.config.AbstractMongoConfiguration;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.index.GeoSpatialIndexType;
import org.springframework.data.mongodb.core.index.GeoSpatialIndexed;
@@ -43,11 +45,15 @@ import org.springframework.data.mongodb.core.index.GeospatialIndex;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.test.util.BasicDbListBuilder;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import com.mongodb.BasicDBObject;
import com.mongodb.DBCollection;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoException;
import com.mongodb.WriteConcern;
/**
@@ -317,6 +323,66 @@ public class GeoJsonTests {
assertThat(venues.size(), is(2));
}
/**
* @see DATAMONGO-1453
*/
@Test
public void shouldConvertPointRepresentationCorrectlyWhenSourceCoordinatesUsesInteger() {
this.template.execute(template.getCollectionName(DocumentWithPropertyUsingGeoJsonType.class),
new CollectionCallback<Object>() {
@Override
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
BasicDBObject pointRepresentation = new BasicDBObject();
pointRepresentation.put("type", "Point");
pointRepresentation.put("coordinates", new BasicDbListBuilder().add(0).add(0).get());
BasicDBObject document = new BasicDBObject();
document.append("_id", "datamongo-1453");
document.append("geoJsonPoint", pointRepresentation);
return collection.save(document);
}
});
assertThat(template.findOne(query(where("id").is("datamongo-1453")),
DocumentWithPropertyUsingGeoJsonType.class).geoJsonPoint, is(equalTo(new GeoJsonPoint(0D, 0D))));
}
/**
* @see DATAMONGO-1453
*/
@Test
public void shouldConvertLineStringRepresentationCorrectlyWhenSourceCoordinatesUsesInteger() {
this.template.execute(template.getCollectionName(DocumentWithPropertyUsingGeoJsonType.class),
new CollectionCallback<Object>() {
@Override
public Object doInCollection(DBCollection collection) throws MongoException, DataAccessException {
BasicDBObject lineStringRepresentation = new BasicDBObject();
lineStringRepresentation.put("type", "LineString");
lineStringRepresentation.put("coordinates",
new BasicDbListBuilder().add(new BasicDbListBuilder().add(0).add(0).get())
.add(new BasicDbListBuilder().add(1).add(1).get()).get());
BasicDBObject document = new BasicDBObject();
document.append("_id", "datamongo-1453");
document.append("geoJsonLineString", lineStringRepresentation);
return collection.save(document);
}
});
assertThat(
template.findOne(query(where("id").is("datamongo-1453")),
DocumentWithPropertyUsingGeoJsonType.class).geoJsonLineString,
is(equalTo(new GeoJsonLineString(new Point(0D, 0D), new Point(1, 1)))));
}
private void addVenues() {
template.insert(new Venue2DSphere("Penn Station", -73.99408, 40.75057));

View File

@@ -26,6 +26,7 @@ import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.Suite;
import org.junit.runners.Suite.SuiteClasses;
import org.springframework.data.annotation.Id;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.DBObjectTestUtils;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexResolver.IndexDefinitionHolder;
@@ -148,6 +149,34 @@ public class MongoPersistentEntityIndexResolverUnitTests {
assertThat(indexDefinitions.get(0).getCollection(), equalTo("CollectionOverride"));
}
/**
* @see DATAMONGO-1297
*/
@Test
public void resolvesIndexOnDbrefWhenDefined() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(WithDbRef.class);
assertThat(indexDefinitions, hasSize(1));
assertThat(indexDefinitions.get(0).getCollection(), equalTo("withDbRef"));
assertThat(indexDefinitions.get(0).getIndexKeys(), equalTo(new BasicDBObjectBuilder().add("indexedDbRef", 1)
.get()));
}
/**
* @see DATAMONGO-1297
*/
@Test
public void resolvesIndexOnDbrefWhenDefinedOnNestedElement() {
List<IndexDefinitionHolder> indexDefinitions = prepareMappingContextAndResolveIndexForType(WrapperOfWithDbRef.class);
assertThat(indexDefinitions, hasSize(1));
assertThat(indexDefinitions.get(0).getCollection(), equalTo("wrapperOfWithDbRef"));
assertThat(indexDefinitions.get(0).getIndexKeys(),
equalTo(new BasicDBObjectBuilder().add("nested.indexedDbRef", 1).get()));
}
@Document(collection = "Zero")
static class IndexOnLevelZero {
@Indexed String indexedProperty;
@@ -182,6 +211,24 @@ public class MongoPersistentEntityIndexResolverUnitTests {
@Indexed @Field("customFieldName") String namedProperty;
}
@Document
static class WrapperOfWithDbRef {
WithDbRef nested;
}
@Document
static class WithDbRef {
@Indexed//
@DBRef//
NoIndex indexedDbRef;
}
@Document(collection = "no-index")
static class NoIndex {
@Id String id;
}
}
/**

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,12 +15,40 @@
*/
package org.springframework.data.mongodb.core.mapreduce;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.test.util.IsBsonObject.*;
import org.junit.Test;
/**
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
*/
public class MapReduceOptionsTests {
@Test
public void testFinalize() {
new MapReduceOptions().finalizeFunction("code");
}
/**
* @see DATAMONGO-1334
*/
@Test
public void limitShouldBeIncludedCorrectly() {
MapReduceOptions options = new MapReduceOptions();
options.limit(10);
assertThat(options.getOptionsObject(), isBsonObject().containing("limit", 10));
}
/**
* @see DATAMONGO-1334
*/
@Test
public void limitShouldNotBePresentInDboWhenNotSet() {
assertThat(new MapReduceOptions().getOptionsObject(), isBsonObject().notContaining("limit"));
}
}

View File

@@ -118,13 +118,13 @@ public class MapReduceTests {
int size = 0;
for (ContentAndVersion cv : results) {
if (cv.getId().equals("Resume")) {
if ("Resume".equals(cv.getId())) {
assertEquals(6, cv.getValue().longValue());
}
if (cv.getId().equals("Schema")) {
if ("Schema".equals(cv.getId())) {
assertEquals(2, cv.getValue().longValue());
}
if (cv.getId().equals("mongoDB How-To")) {
if ("mongoDB How-To".equals(cv.getId())) {
assertEquals(2, cv.getValue().longValue());
}
size++;
@@ -141,13 +141,13 @@ public class MapReduceTests {
new MapReduceOptions().outputCollection("jmr2_out"), NumberAndVersion.class);
int size = 0;
for (NumberAndVersion nv : results) {
if (nv.getId().equals("1")) {
if ("1".equals(nv.getId())) {
assertEquals(2, nv.getValue().longValue());
}
if (nv.getId().equals("2")) {
if ("2".equals(nv.getId())) {
assertEquals(6, nv.getValue().longValue());
}
if (nv.getId().equals("3")) {
if ("3".equals(nv.getId())) {
assertEquals(2, nv.getValue().longValue());
}
size++;

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,8 +18,7 @@ package org.springframework.data.mongodb.core.query;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import nl.jqno.equalsverifier.EqualsVerifier;
import nl.jqno.equalsverifier.Warning;
import static org.springframework.data.mongodb.test.util.IsBsonObject.*;
import org.junit.Test;
import org.springframework.data.domain.Sort.Direction;
@@ -27,11 +26,15 @@ import org.springframework.data.domain.Sort.Direction;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import nl.jqno.equalsverifier.EqualsVerifier;
import nl.jqno.equalsverifier.Warning;
/**
* Unit tests for {@link BasicQuery}.
*
* @author Oliver Gierke
* @author Thomas Darimont
* @author John Willemin
*/
public class BasicQueryUnitTests {
@@ -137,4 +140,48 @@ public class BasicQueryUnitTests {
assertThat(query1, is(not(equalTo(query2))));
assertThat(query1.hashCode(), is(not(query2.hashCode())));
}
/**
* @see DATAMONGO-1387
*/
@Test
public void returnsFieldsCorrectly() {
String qry = "{ \"name\" : \"Thomas\"}";
String fields = "{\"name\":1, \"age\":1}";
BasicQuery query1 = new BasicQuery(qry, fields);
assertThat(query1.getFieldsObject(), isBsonObject().containing("name").containing("age"));
}
/**
* @see DATAMONGO-1387
*/
@Test
public void handlesFieldsIncludeCorrectly() {
String qry = "{ \"name\" : \"Thomas\"}";
BasicQuery query1 = new BasicQuery(qry);
query1.fields().include("name");
assertThat(query1.getFieldsObject(), isBsonObject().containing("name"));
}
/**
* @see DATAMONGO-1387
*/
@Test
public void combinesFieldsIncludeCorrectly() {
String qry = "{ \"name\" : \"Thomas\"}";
String fields = "{\"name\":1, \"age\":1}";
BasicQuery query1 = new BasicQuery(qry, fields);
query1.fields().include("gender");
assertThat(query1.getFieldsObject(), isBsonObject().containing("name").containing("age").containing("gender"));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2014 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -23,9 +23,11 @@ import java.util.Map;
import org.joda.time.DateTime;
import org.junit.Test;
import org.springframework.data.mongodb.core.DBObjectTestUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.DBObject;
/**
* Test cases for {@link Update}.
@@ -484,4 +486,22 @@ public class UpdateTests {
public void pushShouldThrowExceptionWhenGivenNegativePosition() {
new Update().push("foo").atPosition(-1).each("booh");
}
/**
* @see DATAMONGO-1346
*/
@Test
public void registersMultiplePullAllClauses() {
Update update = new Update();
update.pullAll("field1", new String[] { "foo" });
update.pullAll("field2", new String[] { "bar" });
DBObject updateObject = update.getUpdateObject();
DBObject pullAll = DBObjectTestUtils.getAsDBObject(updateObject, "$pullAll");
assertThat(pullAll.get("field1"), is(notNullValue()));
assertThat(pullAll.get("field2"), is(notNullValue()));
}
}

View File

@@ -83,8 +83,10 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
dave = new Person("Dave", "Matthews", 42);
oliver = new Person("Oliver August", "Matthews", 4);
carter = new Person("Carter", "Beauford", 49);
carter.setSkills(Arrays.asList("Drums", "percussion", "vocals"));
Thread.sleep(10);
boyd = new Person("Boyd", "Tinsley", 45);
boyd.setSkills(Arrays.asList("Violin", "Electric Violin", "Viola", "Mandolin", "Vocals", "Guitar"));
stefan = new Person("Stefan", "Lessard", 34);
leroi = new Person("Leroi", "Moore", 41);
@@ -1211,14 +1213,36 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
}
/**
* @see DATAMONGO-990
* @see DATAMONGO-1425
*/
@Test
public void shouldFindByFirstnameForSpELExpressionWithParameterVariableOnly() {
public void findsPersonsByFirstnameNotContains() throws Exception {
List<Person> users = repository.findWithSpelByFirstnameForSpELExpressionWithParameterVariableOnly("Dave");
assertThat(users, hasSize(1));
assertThat(users.get(0), is(dave));
List<Person> result = repository.findByFirstnameNotContains("Boyd");
assertThat(result.size(), is((int) (repository.count() - 1)));
assertThat(result, not(hasItem(boyd)));
}
/**
* @see DATAMONGO-1425
*/
@Test
public void findBySkillsContains() throws Exception {
List<Person> result = repository.findBySkillsContains(Arrays.asList("Drums"));
assertThat(result.size(), is(1));
assertThat(result, hasItem(carter));
}
/**
* @see DATAMONGO-1425
*/
@Test
public void findBySkillsNotContains() throws Exception {
List<Person> result = repository.findBySkillsNotContains(Arrays.asList("Drums"));
assertThat(result.size(), is((int) (repository.count() - 1)));
assertThat(result, not(hasItem(carter)));
}
}

View File

@@ -89,8 +89,14 @@ public interface PersonRepository extends MongoRepository<Person, String>, Query
*/
List<Person> findByFirstnameLike(String firstname);
List<Person> findByFirstnameNotContains(String firstname);
List<Person> findByFirstnameLikeOrderByLastnameAsc(String firstname, Sort sort);
List<Person> findBySkillsContains(List<String> skills);
List<Person> findBySkillsNotContains(List<String> skills);
@Query("{'age' : { '$lt' : ?0 } }")
List<Person> findByAgeLessThan(int age, Sort sort);
@@ -309,7 +315,8 @@ public interface PersonRepository extends MongoRepository<Person, String>, Query
* @see DATAMONGO-745
*/
@Query("{lastname:?0, address.street:{$in:?1}}")
Page<Person> findByCustomQueryLastnameAndAddressStreetInList(String lastname, List<String> streetNames, Pageable page);
Page<Person> findByCustomQueryLastnameAndAddressStreetInList(String lastname, List<String> streetNames,
Pageable page);
/**
* @see DATAMONGO-950
@@ -334,19 +341,19 @@ public interface PersonRepository extends MongoRepository<Person, String>, Query
*/
@Query("{ firstname : { $in : ?0 }}")
Stream<Person> findByCustomQueryWithStreamingCursorByFirstnames(List<String> firstnames);
/**
* @see DATAMONGO-990
*/
@Query("{ firstname : ?#{[0]}}")
List<Person> findWithSpelByFirstnameForSpELExpressionWithParameterIndexOnly(String firstname);
/**
* @see DATAMONGO-990
*/
@Query("{ firstname : ?#{[0]}, email: ?#{principal.email} }")
List<Person> findWithSpelByFirstnameAndCurrentUserWithCustomQuery(String firstname);
/**
* @see DATAMONGO-990
*/

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,47 +17,42 @@ package org.springframework.data.mongodb.repository.query;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.mockito.Matchers.*;
import static org.mockito.Mockito.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import static org.springframework.data.mongodb.repository.query.StubParameterAccessor.*;
import java.lang.reflect.Method;
import java.math.BigInteger;
import java.util.List;
import org.bson.types.ObjectId;
import org.junit.Before;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.ExpectedException;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.Mockito;
import org.mockito.invocation.InvocationOnMock;
import org.mockito.runners.MockitoJUnitRunner;
import org.mockito.stubbing.Answer;
import org.springframework.data.domain.Range;
import org.springframework.data.geo.Distance;
import org.springframework.data.geo.Metrics;
import org.springframework.data.geo.Point;
import org.springframework.data.geo.Polygon;
import org.springframework.data.geo.Shape;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.Person;
import org.springframework.data.mongodb.core.Venue;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.index.GeoSpatialIndexType;
import org.springframework.data.mongodb.core.index.GeoSpatialIndexed;
import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.repository.Repository;
import org.springframework.data.repository.core.support.DefaultRepositoryMetadata;
import org.springframework.data.repository.query.parser.PartTree;
import org.springframework.data.util.TypeInformation;
/**
* Unit test for {@link MongoQueryCreator}.
@@ -71,22 +66,20 @@ public class MongoQueryCreatorUnitTests {
Method findByFirstname, findByFirstnameAndFriend, findByFirstnameNotNull;
@Mock MongoConverter converter;
@Mock DbRefResolver resolver;
MappingContext<?, MongoPersistentProperty> context;
MongoMappingContext context;
MappingMongoConverter converter;
@Rule public ExpectedException expection = ExpectedException.none();
@Before
public void setUp() throws SecurityException, NoSuchMethodException {
context = new MongoMappingContext();
this.context = new MongoMappingContext();
doAnswer(new Answer<Object>() {
public Object answer(InvocationOnMock invocation) throws Throwable {
return invocation.getArguments()[0];
}
}).when(converter).convertToMongoType(any(), Mockito.any(TypeInformation.class));
this.converter = new MappingMongoConverter(resolver, context);
this.converter.afterPropertiesSet();
}
@Test
@@ -137,8 +130,8 @@ public class MongoQueryCreatorUnitTests {
Point point = new Point(10, 20);
Distance distance = new Distance(2.5, Metrics.KILOMETERS);
Query query = query(where("location").nearSphere(point).maxDistance(distance.getNormalizedValue()).and("firstname")
.is("Dave"));
Query query = query(
where("location").nearSphere(point).maxDistance(distance.getNormalizedValue()).and("firstname").is("Dave"));
assertBindsDistanceToQuery(point, distance, query);
}
@@ -148,8 +141,8 @@ public class MongoQueryCreatorUnitTests {
Point point = new Point(10, 20);
Distance distance = new Distance(2.5);
Query query = query(where("location").near(point).maxDistance(distance.getNormalizedValue()).and("firstname")
.is("Dave"));
Query query = query(
where("location").near(point).maxDistance(distance.getNormalizedValue()).and("firstname").is("Dave"));
assertBindsDistanceToQuery(point, distance, query);
}
@@ -234,23 +227,6 @@ public class MongoQueryCreatorUnitTests {
assertThat(query, is(query(new Criteria().orOperator(where("firstName").is("Dave"), where("age").is(42)))));
}
/**
* @see DATAMONGO-347
*/
@Test
public void createsQueryReferencingADBRefCorrectly() {
User user = new User();
com.mongodb.DBRef dbref = new com.mongodb.DBRef("user", "id");
when(converter.toDBRef(eq(user), Mockito.any(MongoPersistentProperty.class))).thenReturn(dbref);
PartTree tree = new PartTree("findByCreator", User.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, user), context);
Query query = creator.createQuery();
assertThat(query, is(query(where("creator").is(dbref))));
}
/**
* @see DATAMONGO-418
*/
@@ -292,16 +268,14 @@ public class MongoQueryCreatorUnitTests {
private void assertBindsDistanceToQuery(Point point, Distance distance, Query reference) throws Exception {
when(converter.convertToMongoType("Dave")).thenReturn("Dave");
PartTree tree = new PartTree("findByLocationNearAndFirstname",
org.springframework.data.mongodb.repository.Person.class);
Method method = PersonRepository.class.getMethod("findByLocationNearAndFirstname", Point.class, Distance.class,
String.class);
MongoQueryMethod queryMethod = new MongoQueryMethod(method, new DefaultRepositoryMetadata(PersonRepository.class),
new MongoMappingContext());
MongoParameterAccessor accessor = new MongoParametersParameterAccessor(queryMethod, new Object[] { point, distance,
"Dave" });
MongoParameterAccessor accessor = new MongoParametersParameterAccessor(queryMethod,
new Object[] { point, distance, "Dave" });
Query query = new MongoQueryCreator(tree, new ConvertingParameterAccessor(converter, accessor), context)
.createQuery();
@@ -461,6 +435,7 @@ public class MongoQueryCreatorUnitTests {
/**
* @see DATAMONGO-1075
* @see DATAMONGO-1425
*/
@Test
public void shouldCreateRegexWhenUsingNotContainsOnStringProperty() {
@@ -469,7 +444,7 @@ public class MongoQueryCreatorUnitTests {
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, "thew"), context);
Query query = creator.createQuery();
assertThat(query, is(query(where("username").regex(".*thew.*").not())));
assertThat(query.getQueryObject(), is(query(where("username").not().regex(".*thew.*")).getQueryObject()));
}
/**
@@ -661,6 +636,35 @@ public class MongoQueryCreatorUnitTests {
assertThat(query, is(query(where("username").regex(".*"))));
}
/**
* @see DATAMONGO-1342
*/
@Test
public void bindsNullValueToContainsClause() {
PartTree partTree = new PartTree("emailAddressesContains", User.class);
ConvertingParameterAccessor accessor = getAccessor(converter, new Object[] { null });
Query query = new MongoQueryCreator(partTree, accessor, context).createQuery();
assertThat(query, is(query(where("emailAddresses").in((Object) null))));
}
/**
* @see DATAMONGO-1445
*/
@Test
public void doesNotPreConvertValues() {
PartTree tree = new PartTree("id", WithBigIntegerId.class);
BigInteger id = new BigInteger(new ObjectId().toString(), 16);
ConvertingParameterAccessor accessor = getAccessor(converter, new Object[] { id });
Query query = new MongoQueryCreator(tree, accessor, context).createQuery();
assertThat(query, is(query(where("id").is(id))));
}
interface PersonRepository extends Repository<Person, Long> {
List<Person> findByLocationNearAndFirstname(Point location, Distance maxDistance, String firstname);
@@ -689,4 +693,8 @@ public class MongoQueryCreatorUnitTests {
String street;
@GeoSpatialIndexed(type = GeoSpatialIndexType.GEO_2DSPHERE) Point geo;
}
static class WithBigIntegerId {
BigInteger id;
}
}

View File

@@ -24,6 +24,9 @@ import java.util.Collections;
import java.util.List;
import java.util.Map;
import javax.xml.bind.DatatypeConverter;
import org.bson.BSON;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
@@ -343,6 +346,23 @@ public class StringBasedMongoQueryUnitTests {
assertThat(query.getQueryObject(), is(reference.getQueryObject()));
}
/**
* @see DATAMONGO-1290
*/
@Test
public void shouldSupportNonQuotedBinaryDataReplacement() throws Exception {
byte[] binaryData = "Matthews".getBytes("UTF-8");
ConvertingParameterAccessor accesor = StubParameterAccessor.getAccessor(converter, binaryData);
StringBasedMongoQuery mongoQuery = createQueryForMethod("findByLastnameAsBinary", byte[].class);
org.springframework.data.mongodb.core.query.Query query = mongoQuery.createQuery(accesor);
org.springframework.data.mongodb.core.query.Query reference = new BasicQuery("{'lastname' : { '$binary' : '"
+ DatatypeConverter.printBase64Binary(binaryData) + "', '$type' : " + BSON.B_GENERAL + "}}");
assertThat(query.getQueryObject(), is(reference.getQueryObject()));
}
private StringBasedMongoQuery createQueryForMethod(String name, Class<?>... parameters) throws Exception {
Method method = SampleRepository.class.getMethod(name, parameters);
@@ -355,6 +375,9 @@ public class StringBasedMongoQueryUnitTests {
@Query("{ 'lastname' : ?0 }")
Person findByLastname(String lastname);
@Query("{ 'lastname' : ?0 }")
Person findByLastnameAsBinary(byte[] lastname);
@Query("{ 'lastname' : '?0' }")
Person findByLastnameQuoted(String lastname);

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -19,6 +19,8 @@ import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.core.DBObjectTestUtils.*;
import java.util.Collections;
import org.bson.types.ObjectId;
import org.hamcrest.Matchers;
import org.junit.Before;
@@ -26,11 +28,15 @@ import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.convert.WritingConverter;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.repository.Person.Sex;
import org.springframework.data.mongodb.repository.QAddress;
import org.springframework.data.mongodb.repository.QPerson;
@@ -171,10 +177,52 @@ public class SpringDataMongodbSerializerUnitTests {
assertThat($in, Matchers.<Object> arrayContaining(firstId, secondId));
}
/**
* @see DATAMONGO-1485
*/
@Test
public void takesCustomConversionForEnumsIntoAccount() {
MongoMappingContext context = new MongoMappingContext();
MappingMongoConverter converter = new MappingMongoConverter(dbFactory, context);
converter.setCustomConversions(new CustomConversions(Collections.singletonList(new SexTypeWriteConverter())));
converter.afterPropertiesSet();
this.converter = converter;
this.serializer = new SpringDataMongodbSerializer(this.converter);
Object mappedPredicate = this.serializer.handle(QPerson.person.sex.eq(Sex.FEMALE));
assertThat(mappedPredicate, is(instanceOf(DBObject.class)));
assertThat(((DBObject) mappedPredicate).get("sex"), is((Object) "f"));
}
class Address {
String id;
String street;
@Field("zip_code") String zipCode;
@Field("bar") String[] foo;
}
@WritingConverter
public class SexTypeWriteConverter implements Converter<Sex, String> {
@Override
public String convert(Sex source) {
if (source == null) {
return null;
}
switch (source) {
case MALE:
return "m";
case FEMALE:
return "f";
default:
throw new IllegalArgumentException("o_O");
}
}
}
}

View File

@@ -0,0 +1,10 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:db-factory id="testMongo" client-uri="mongodb://username:password@localhost/database" />
</beans>

View File

@@ -0,0 +1,10 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:db-factory client-uri="mongodb://username:password@localhost/database" write-concern="NORMAL" username="username" />
</beans>

View File

@@ -0,0 +1,10 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:db-factory id="testMongo" uri="mongodb://username:password@localhost/database" />
</beans>

View File

@@ -0,0 +1,10 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:db-factory uri="mongodb://username:password@localhost/database" write-concern="NORMAL" username="username" />
</beans>

View File

@@ -16,6 +16,7 @@ Import-Template:
javax.tools.*;version="0",
javax.net.*;version="0",
javax.validation.*;version="${validation:[=.=.=.=,+1.0.0)}";resolution:=optional,
javax.xml.bind.*;version=0,
org.aopalliance.*;version="[1.0.0, 2.0.0)";resolution:=optional,
org.bson.*;version="0",
org.objenesis.*;version="${objenesis:[=.=.=, +1.0.0)}";resolution:=optional,

View File

@@ -12,25 +12,25 @@ NOTE: `SimpleMongoConverter` has been deprecated in Spring Data MongoDB M3 as al
`MongoMappingConverter` has a few conventions for mapping objects to documents when no additional mapping metadata is provided. The conventions are:
* The short Java class name is mapped to the collection name in the following manner. The class '`com.bigbank.SavingsAccount`' maps to '`savingsAccount`' collection name.
* The short Java class name is mapped to the collection name in the following manner. The class `com.bigbank.SavingsAccount` maps to `savingsAccount` collection name.
* All nested objects are stored as nested objects in the document and *not* as DBRefs
* The converter will use any Spring Converters registered with it to override the default mapping of object properties to document field/values.
* The fields of an object are used to convert to and from fields in the document. Public JavaBean properties are not used.
* You can have a single non-zero argument constructor whose constructor argument names match top level field names of document, that constructor will be used. Otherwise the zero arg constructor will be used. if there is more than one non-zero argument constructor an exception will be thrown.
[[mapping.conventions.id-field]]
=== How the '_id' field is handled in the mapping layer
=== How the `_id` field is handled in the mapping layer
MongoDB requires that you have an '_id' field for all documents. If you don't provide one the driver will assign a ObjectId with a generated value. The "_id" field can be of any type the, other than arrays, so long as it is unique. The driver naturally supports all primitive types and Dates. When using the `MongoMappingConverter` there are certain rules that govern how properties from the Java class is mapped to this '_id' field.
MongoDB requires that you have an `_id` field for all documents. If you don't provide one the driver will assign a ObjectId with a generated value. The "_id" field can be of any type the, other than arrays, so long as it is unique. The driver naturally supports all primitive types and Dates. When using the `MongoMappingConverter` there are certain rules that govern how properties from the Java class is mapped to this `_id` field.
The following outlines what field will be mapped to the '_id' document field:
The following outlines what field will be mapped to the `_id` document field:
* A field annotated with `@Id` (`org.springframework.data.annotation.Id`) will be mapped to the '_id' field.
* A field without an annotation but named 'id' will be mapped to the '_id' field.
* The default field name for identifiers is '_id' and can be customized via the `@Field` annotation.
* A field annotated with `@Id` (`org.springframework.data.annotation.Id`) will be mapped to the `_id` field.
* A field without an annotation but named `id` will be mapped to the `_id` field.
* The default field name for identifiers is `_id` and can be customized via the `@Field` annotation.
[cols="1,2", options="header"]
.Examples for the translation of '_id'-field definitions
.Examples for the translation of `_id` field definitions
|===
| Field definition
| Resulting Id-Fieldname in MongoDB
@@ -41,24 +41,208 @@ The following outlines what field will be mapped to the '_id' document field:
| `@Field` `String` id
| `_id`
| `@Field('x')` `String` id
| `@Field("x")` `String` id
| `x`
| `@Id` `String` x
| `_id`
| `@Field('x')` `@Id` `String` x
| `@Field("x")` `@Id` `String` x
| `_id`
|===
The following outlines what type conversion, if any, will be done on the property mapped to the _id document field.
* If a field named 'id' is declared as a String or BigInteger in the Java class it will be converted to and stored as an ObjectId if possible. ObjectId as a field type is also valid. If you specify a value for 'id' in your application, the conversion to an ObjectId is detected to the MongoDBdriver. If the specified 'id' value cannot be converted to an ObjectId, then the value will be stored as is in the document's _id field.
* If a field named ' id' id field is not declared as a String, BigInteger, or ObjectID in the Java class then you should assign it a value in your application so it can be stored 'as-is' in the document's _id field.
* If no field named 'id' is present in the Java class then an implicit '_id' file will be generated by the driver but not mapped to a property or field of the Java class.
* If a field named `id` is declared as a String or BigInteger in the Java class it will be converted to and stored as an ObjectId if possible. ObjectId as a field type is also valid. If you specify a value for `id` in your application, the conversion to an ObjectId is detected to the MongoDBdriver. If the specified `id` value cannot be converted to an ObjectId, then the value will be stored as is in the document's _id field.
* If a field named `id` id field is not declared as a String, BigInteger, or ObjectID in the Java class then you should assign it a value in your application so it can be stored 'as-is' in the document's _id field.
* If no field named `id` is present in the Java class then an implicit `_id` file will be generated by the driver but not mapped to a property or field of the Java class.
When querying and updating `MongoTemplate` will use the converter to handle conversions of the `Query` and `Update` objects that correspond to the above rules for saving documents so field names and types used in your queries will be able to match what is in your domain classes.
[[mapping-conversion]]
== Data mapping and type conversion
This section explain how types are mapped to a MongoDB representation and vice versa. Spring Data MongoDB supports all types that can be represented as BSON, MongoDB's internal document format.
In addition to these types, Spring Data MongoDB provides a set of built-in converters to map additional types. You can provide your own converters to adjust type conversion, see <<mapping-explicit-converters>> for further details.
[cols="3,1,6", options="header"]
.Type
|===
| Type
| Type conversion
| Sample
| `String`
| native
| `{"firstname" : "Dave"}`
| `double`, `Double`, `float`, `Float`
| native
| `{"weight" : 42.5}`
| `int`, `Integer`, `short`, `Short`
| native +
32-bit integer
| `{"height" : 42}`
| `long`, `Long`
| native +
64-bit integer
| `{"height" : 42}`
| `Date`, `Timestamp`
| native
| `{"date" : ISODate("2019-11-12T23:00:00.809Z")}`
| `byte[]`
| native
| `{"bin" : { "$binary" : "AQIDBA==", "$type" : "00" }}`
| `java.util.UUID` (Legacy UUID)
| native
| `{"uuid" : { "$binary" : "MEaf1CFQ6lSphaa3b9AtlA==", "$type" : "03" }}`
| `Date`
| native
| `{"date" : ISODate("2019-11-12T23:00:00.809Z")}`
| `ObjectId`
| native
| `{"_id" : ObjectId("5707a2690364aba3136ab870")}`
| Array, `List`, `BasicDBList`
| native
| `{"cookies" : [ … ]}`
| `boolean`, `Boolean`
| native
| `{"active" : true}`
| `null`
| native
| `{"value" : null}`
| `DBObject`
| native
| `{"value" : { … }}`
| `AtomicInteger` +
calling `get()` before the actual conversion
| converter +
32-bit integer
| `{"value" : "741" }`
| `AtomicLong` +
calling `get()` before the actual conversion
| converter +
64-bit integer
| `{"value" : "741" }`
| `BigInteger`
| converter +
`String`
| `{"value" : "741" }`
| `BigDecimal`
| converter +
`String`
| `{"value" : "741.99" }`
| `URL`
| converter
| `{"website" : "http://projects.spring.io/spring-data-mongodb/" }`
| `Locale`
| converter
| `{"locale : "en_US" }`
| `char`, `Character`
| converter
| `{"char" : "a" }`
| `NamedMongoScript`
| converter +
`Code`
| `{"_id" : "script name", value: (some javascript code)`}
| `java.util.Currency`
| converter
| `{"currencyCode" : "EUR"}`
| `LocalDate` +
(Joda, Java 8, JSR310-BackPort)
| converter
| `{"date" : ISODate("2019-11-12T00:00:00.000Z")}`
| `LocalDateTime`, `LocalTime`, `Instant` +
(Joda, Java 8, JSR310-BackPort)
| converter
| `{"date" : ISODate("2019-11-12T23:00:00.809Z")}`
| `DateTime` (Joda)
| converter
| `{"date" : ISODate("2019-11-12T23:00:00.809Z")}`
| `DateMidnight` (Joda)
| converter
| `{"date" : ISODate("2019-11-12T00:00:00.000Z")}`
| `ZoneId` (Java 8, JSR310-BackPort)
| converter
| `{"zoneId" : "ECT - Europe/Paris"}`
| `Box`
| converter
| `{"box" : { "first" : { "x" : 1.0 , "y" : 2.0} , "second" : { "x" : 3.0 , "y" : 4.0}}`
| `Polygon`
| converter
| `{"polygon" : { "points" : [ { "x" : 1.0 , "y" : 2.0} , { "x" : 3.0 , "y" : 4.0} , { "x" : 4.0 , "y" : 5.0}]}}`
| `Circle`
| converter
| `{"circle" : { "center" : { "x" : 1.0 , "y" : 2.0} , "radius" : 3.0 , "metric" : "NEUTRAL"}}`
| `Point`
| converter
| `{"point" : { "x" : 1.0 , "y" : 2.0}}`
| `GeoJsonPoint`
| converter
| `{"point" : { "type" : "Point" , "coordinates" : [3.0 , 4.0] }}`
| `GeoJsonMultiPoint`
| converter
| `{"geoJsonLineString" : {"type":"MultiPoint", "coordinates": [ [ 0 , 0 ], [ 0 , 1 ], [ 1 , 1 ] ] }}`
| `Sphere`
| converter
| `{"sphere" : { "center" : { "x" : 1.0 , "y" : 2.0} , "radius" : 3.0 , "metric" : "NEUTRAL"}}`
| `GeoJsonPolygon`
| converter
| `{"polygon" : { "type" : "Polygon", "coordinates" : [[ [ 0 , 0 ], [ 3 , 6 ], [ 6 , 1 ], [ 0 , 0 ] ]] }}`
| `GeoJsonMultiPolygon`
| converter
| `{"geoJsonMultiPolygon" : { "type" : "MultiPolygon", "coordinates" : [
[ [ [ -73.958 , 40.8003 ] , [ -73.9498 , 40.7968 ] ] ],
[ [ [ -73.973 , 40.7648 ] , [ -73.9588 , 40.8003 ] ] ]
] }}`
| `GeoJsonLineString`
| converter
| `{ "geoJsonLineString" : { "type" : "LineString", "coordinates" : [ [ 40 , 5 ], [ 41 , 6 ] ] }}`
| `GeoJsonMultiLineString`
| converter
| `{"geoJsonLineString" : { "type" : "MultiLineString", coordinates: [
[ [ -73.97162 , 40.78205 ], [ -73.96374 , 40.77715 ] ],
[ [ -73.97880 , 40.77247 ], [ -73.97036 , 40.76811 ] ]
] }}`
|===
[[mapping-configuration]]
== Mapping Configuration
@@ -108,11 +292,11 @@ public class GeoSpatialAppConfig extends AbstractMongoConfiguration {
----
====
`AbstractMongoConfiguration` requires you to implement methods that define a `com.mongodb.Mongo` as well as provide a database name. `AbstractMongoConfiguration` also has a method you can override named '`getMappingBasePackage`' which tells the converter where to scan for classes annotated with the `@org.springframework.data.mongodb.core.mapping.Document` annotation.
`AbstractMongoConfiguration` requires you to implement methods that define a `com.mongodb.Mongo` as well as provide a database name. `AbstractMongoConfiguration` also has a method you can override named `getMappingBasePackage(…)` which tells the converter where to scan for classes annotated with the `@Document` annotation.
You can add additional converters to the converter by overriding the method afterMappingMongoConverterCreation. Also shown in the above example is a `LoggingEventListener` which logs `MongoMappingEvent`s that are posted onto Spring's `ApplicationContextEvent` infrastructure.
You can add additional converters to the converter by overriding the method afterMappingMongoConverterCreation. Also shown in the above example is a `LoggingEventListener` which logs `MongoMappingEvent` s that are posted onto Spring's `ApplicationContextEvent` infrastructure.
NOTE: AbstractMongoConfiguration will create a MongoTemplate instance and registered with the container under the name 'mongoTemplate'.
NOTE: AbstractMongoConfiguration will create a MongoTemplate instance and registered with the container under the name `mongoTemplate`.
You can also override the method `UserCredentials getUserCredentials()` to provide the username and password information to connect to the database.
@@ -165,7 +349,7 @@ The `base-package` property tells it where to scan for classes annotated with th
[[mapping-usage]]
== Metadata based Mapping
To take full advantage of the object mapping functionality inside the Spring Data/MongoDB support, you should annotate your mapped objects with the `@org.springframework.data.mongodb.core.mapping.Document` annotation. Although it is not necessary for the mapping framework to have this annotation (your POJOs will be mapped correctly, even without any annotations), it allows the classpath scanner to find and pre-process your domain objects to extract the necessary metadata. If you don't use this annotation, your application will take a slight performance hit the first time you store a domain object because the mapping framework needs to build up its internal metadata model so it knows about the properties of your domain object and how to persist them.
To take full advantage of the object mapping functionality inside the Spring Data/MongoDB support, you should annotate your mapped objects with the `@Document` annotation. Although it is not necessary for the mapping framework to have this annotation (your POJOs will be mapped correctly, even without any annotations), it allows the classpath scanner to find and pre-process your domain objects to extract the necessary metadata. If you don't use this annotation, your application will take a slight performance hit the first time you store a domain object because the mapping framework needs to build up its internal metadata model so it knows about the properties of your domain object and how to persist them.
.Example domain object
====
@@ -415,7 +599,7 @@ Simply declaring these beans in your Spring ApplicationContext will cause them t
[[mapping-explicit-converters]]
=== Overriding Mapping with explicit Converters
When storing and querying your objects it is convenient to have a `MongoConverter` instance handle the mapping of all Java types to DBObjects. However, sometimes you may want the `MongoConverter`'s do most of the work but allow you to selectively handle the conversion for a particular type or to optimize performance.
When storing and querying your objects it is convenient to have a `MongoConverter` instance handle the mapping of all Java types to DBObjects. However, sometimes you may want the `MongoConverter` s do most of the work but allow you to selectively handle the conversion for a particular type or to optimize performance.
To selectively handle the conversion yourself, register one or more one or more `org.springframework.core.convert.converter.Converter` instances with the MongoConverter.

View File

@@ -28,7 +28,7 @@ public class Person {
----
====
We have a quite simple domain object here. Note that it has a property named `id` of type`ObjectId`. The default serialization mechanism used in `MongoTemplate` (which is backing the repository support) regards properties named id as document id. Currently we support`String`, `ObjectId` and `BigInteger` as id-types.
We have a quite simple domain object here. Note that it has a property named `id` of type `ObjectId`. The default serialization mechanism used in `MongoTemplate` (which is backing the repository support) regards properties named id as document id. Currently we support `String`, `ObjectId` and `BigInteger` as id-types.
.Basic repository interface to persist Person entities
====
@@ -99,7 +99,7 @@ class ApplicationConfig extends AbstractMongoConfiguration {
----
====
As our domain repository extends `PagingAndSortingRepository` it provides you with CRUD operations as well as methods for paginated and sorted access to the entities. Working with the repository instance is just a matter of dependency injecting it into a client. So accessing the second page of `Person`s at a page size of 10 would simply look something like this:
As our domain repository extends `PagingAndSortingRepository` it provides you with CRUD operations as well as methods for paginated and sorted access to the entities. Working with the repository instance is just a matter of dependency injecting it into a client. So accessing the second page of `Person` s at a page size of 10 would simply look something like this:
.Paging access to Person entities
====
@@ -139,17 +139,17 @@ public interface PersonRepository extends PagingAndSortingRepository<Person, Str
Page<Person> findByFirstname(String firstname, Pageable pageable); <2>
Person findByShippingAddresses(Address address); <3>
Stream<Person> findAllBy(); <4>
}
----
<1> The method shows a query for all people with the given lastname. The query will be derived parsing the method name for constraints which can be concatenated with `And` and `Or`. Thus the method name will result in a query expression of `{"lastname" : lastname}`.
<2> Applies pagination to a query. Just equip your method signature with a `Pageable` parameter and let the method return a `Page` instance and we will automatically page the query accordingly.
<3> Shows that you can query based on properties which are not a primitive type.
<3> Shows that you can query based on properties which are not a primitive type.
<4> Uses a Java 8 `Stream` which reads and converts individual elements while iterating the stream.
====
NOTE: Note that for version 1.0 we currently don't support referring to parameters that are mapped as `DBRef` in the domain class.
@@ -212,10 +212,18 @@ NOTE: Note that for version 1.0 we currently don't support referring to paramete
| `findByFirstnameContaining(String name)`
| `{"firstname" : name} (name as regex)`
| `NotContaining` on String
| `findByFirstnameNotContaining(String name)`
| `{"firstname" : { "$not" : name}} (name as regex)`
| `Containing` on Collection
| `findByAddressesContaining(Address address)`
| `{"addresses" : { "$in" : address}}`
| `NotContaining` on Collection
| `findByAddressesNotContaining(Address address)`
| `{"addresses" : { "$not" : { "$in" : address}}}`
| `Regex`
| `findByFirstnameRegex(String firstname)`
| `{"firstname" : {"$regex" : firstname }}`
@@ -328,7 +336,7 @@ public interface PersonRepository extends MongoRepository<Person, String>
// Metric: {'geoNear' : 'person', 'near' : [x, y], 'maxDistance' : distance,
// 'distanceMultiplier' : metric.multiplier, 'spherical' : true }
GeoResults<Person> findByLocationNear(Point location, Distance distance);
// Metric: {'geoNear' : 'person', 'near' : [x, y], 'minDistance' : min,
// 'maxDistance' : max, 'distanceMultiplier' : metric.multiplier,
// 'spherical' : true }

View File

@@ -195,11 +195,11 @@ public class AppConfig {
This approach allows you to use the standard `com.mongodb.Mongo` API that you may already be used to using but also pollutes the code with the UnknownHostException checked exception. The use of the checked exception is not desirable as Java based bean metadata uses methods as a means to set object dependencies, making the calling code cluttered.
An alternative is to register an instance of `com.mongodb.Mongo` instance with the container using Spring's` MongoFactoryBean`. As compared to instantiating a `com.mongodb.Mongo` instance directly, the FactoryBean approach does not throw a checked exception and has the added advantage of also providing the container with an ExceptionTranslator implementation that translates MongoDB exceptions to exceptions in Spring's portable `DataAccessException` hierarchy for data access classes annoated with the `@Repository` annotation. This hierarchy and use of `@Repository` is described in http://docs.spring.io/spring/docs/current/spring-framework-reference/html/dao.html[Spring's DAO support features].
An alternative is to register an instance of `com.mongodb.Mongo` instance with the container using Spring's `MongoClientFactoryBean`. As compared to instantiating a `com.mongodb.Mongo` instance directly, the FactoryBean approach does not throw a checked exception and has the added advantage of also providing the container with an ExceptionTranslator implementation that translates MongoDB exceptions to exceptions in Spring's portable `DataAccessException` hierarchy for data access classes annoated with the `@Repository` annotation. This hierarchy and use of `@Repository` is described in http://docs.spring.io/spring/docs/current/spring-framework-reference/html/dao.html[Spring's DAO support features].
An example of a Java based bean metadata that supports exception translation on `@Repository` annotated classes is shown below:
.Registering a com.mongodb.Mongo object using Spring's MongoFactoryBean and enabling Spring's exception translation support
.Registering a com.mongodb.Mongo object using Spring's MongoClientFactoryBean and enabling Spring's exception translation support
====
[source,java]
----
@@ -209,8 +209,8 @@ public class AppConfig {
/*
* Factory bean that creates the com.mongodb.Mongo instance
*/
public @Bean MongoFactoryBean mongo() {
MongoFactoryBean mongo = new MongoFactoryBean();
public @Bean MongoClientFactoryBean mongo() {
MongoClientFactoryBean mongo = new MongoClientFactoryBean();
mongo.setHost("localhost");
return mongo;
}
@@ -218,7 +218,7 @@ public class AppConfig {
----
====
To access the `com.mongodb.Mongo` object created by the `MongoFactoryBean` in other `@Configuration` or your own classes, use a "`private @Autowired Mongo mongo;`" field.
To access the `com.mongodb.Mongo` object created by the `MongoClientFactoryBean` in other `@Configuration` or your own classes, use a "`private @Autowired Mongo mongo;`" field.
[[mongo.mongo-xml-config]]
=== Registering a Mongo instance using XML based metadata
@@ -366,7 +366,7 @@ public class MongoConfiguration {
[[mongo.mongo-db-factory-xml]]
=== Registering a MongoDbFactory instance using XML based metadata
The mongo namespace provides a convient way to create a `SimpleMongoDbFactory` as compared to using the`<beans/>` namespace. Simple usage is shown below
The mongo namespace provides a convient way to create a `SimpleMongoDbFactory` as compared to using the `<beans/>` namespace. Simple usage is shown below
[source,xml]
----
@@ -643,21 +643,21 @@ NOTE: This example is meant to show the use of save, update and remove operation
The query syntax used in the example is explained in more detail in the section <<mongo.query,Querying Documents>>.
[[mongo-template.id-handling]]
=== How the '_id' field is handled in the mapping layer
=== How the `_id` field is handled in the mapping layer
MongoDB requires that you have an '_id' field for all documents. If you don't provide one the driver will assign a `ObjectId` with a generated value. When using the `MongoMappingConverter` there are certain rules that govern how properties from the Java class is mapped to this '_id' field.
MongoDB requires that you have an `_id` field for all documents. If you don't provide one the driver will assign a `ObjectId` with a generated value. When using the `MongoMappingConverter` there are certain rules that govern how properties from the Java class is mapped to this `_id` field.
The following outlines what property will be mapped to the '_id' document field:
The following outlines what property will be mapped to the `_id` document field:
* A property or field annotated with `@Id` (`org.springframework.data.annotation.Id`) will be mapped to the '_id' field.
* A property or field without an annotation but named `id` will be mapped to the '_id' field.
* A property or field annotated with `@Id` (`org.springframework.data.annotation.Id`) will be mapped to the `_id` field.
* A property or field without an annotation but named `id` will be mapped to the `_id` field.
The following outlines what type conversion, if any, will be done on the property mapped to the _id document field when using the `MappingMongoConverter`, the default for `MongoTemplate`.
* An id property or field declared as a String in the Java class will be converted to and stored as an `ObjectId` if possible using a Spring `Converter<String, ObjectId>`. Valid conversion rules are delegated to the MongoDB Java driver. If it cannot be converted to an ObjectId, then the value will be stored as a string in the database.
* An id property or field declared as `BigInteger` in the Java class will be converted to and stored as an `ObjectId` using a Spring `Converter<BigInteger, ObjectId>`.
If no field or property specified above is present in the Java class then an implicit '_id' file will be generated by the driver but not mapped to a property or field of the Java class.
If no field or property specified above is present in the Java class then an implicit `_id` file will be generated by the driver but not mapped to a property or field of the Java class.
When querying and updating `MongoTemplate` will use the converter to handle conversions of the `Query` and `Update` objects that correspond to the above rules for saving documents so field names and types used in your queries will be able to match what is in your domain classes.
@@ -815,7 +815,7 @@ There are two ways to manage the collection name that is used for operating on t
The MongoDB driver supports inserting a collection of documents in one operation. The methods in the MongoOperations interface that support this functionality are listed below
* *insert* inserts an object. If there is an existing document with the same id then an error is generated.
* *insertAll* takes a `Collection `of objects as the first parameter. This method inspects each object and inserts it to the appropriate collection based on the rules specified above.
* *insertAll* takes a `Collection` of objects as the first parameter. This method inspects each object and inserts it to the appropriate collection based on the rules specified above.
* *save* saves the object overwriting any object that might exist with the same id.
[[mongo-template.save-insert.batch]]
@@ -823,12 +823,12 @@ The MongoDB driver supports inserting a collection of documents in one operation
The MongoDB driver supports inserting a collection of documents in one operation. The methods in the MongoOperations interface that support this functionality are listed below
* *insert*` methods that take a `Collection` as the first argument. This inserts a list of objects in a single batch write to the database.
* *insert* methods that take a `Collection` as the first argument. This inserts a list of objects in a single batch write to the database.
[[mongodb-template-update]]
=== Updating documents in a collection
For updates we can elect to update the first document found using `MongoOperation`'s method `updateFirst` or we can update all documents that were found to match the query using the method `updateMulti`. Here is an example of an update of all SAVINGS accounts where we are adding a one time $50.00 bonus to the balance using the `$inc` operator.
For updates we can elect to update the first document found using `MongoOperation` 's method `updateFirst` or we can update all documents that were found to match the query using the method `updateMulti`. Here is an example of an update of all SAVINGS accounts where we are adding a one time $50.00 bonus to the balance using the `$inc` operator.
.Updating documents using the MongoTemplate
====
@@ -1045,7 +1045,7 @@ There are also methods on the Criteria class for geospatial queries. Here is a l
* `Criteria` *within* `(Circle circle)` Creates a geospatial criterion using `$geoWithin $center` operators.
* `Criteria` *within* `(Box box)` Creates a geospatial criterion using a `$geoWithin $box` operation.
* `Criteria` *withinSphere* `(Circle circle)` Creates a geospatial criterion using `$geoWithin $center` operators.
* `Criteria` *near* `(Point point)` Creates a geospatial criterion using a `$near `operation
* `Criteria` *near* `(Point point)` Creates a geospatial criterion using a `$near` operation
* `Criteria` *nearSphere* `(Point point)` Creates a geospatial criterion using `$nearSphere$center` operations. This is only available for MongoDB 1.7 and higher.
* `Criteria` *minDistance* `(double minDistance)` Creates a geospatial criterion using the `$minDistance` operation, for use with $near.
* `Criteria` *maxDistance* `(double maxDistance)` Creates a geospatial criterion using the `$maxDistance` operation, for use with $near.
@@ -1059,7 +1059,7 @@ The `Query` class has some additional methods used to provide options for the qu
* `Field` *fields* `()` used to define fields to be included in the query results
* `Query` *limit* `(int limit)` used to limit the size of the returned results to the provided limit (used for paging)
* `Query` *skip* `(int skip)` used to skip the provided number of documents in the results (used for paging)
* `Sort` *sort* `()` used to provide sort definition for the results
* `Query` *with* `(Sort sort)` used to provide sort definition for the results
[[mongo-template.querying]]
=== Methods for querying for documents
@@ -1382,7 +1382,7 @@ Executing this will result in a collection as shown below.
{ "_id" : "d", "value" : 1 }
----
Assuming that the map and reduce functions are located in map.js and reduce.js and bundled in your jar so they are available on the classpath, you can execute a map-reduce operation and obtain the results as shown below
Assuming that the map and reduce functions are located in `map.js` and `reduce.js` and bundled in your jar so they are available on the classpath, you can execute a map-reduce operation and obtain the results as shown below
[source,java]
----
@@ -1490,7 +1490,7 @@ Spring provides integration with MongoDB's group operation by providing methods
[[mongo.group.example]]
=== Example Usage
In order to understand how group operations work the following example is used, which is somewhat artificial. For a more realistic example consult the book 'MongoDB - The definitive guide'. A collection named "group_test_collection" created with the following rows.
In order to understand how group operations work the following example is used, which is somewhat artificial. For a more realistic example consult the book 'MongoDB - The definitive guide'. A collection named `group_test_collection` created with the following rows.
[source]
----
@@ -1502,7 +1502,7 @@ In order to understand how group operations work the following example is used,
{ "_id" : ObjectId("4ec1d25d41421e2015da64f6"), "x" : 3 }
----
We would like to group by the only field in each row, the 'x' field and aggregate the number of times each specific value of 'x' occurs. To do this we need to create an initial document that contains our count variable and also a reduce function which will increment it each time it is encountered. The Java code to execute the group operation is shown below
We would like to group by the only field in each row, the `x` field and aggregate the number of times each specific value of `x` occurs. To do this we need to create an initial document that contains our count variable and also a reduce function which will increment it each time it is encountered. The Java code to execute the group operation is shown below
[source,java]
----
@@ -1660,7 +1660,7 @@ Note that the aggregation operations not listed here are currently not supported
[[mongo.aggregation.projection]]
=== Projection Expressions
Projection expressions are used to define the fields that are the outcome of a particular aggregation step. Projection expressions can be defined via the `project` method of the `Aggregate` class either by passing a list of `String`s or an aggregation framework `Fields` object. The projection can be extended with additional fields through a fluent API via the `and(String)` method and aliased via the `as(String)` method.
Projection expressions are used to define the fields that are the outcome of a particular aggregation step. Projection expressions can be defined via the `project` method of the `Aggregate` class either by passing a list of `String` 's or an aggregation framework `Fields` object. The projection can be extended with additional fields through a fluent API via the `and(String)` method and aliased via the `as(String)` method.
Note that one can also define fields with aliases via the static factory method `Fields.field` of the aggregation framework that can then be used to construct a new `Fields` instance.
.Projection expression examples
@@ -1804,7 +1804,7 @@ ZipInfoStats firstZipInfoStats = result.getMappedResults().get(0);
----
* The class `ZipInfo` maps the structure of the given input-collection. The class `ZipInfoStats` defines the structure in the desired output format.
* As a first step we use the `group` operation to define a group from the input-collection. The grouping criteria is the combination of the fields `"state"` and `"city" `which forms the id structure of the group. We aggregate the value of the `"population"` property from the grouped elements with by using the `sum` operator saving the result in the field `"pop"`.
* As a first step we use the `group` operation to define a group from the input-collection. The grouping criteria is the combination of the fields `"state"` and `"city"` which forms the id structure of the group. We aggregate the value of the `"population"` property from the grouped elements with by using the `sum` operator saving the result in the field `"pop"`.
* In a second step we use the `sort` operation to sort the intermediate-result by the fields `"pop"`, `"state"` and `"city"` in ascending order, such that the smallest city is at the top and the biggest city is at the bottom of the result. Note that the sorting on "state" and `"city"` is implicitly performed against the group id fields which Spring Data MongoDB took care of.
* In the third step we use a `group` operation again to group the intermediate result by `"state"`. Note that `"state"` again implicitly references an group-id field. We select the name and the population count of the biggest and smallest city with calls to the `last(…)` and `first(...)` operator respectively via the `project` operation.
* As the forth step we select the `"state"` field from the previous `group` operation. Note that `"state"` again implicitly references an group-id field. As we do not want an implicit generated id to appear, we exclude the id from the previous operation via `and(previousOperation()).exclude()`. As we want to populate the nested `City` structures in our output-class accordingly we have to emit appropriate sub-documents with the nested method.
@@ -2001,7 +2001,7 @@ public class PersonReadConverter implements Converter<DBObject, Person> {
[[mongo.custom-converters.xml]]
=== Registering Spring Converters with the MongoConverter
The Mongo Spring namespace provides a convenience way to register Spring `Converter`s with the `MappingMongoConverter`. The configuration snippet below shows how to manually register converter beans as well as configuring the wrapping `MappingMongoConverter` into a `MongoTemplate`.
The Mongo Spring namespace provides a convenience way to register Spring `Converter` s with the `MappingMongoConverter`. The configuration snippet below shows how to manually register converter beans as well as configuring the wrapping `MappingMongoConverter` into a `MongoTemplate`.
[source,xml]
----
@@ -2211,7 +2211,7 @@ Here is a list of execute callback methods.
* `<T> T` *execute* `(String collectionName, DbCallback<T> action)` Executes a DbCallback on the collection of the given name translating any exceptions as necessary.
* `<T> T` *executeInSession* `(DbCallback<T> action) ` Executes the given DbCallback within the same connection to the database so as to ensure consistency in a write heavy environment where you may read the data that you wrote.
* `<T> T` *executeInSession* `(DbCallback<T> action)` Executes the given DbCallback within the same connection to the database so as to ensure consistency in a write heavy environment where you may read the data that you wrote.
Here is an example that uses the `CollectionCallback` to return information about an index

View File

@@ -1,6 +1,197 @@
Spring Data MongoDB Changelog
=============================
Changes in version 1.8.5.RELEASE (2016-09-20)
---------------------------------------------
* DATAMONGO-1494 - Release 1.8.5 (Gosling SR5).
* DATAMONGO-1492 - Interface AggregationExpression in package org.springframework.data.mongodb.core.aggregation should be public.
* DATAMONGO-1485 - Querydsl MongodbSerializer does not take registered converters for Enums into account.
* DATAMONGO-1479 - MappingMongoConverter.convertToMongoType causes StackOverflowError for parameterized map value types.
* DATAMONGO-1471 - MappingMongoConverter attempts to set null value on potentially primitive identifier.
* DATAMONGO-1465 - String arguments passed to DefaultScriptOperations.execute() appear quoted in script.
* DATAMONGO-1453 - Parse error into GeoJsonPoint if coordinates are "integers".
* DATAMONGO-1449 - Replace legacy for loop with foreach in MappingMongoConverter.
* DATAMONGO-1445 - @Id annotated attribute of type BigInteger does not work with query methods.
* DATAMONGO-1437 - DefaultDbRefResolver swallows cause of non DataAccessException translatable Exception.
* DATAMONGO-1425 - NOT_CONTAINS keyword issues CONTAINS query.
* DATAMONGO-1423 - Nested document update doesn't apply converters on embedded maps.
* DATAMONGO-1412 - Document mapping rules for Java types to MongoDB representation.
* DATAMONGO-1406 - Query mapper does not use @Field field name when querying nested fields in combination with nested keywords.
* DATAMONGO-1401 - GeoJsonPoint error on update.
Changes in version 1.9.3.RELEASE (2016-09-20)
---------------------------------------------
* DATAMONGO-1493 - Typos in reference documentation.
* DATAMONGO-1492 - Interface AggregationExpression in package org.springframework.data.mongodb.core.aggregation should be public.
* DATAMONGO-1486 - Changes to MappingMongoConverter Result in Class Cast Exception.
* DATAMONGO-1485 - Querydsl MongodbSerializer does not take registered converters for Enums into account.
* DATAMONGO-1479 - MappingMongoConverter.convertToMongoType causes StackOverflowError for parameterized map value types.
* DATAMONGO-1471 - MappingMongoConverter attempts to set null value on potentially primitive identifier.
* DATAMONGO-1465 - String arguments passed to DefaultScriptOperations.execute() appear quoted in script.
* DATAMONGO-1463 - Upgrade to MongoDB Java driver 2.14.3.
* DATAMONGO-1453 - Parse error into GeoJsonPoint if coordinates are "integers".
* DATAMONGO-1450 - Release 1.9.3 (Hopper SR3).
* DATAMONGO-1406 - Query mapper does not use @Field field name when querying nested fields in combination with nested keywords.
Changes in version 1.10.0.M1 (2016-07-27)
-----------------------------------------
* DATAMONGO-1464 - Pagination - Optimize out the count query for paging.
* DATAMONGO-1463 - Upgrade to MongoDB Java driver 2.14.3.
* DATAMONGO-1462 - Integrate version badge from spring.io.
* DATAMONGO-1460 - User placeholder property for JSR-303 API.
* DATAMONGO-1459 - Add support for any-match mode in query-by-example.
* DATAMONGO-1457 - Add support for $slice in projection stage of aggregation.
* DATAMONGO-1456 - Add support for $diacriticInsensitivity to text search.
* DATAMONGO-1455 - Add support for $caseSensitive to text search.
* DATAMONGO-1453 - Parse error into GeoJsonPoint if coordinates are "integers".
* DATAMONGO-1449 - Replace legacy for loop with foreach in MappingMongoConverter.
* DATAMONGO-1437 - DefaultDbRefResolver swallows cause of non DataAccessException translatable Exception.
* DATAMONGO-1431 - Add overload of MongoOperations.stream(…) to take an explicit collection name.
* DATAMONGO-1425 - NOT_CONTAINS keyword issues CONTAINS query.
* DATAMONGO-1424 - Add support for "notLike" keyword in derived queries.
* DATAMONGO-1423 - Nested document update doesn't apply converters on embedded maps.
* DATAMONGO-1420 - Update Spring Data MongoDB version in Github readme.
* DATAMONGO-1419 - Remove deprecations in AbstractMongoEventListener.
* DATAMONGO-1418 - Add support for $out operand for Aggregation.
* DATAMONGO-1416 - Standard bootstrap issues warning in converter registration.
* DATAMONGO-1412 - Document mapping rules for Java types to MongoDB representation.
* DATAMONGO-1411 - Enable MongoDB build on TravisCI.
* DATAMONGO-1409 - Release 1.10 M1 (Ingalls).
* DATAMONGO-1404 - Add support of $max and $min update operations.
* DATAMONGO-1403 - Add maxExecutionTimeMs alias for @Meta(maxExcecutionTime).
* DATAMONGO-1399 - Allow adding hole to GeoJson Polygon.
* DATAMONGO-1394 - References not handled correctly when using QueryDSL.
* DATAMONGO-1391 - Support Mongo 3.2 syntax for $unwind in aggregation.
* DATAMONGO-1271 - Provide read lifecycle events when loading DBRefs.
* DATAMONGO-1194 - Improve DBRef resolution for collections.
* DATAMONGO-832 - Add support for $slice in Update.push.
Changes in version 1.9.2.RELEASE (2016-06-15)
---------------------------------------------
* DATAMONGO-1449 - Replace legacy for loop with foreach in MappingMongoConverter.
* DATAMONGO-1437 - DefaultDbRefResolver swallows cause of non DataAccessException translatable Exception.
* DATAMONGO-1425 - NOT_CONTAINS keyword issues CONTAINS query.
* DATAMONGO-1423 - Nested document update doesn't apply converters on embedded maps.
* DATAMONGO-1416 - Standard bootstrap issues warning in converter registration.
* DATAMONGO-1412 - Document mapping rules for Java types to MongoDB representation.
* DATAMONGO-1411 - Enable MongoDB build on TravisCI.
* DATAMONGO-1410 - Release 1.9.2 (Hopper SR2).
Changes in version 1.9.1.RELEASE (2016-04-06)
---------------------------------------------
* DATAMONGO-1408 - Release 1.9.1 (Hopper SR1).
Changes in version 1.9.0.RELEASE (2016-04-06)
---------------------------------------------
* DATAMONGO-1407 - Add pull request template.
* DATAMONGO-1405 - Release 1.9 GA (Hopper).
* DATAMONGO-1401 - GeoJsonPoint error on update.
* DATAMONGO-1398 - Update documentation for Spring Data MongoDB 1.9.
* DATAMONGO-1396 - Exception when creating geo within Criteria using Aggregation.
Changes in version 1.9.0.RC1 (2016-03-18)
-----------------------------------------
* DATAMONGO-1400 - Adapt to rename of Spring Data Commons' Tuple to Pair.
* DATAMONGO-1397 - MongoTemplate.geoNear() do not log the Query.
* DATAMONGO-1392 - Release 1.9 RC1 (Hopper).
* DATAMONGO-1389 - Adapt test case to changes made for improved type prediction infrastructure.
* DATAMONGO-1387 - BasicQuery.fields().include() doesn't stick, even though Query.fields().include() does.
* DATAMONGO-1373 - Problem with custom annotations with AliasFor annotated attributes.
* DATAMONGO-1326 - Add support for $lookup to aggregation.
* DATAMONGO-1245 - Add support for Query-By-Example.
Changes in version 1.8.4.RELEASE (2016-02-23)
---------------------------------------------
* DATAMONGO-1381 - Release 1.8.4 (Gosling SR4).
* DATAMONGO-1380 - Improve logging in MongoChangeSetPersister.
* DATAMONGO-1378 - Update reference documentation: Change Query.sort() to Query.with(Sort sort).
* DATAMONGO-1377 - Update JavaDoc: Use @EnableMongoRepositories instead of @EnableJpaRepositories.
* DATAMONGO-1376 - Move away from SimpleTypeInformationMapper.INSTANCE.
* DATAMONGO-1375 - Fix typo in MongoOperations JavaDoc.
* DATAMONGO-1361 - geoNear() queries fail when the accompanying query returns no results.
* DATAMONGO-1360 - Cannot query with JSR310.
* DATAMONGO-1270 - Update documentation to reflect deprecation of MongoFactoryBean.
Changes in version 1.9.0.M1 (2016-02-12)
----------------------------------------
* DATAMONGO-1380 - Improve logging in MongoChangeSetPersister.
* DATAMONGO-1378 - Update reference documentation: Change Query.sort() to Query.with(Sort sort).
* DATAMONGO-1377 - Update JavaDoc: Use @EnableMongoRepositories instead of @EnableJpaRepositories.
* DATAMONGO-1376 - Move away from SimpleTypeInformationMapper.INSTANCE.
* DATAMONGO-1375 - Fix typo in MongoOperations JavaDoc.
* DATAMONGO-1372 - Add converter for Currency.
* DATAMONGO-1371 - Add code of conduct.
* DATAMONGO-1366 - Release 1.9 M1 (Hopper).
* DATAMONGO-1361 - geoNear() queries fail when the accompanying query returns no results.
* DATAMONGO-1360 - Cannot query with JSR310.
* DATAMONGO-1349 - Upgrade to mongo-java-driver 2.14.0.
* DATAMONGO-1346 - Cannot add two pullAll to an Update.
* DATAMONGO-1345 - Add support for projections on repository query methods.
* DATAMONGO-1342 - Potential NullPointerException in MongoQueryCreator.nextAsArray(…).
* DATAMONGO-1341 - Remove package cycle between core and core.index.
* DATAMONGO-1337 - General code quality improvements.
* DATAMONGO-1335 - DBObjectAccessor doesn't write properties correctly if multiple ones are nested.
* DATAMONGO-1334 - MapResultOptions limit not implemented.
* DATAMONGO-1324 - StringToObjectIdConverter not properly registered causing drop in performance on identifier conversion.
* DATAMONGO-1317 - Assert compatibility with MongoDB Java driver 3.2.
* DATAMONGO-1314 - Fix typo in Exception message.
* DATAMONGO-1312 - Cannot convert generic sub-document fields.
* DATAMONGO-1303 - Add build profile for MongoDB 3.1 driver.
* DATAMONGO-1302 - CustomConversions should allow registration of ConverterFactory.
* DATAMONGO-1297 - Unique Index on DBRef.
* DATAMONGO-1293 - MongoDbFactoryParser should allow id attribute in addition to client-uri.
* DATAMONGO-1291 - Allow @Document to be used as meta-annotation.
* DATAMONGO-1290 - @Query annotation with byte[] parameter does not work.
* DATAMONGO-1289 - NullPointerException when saving an object with no "id" field or @Id annotation.
* DATAMONGO-1288 - Update.inc(String, Number) method fails to work with AtomicInteger.
* DATAMONGO-1287 - MappingMongoConverter eagerly fetches and converts lazy DbRef to change them afterwards by proxies.
* DATAMONGO-1276 - MongoTemplate.CloseableIterableCursorAdapter does not null check return values from PersistenceExceptionTranslator.
* DATAMONGO-1270 - Update documentation to reflect deprecation of MongoFactoryBean.
* DATAMONGO-1238 - Support for Querydsl 4.
* DATAMONGO-1204 - ObjectPath equality check breaks due to changes MongoDB V3.
* DATAMONGO-1163 - Allow @Indexed to be used as meta-annotation.
* DATAMONGO-934 - Add support for the bulk operations introduced in MongoDB 2.6.
Changes in version 1.8.2.RELEASE (2015-12-18)
---------------------------------------------
* DATAMONGO-1355 - Release 1.8.2 (Gosling).
* DATAMONGO-1346 - Cannot add two pullAll to an Update.
* DATAMONGO-1342 - Potential NullPointerException in MongoQueryCreator.nextAsArray(…).
* DATAMONGO-1337 - General code quality improvements.
* DATAMONGO-1335 - DBObjectAccessor doesn't write properties correctly if multiple ones are nested.
* DATAMONGO-1334 - MapResultOptions limit not implemented.
* DATAMONGO-1324 - StringToObjectIdConverter not properly registered causing drop in performance on identifier conversion.
* DATAMONGO-1317 - Assert compatibility with MongoDB Java driver 3.2.
* DATAMONGO-1290 - @Query annotation with byte[] parameter does not work.
* DATAMONGO-1289 - NullPointerException when saving an object with no "id" field or @Id annotation.
* DATAMONGO-1287 - MappingMongoConverter eagerly fetches and converts lazy DbRef to change them afterwards by proxies.
* DATAMONGO-1204 - ObjectPath equality check breaks due to changes MongoDB V3.
Changes in version 1.8.1.RELEASE (2015-11-15)
---------------------------------------------
* DATAMONGO-1316 - Release 1.8.1 (Gosling).
* DATAMONGO-1312 - Cannot convert generic sub-document fields.
* DATAMONGO-1302 - CustomConversions should allow registration of ConverterFactory.
* DATAMONGO-1297 - Unique Index on DBRef.
* DATAMONGO-1293 - MongoDbFactoryParser should allow id attribute in addition to client-uri.
* DATAMONGO-1276 - MongoTemplate.CloseableIterableCursorAdapter does not null check return values from PersistenceExceptionTranslator.
Changes in version 1.6.4.RELEASE (2015-10-14)
---------------------------------------------
* DATAMONGO-1304 - Release 1.6.4 (Evans).
Changes in version 1.8.0.RELEASE (2015-09-01)
---------------------------------------------
* DATAMONGO-1282 - Release 1.8 GA (Gosling).

View File

@@ -1,4 +1,4 @@
Spring Data MongoDB 1.8 GA
Spring Data MongoDB 1.8.5
Copyright (c) [2010-2015] Pivotal Software, Inc.
This product is licensed to you under the Apache License, Version 2.0 (the "License").