Compare commits

..

16 Commits

Author SHA1 Message Date
Spring Buildmaster
9375e7b981 DATAMONGO-492 - Release 1.0.3.RELEASE. 2012-07-24 07:01:21 -07:00
Oliver Gierke
30a4682369 DATAMONGO-492 - Prepare changelog for 1.0.3.RELEASE. 2012-07-24 15:34:33 +02:00
Oliver Gierke
356e6acd43 DATAMONGO-474 - Populating id's after save now inspects field only.
So far the algorithm to inspect whether an id property has to be set after a save(…) operation has used the plain BeanWrapper.getProperty(PersistentProperty property) method. This caused problems in case the getter of the id field returned something completely different (to be precise: a complex type not convertible out of the box).

We now inspect the id field only to retrieve the value.
2012-07-24 13:30:04 +02:00
Oliver Gierke
09ed4aaf24 DATAMONGO-489 - Ensure read collections get converted to appropriate target type.
When reading BasicDBLists we now make sure the resulting collection is converted into the actual target type eventually. It might be an array and thus need an additional round of massaging before being returned as value.
2012-07-23 16:41:28 +02:00
Oliver Gierke
c7995eb462 DATAMONGO-485 - Added test case to show complex id's are working. 2012-07-17 12:46:34 +02:00
Amol Nayak
2e6a9a6ee7 DATAMONGO-480 - Consider WriteResult for insert(…) and save(…) methods. 2012-07-16 18:53:38 +02:00
Oliver Gierke
a82fbade95 DATAMONGO-483 - Indexes now use the field name even if index name is defined. 2012-07-16 18:36:56 +02:00
Oliver Gierke
7a9ba3fe3e DATAMONGO-482 - Fixed typo in reference documentation. 2012-07-16 17:43:01 +02:00
Oliver Gierke
134e7762a7 DATAMONGO-474 - Fixed criteria mapping for MongoTemplate.group(…).
The criteria object handed to the group object needs to be mapped correctly to map complex values. Improved error handling on the way.
2012-07-16 16:37:18 +02:00
Oliver Gierke
e41299ff38 DATAMONGO-475 - Fixed debug output in map-reduce operations.
Using SerializationUtils.serializeToJsonSafely(…) instead of plain toString() as this might cause SerializationExceptions for complex objects.
2012-07-16 14:55:32 +02:00
Oliver Gierke
5cf7a86023 DATAMONGO-469 - Fixed parsing of And keyword in derived queries. 2012-06-26 13:44:22 +02:00
Oliver Gierke
0aacb887de DATAMONGO-470 - Implemented equals(…) and hashCode() for Query and Criteria. 2012-06-26 13:36:00 +02:00
Oliver Gierke
ba81f21aba DATAMONGO-467 - Fix identifier handling for Querydsl.
As we try to massage the value of the id property into an ObjectId if possible we need to do so as well when mapping the Querydsl query. Adapted SpringDataMongoDbSerializer accordingly.
2012-06-25 13:11:37 +02:00
Oliver Gierke
43dee69fe0 DATAMONGO-464 - Fixed resource synchronization in MongoDbUtils.
MongoDbUtils now correctly returns DB instances for others than the first one bound. So far the lookup for an alternate database resulted in the first one bound to be returned.
2012-06-25 11:16:31 +02:00
Oliver Gierke
1be1297ef9 DATAMONGO-466 - QueryMapper now only tries id conversion for top level document.
So far the QueryMapper has tried to map id properties of nested documents to ObjectIds which it shouldn't do actually.
2012-06-22 15:14:37 +02:00
Spring Buildmaster
dad0789356 DATAMONGO-463 - Prepare next development iteration. 2012-06-20 03:53:31 -07:00
25 changed files with 722 additions and 137 deletions

View File

@@ -6,7 +6,7 @@
<name>Spring Data MongoDB Distribution</name>
<description>Spring Data project for MongoDB</description>
<url>http://www.springsource.org/spring-data/mongodb</url>
<version>1.0.2.RELEASE</version>
<version>1.0.3.RELEASE</version>
<packaging>pom</packaging>
<modules>
<module>spring-data-mongodb</module>

View File

@@ -4,7 +4,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.2.RELEASE</version>
<version>1.0.3.RELEASE</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-cross-store</artifactId>

View File

@@ -4,7 +4,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.2.RELEASE</version>
<version>1.0.3.RELEASE</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-log4j</artifactId>

View File

@@ -6,7 +6,7 @@
<name>Spring Data MongoDB Parent</name>
<description>Spring Data project for MongoDB</description>
<url>http://www.springsource.org/spring-data/mongodb</url>
<version>1.0.2.RELEASE</version>
<version>1.0.3.RELEASE</version>
<packaging>pom</packaging>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
@@ -198,7 +198,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.0.2.RELEASE</version>
<version>1.0.3.RELEASE</version>
</dependency>
<!-- Logging -->

View File

@@ -4,7 +4,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.2.RELEASE</version>
<version>1.0.3.RELEASE</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb</artifactId>

View File

@@ -15,14 +15,15 @@
*/
package org.springframework.data.mongodb.core;
import com.mongodb.DB;
import com.mongodb.Mongo;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.util.Assert;
import com.mongodb.DB;
import com.mongodb.Mongo;
/**
* Helper class featuring helper methods for internal MongoDb classes.
* <p/>
@@ -78,7 +79,7 @@ public abstract class MongoDbUtils {
DB db = null;
if (TransactionSynchronizationManager.isSynchronizationActive() && dbHolder.doesNotHoldNonDefaultDB()) {
// Spring transaction management is active ->
db = dbHolder.getDB();
db = dbHolder.getDB(databaseName);
if (db != null && !dbHolder.isSynchronizedWithTransaction()) {
LOGGER.debug("Registering Spring transaction synchronization for existing Mongo DB");
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(dbHolder, mongo));
@@ -110,9 +111,9 @@ public abstract class MongoDbUtils {
LOGGER.debug("Registering Spring transaction synchronization for new Hibernate Session");
DbHolder holderToUse = dbHolder;
if (holderToUse == null) {
holderToUse = new DbHolder(db);
holderToUse = new DbHolder(databaseName, db);
} else {
holderToUse.addDB(db);
holderToUse.addDB(databaseName, db);
}
TransactionSynchronizationManager.registerSynchronization(new MongoSynchronization(holderToUse, mongo));
holderToUse.setSynchronizedWithTransaction(true);

View File

@@ -109,6 +109,7 @@ import com.mongodb.util.JSON;
* @author Graeme Rocher
* @author Mark Pollack
* @author Oliver Gierke
* @author Amol Nayak
*/
public class MongoTemplate implements MongoOperations, ApplicationContextAware {
@@ -334,11 +335,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
Assert.notNull(query);
DBObject queryObject = query.getQueryObject();
DBObject sortObject = query.getSortObject();
DBObject fieldsObject = query.getFieldsObject();
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("find using query: " + queryObject + " fields: " + fieldsObject + " in collection: "
+ collectionName);
LOGGER.debug(String.format("Executing query: %s sort: %s fields: %s in collection: $s",
SerializationUtils.serializeToJsonSafely(queryObject), sortObject, fieldsObject, collectionName));
}
this.executeQueryInternal(new FindCallback(queryObject, fieldsObject), preparer, dch, collectionName);
@@ -733,11 +735,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.INSERT, collectionName,
entityClass, dbDoc, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
WriteResult wr;
if (writeConcernToUse == null) {
collection.insert(dbDoc);
wr = collection.insert(dbDoc);
} else {
collection.insert(dbDoc, writeConcernToUse);
wr = collection.insert(dbDoc, writeConcernToUse);
}
handleAnyWriteResultErrors(wr, dbDoc, "insert");
return dbDoc.get(ID);
}
});
@@ -756,11 +760,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.INSERT_LIST, collectionName, null,
null, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
WriteResult wr;
if (writeConcernToUse == null) {
collection.insert(dbDocList);
wr = collection.insert(dbDocList);
} else {
collection.insert(dbDocList.toArray((DBObject[]) new BasicDBObject[dbDocList.size()]), writeConcernToUse);
wr = collection.insert(dbDocList.toArray((DBObject[]) new BasicDBObject[dbDocList.size()]), writeConcernToUse);
}
handleAnyWriteResultErrors(wr, null, "insert_list");
return null;
}
});
@@ -787,11 +793,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
MongoAction mongoAction = new MongoAction(writeConcern, MongoActionOperation.SAVE, collectionName, entityClass,
dbDoc, null);
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
WriteResult wr;
if (writeConcernToUse == null) {
collection.save(dbDoc);
wr = collection.save(dbDoc);
} else {
collection.save(dbDoc, writeConcernToUse);
wr = collection.save(dbDoc, writeConcernToUse);
}
handleAnyWriteResultErrors(wr, dbDoc, "save");
return dbDoc.get(ID);
}
});
@@ -879,7 +887,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Returns a {@link Query} for the given entity by its id.
*
*
* @param object must not be {@literal null}.
* @return
*/
@@ -997,26 +1005,16 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
LOGGER.debug("Executing MapReduce on collection [" + command.getInput() + "], mapFunction [" + mapFunc
+ "], reduceFunction [" + reduceFunc + "]");
}
CommandResult commandResult = null;
try {
if (command.getOutputType() == MapReduceCommand.OutputType.INLINE) {
commandResult = executeCommand(commandObject, getDb().getOptions());
} else {
commandResult = executeCommand(commandObject);
}
commandResult.throwOnError();
} catch (RuntimeException ex) {
this.potentiallyConvertRuntimeException(ex);
}
String error = commandResult.getErrorMessage();
if (error != null) {
throw new InvalidDataAccessApiUsageException("Command execution failed: Error [" + error + "], Command = "
+ commandObject);
}
CommandResult commandResult = command.getOutputType() == MapReduceCommand.OutputType.INLINE ? executeCommand(
commandObject, getDb().getOptions()) : executeCommand(commandObject);
handleCommandError(commandResult, commandObject);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("MapReduce command result = [" + commandResult + "]");
LOGGER.debug(String.format("MapReduce command result = [%s]",
SerializationUtils.serializeToJsonSafely(commandObject)));
}
MapReduceOutput mapReduceOutput = new MapReduceOutput(inputCollection, commandObject, commandResult);
List<T> mappedResults = new ArrayList<T>();
DbObjectCallback<T> callback = new ReadDbObjectCallback<T>(mongoConverter, entityClass);
@@ -1041,7 +1039,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
if (criteria == null) {
dbo.put("cond", null);
} else {
dbo.put("cond", criteria.getCriteriaObject());
dbo.put("cond", mapper.getMappedObject(criteria.getCriteriaObject(), null));
}
// If initial document was a JavaScript string, potentially loaded by Spring's Resource abstraction, load it and
// convert to DBObject
@@ -1067,21 +1065,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DBObject commandObject = new BasicDBObject("group", dbo);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Executing Group with DBObject [" + commandObject.toString() + "]");
}
CommandResult commandResult = null;
try {
commandResult = executeCommand(commandObject, getDb().getOptions());
commandResult.throwOnError();
} catch (RuntimeException ex) {
this.potentiallyConvertRuntimeException(ex);
}
String error = commandResult.getErrorMessage();
if (error != null) {
throw new InvalidDataAccessApiUsageException("Command execution failed: Error [" + error + "], Command = "
+ commandObject);
LOGGER.debug(String.format("Executing Group with DBObject [%s]",
SerializationUtils.serializeToJsonSafely(commandObject)));
}
CommandResult commandResult = executeCommand(commandObject, getDb().getOptions());
handleCommandError(commandResult, commandObject);
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Group command result = [" + commandResult + "]");
}
@@ -1188,7 +1178,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Create the specified collection using the provided options
*
*
* @param collectionName
* @param collectionOptions
* @return the collection that was created
@@ -1210,7 +1200,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* Map the results of an ad-hoc query on the default MongoDB collection to an object using the template's converter
* <p/>
* The query document is specified as a standard DBObject and so is the fields specification.
*
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query document that specifies the criteria used to find a record
* @param fields the document that specifies the fields to be returned
@@ -1235,7 +1225,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* The query document is specified as a standard DBObject and so is the fields specification.
* <p/>
* Can be overridden by subclasses.
*
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query document that specifies the criteria used to find a record
* @param fields the document that specifies the fields to be returned
@@ -1265,7 +1255,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* Map the results of an ad-hoc query on the default MongoDB collection to a List using the template's converter.
* <p/>
* The query document is specified as a standard DBObject and so is the fields specification.
*
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query document that specifies the criteria used to find a record
* @param fields the document that specifies the fields to be returned
@@ -1304,7 +1294,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* The first document that matches the query is returned and also removed from the collection in the database.
* <p/>
* The query document is specified as a standard DBObject and so is the fields specification.
*
*
* @param collectionName name of the collection to retrieve the objects from
* @param query the query document that specifies the criteria used to find a record
* @param entityClass the parameterized type of the returned list.
@@ -1351,7 +1341,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
/**
* Populates the id property of the saved object, if it's not set already.
*
*
* @param savedObject
* @param id
*/
@@ -1372,7 +1362,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
try {
Object idValue = wrapper.getProperty(idProp);
Object idValue = wrapper.getProperty(idProp, idProp.getType(), true);
if (idValue != null) {
return;
@@ -1404,7 +1394,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* <li>Execute the given {@link ConnectionCallback} for a {@link DBObject}.</li>
* <li>Apply the given {@link DbObjectCallback} to each of the {@link DBObject}s to obtain the result.</li>
* <ol>
*
*
* @param <T>
* @param collectionCallback the callback to retrieve the {@link DBObject} with
* @param objectCallback the {@link DbObjectCallback} to transform {@link DBObject}s into the actual domain type
@@ -1527,9 +1517,16 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
String error = wr.getError();
if (error != null) {
String message = String.format("Execution of %s%s failed: %s", operation, query == null ? "" : "' using '"
+ query.toString() + "' query", error);
String message;
if (operation.equals("insert") || operation.equals("save")) {
// assuming the insert operations will begin with insert string
message = String.format("Insert/Save for %s failed: %s", query, error);
} else if (operation.equals("insert_list")) {
message = String.format("Insert list failed: %s", error);
} else {
message = String.format("Execution of %s%s failed: %s", operation,
query == null ? "" : "' using '" + query.toString() + "' query", error);
}
if (WriteResultChecking.EXCEPTION == this.writeResultChecking) {
throw new DataIntegrityViolationException(message);
@@ -1552,6 +1549,27 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return resolved == null ? ex : resolved;
}
/**
* Inspects the given {@link CommandResult} for erros and potentially throws an
* {@link InvalidDataAccessApiUsageException} for that error.
*
* @param result must not be {@literal null}.
* @param source must not be {@literal null}.
*/
private void handleCommandError(CommandResult result, DBObject source) {
try {
result.throwOnError();
} catch (MongoException ex) {
String error = result.getErrorMessage();
error = error == null ? "NO MESSAGE" : error;
throw new InvalidDataAccessApiUsageException("Command execution failed: Error [" + error + "], Command = "
+ source, ex);
}
}
private static final MongoConverter getDefaultMongoConverter(MongoDbFactory factory) {
MappingMongoConverter converter = new MappingMongoConverter(factory, new MongoMappingContext());
converter.afterPropertiesSet();

View File

@@ -141,7 +141,11 @@ public class QueryMapper {
*/
private boolean isIdKey(String key, MongoPersistentEntity<?> entity) {
if (null != entity && entity.getIdProperty() != null) {
if (entity == null) {
return false;
}
if (entity.getIdProperty() != null) {
MongoPersistentProperty idProperty = entity.getIdProperty();
return idProperty.getName().equals(key) || idProperty.getFieldName().equals(key);
}

View File

@@ -728,7 +728,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @return the converted {@link Collections}, will never be {@literal null}.
*/
@SuppressWarnings("unchecked")
private Collection<?> readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue) {
private Object readCollectionOrArray(TypeInformation<?> targetType, BasicDBList sourceValue) {
Assert.notNull(targetType);
@@ -750,7 +750,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
}
return items;
return getPotentiallyConvertedSimpleRead(items, targetType.getType());
}
/**

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,9 +13,9 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
@@ -24,14 +24,29 @@ import java.lang.annotation.Target;
/**
* Mark a class to use compound indexes.
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Jon Brisbin
* @author Oliver Gierke
*/
@Target({ ElementType.TYPE })
@Documented
@Retention(RetentionPolicy.RUNTIME)
public @interface CompoundIndex {
/**
* The actual index definition in JSON format. The keys of the JSON document are the fields to be indexed, the values
* define the index direction (1 for ascending, -1 for descending).
*
* @return
*/
String def();
/**
* It does not actually make sense to use that attribute as the direction has to be defined in the {@link #def()}
* attribute actually.
*
* @return
*/
@Deprecated
IndexDirection direction() default IndexDirection.ASCENDING;
boolean unique() default false;
@@ -40,8 +55,18 @@ public @interface CompoundIndex {
boolean dropDups() default false;
/**
* The name of the index to be created.
*
* @return
*/
String name() default "";
/**
* The collection the index will be created in. Will default to the collection the annotated domain class will be
* stored in.
*
* @return
*/
String collection() default "";
}

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,7 +13,6 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import java.lang.reflect.Field;
@@ -27,7 +26,6 @@ import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PropertyHandler;
import org.springframework.data.mapping.event.MappingContextEvent;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
@@ -39,10 +37,10 @@ import com.mongodb.DBObject;
import com.mongodb.util.JSON;
/**
* Component that inspects {@link BasicMongoPersistentEntity} instances contained in the given
* {@link MongoMappingContext} for indexing metadata and ensures the indexes to be available.
* Component that inspects {@link MongoPersistentEntity} instances contained in the given {@link MongoMappingContext}
* for indexing metadata and ensures the indexes to be available.
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Jon Brisbin
* @author Oliver Gierke
*/
public class MongoPersistentEntityIndexCreator implements
@@ -97,12 +95,12 @@ public class MongoPersistentEntityIndexCreator implements
if (type.isAnnotationPresent(CompoundIndexes.class)) {
CompoundIndexes indexes = type.getAnnotation(CompoundIndexes.class);
for (CompoundIndex index : indexes.value()) {
String indexColl = index.collection();
if ("".equals(indexColl)) {
indexColl = entity.getCollection();
}
ensureIndex(indexColl, index.name(), index.def(), index.direction(), index.unique(), index.dropDups(),
index.sparse());
String indexColl = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
DBObject definition = (DBObject) JSON.parse(index.def());
ensureIndex(indexColl, index.name(), definition, index.unique(), index.dropDups(), index.sparse());
if (log.isDebugEnabled()) {
log.debug("Created compound index " + index);
}
@@ -111,10 +109,14 @@ public class MongoPersistentEntityIndexCreator implements
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty persistentProperty) {
Field field = persistentProperty.getField();
if (field.isAnnotationPresent(Indexed.class)) {
Indexed index = field.getAnnotation(Indexed.class);
String name = index.name();
if (!StringUtils.hasText(name)) {
name = persistentProperty.getFieldName();
} else {
@@ -126,11 +128,17 @@ public class MongoPersistentEntityIndexCreator implements
}
}
}
String collection = StringUtils.hasText(index.collection()) ? index.collection() : entity.getCollection();
ensureIndex(collection, name, null, index.direction(), index.unique(), index.dropDups(), index.sparse());
int direction = index.direction() == IndexDirection.ASCENDING ? 1 : -1;
DBObject definition = new BasicDBObject(persistentProperty.getFieldName(), direction);
ensureIndex(collection, name, definition, index.unique(), index.dropDups(), index.sparse());
if (log.isDebugEnabled()) {
log.debug("Created property index " + index);
}
} else if (field.isAnnotationPresent(GeoSpatialIndexed.class)) {
GeoSpatialIndexed index = field.getAnnotation(GeoSpatialIndexed.class);
@@ -155,21 +163,15 @@ public class MongoPersistentEntityIndexCreator implements
}
}
protected void ensureIndex(String collection, final String name, final String def, final IndexDirection direction,
final boolean unique, final boolean dropDups, final boolean sparse) {
DBObject defObj;
if (null != def) {
defObj = (DBObject) JSON.parse(def);
} else {
defObj = new BasicDBObject();
defObj.put(name, (direction == IndexDirection.ASCENDING ? 1 : -1));
}
protected void ensureIndex(String collection, String name, DBObject indexDefinition, boolean unique,
boolean dropDups, boolean sparse) {
DBObject opts = new BasicDBObject();
opts.put("name", name);
opts.put("dropDups", dropDups);
opts.put("sparse", sparse);
opts.put("unique", unique);
mongoDbFactory.getDb().getCollection(collection).ensureIndex(defObj, opts);
}
mongoDbFactory.getDb().getCollection(collection).ensureIndex(indexDefinition, opts);
}
}

View File

@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core.query;
import static org.springframework.util.ObjectUtils.*;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
@@ -29,6 +31,7 @@ import org.springframework.data.mongodb.core.geo.Circle;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.mongodb.core.geo.Shape;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
@@ -429,11 +432,9 @@ public class Criteria implements CriteriaDefinition {
}
/*
* (non-Javadoc)
*
* @see org.springframework.datastore.document.mongodb.query.Criteria#
* getCriteriaObject(java.lang.String)
*/
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.CriteriaDefinition#getCriteriaObject()
*/
public DBObject getCriteriaObject() {
if (this.criteriaChain.size() == 1) {
return criteriaChain.get(0).getSingleCriteriaObject();
@@ -496,4 +497,63 @@ public class Criteria implements CriteriaDefinition {
}
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
Criteria that = (Criteria) obj;
boolean keyEqual = this.key == null ? that.key == null : this.key.equals(that.key);
boolean criteriaEqual = this.criteria.equals(that.criteria);
boolean valueEqual = isEqual(this.isValue, that.isValue);
return keyEqual && criteriaEqual && valueEqual;
}
/**
* Checks the given objects for equality. Handles {@link Pattern} and arrays correctly.
*
* @param left
* @param right
* @return
*/
private boolean isEqual(Object left, Object right) {
if (left == null) {
return right == null;
}
if (left instanceof Pattern) {
return right instanceof Pattern ? ((Pattern) left).pattern().equals(((Pattern) right).pattern()) : false;
}
return ObjectUtils.nullSafeEquals(left, right);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += nullSafeHashCode(key);
result += criteria.hashCode();
result += nullSafeHashCode(isValue);
return result;
}
}

View File

@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.core.query;
import static org.springframework.data.mongodb.core.SerializationUtils.*;
import static org.springframework.util.ObjectUtils.*;
import java.util.ArrayList;
import java.util.LinkedHashMap;
@@ -152,4 +153,50 @@ public class Query {
return String.format("Query: %s, Fields: %s, Sort: %s", serializeToJsonSafely(getQueryObject()),
serializeToJsonSafely(getFieldsObject()), serializeToJsonSafely(getSortObject()));
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
Query that = (Query) obj;
boolean criteriaEqual = this.criteria.equals(that.criteria);
boolean fieldsEqual = this.fieldSpec == null ? that.fieldSpec == null : this.fieldSpec.equals(that.fieldSpec);
boolean sortEqual = this.sort == null ? that.sort == null : this.sort.equals(that.sort);
boolean hintEqual = this.hint == null ? that.hint == null : this.hint.equals(that.hint);
boolean skipEqual = this.skip == that.skip;
boolean limitEqual = this.limit == that.limit;
return criteriaEqual && fieldsEqual && sortEqual && hintEqual && skipEqual && limitEqual;
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * criteria.hashCode();
result += 31 * nullSafeHashCode(fieldSpec);
result += 31 * nullSafeHashCode(sort);
result += 31 * nullSafeHashCode(hint);
result += 31 * skip;
result += 31 * limit;
return result;
}
}

View File

@@ -116,13 +116,9 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
return create(part, iterator);
}
PersistentPropertyPath<MongoPersistentProperty> path2 = context.getPersistentPropertyPath(part.getProperty());
Criteria criteria = from(part.getType(),
where(path2.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE)),
PersistentPropertyPath<MongoPersistentProperty> path = context.getPersistentPropertyPath(part.getProperty());
return from(part.getType(), where(path.toDotPath(MongoPersistentProperty.PropertyToFieldNameConverter.INSTANCE)),
(PotentiallyConvertingIterator) iterator);
return criteria.andOperator(criteria);
}
/*

View File

@@ -28,6 +28,7 @@ import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.QueryMapper;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
@@ -235,6 +236,7 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
private final MongoConverter converter;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final QueryMapper mapper;
/**
* Creates a new {@link SpringDataMongodbSerializer} for the given {@link MappingContext}.
@@ -244,6 +246,7 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
public SpringDataMongodbSerializer(MongoConverter converter) {
this.mappingContext = converter.getMappingContext();
this.converter = converter;
this.mapper = new QueryMapper(converter);
}
@Override
@@ -258,6 +261,10 @@ public class QueryDslMongoRepository<T, ID extends Serializable> extends SimpleM
@Override
protected DBObject asDBObject(String key, Object value) {
if ("_id".equals(key)) {
return super.asDBObject(key, mapper.convertId(value));
}
return super.asDBObject(key, value instanceof Pattern ? value : converter.convertToMongoType(value));
}
}

View File

@@ -0,0 +1,65 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import com.mongodb.DB;
import com.mongodb.Mongo;
/**
* Unit tests for {@link MongoDbUtils}.
*
* @author Oliver Gierke
*/
public class MongoDbUtilsUnitTests {
Mongo mongo;
@Before
public void setUp() throws Exception {
this.mongo = new Mongo();
TransactionSynchronizationManager.initSynchronization();
}
@After
public void tearDown() {
for (Object key : TransactionSynchronizationManager.getResourceMap().keySet()) {
TransactionSynchronizationManager.unbindResource(key);
}
TransactionSynchronizationManager.clearSynchronization();
}
@Test
public void returnsNewInstanceForDifferentDatabaseName() {
DB first = MongoDbUtils.getDB(mongo, "first");
assertThat(first, is(notNullValue()));
assertThat(MongoDbUtils.getDB(mongo, "first"), is(first));
DB second = MongoDbUtils.getDB(mongo, "second");
assertThat(second, is(not(first)));
assertThat(MongoDbUtils.getDB(mongo, "second"), is(second));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -39,6 +39,7 @@ import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.core.convert.converter.Converter;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.annotation.Id;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
@@ -71,6 +72,7 @@ import com.mongodb.WriteResult;
*
* @author Oliver Gierke
* @author Thomas Risberg
* @author Amol Nayak
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:infrastructure.xml")
@@ -160,6 +162,112 @@ public class MongoTemplateTests {
mongoTemplate.updateFirst(q, u, Person.class);
}
/**
* @see DATAMONGO-480
*/
@Test
public void throwsExceptionForDuplicateIds() {
MongoTemplate template = new MongoTemplate(factory);
template.setWriteResultChecking(WriteResultChecking.EXCEPTION);
Person person = new Person(new ObjectId(), "Amol");
person.setAge(28);
template.insert(person);
try {
template.insert(person);
fail("Expected DataIntegrityViolationException!");
} catch (DataIntegrityViolationException e) {
assertThat(e.getMessage(), containsString("E11000 duplicate key error index: database.person.$_id_ dup key:"));
}
}
/**
* @see DATAMONGO-480
*/
@Test
public void throwsExceptionForUpdateWithInvalidPushOperator() {
MongoTemplate template = new MongoTemplate(factory);
template.setWriteResultChecking(WriteResultChecking.EXCEPTION);
ObjectId id = new ObjectId();
Person person = new Person(id, "Amol");
person.setAge(28);
template.insert(person);
try {
Query query = new Query(Criteria.where("firstName").is("Amol"));
Update upd = new Update().push("age", 29);
template.updateFirst(query, upd, Person.class);
fail("Expected DataIntegrityViolationException!");
} catch (DataIntegrityViolationException e) {
assertThat(e.getMessage(),
is("Execution of update with '{ \"$push\" : { \"age\" : 29}}'' using '{ \"firstName\" : \"Amol\"}' "
+ "query failed: Cannot apply $push/$pushAll modifier to non-array"));
}
}
/**
* @see DATAMONGO-480
*/
@Test
public void throwsExceptionForIndexViolationIfConfigured() {
MongoTemplate template = new MongoTemplate(factory);
template.setWriteResultChecking(WriteResultChecking.EXCEPTION);
template.indexOps(Person.class).ensureIndex(new Index().on("firstName", Order.DESCENDING).unique());
Person person = new Person(new ObjectId(), "Amol");
person.setAge(28);
template.save(person);
person = new Person(new ObjectId(), "Amol");
person.setAge(28);
try {
template.save(person);
fail("Expected DataIntegrityViolationException!");
} catch (DataIntegrityViolationException e) {
assertThat(e.getMessage(),
containsString("E11000 duplicate key error index: database.person.$firstName_-1 dup key:"));
}
}
/**
* @see DATAMONGO-480
*/
@Test
public void rejectsDuplicateIdInInsertAll() {
MongoTemplate template = new MongoTemplate(factory);
template.setWriteResultChecking(WriteResultChecking.EXCEPTION);
ObjectId id = new ObjectId();
Person person = new Person(id, "Amol");
person.setAge(28);
List<Person> records = new ArrayList<Person>();
records.add(person);
records.add(person);
try {
template.insertAll(records);
fail("Expected DataIntegrityViolationException!");
} catch (DataIntegrityViolationException e) {
assertThat(
e.getMessage(),
startsWith("Insert list failed: E11000 duplicate key error index: database.person.$_id_ dup key: { : ObjectId"));
}
}
@Test
public void testEnsureIndex() throws Exception {
@@ -190,7 +298,7 @@ public class MongoTemplateTests {
assertThat(dropDupes, is(true));
List<IndexInfo> indexInfoList = template.indexOps(Person.class).getIndexInfo();
System.out.println(indexInfoList);
assertThat(indexInfoList.size(), is(2));
IndexInfo ii = indexInfoList.get(1);
assertThat(ii.isUnique(), is(true));
@@ -932,7 +1040,7 @@ public class MongoTemplateTests {
DBRef first = new DBRef(factory.getDb(), "foo", new ObjectId());
DBRef second = new DBRef(factory.getDb(), "bar", new ObjectId());
template.updateFirst(null, Update.update("dbRefs", Arrays.asList(first, second)), ClassWithDBRefs.class);
template.updateFirst(null, update("dbRefs", Arrays.asList(first, second)), ClassWithDBRefs.class);
}
class ClassWithDBRefs {
@@ -1095,7 +1203,7 @@ public class MongoTemplateTests {
template.save(second);
Query query = query(where("field").not().regex("Matthews"));
System.out.println(query.getQueryObject());
List<Sample> result = template.find(query, Sample.class);
assertThat(result.size(), is(1));
assertThat(result.get(0).field, is("Beauford"));

View File

@@ -20,6 +20,7 @@ import static org.junit.Assert.*;
import static org.mockito.Mockito.*;
import java.math.BigInteger;
import java.util.regex.Pattern;
import org.bson.types.ObjectId;
import org.junit.Before;
@@ -136,6 +137,31 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
assertThat(entity.id, is(notNullValue()));
}
/**
* @see DATAMONGO-474
*/
@Test
public void setsUnpopulatedIdField() {
NotAutogenerateableId entity = new NotAutogenerateableId();
template.populateIdIfNecessary(entity, 5);
assertThat(entity.id, is(5));
}
/**
* @see DATAMONGO-474
*/
@Test
public void doesNotSetAlreadyPopulatedId() {
NotAutogenerateableId entity = new NotAutogenerateableId();
entity.id = 5;
template.populateIdIfNecessary(entity, 7);
assertThat(entity.id, is(5));
}
class AutogenerateableId {
@Id
@@ -146,6 +172,10 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
@Id
Integer id;
public Pattern getId() {
return Pattern.compile(".");
}
}
/**
@@ -161,9 +191,10 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
return template;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoOperationsUnitTests#getOperations()
*/
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperationsUnitTests#getOperationsForExceptionHandling()
*/
@Override
protected MongoOperations getOperationsForExceptionHandling() {
MongoTemplate template = spy(this.template);
@@ -171,9 +202,10 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
return template;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.MongoOperationsUnitTests#getOperations()
*/
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperationsUnitTests#getOperations()
*/
@Override
protected MongoOperations getOperations() {
return this.template;

View File

@@ -1059,6 +1059,63 @@ public class MappingMongoConverterUnitTests {
assertThat(sink.get("url"), is((Object) "http://springsource.org"));
}
/**
* @see DATAMONGO-485
*/
@Test
public void writesComplexIdCorrectly() {
ComplexId id = new ComplexId();
id.innerId = 4711L;
ClassWithComplexId entity = new ClassWithComplexId();
entity.complexId = id;
DBObject dbObject = new BasicDBObject();
converter.write(entity, dbObject);
Object idField = dbObject.get("_id");
assertThat(idField, is(notNullValue()));
assertThat(idField, is(instanceOf(DBObject.class)));
assertThat(((DBObject) idField).get("innerId"), is((Object) 4711L));
}
/**
* @see DATAMONGO-485
*/
@Test
public void readsComplexIdCorrectly() {
DBObject innerId = new BasicDBObject("innerId", 4711L);
DBObject entity = new BasicDBObject("_id", innerId);
ClassWithComplexId result = converter.read(ClassWithComplexId.class, entity);
assertThat(result.complexId, is(notNullValue()));
assertThat(result.complexId.innerId, is(4711L));
}
/**
* @see DATAMONGO-489
*/
@Test
public void readsArraysAsMapValuesCorrectly() {
BasicDBList list = new BasicDBList();
list.add("Foo");
list.add("Bar");
DBObject map = new BasicDBObject("key", list);
DBObject wrapper = new BasicDBObject("mapOfStrings", map);
ClassWithMapProperty result = converter.read(ClassWithMapProperty.class, wrapper);
assertThat(result.mapOfStrings, is(notNullValue()));
String[] values = result.mapOfStrings.get("key");
assertThat(values, is(notNullValue()));
assertThat(values, is(arrayWithSize(2)));
}
static class GenericType<T> {
T content;
}
@@ -1120,6 +1177,7 @@ public class MappingMongoConverterUnitTests {
Map<Locale, String> map;
Map<String, List<String>> mapOfLists;
Map<String, Object> mapOfObjects;
Map<String, String[]> mapOfStrings;
}
static class ClassWithNestedMaps {
@@ -1194,6 +1252,16 @@ public class MappingMongoConverterUnitTests {
URL url;
}
static class ClassWithComplexId {
@Id
ComplexId complexId;
}
static class ComplexId {
Long innerId;
}
private class LocalDateToDateConverter implements Converter<LocalDate, Date> {
public Date convert(LocalDate source) {

View File

@@ -0,0 +1,82 @@
/*
* Copyright 2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.util.Collections;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import com.mongodb.DBObject;
/**
* Unit tests for {@link MongoPersistentEntityIndexCreator}.
*
* @author Oliver Gierke
*/
@RunWith(MockitoJUnitRunner.class)
public class MongoPersistentEntityIndexCreatorUnitTests {
@Mock
MongoDbFactory factory;
@Test
public void buildsIndexDefinitionUsingFieldName() {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(Collections.singleton(Person.class));
mappingContext.afterPropertiesSet();
DummyMongoPersistentEntityIndexCreator creator = new DummyMongoPersistentEntityIndexCreator(mappingContext, factory);
assertThat(creator.indexDefinition, is(notNullValue()));
assertThat(creator.indexDefinition.keySet(), hasItem("fieldname"));
assertThat(creator.name, is("indexName"));
}
static class Person {
@Indexed(name = "indexName")
@Field("fieldname")
String field;
}
static class DummyMongoPersistentEntityIndexCreator extends MongoPersistentEntityIndexCreator {
DBObject indexDefinition;
String name;
public DummyMongoPersistentEntityIndexCreator(MongoMappingContext mappingContext, MongoDbFactory mongoDbFactory) {
super(mappingContext, mongoDbFactory);
}
@Override
protected void ensureIndex(String collection, String name, DBObject indexDefinition, boolean unique,
boolean dropDups, boolean sparse) {
this.name = name;
this.indexDefinition = indexDefinition;
}
}
}

View File

@@ -83,7 +83,7 @@ public class QueryMapperUnitTests {
public void convertsStringIntoObjectId() {
DBObject query = new BasicDBObject("_id", new ObjectId().toString());
DBObject result = mapper.getMappedObject(query, null);
DBObject result = mapper.getMappedObject(query, context.getPersistentEntity(IdWrapper.class));
assertThat(result.get("_id"), is(instanceOf(ObjectId.class)));
}
@@ -91,7 +91,7 @@ public class QueryMapperUnitTests {
public void handlesBigIntegerIdsCorrectly() {
DBObject dbObject = new BasicDBObject("id", new BigInteger("1"));
DBObject result = mapper.getMappedObject(dbObject, null);
DBObject result = mapper.getMappedObject(dbObject, context.getPersistentEntity(IdWrapper.class));
assertThat(result.get("_id"), is((Object) "1"));
}
@@ -100,7 +100,7 @@ public class QueryMapperUnitTests {
ObjectId id = new ObjectId();
DBObject dbObject = new BasicDBObject("id", new BigInteger(id.toString(), 16));
DBObject result = mapper.getMappedObject(dbObject, null);
DBObject result = mapper.getMappedObject(dbObject, context.getPersistentEntity(IdWrapper.class));
assertThat(result.get("_id"), is((Object) id));
}
@@ -198,6 +198,29 @@ public class QueryMapperUnitTests {
assertThat(result, is(query.getQueryObject()));
}
@Test
public void doesNotHandleNestedFieldsWithDefaultIdNames() {
BasicDBObject dbObject = new BasicDBObject("id", new ObjectId().toString());
dbObject.put("nested", new BasicDBObject("id", new ObjectId().toString()));
MongoPersistentEntity<?> entity = context.getPersistentEntity(ClassWithDefaultId.class);
DBObject result = mapper.getMappedObject(dbObject, entity);
assertThat(result.get("_id"), is(instanceOf(ObjectId.class)));
assertThat(((DBObject) result.get("nested")).get("id"), is(instanceOf(String.class)));
}
class IdWrapper {
Object id;
}
class ClassWithDefaultId {
String id;
ClassWithDefaultId nested;
}
class Sample {
@Id

View File

@@ -82,12 +82,22 @@ public class MongoQueryCreatorUnitTests {
PartTree tree = new PartTree("findByFirstName", Person.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, "Oliver"), context);
Query query = creator.createQuery();
assertThat(query, is(query(where("firstName").is("Oliver"))));
}
creator.createQuery();
/**
* @see DATAMONGO-469
*/
@Test
public void createsAndQueryCorrectly() {
creator = new MongoQueryCreator(new PartTree("findByFirstNameAndFriend", Person.class), getAccessor(converter,
"Oliver", new Person()), context);
creator.createQuery();
Person person = new Person();
MongoQueryCreator creator = new MongoQueryCreator(new PartTree("findByFirstNameAndFriend", Person.class),
getAccessor(converter, "Oliver", person), context);
Query query = creator.createQuery();
assertThat(query, is(query(where("firstName").is("Oliver").and("friend").is(person))));
}
@Test
@@ -96,7 +106,7 @@ public class MongoQueryCreatorUnitTests {
PartTree tree = new PartTree("findByFirstNameNotNull", Person.class);
Query query = new MongoQueryCreator(tree, getAccessor(converter), context).createQuery();
assertThat(query.getQueryObject(), is(new Query(Criteria.where("firstName").ne(null)).getQueryObject()));
assertThat(query, is(new Query(Criteria.where("firstName").ne(null))));
}
@Test
@@ -105,7 +115,7 @@ public class MongoQueryCreatorUnitTests {
PartTree tree = new PartTree("findByFirstNameIsNull", Person.class);
Query query = new MongoQueryCreator(tree, getAccessor(converter), context).createQuery();
assertThat(query.getQueryObject(), is(new Query(Criteria.where("firstName").is(null)).getQueryObject()));
assertThat(query, is(new Query(Criteria.where("firstName").is(null))));
}
@Test
@@ -137,7 +147,7 @@ public class MongoQueryCreatorUnitTests {
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, 18), context);
Query reference = query(where("age").lte(18));
assertThat(creator.createQuery().getQueryObject(), is(reference.getQueryObject()));
assertThat(creator.createQuery(), is(reference));
}
@Test
@@ -147,7 +157,7 @@ public class MongoQueryCreatorUnitTests {
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, 18), context);
Query reference = query(where("age").gte(18));
assertThat(creator.createQuery().getQueryObject(), is(reference.getQueryObject()));
assertThat(creator.createQuery(), is(reference));
}
/**
@@ -160,7 +170,7 @@ public class MongoQueryCreatorUnitTests {
MongoQueryCreator creator = new MongoQueryCreator(partTree, getAccessor(converter, "Oliver"), context);
Query reference = query(where("foo").is("Oliver"));
assertThat(creator.createQuery().getQueryObject(), is(reference.getQueryObject()));
assertThat(creator.createQuery(), is(reference));
}
/**
@@ -172,7 +182,7 @@ public class MongoQueryCreatorUnitTests {
PartTree tree = new PartTree("findByAgeExists", Person.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, true), context);
Query query = query(where("age").exists(true));
assertThat(creator.createQuery().getQueryObject(), is(query.getQueryObject()));
assertThat(creator.createQuery(), is(query));
}
/**
@@ -184,7 +194,7 @@ public class MongoQueryCreatorUnitTests {
PartTree tree = new PartTree("findByFirstNameRegex", Person.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, ".*"), context);
Query query = query(where("firstName").regex(".*"));
assertThat(creator.createQuery().getQueryObject(), is(query.getQueryObject()));
assertThat(creator.createQuery(), is(query));
}
/**
@@ -196,7 +206,7 @@ public class MongoQueryCreatorUnitTests {
PartTree tree = new PartTree("findByActiveTrue", Person.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter), context);
Query query = query(where("active").is(true));
assertThat(creator.createQuery().getQueryObject(), is(query.getQueryObject()));
assertThat(creator.createQuery(), is(query));
}
/**
@@ -208,7 +218,7 @@ public class MongoQueryCreatorUnitTests {
PartTree tree = new PartTree("findByActiveFalse", Person.class);
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter), context);
Query query = query(where("active").is(false));
assertThat(creator.createQuery().getQueryObject(), is(query.getQueryObject()));
assertThat(creator.createQuery(), is(query));
}
/**
@@ -221,8 +231,7 @@ public class MongoQueryCreatorUnitTests {
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, "Dave", 42), context);
Query query = creator.createQuery();
assertThat(query.getQueryObject(),
is(query(new Criteria().orOperator(where("firstName").is("Dave"), where("age").is(42))).getQueryObject()));
assertThat(query, is(query(new Criteria().orOperator(where("firstName").is("Dave"), where("age").is(42)))));
}
private void assertBindsDistanceToQuery(Point point, Distance distance, Query reference) throws Exception {
@@ -240,7 +249,7 @@ public class MongoQueryCreatorUnitTests {
Query query = new MongoQueryCreator(tree, new ConvertingParameterAccessor(converter, accessor), context)
.createQuery();
assertThat(query.getQueryObject(), is(query.getQueryObject()));
assertThat(query, is(query));
}
interface PersonRepository extends Repository<Person, Long> {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2012 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.repository.support;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.bson.types.ObjectId;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
@@ -34,6 +35,8 @@ import org.springframework.data.mongodb.repository.support.QueryDslMongoReposito
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mysema.query.types.expr.BooleanOperation;
import com.mysema.query.types.path.PathBuilder;
import com.mysema.query.types.path.StringPath;
/**
@@ -98,7 +101,25 @@ public class SpringDataMongodbSerializerUnitTests {
assertThat(serializer.getKeyForPath(address, address.getMetadata()), is(""));
}
/**
* @see DATAMONGO-467
*/
@Test
public void convertsIdPropertyCorrectly() {
ObjectId id = new ObjectId();
PathBuilder<Address> builder = new PathBuilder<Address>(Address.class, "address");
StringPath idPath = builder.getString("id");
DBObject result = (DBObject) serializer.visit((BooleanOperation) idPath.eq(id.toString()), (Void) null);
assertThat(result.get("_id"), is(notNullValue()));
assertThat(result.get("_id"), is(instanceOf(ObjectId.class)));
assertThat(result.get("_id"), is((Object) id));
}
class Address {
String id;
String street;
@Field("zip_code")
String zipCode;

View File

@@ -2135,7 +2135,7 @@ MapReduceResults&lt;ValueObject&gt; results = mongoOperations.mapReduce(query, "
<section id="mongo.group">
<title>Group Operations</title>
<para>As an alternative to usiing Map-Reduce to perform data aggregation,
<para>As an alternative to using Map-Reduce to perform data aggregation,
you can use the <ulink
url="http://www.mongodb.org/display/DOCS/Aggregation#Aggregation-Group"><literal>group</literal>
operation</ulink> which feels similar to using SQL's group by query style,

View File

@@ -1,6 +1,23 @@
Spring Data Document Changelog
=============================================
** Bug
* [DATAMONGO-467] - String @id field is not mapped to ObjectId when using QueryDSL ".id" path
* [DATAMONGO-469] - Query creation from method names using AND criteria does not work anymore
* [DATAMONGO-474] - Wrong property is used for Id mapping
* [DATAMONGO-475] - 'group' operation fails where query references non primitive property
* [DATAMONGO-480] - The WriteResultChecking is not used in case of insert or save of documents.
* [DATAMONGO-483] - @Indexed(unique=true, name="foo") puts name's value to the 'key' in the MongoDB
* [DATAMONGO-489] - ClassCastException when loading Map<String, String[]>
** Improvement
* [DATAMONGO-466] - QueryMapper shouldn't map id properties of nested classes
* [DATAMONGO-470] - Criteria and Query should have proper equals(…) and hashCode() method.
* [DATAMONGO-482] - typo in documentation - 2 i's in usiing
** Task
* [DATAMONGO-492] - Release 1.0.3
Changes in version 1.0.2.RELEASE MongoDB (2012-06-20)
-----------------------------------------------------
** Bug