Compare commits

...

16 Commits

Author SHA1 Message Date
Spring Buildmaster
c2aacc03ff DATAMONGO-772 - Release version 1.3.2.RELEASE. 2013-10-25 08:11:23 -07:00
Thomas Darimont
1cf544a530 DATAMONGO-772 - Prepare 1.3.2.RELEASE.
Updated version of SD-CMNS to 1.6.2.RELEASE.
2013-10-25 16:51:57 +02:00
Komi Serge Innocent
bbb097cafc DATAMONGO-746 - Creating an IndexInfo now also works with Doubles.
DefaultIndexOperations is now able to detect that in contrast to what's currently documented in the MongoDB reference documentation, it apparently returns double values for the index direction.

Original pull request: #67.
2013-10-25 11:00:17 +02:00
Thomas Darimont
feafd50b59 DATAMONGO-769 - Improve support for arithmetic operators in AggregationFramework.
We now support the usage of field references in arithmetic projection operations.

Original pull request: #80.
2013-10-14 08:59:42 +02:00
Oliver Gierke
b51cf05f90 DATAMONGO-752 - Improved keyword detection in QueryMapper.
The check for keywords in QueryMapper now selectively decides between checks for a nested keyword (DBObject) object and the check for a simple key. This allows the usage of criteria values starting with $ (e.g. { 'myvalue' : '$334' }) without the value being considered a keyword and thus erroneously triggering a potential conversion of the value.

Moved more logic for a keyword into the Keyword value object.
2013-10-13 13:50:54 +02:00
Thomas Darimont
b8196ac9ed DATAMONGO-771 - Fix raw JSON string handling in MongoTemplate.insert(…).
We now support insertion of JSON objects as plain strings via MongoTemplate.insert(…).

Original pull request: #79.
2013-10-08 12:46:17 +02:00
Oliver Gierke
e643d39fa6 DATAMONGO-761 - Fix path key lookup for non-properties in SpringDataMongoDBSerializer.
In our Querydsl MongodbSerializer implementation we now only inspect the MongoPersistentProperty for a field name if the given path is really a property path. Previously we tried to always resolve a persistent property even if the given path was an array index path, a map key or the like.
2013-10-08 12:41:58 +02:00
Thomas Darimont
6abdb0aa46 DATAMONGO-768 - Improve documentation of how to use @PersistenceConstructor.
Added an additional section to chapter 7.3 that describes the parameter value binding when the @PersistenceConstructor annotation including a small usage example. Added a concrete example for the @Value annotation that uses SpEL.

Original pull request: #77.
2013-10-01 14:05:25 +02:00
Thomas Darimont
34063ff647 DATAMONGO-759 - Improved rendering of GroupOperation.
GroupOperation gets the _id field now rendered as null if no group fields were added to the operation. Previously it was rendered as empty document (i.e. { }). While this was technically correct as well, we're now closer to what the MongoDB reference documentation describes.

Original pull request: #73.
2013-09-30 19:25:13 +02:00
Thomas Darimont
857f366b56 DATAMONGO-757 - Align output of projection operation with MongoDB defaults.
Adjusted FieldProjection to generate an appropriate representation of included / excluded fields (namely :1 for included and :0 for excluded).
Polished guards to handle only _id is allowed to be excluded (DATAMONGO-758).

Original pull request: #76.
2013-09-27 12:56:34 +02:00
Thomas Darimont
f7540d45c6 DATAMONGO-758 - Current mongodb Versions only support to explicitly exclude the _id property in a projection.
Added guard to FieldProjection.from within ProjectionOption to prevent users from excluding fields other than "_id".
2013-09-26 17:55:58 +02:00
Thomas Darimont
3d2ae8117f DATAMONGO-753 - Add support for nested field references in aggregation operations.
Aggregation pipelines now correctly handle nested field references in aggregation operations. We introduced FieldsExposingAggregationOperation to mark AggregationOperations that change the set of exposed fields available for processing by later AggregationOperations. Extracted context state out of AggregationOperation to ExposedFieldsAggregationContext for better separation of concerns. Modified toDbObject(…) in Aggregation to only replace the aggregation context when the current AggregationOperation is a FieldExposingAggregationOperation.

Original pull request: #74.
2013-09-26 14:27:39 +02:00
Spring Buildmaster
6b3bd8f621 DATAMONGO-751 - Prepare next development iteration. 2013-09-09 23:27:16 -07:00
Spring Buildmaster
b17ec47003 DATAMONGO-751 - Release version 1.3.1.RELEASE. 2013-09-09 23:27:13 -07:00
Oliver Gierke
8c7b558d39 DATAMONGO-751 - Prepare 1.3.1.RELEASE. 2013-09-09 23:12:40 -07:00
Spring Buildmaster
a3faabf718 DATAMONGO-740 - Prepare next development iteration. 2013-09-09 15:58:43 -07:00
30 changed files with 831 additions and 142 deletions

View File

@@ -26,7 +26,7 @@ Add the Maven dependency:
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.2.3.RELEASE</version>
<version>1.3.2.RELEASE</version>
</dependency>
```
@@ -36,7 +36,7 @@ If you'd rather like the latest snapshots of the upcoming major version, use our
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.3.0.BUILD-SNAPSHOT</version>
<version>1.4.0.BUILD-SNAPSHOT</version>
</dependency>
<repository>

39
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.3.0.RELEASE</version>
<version>1.3.2.RELEASE</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -29,7 +29,7 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.6.0.RELEASE</springdata.commons>
<springdata.commons>1.6.2.RELEASE</springdata.commons>
<mongo>2.10.1</mongo>
</properties>
@@ -37,9 +37,9 @@
<developer>
<id>ogierke</id>
<name>Oliver Gierke</name>
<email>ogierke at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<email>ogierke at gopivotal.com</email>
<organization>Pivotal Inc.</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Project Lean</role>
</roles>
@@ -48,9 +48,9 @@
<developer>
<id>trisberg</id>
<name>Thomas Risberg</name>
<email>trisberg at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<email>trisberg at gopivotal.com</email>
<organization>Pivotal Inc.</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -59,9 +59,9 @@
<developer>
<id>mpollack</id>
<name>Mark Pollack</name>
<email>mpollack at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<email>mpollack at gopivotal.com</email>
<organization>Pivotal Inc.</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
@@ -70,14 +70,25 @@
<developer>
<id>jbrisbin</id>
<name>Jon Brisbin</name>
<email>jbrisbin at vmware.com</email>
<organization>SpringSource</organization>
<organizationUrl>http://www.springsource.com</organizationUrl>
<email>jbrisbin at gopivotal.com</email>
<organization>Pivotal Inc.</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>-6</timezone>
</developer>
<developer>
<id>tdarimont</id>
<name>Thomas Darimont</name>
<email>tdarimont at gopivotal.com</email>
<organization>Pivotal Inc.</organization>
<organizationUrl>http://www.gopivotal.com</organizationUrl>
<roles>
<role>Developer</role>
</roles>
<timezone>+1</timezone>
</developer>
</developers>
<dependencies>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.3.0.RELEASE</version>
<version>1.3.2.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -52,7 +52,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.3.0.RELEASE</version>
<version>1.3.2.RELEASE</version>
</dependency>
<dependency>

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.3.0.RELEASE</version>
<version>1.3.2.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -5,7 +5,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.3.0.RELEASE</version>
<version>1.3.2.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.3.0.RELEASE</version>
<version>1.3.2.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011 the original author or authors.
* Copyright 2011-2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core;
import static org.springframework.data.domain.Sort.Direction.*;
import java.util.ArrayList;
import java.util.List;
@@ -22,7 +24,6 @@ import org.springframework.dao.DataAccessException;
import org.springframework.data.mongodb.core.index.IndexDefinition;
import org.springframework.data.mongodb.core.index.IndexField;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.util.Assert;
import com.mongodb.DBCollection;
@@ -34,9 +35,13 @@ import com.mongodb.MongoException;
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Komi Innocent
*/
public class DefaultIndexOperations implements IndexOperations {
private static final Double ONE = Double.valueOf(1);
private static final Double MINUS_ONE = Double.valueOf(-1);
private final MongoOperations mongoOperations;
private final String collectionName;
@@ -135,12 +140,17 @@ public class DefaultIndexOperations implements IndexOperations {
Object value = keyDbObject.get(key);
if (Integer.valueOf(1).equals(value)) {
indexFields.add(IndexField.create(key, Order.ASCENDING));
} else if (Integer.valueOf(-1).equals(value)) {
indexFields.add(IndexField.create(key, Order.DESCENDING));
} else if ("2d".equals(value)) {
if ("2d".equals(value)) {
indexFields.add(IndexField.geo(key));
} else {
Double keyValue = new Double(value.toString());
if (ONE.equals(keyValue)) {
indexFields.add(IndexField.create(key, ASC));
} else if (MINUS_ONE.equals(keyValue)) {
indexFields.add(IndexField.create(key, DESC));
}
}
}

View File

@@ -700,10 +700,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
initializeVersionProperty(objectToSave);
BasicDBObject dbDoc = new BasicDBObject();
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave));
writer.write(objectToSave, dbDoc);
DBObject dbDoc = toDbObject(objectToSave, writer);
maybeEmitEvent(new BeforeSaveEvent<T>(objectToSave, dbDoc));
Object id = insertDBObject(collectionName, dbDoc, objectToSave.getClass());
@@ -712,6 +711,26 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
maybeEmitEvent(new AfterSaveEvent<T>(objectToSave, dbDoc));
}
/**
* @param objectToSave
* @param writer
* @return
*/
private <T> DBObject toDbObject(T objectToSave, MongoWriter<T> writer) {
if (!(objectToSave instanceof String)) {
DBObject dbDoc = new BasicDBObject();
writer.write(objectToSave, dbDoc);
return dbDoc;
} else {
try {
return (DBObject) JSON.parse((String) objectToSave);
} catch (JSONParseException e) {
throw new MappingException("Could not parse given String to save into a JSON document!", e);
}
}
}
private void initializeVersionProperty(Object entity) {
MongoPersistentEntity<?> mongoPersistentEntity = getPersistentEntity(entity.getClass());
@@ -851,19 +870,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
assertUpdateableIdIfNotSet(objectToSave);
DBObject dbDoc = new BasicDBObject();
maybeEmitEvent(new BeforeConvertEvent<T>(objectToSave));
if (!(objectToSave instanceof String)) {
writer.write(objectToSave, dbDoc);
} else {
try {
dbDoc = (DBObject) JSON.parse((String) objectToSave);
} catch (JSONParseException e) {
throw new MappingException("Could not parse given String to save into a JSON document!", e);
}
}
DBObject dbDoc = toDbObject(objectToSave, writer);
maybeEmitEvent(new BeforeSaveEvent<T>(objectToSave, dbDoc));
Object id = saveDBObject(collectionName, dbDoc, objectToSave.getClass());

View File

@@ -227,8 +227,9 @@ public class Aggregation {
operationDocuments.add(operation.toDBObject(context));
if (operation instanceof AggregationOperationContext) {
context = (AggregationOperationContext) operation;
if (operation instanceof FieldsExposingAggregationOperation) {
FieldsExposingAggregationOperation exposedFieldsOperation = (FieldsExposingAggregationOperation) operation;
context = new ExposedFieldsAggregationOperationContext(exposedFieldsOperation.getFields());
}
}

View File

@@ -29,6 +29,7 @@ import org.springframework.util.CompositeIterator;
* Value object to capture the fields exposed by an {@link AggregationOperation}.
*
* @author Oliver Gierke
* @author Thomas Darimont
* @since 1.3
*/
public class ExposedFields implements Iterable<ExposedField> {
@@ -151,13 +152,47 @@ public class ExposedFields implements Iterable<ExposedField> {
return null;
}
/**
* Returns whether the {@link ExposedFields} exposes no non-synthetic fields at all.
*
* @return
*/
boolean exposesNoNonSyntheticFields() {
return originalFields.isEmpty();
}
/**
* Returns whether the {@link ExposedFields} exposes a single non-synthetic field only.
*
* @return
*/
boolean exposesSingleNonSyntheticFieldOnly() {
return originalFields.size() == 1;
}
/**
* Returns whether the {@link ExposedFields} exposes no fields at all.
*
* @return
*/
boolean exposesNoFields() {
return exposedFieldsCount() == 0;
}
/**
* Returns whether the {@link ExposedFields} exposes a single field only.
*
* @return
*/
public boolean exposesSingleFieldOnly() {
return originalFields.size() + syntheticFields.size() == 1;
boolean exposesSingleFieldOnly() {
return exposedFieldsCount() == 1;
}
/**
* @return
*/
private int exposedFieldsCount() {
return originalFields.size() + syntheticFields.size();
}
/*

View File

@@ -17,17 +17,32 @@ package org.springframework.data.mongodb.core.aggregation;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.ExposedField;
import org.springframework.data.mongodb.core.aggregation.ExposedFields.FieldReference;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* Support class to implement {@link AggregationOperation}s that will become an {@link AggregationOperationContext} as
* well defining {@link ExposedFields}.
* {@link AggregationOperationContext} that combines the available field references from a given
* {@code AggregationOperationContext} and an {@link FieldsExposingAggregationOperation}.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.3
* @since 1.4
*/
public abstract class ExposedFieldsAggregationOperationContext implements AggregationOperationContext {
class ExposedFieldsAggregationOperationContext implements AggregationOperationContext {
private final ExposedFields exposedFields;
/**
* Creates a new {@link ExposedFieldsAggregationOperationContext} from the given {@link ExposedFields}.
*
* @param exposedFields must not be {@literal null}.
*/
public ExposedFieldsAggregationOperationContext(ExposedFields exposedFields) {
Assert.notNull(exposedFields, "ExposedFields must not be null!");
this.exposedFields = exposedFields;
}
/*
* (non-Javadoc)
@@ -54,7 +69,7 @@ public abstract class ExposedFieldsAggregationOperationContext implements Aggreg
@Override
public FieldReference getReference(String name) {
ExposedField field = getFields().getField(name);
ExposedField field = exposedFields.getField(name);
if (field != null) {
return new FieldReference(field);
@@ -62,6 +77,4 @@ public abstract class ExposedFieldsAggregationOperationContext implements Aggreg
throw new IllegalArgumentException(String.format("Invalid reference '%s'!", name));
}
protected abstract ExposedFields getFields();
}

View File

@@ -0,0 +1,32 @@
/*
* Copyright 2013 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
/**
* {@link AggregationOperation} that exposes new {@link ExposedFields} that can be used for later aggregation pipeline
* {@code AggregationOperation}s.
*
* @author Thomas Darimont
*/
public interface FieldsExposingAggregationOperation extends AggregationOperation {
/**
* Returns the fields exposed by the {@link AggregationOperation}.
*
* @return will never be {@literal null}.
*/
ExposedFields getFields();
}

View File

@@ -38,9 +38,13 @@ import com.mongodb.DBObject;
* @author Oliver Gierke
* @since 1.3
*/
public class GroupOperation extends ExposedFieldsAggregationOperationContext implements AggregationOperation {
public class GroupOperation implements FieldsExposingAggregationOperation {
/**
* Holds the non-synthetic fields which are the fields of the group-id structure.
*/
private final ExposedFields idFields;
private final ExposedFields nonSynthecticFields;
private final List<Operation> operations;
/**
@@ -50,7 +54,7 @@ public class GroupOperation extends ExposedFieldsAggregationOperationContext imp
*/
public GroupOperation(Fields fields) {
this.nonSynthecticFields = ExposedFields.nonSynthetic(fields);
this.idFields = ExposedFields.nonSynthetic(fields);
this.operations = new ArrayList<Operation>();
}
@@ -74,7 +78,7 @@ public class GroupOperation extends ExposedFieldsAggregationOperationContext imp
Assert.notNull(groupOperation, "GroupOperation must not be null!");
Assert.notNull(nextOperations, "NextOperations must not be null!");
this.nonSynthecticFields = groupOperation.nonSynthecticFields;
this.idFields = groupOperation.idFields;
this.operations = new ArrayList<Operation>(nextOperations.size() + 1);
this.operations.addAll(groupOperation.operations);
this.operations.addAll(nextOperations);
@@ -261,7 +265,7 @@ public class GroupOperation extends ExposedFieldsAggregationOperationContext imp
@Override
public ExposedFields getFields() {
ExposedFields fields = this.nonSynthecticFields.and(new ExposedField(Fields.UNDERSCORE_ID, true));
ExposedFields fields = this.idFields.and(new ExposedField(Fields.UNDERSCORE_ID, true));
for (Operation operation : operations) {
fields = fields.and(operation.asField());
@@ -279,16 +283,20 @@ public class GroupOperation extends ExposedFieldsAggregationOperationContext imp
BasicDBObject operationObject = new BasicDBObject();
if (nonSynthecticFields.exposesSingleFieldOnly()) {
if (idFields.exposesNoNonSyntheticFields()) {
FieldReference reference = context.getReference(nonSynthecticFields.iterator().next());
operationObject.put(Fields.UNDERSCORE_ID, null);
} else if (idFields.exposesSingleNonSyntheticFieldOnly()) {
FieldReference reference = context.getReference(idFields.iterator().next());
operationObject.put(Fields.UNDERSCORE_ID, reference.toString());
} else {
BasicDBObject inner = new BasicDBObject();
for (ExposedField field : nonSynthecticFields) {
for (ExposedField field : idFields) {
FieldReference reference = context.getReference(field);
inner.put(field.getName(), reference.toString());
}

View File

@@ -41,9 +41,11 @@ import com.mongodb.DBObject;
* @author Oliver Gierke
* @since 1.3
*/
public class ProjectionOperation extends ExposedFieldsAggregationOperationContext implements AggregationOperation {
public class ProjectionOperation implements FieldsExposingAggregationOperation {
private static final List<Projection> NONE = Collections.emptyList();
private static final String EXCLUSION_ERROR = "Exclusion of field %s not allowed. Projections by the mongodb "
+ "aggregation framework only support the exclusion of the %s field!";
private final List<Projection> projections;
@@ -60,7 +62,7 @@ public class ProjectionOperation extends ExposedFieldsAggregationOperationContex
* @param fields must not be {@literal null}.
*/
public ProjectionOperation(Fields fields) {
this(NONE, ProjectionOperationBuilder.FieldProjection.from(fields, true));
this(NONE, ProjectionOperationBuilder.FieldProjection.from(fields));
}
/**
@@ -117,23 +119,29 @@ public class ProjectionOperation extends ExposedFieldsAggregationOperationContex
/**
* Excludes the given fields from the projection.
*
* @param fields must not be {@literal null}.
* @param fieldNames must not be {@literal null}.
* @return
*/
public ProjectionOperation andExclude(String... fields) {
List<FieldProjection> excludeProjections = FieldProjection.from(Fields.fields(fields), false);
public ProjectionOperation andExclude(String... fieldNames) {
for (String fieldName : fieldNames) {
Assert.isTrue(Fields.UNDERSCORE_ID.equals(fieldName),
String.format(EXCLUSION_ERROR, fieldName, Fields.UNDERSCORE_ID));
}
List<FieldProjection> excludeProjections = FieldProjection.from(Fields.fields(fieldNames), false);
return new ProjectionOperation(this.projections, excludeProjections);
}
/**
* Includes the given fields into the projection.
*
* @param fields must not be {@literal null}.
* @param fieldNames must not be {@literal null}.
* @return
*/
public ProjectionOperation andInclude(String... fields) {
public ProjectionOperation andInclude(String... fieldNames) {
List<FieldProjection> projections = FieldProjection.from(Fields.fields(fields), true);
List<FieldProjection> projections = FieldProjection.from(Fields.fields(fieldNames), true);
return new ProjectionOperation(this.projections, projections);
}
@@ -147,12 +155,12 @@ public class ProjectionOperation extends ExposedFieldsAggregationOperationContex
return new ProjectionOperation(this.projections, FieldProjection.from(fields, true));
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ExposedFieldsAggregationOperationContext#getFields()
* @see org.springframework.data.mongodb.core.aggregation.FieldsExposingAggregationOperation#getFields()
*/
@Override
protected ExposedFields getFields() {
public ExposedFields getFields() {
ExposedFields fields = null;
@@ -184,6 +192,7 @@ public class ProjectionOperation extends ExposedFieldsAggregationOperationContex
* Builder for {@link ProjectionOperation}s on a field.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public static class ProjectionOperationBuilder implements AggregationOperation {
@@ -258,6 +267,18 @@ public class ProjectionOperation extends ExposedFieldsAggregationOperationContex
return project("add", number);
}
/**
* Generates an {@code $add} expression that adds the value of the given field to the previously mentioned field.
*
* @param fieldReference
* @return
*/
public ProjectionOperationBuilder plus(String fieldReference) {
Assert.notNull(fieldReference, "Field reference must not be null!");
return project("add", Fields.field(fieldReference));
}
/**
* Generates an {@code $subtract} expression that subtracts the given number to the previously mentioned field.
*
@@ -270,6 +291,19 @@ public class ProjectionOperation extends ExposedFieldsAggregationOperationContex
return project("subtract", number);
}
/**
* Generates an {@code $subtract} expression that subtracts the value of the given field to the previously mentioned
* field.
*
* @param fieldReference
* @return
*/
public ProjectionOperationBuilder minus(String fieldReference) {
Assert.notNull(fieldReference, "Field reference must not be null!");
return project("subtract", Fields.field(fieldReference));
}
/**
* Generates an {@code $multiply} expression that multiplies the given number with the previously mentioned field.
*
@@ -282,6 +316,19 @@ public class ProjectionOperation extends ExposedFieldsAggregationOperationContex
return project("multiply", number);
}
/**
* Generates an {@code $multiply} expression that multiplies the value of the given field with the previously
* mentioned field.
*
* @param fieldReference
* @return
*/
public ProjectionOperationBuilder multiply(String fieldReference) {
Assert.notNull(fieldReference, "Field reference must not be null!");
return project("multiply", Fields.field(fieldReference));
}
/**
* Generates an {@code $divide} expression that divides the previously mentioned field by the given number.
*
@@ -295,6 +342,19 @@ public class ProjectionOperation extends ExposedFieldsAggregationOperationContex
return project("divide", number);
}
/**
* Generates an {@code $divide} expression that divides the value of the given field by the previously mentioned
* field.
*
* @param fieldReference
* @return
*/
public ProjectionOperationBuilder divide(String fieldReference) {
Assert.notNull(fieldReference, "Field reference must not be null!");
return project("divide", Fields.field(fieldReference));
}
/**
* Generates an {@code $mod} expression that divides the previously mentioned field by the given number and returns
* the remainder.
@@ -309,7 +369,21 @@ public class ProjectionOperation extends ExposedFieldsAggregationOperationContex
return project("mod", number);
}
/* (non-Javadoc)
/**
* Generates an {@code $mod} expression that divides the value of the given field by the previously mentioned field
* and returns the remainder.
*
* @param fieldReference
* @return
*/
public ProjectionOperationBuilder mod(String fieldReference) {
Assert.notNull(fieldReference, "Field reference must not be null!");
return project("mod", Fields.field(fieldReference));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
@@ -362,6 +436,7 @@ public class ProjectionOperation extends ExposedFieldsAggregationOperationContex
* A {@link FieldProjection} to map a result of a previous {@link AggregationOperation} to a new field.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
static class FieldProjection extends Projection {
@@ -386,20 +461,31 @@ public class ProjectionOperation extends ExposedFieldsAggregationOperationContex
this.value = value;
}
/**
* Factory method to easily create {@link FieldProjection}s for the given {@link Fields}. Fields are projected as
* references with their given name. A field {@code foo} will be projected as: {@code foo : 1 } .
*
* @param fields the {@link Fields} to in- or exclude, must not be {@literal null}.
* @return
*/
public static List<? extends Projection> from(Fields fields) {
return from(fields, null);
}
/**
* Factory method to easily create {@link FieldProjection}s for the given {@link Fields}.
*
* @param fields the {@link Fields} to in- or exclude, must not be {@literal null}.
* @param include whether to include or exclude the fields.
* @param value to use for the given field.
* @return
*/
public static List<FieldProjection> from(Fields fields, boolean include) {
public static List<FieldProjection> from(Fields fields, Object value) {
Assert.notNull(fields, "Fields must not be null!");
List<FieldProjection> projections = new ArrayList<FieldProjection>();
for (Field field : fields) {
projections.add(new FieldProjection(field, include ? null : 0));
projections.add(new FieldProjection(field, value));
}
return projections;
@@ -411,13 +497,32 @@ public class ProjectionOperation extends ExposedFieldsAggregationOperationContex
*/
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject(field.getName(), renderFieldValue(context));
}
if (value != null) {
return new BasicDBObject(field.getName(), value);
private Object renderFieldValue(AggregationOperationContext context) {
// implicit reference or explicit include?
if (value == null || Boolean.TRUE.equals(value)) {
// check whether referenced field exists in the context
FieldReference reference = context.getReference(field.getTarget());
if (field.getName().equals(field.getTarget())) {
// render field as included
return 1;
}
// render field reference
return reference.toString();
} else if (Boolean.FALSE.equals(value)) {
// render field as excluded
return 0;
}
FieldReference reference = context.getReference(field.getTarget());
return new BasicDBObject(field.getName(), reference.toString());
return value;
}
}

View File

@@ -71,9 +71,9 @@ public class TypeBasedAggregationOperationContext implements AggregationOperatio
return mapper.getMappedObject(dbObject, mappingContext.getPersistentEntity(type));
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.ExposedFields.AvailableField)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperationContext#getReference(org.springframework.data.mongodb.core.aggregation.Field)
*/
@Override
public FieldReference getReference(Field field) {

View File

@@ -29,7 +29,7 @@ import com.mongodb.DBObject;
* @author Oliver Gierke
* @since 1.3
*/
public class UnwindOperation extends ExposedFieldsAggregationOperationContext implements AggregationOperation {
public class UnwindOperation implements AggregationOperation {
private final ExposedField field;
@@ -44,15 +44,6 @@ public class UnwindOperation extends ExposedFieldsAggregationOperationContext im
this.field = new ExposedField(field, true);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.ExposedFieldsAggregationOperationContext#getFields()
*/
@Override
protected ExposedFields getFields() {
return ExposedFields.from(field);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)

View File

@@ -48,7 +48,6 @@ import com.mongodb.DBRef;
public class QueryMapper {
private static final List<String> DEFAULT_ID_NAMES = Arrays.asList("id", "_id");
private static final String N_OR_PATTERN = "\\$.*or";
private final ConversionService conversionService;
private final MongoConverter converter;
@@ -79,7 +78,7 @@ public class QueryMapper {
@SuppressWarnings("deprecation")
public DBObject getMappedObject(DBObject query, MongoPersistentEntity<?> entity) {
if (Keyword.isKeyword(query)) {
if (isNestedKeyword(query)) {
return getMappedKeyword(new Keyword(query), entity);
}
@@ -97,7 +96,7 @@ public class QueryMapper {
continue;
}
if (Keyword.isKeyword(key)) {
if (isKeyword(key)) {
result.putAll(getMappedKeyword(new Keyword(query, key), entity));
continue;
}
@@ -107,7 +106,7 @@ public class QueryMapper {
Object rawValue = query.get(key);
String newKey = field.getMappedKey();
if (Keyword.isKeyword(rawValue) && !field.isIdField()) {
if (isNestedKeyword(rawValue) && !field.isIdField()) {
Keyword keyword = new Keyword((DBObject) rawValue);
result.put(newKey, getMappedKeyword(field, keyword));
} else {
@@ -121,16 +120,16 @@ public class QueryMapper {
/**
* Returns the given {@link DBObject} representing a keyword by mapping the keyword's value.
*
* @param query the {@link DBObject} representing a keyword (e.g. {@code $ne : … } )
* @param keyword the {@link DBObject} representing a keyword (e.g. {@code $ne : … } )
* @param entity
* @return
*/
private DBObject getMappedKeyword(Keyword query, MongoPersistentEntity<?> entity) {
private DBObject getMappedKeyword(Keyword keyword, MongoPersistentEntity<?> entity) {
// $or/$nor
if (query.key.matches(N_OR_PATTERN) || query.value instanceof Iterable) {
if (keyword.isOrOrNor() || keyword.hasIterableValue()) {
Iterable<?> conditions = (Iterable<?>) query.value;
Iterable<?> conditions = keyword.getValue();
BasicDBList newConditions = new BasicDBList();
for (Object condition : conditions) {
@@ -138,10 +137,10 @@ public class QueryMapper {
: convertSimpleOrDBObject(condition, entity));
}
return new BasicDBObject(query.key, newConditions);
return new BasicDBObject(keyword.getKey(), newConditions);
}
return new BasicDBObject(query.key, convertSimpleOrDBObject(query.value, entity));
return new BasicDBObject(keyword.getKey(), convertSimpleOrDBObject(keyword.getValue(), entity));
}
/**
@@ -154,10 +153,12 @@ public class QueryMapper {
private DBObject getMappedKeyword(Field property, Keyword keyword) {
boolean needsAssociationConversion = property.isAssociation() && !keyword.isExists();
Object value = needsAssociationConversion ? convertAssociation(keyword.value, property.getProperty())
: getMappedValue(property.with(keyword.key), keyword.value);
Object value = keyword.getValue();
return new BasicDBObject(keyword.key, value);
Object convertedValue = needsAssociationConversion ? convertAssociation(value, property.getProperty())
: getMappedValue(property.with(keyword.getKey()), value);
return new BasicDBObject(keyword.key, convertedValue);
}
/**
@@ -195,7 +196,7 @@ public class QueryMapper {
}
}
if (Keyword.isKeyword(value)) {
if (isNestedKeyword(value)) {
return getMappedKeyword(new Keyword((DBObject) value), null);
}
@@ -289,6 +290,39 @@ public class QueryMapper {
return delegateConvertToMongoType(id, null);
}
/**
* Returns whether the given {@link Object} is a keyword, i.e. if it's a {@link DBObject} with a keyword key.
*
* @param candidate
* @return
*/
protected boolean isNestedKeyword(Object candidate) {
if (!(candidate instanceof BasicDBObject)) {
return false;
}
BasicDBObject dbObject = (BasicDBObject) candidate;
Set<String> keys = dbObject.keySet();
if (keys.size() != 1) {
return false;
}
return isKeyword(keys.iterator().next().toString());
}
/**
* Returns whether the given {@link String} is a MongoDB keyword. The default implementation will check against the
* set of registered keywords returned by {@link #getKeywords()}.
*
* @param candidate
* @return
*/
protected boolean isKeyword(String candidate) {
return candidate.startsWith("$");
}
/**
* Value object to capture a query keyword representation.
*
@@ -296,8 +330,10 @@ public class QueryMapper {
*/
private static class Keyword {
String key;
Object value;
private static final String N_OR_PATTERN = "\\$.*or";
private final String key;
private final Object value;
public Keyword(DBObject source, String key) {
this.key = key;
@@ -322,25 +358,21 @@ public class QueryMapper {
return "$exists".equalsIgnoreCase(key);
}
/**
* Returns whether the given value actually represents a keyword. If this returns {@literal true} it's safe to call
* the constructor.
*
* @param value
* @return
*/
public static boolean isKeyword(Object value) {
public boolean isOrOrNor() {
return key.matches(N_OR_PATTERN);
}
if (value instanceof String) {
return ((String) value).startsWith("$");
}
public boolean hasIterableValue() {
return value instanceof Iterable;
}
if (!(value instanceof DBObject)) {
return false;
}
public String getKey() {
return key;
}
DBObject dbObject = (DBObject) value;
return dbObject.keySet().size() == 1 && dbObject.keySet().iterator().next().startsWith("$");
@SuppressWarnings("unchecked")
public <T> T getValue() {
return (T) value;
}
}

View File

@@ -28,6 +28,7 @@ import com.mongodb.DBObject;
import com.mysema.query.mongodb.MongodbSerializer;
import com.mysema.query.types.Path;
import com.mysema.query.types.PathMetadata;
import com.mysema.query.types.PathType;
/**
* Custom {@link MongodbSerializer} to take mapping information into account when building keys for constraints.
@@ -61,6 +62,10 @@ class SpringDataMongodbSerializer extends MongodbSerializer {
@Override
protected String getKeyForPath(Path<?> expr, PathMetadata<?> metadata) {
if (!metadata.getPathType().equals(PathType.PROPERTY)) {
return super.getKeyForPath(expr, metadata);
}
Path<?> parent = metadata.getParent();
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(parent.getType());
MongoPersistentProperty property = entity.getPersistentProperty(metadata.getName());

View File

@@ -90,6 +90,7 @@ import com.mongodb.WriteResult;
* @author Amol Nayak
* @author Patryk Wasik
* @author Thomas Darimont
* @author Komi Innocent
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:infrastructure.xml")
@@ -333,6 +334,43 @@ public class MongoTemplateTests {
assertThat(field, is(IndexField.create("age", Direction.DESC)));
}
/**
* @see DATAMONGO-746
*/
@Test
public void testReadIndexInfoForIndicesCreatedViaMongoShellCommands() throws Exception {
String command = "db." + template.getCollectionName(Person.class)
+ ".ensureIndex({'age':-1}, {'unique':true, 'sparse':true})";
template.indexOps(Person.class).dropAllIndexes();
assertThat(template.indexOps(Person.class).getIndexInfo().isEmpty(), is(true));
factory.getDb().eval(command);
List<DBObject> indexInfo = template.getCollection(template.getCollectionName(Person.class)).getIndexInfo();
String indexKey = null;
boolean unique = false;
for (DBObject ix : indexInfo) {
if ("age_-1".equals(ix.get("name"))) {
indexKey = ix.get("key").toString();
unique = (Boolean) ix.get("unique");
}
}
assertThat(indexKey, is("{ \"age\" : -1.0}"));
assertThat(unique, is(true));
IndexInfo info = template.indexOps(Person.class).getIndexInfo().get(1);
assertThat(info.isUnique(), is(true));
assertThat(info.isSparse(), is(true));
List<IndexField> indexFields = info.getIndexFields();
IndexField field = indexFields.get(0);
assertThat(field, is(IndexField.create("age", Direction.DESC)));
}
@Test
public void testProperHandlingOfDifferentIdTypesWithMappingMongoConverter() throws Exception {
testProperHandlingOfDifferentIdTypes(this.mappingTemplate);
@@ -1996,6 +2034,24 @@ public class MongoTemplateTests {
assertThat(result.get(2).getClass(), is((Object) VerySpecialDoc.class));
}
/**
* @see DATAMONGO-771
*/
@Test
public void allowInsertWithPlainJsonString() {
String id = "4711";
String value = "bubu";
String json = String.format("{_id:%s, field: '%s'}", id, value);
template.insert(json, "sample");
List<Sample> result = template.findAll(Sample.class);
assertThat(result.size(), is(1));
assertThat(result.get(0).id, is(id));
assertThat(result.get(0).field, is(value));
}
static interface Model {
String value();

View File

@@ -15,7 +15,7 @@
*/
package org.springframework.data.mongodb.core.aggregation;
import static org.hamcrest.CoreMatchers.*;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.domain.Sort.Direction.*;
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
@@ -84,6 +84,7 @@ public class AggregationTests {
mongoTemplate.dropCollection(INPUT_COLLECTION);
mongoTemplate.dropCollection(Product.class);
mongoTemplate.dropCollection(UserWithLikes.class);
mongoTemplate.dropCollection(DATAMONGO753.class);
}
/**
@@ -439,6 +440,11 @@ public class AggregationTests {
.and("netPrice").multiply(2).as("netPriceMul2") //
.and("netPrice").divide(1.19).as("netPriceDiv119") //
.and("spaceUnits").mod(2).as("spaceUnitsMod2") //
.and("spaceUnits").plus("spaceUnits").as("spaceUnitsPlusSpaceUnits") //
.and("spaceUnits").minus("spaceUnits").as("spaceUnitsMinusSpaceUnits") //
.and("spaceUnits").multiply("spaceUnits").as("spaceUnitsMultiplySpaceUnits") //
.and("spaceUnits").divide("spaceUnits").as("spaceUnitsDivideSpaceUnits") //
.and("spaceUnits").mod("spaceUnits").as("spaceUnitsModSpaceUnits") //
);
AggregationResults<DBObject> result = mongoTemplate.aggregate(agg, DBObject.class);
@@ -452,7 +458,66 @@ public class AggregationTests {
assertThat((Double) resultList.get(0).get("netPriceMul2"), is(netPrice * 2));
assertThat((Double) resultList.get(0).get("netPriceDiv119"), is(netPrice / 1.19));
assertThat((Integer) resultList.get(0).get("spaceUnitsMod2"), is(spaceUnits % 2));
assertThat((Integer) resultList.get(0).get("spaceUnitsPlusSpaceUnits"), is(spaceUnits + spaceUnits));
assertThat((Integer) resultList.get(0).get("spaceUnitsMinusSpaceUnits"), is(spaceUnits - spaceUnits));
assertThat((Integer) resultList.get(0).get("spaceUnitsMultiplySpaceUnits"), is(spaceUnits * spaceUnits));
assertThat((Double) resultList.get(0).get("spaceUnitsDivideSpaceUnits"), is((double) (spaceUnits / spaceUnits)));
assertThat((Integer) resultList.get(0).get("spaceUnitsModSpaceUnits"), is(spaceUnits % spaceUnits));
}
/**
* @see DATAMONGO-753
* @see http
* ://stackoverflow.com/questions/18653574/spring-data-mongodb-aggregation-framework-invalid-reference-in-group
* -operati
*/
@Test
public void allowsNestedFieldReferencesAsGroupIdsInGroupExpressions() {
mongoTemplate.insert(new DATAMONGO753().withPDs(new PD("A", 1), new PD("B", 1), new PD("C", 1)));
mongoTemplate.insert(new DATAMONGO753().withPDs(new PD("B", 1), new PD("B", 1), new PD("C", 1)));
TypedAggregation<DATAMONGO753> agg = newAggregation(DATAMONGO753.class, //
unwind("pd"), //
group("pd.pDch") // the nested field expression
.sum("pd.up").as("uplift"), //
project("_id", "uplift"));
AggregationResults<DBObject> result = mongoTemplate.aggregate(agg, DBObject.class);
List<DBObject> stats = result.getMappedResults();
assertThat(stats.size(), is(3));
assertThat(stats.get(0).get("_id").toString(), is("C"));
assertThat((Integer) stats.get(0).get("uplift"), is(2));
assertThat(stats.get(1).get("_id").toString(), is("B"));
assertThat((Integer) stats.get(1).get("uplift"), is(3));
assertThat(stats.get(2).get("_id").toString(), is("A"));
assertThat((Integer) stats.get(2).get("uplift"), is(1));
}
/**
* @see DATAMONGO-753
* @see http
* ://stackoverflow.com/questions/18653574/spring-data-mongodb-aggregation-framework-invalid-reference-in-group
* -operati
*/
@Test
public void aliasesNestedFieldInProjectionImmediately() {
mongoTemplate.insert(new DATAMONGO753().withPDs(new PD("A", 1), new PD("B", 1), new PD("C", 1)));
mongoTemplate.insert(new DATAMONGO753().withPDs(new PD("B", 1), new PD("B", 1), new PD("C", 1)));
TypedAggregation<DATAMONGO753> agg = newAggregation(DATAMONGO753.class, //
unwind("pd"), //
project().and("pd.up").as("up"));
AggregationResults<DBObject> results = mongoTemplate.aggregate(agg, DBObject.class);
List<DBObject> mappedResults = results.getMappedResults();
assertThat(mappedResults, hasSize(6));
for (DBObject element : mappedResults) {
assertThat(element.get("up"), is((Object) 1));
}
}
private void assertLikeStats(LikeStats like, String id, long count) {
@@ -502,4 +567,22 @@ public class AggregationTests {
assertThat(tagCount.getN(), is(n));
}
static class DATAMONGO753 {
PD[] pd;
DATAMONGO753 withPDs(PD... pds) {
this.pd = pds;
return this;
}
}
static class PD {
String pDch;
@org.springframework.data.mongodb.core.mapping.Field("alias") int up;
public PD(String pDch, int up) {
this.pDch = pDch;
this.up = up;
}
}
}

View File

@@ -15,32 +15,83 @@
*/
package org.springframework.data.mongodb.core.aggregation;
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.ExpectedException;
/**
* Unit tests for {@link Aggregation}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class AggregationUnitTests {
public @Rule ExpectedException exception = ExpectedException.none();
@Test(expected = IllegalArgumentException.class)
public void rejectsNullAggregationOperation() {
Aggregation.newAggregation((AggregationOperation[]) null);
newAggregation((AggregationOperation[]) null);
}
@Test(expected = IllegalArgumentException.class)
public void rejectsNullTypedAggregationOperation() {
Aggregation.newAggregation(String.class, (AggregationOperation[]) null);
newAggregation(String.class, (AggregationOperation[]) null);
}
@Test(expected = IllegalArgumentException.class)
public void rejectsNoAggregationOperation() {
Aggregation.newAggregation(new AggregationOperation[0]);
newAggregation(new AggregationOperation[0]);
}
@Test(expected = IllegalArgumentException.class)
public void rejectsNoTypedAggregationOperation() {
Aggregation.newAggregation(String.class, new AggregationOperation[0]);
newAggregation(String.class, new AggregationOperation[0]);
}
/**
* @see DATAMONGO-753
*/
@Test
public void checkForCorrectFieldScopeTransfer() {
exception.expect(IllegalArgumentException.class);
exception.expectMessage("Invalid reference");
exception.expectMessage("'b'");
newAggregation( //
project("a", "b"), //
group("a").count().as("cnt"), // a was introduced to the context by the project operation
project("cnt", "b") // b was removed from the context by the group operation
).toDbObject("foo", Aggregation.DEFAULT_CONTEXT); // -> triggers IllegalArgumentException
}
/**
* @see DATAMONGO-753
*/
@Test
public void unwindOperationShouldNotChangeAvailableFields() {
newAggregation( //
project("a", "b"), //
unwind("a"), //
project("a", "b") // b should still be available
).toDbObject("foo", Aggregation.DEFAULT_CONTEXT);
}
/**
* @see DATAMONGO-753
*/
@Test
public void matchOperationShouldNotChangeAvailableFields() {
newAggregation( //
project("a", "b"), //
match(where("a").gte(1)), //
project("a", "b") // b should still be available
).toDbObject("foo", Aggregation.DEFAULT_CONTEXT);
}
}

View File

@@ -37,6 +37,38 @@ public class GroupOperationUnitTests {
new GroupOperation((Fields) null);
}
/**
* @see DATAMONGO-759
*/
@Test
public void groupOperationWithNoGroupIdFieldsShouldGenerateNullAsGroupId() {
GroupOperation operation = new GroupOperation(Fields.from());
ExposedFields fields = operation.getFields();
DBObject groupClause = extractDbObjectFromGroupOperation(operation);
assertThat(fields.exposesSingleFieldOnly(), is(true));
assertThat(fields.exposesNoFields(), is(false));
assertThat(groupClause.get(UNDERSCORE_ID), is(nullValue()));
}
/**
* @see DATAMONGO-759
*/
@Test
public void groupOperationWithNoGroupIdFieldsButAdditionalFieldsShouldGenerateNullAsGroupId() {
GroupOperation operation = new GroupOperation(Fields.from()).count().as("cnt").last("foo").as("foo");
ExposedFields fields = operation.getFields();
DBObject groupClause = extractDbObjectFromGroupOperation(operation);
assertThat(fields.exposesSingleFieldOnly(), is(false));
assertThat(fields.exposesNoFields(), is(false));
assertThat(groupClause.get(UNDERSCORE_ID), is(nullValue()));
assertThat((BasicDBObject) groupClause.get("cnt"), is(new BasicDBObject("$sum", 1)));
assertThat((BasicDBObject) groupClause.get("foo"), is(new BasicDBObject("$last", "$foo")));
}
@Test
public void createsGroupOperationWithSingleField() {

View File

@@ -25,12 +25,14 @@ import org.springframework.data.mongodb.core.DBObjectUtils;
import org.springframework.data.mongodb.core.aggregation.ProjectionOperation.ProjectionOperationBuilder;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Unit tests for {@link ProjectionOperation}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class ProjectionOperationUnitTests {
@@ -65,7 +67,7 @@ public class ProjectionOperationUnitTests {
DBObject dbObject = operation.toDBObject(Aggregation.DEFAULT_CONTEXT);
DBObject projectClause = DBObjectUtils.getAsDBObject(dbObject, PROJECT);
assertThat(projectClause.get("foo"), is((Object) "$foo"));
assertThat((Integer) projectClause.get("foo"), is(1));
assertThat(projectClause.get("bar"), is((Object) "$foobar"));
}
@@ -183,12 +185,89 @@ public class ProjectionOperationUnitTests {
assertThat(oper.get(MOD), is((Object) Arrays.<Object> asList("$a", 3)));
}
/**
* @see DATAMONGO-758
*/
@Test(expected = IllegalArgumentException.class)
public void excludeShouldThrowExceptionForFieldsOtherThanUnderscoreId() {
new ProjectionOperation().andExclude("foo");
}
/**
* @see DATAMONGO-758
*/
@Test
public void excludeShouldAllowExclusionOfUnderscoreId() {
ProjectionOperation projectionOp = new ProjectionOperation().andExclude(Fields.UNDERSCORE_ID);
DBObject dbObject = projectionOp.toDBObject(Aggregation.DEFAULT_CONTEXT);
DBObject projectClause = DBObjectUtils.getAsDBObject(dbObject, PROJECT);
assertThat((Integer) projectClause.get(Fields.UNDERSCORE_ID), is(0));
}
/**
* @see DATAMONGO-757
*/
@Test
public void usesImplictAndExplicitFieldAliasAndIncludeExclude() {
ProjectionOperation operation = Aggregation.project("foo").and("foobar").as("bar").andInclude("inc1", "inc2")
.andExclude("_id");
DBObject dbObject = operation.toDBObject(Aggregation.DEFAULT_CONTEXT);
DBObject projectClause = DBObjectUtils.getAsDBObject(dbObject, PROJECT);
assertThat(projectClause.get("foo"), is((Object) 1)); // implicit
assertThat(projectClause.get("bar"), is((Object) "$foobar")); // explicit
assertThat(projectClause.get("inc1"), is((Object) 1)); // include shortcut
assertThat(projectClause.get("inc2"), is((Object) 1));
assertThat(projectClause.get("_id"), is((Object) 0));
}
@Test(expected = IllegalArgumentException.class)
public void arithmenticProjectionOperationModByZeroException() {
new ProjectionOperation().and("a").mod(0);
}
/**
* @see DATAMONGO-769
*/
@Test
public void allowArithmeticOperationsWithFieldReferences() {
ProjectionOperation operation = Aggregation.project() //
.and("foo").plus("bar").as("fooPlusBar") //
.and("foo").minus("bar").as("fooMinusBar") //
.and("foo").multiply("bar").as("fooMultiplyBar") //
.and("foo").divide("bar").as("fooDivideBar") //
.and("foo").mod("bar").as("fooModBar");
DBObject dbObject = operation.toDBObject(Aggregation.DEFAULT_CONTEXT);
DBObject projectClause = DBObjectUtils.getAsDBObject(dbObject, PROJECT);
assertThat((BasicDBObject) projectClause.get("fooPlusBar"), //
is(new BasicDBObject("$add", dbList("$foo", "$bar"))));
assertThat((BasicDBObject) projectClause.get("fooMinusBar"), //
is(new BasicDBObject("$subtract", dbList("$foo", "$bar"))));
assertThat((BasicDBObject) projectClause.get("fooMultiplyBar"), //
is(new BasicDBObject("$multiply", dbList("$foo", "$bar"))));
assertThat((BasicDBObject) projectClause.get("fooDivideBar"), //
is(new BasicDBObject("$divide", dbList("$foo", "$bar"))));
assertThat((BasicDBObject) projectClause.get("fooModBar"), //
is(new BasicDBObject("$mod", dbList("$foo", "$bar"))));
}
public static BasicDBList dbList(Object... items) {
BasicDBList list = new BasicDBList();
for (Object item : items) {
list.add(item);
}
return list;
}
private static DBObject exctractOperation(String field, DBObject fromProjectClause) {
return (DBObject) fromProjectClause.get(field);
}

View File

@@ -438,6 +438,34 @@ public class QueryMapperUnitTests {
assertThat(inClause.get(0), is(instanceOf(com.mongodb.DBRef.class)));
}
/**
* @see DATAMONGO-752
*/
@Test
public void mapsSimpleValuesStartingWith$Correctly() {
Query query = query(where("myvalue").is("$334"));
DBObject result = mapper.getMappedObject(query.getQueryObject(), null);
assertThat(result.keySet(), hasSize(1));
assertThat(result.get("myvalue"), is((Object) "$334"));
}
/**
* @see DATAMONGO-752
*/
@Test
public void mapsKeywordAsSimpleValuesCorrectly() {
Query query = query(where("myvalue").is("$center"));
DBObject result = mapper.getMappedObject(query.getQueryObject(), null);
assertThat(result.keySet(), hasSize(1));
assertThat(result.get("myvalue"), is((Object) "$center"));
}
class IdWrapper {
Object id;
}

View File

@@ -36,6 +36,7 @@ import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mysema.query.types.expr.BooleanOperation;
import com.mysema.query.types.path.PathBuilder;
import com.mysema.query.types.path.SimplePath;
import com.mysema.query.types.path.StringPath;
/**
@@ -46,8 +47,7 @@ import com.mysema.query.types.path.StringPath;
@RunWith(MockitoJUnitRunner.class)
public class SpringDataMongodbSerializerUnitTests {
@Mock
MongoDbFactory dbFactory;
@Mock MongoDbFactory dbFactory;
MongoConverter converter;
SpringDataMongodbSerializer serializer;
@@ -117,10 +117,23 @@ public class SpringDataMongodbSerializerUnitTests {
assertThat(result.get("_id"), is((Object) id));
}
/**
* @see DATAMONGO-761
*/
@Test
public void looksUpKeyForNonPropertyPath() {
PathBuilder<Address> builder = new PathBuilder<Address>(Address.class, "address");
SimplePath<Object> firstElementPath = builder.getArray("foo", String[].class).get(0);
String path = serializer.getKeyForPath(firstElementPath, firstElementPath.getMetadata());
assertThat(path, is("0"));
}
class Address {
String id;
String street;
@Field("zip_code")
String zipCode;
@Field("zip_code") String zipCode;
@Field("bar") String[] foo;
}
}

View File

@@ -56,7 +56,7 @@
<xi:include href="introduction/why-sd-doc.xml"/>
<xi:include href="introduction/requirements.xml"/>
<xi:include href="introduction/getting-started.xml"/>
<xi:include href="https://raw.github.com/SpringSource/spring-data-commons/1.6.0.RELEASE/src/docbkx/repositories.xml">
<xi:include href="https://raw.github.com/SpringSource/spring-data-commons/1.6.2.RELEASE/src/docbkx/repositories.xml">
<xi:fallback href="../../../spring-data-commons/src/docbkx/repositories.xml" />
</xi:include>
</part>
@@ -76,10 +76,10 @@
<part id="appendix">
<title>Appendix</title>
<xi:include href="https://raw.github.com/SpringSource/spring-data-commons/1.6.0.RELEASE/src/docbkx/repository-namespace-reference.xml">
<xi:include href="https://raw.github.com/SpringSource/spring-data-commons/1.6.2.RELEASE/src/docbkx/repository-namespace-reference.xml">
<xi:fallback href="../../../spring-data-commons/src/docbkx/repository-namespace-reference.xml" />
</xi:include>
<xi:include href="https://raw.github.com/SpringSource/spring-data-commons/1.6.0.RELEASE/src/docbkx/repository-query-keywords-reference.xml">
<xi:include href="https://raw.github.com/SpringSource/spring-data-commons/1.6.2.RELEASE/src/docbkx/repository-query-keywords-reference.xml">
<xi:fallback href="../../../spring-data-commons/src/docbkx/repository-query-keywords-reference.xml" />
</xi:include>
</part>

View File

@@ -369,7 +369,11 @@ public class Person {
Spring Framework . Within the mapping framework it can be applied to
constructor arguments. This lets you use a Spring Expression
Language statement to transform a key's value retrieved in the
database before it is used to construct a domain object.</para>
database before it is used to construct a domain object. In order to
reference a property of a given document one has to use expressions
like: <code>@Value("#root.myProperty")</code> where
<literal>root</literal> refers to the root of the given
document.</para>
</listitem>
<listitem>
@@ -444,7 +448,75 @@ public class Person&lt;T extends Address&gt; {
// other getters/setters ommitted
</programlisting>
<para></para>
<para/>
</section>
<section id="mapping-custom-object-construction">
<title>Customized Object Construction</title>
<para>The Mapping Subsystem allows the customization of the object
construction by annotating a constructor with the
<literal>@PersistenceConstructor</literal> annotation. The values to be
used for the constructor parameters are resolved in the following
way:</para>
<itemizedlist>
<listitem>
<para>If a parameter is annotated with the <code>@Value</code>
annotation, the given expression is evaluated and the result is used
as the parameter value.</para>
</listitem>
<listitem>
<para>If the Java type has a property whose name matches the given
field of the input document, then it's property information is used
to select the appropriate constructor parameter to pass the input
field value to. This works only if the parameter name information is
present in the java .class files which can be achieved by compiling
the source with debug information or using the new
<literal>-parameters</literal> command-line switch for javac in Java
8.</para>
</listitem>
<listitem>
<para>Otherwise an <classname>MappingException</classname> will be
thrown indicating that the given constructor parameter could not be
bound.</para>
</listitem>
</itemizedlist>
<programlisting language="java">class OrderItem {
@Id String id;
int quantity;
double unitPrice;
OrderItem(String id, @Value("#root.qty ?: 0") int quantity, double unitPrice) {
this.id = id;
this.quantity = quantity;
this.unitPrice = unitPrice;
}
// getters/setters ommitted
}
DBObject input = new BasicDBObject("id", "4711");
input.put("unitPrice", 2.5);
input.put("qty",5);
OrderItem item = converter.read(OrderItem.class, input);</programlisting>
<note>
<para>The SpEL expression in the <literal>@Value</literal> annotation
of the <literal>quantity</literal> parameter falls back to the value
<literal>0</literal> if the given property path cannot be
resolved.</para>
</note>
<para>Additional examples for using the
<classname>@PersistenceConstructor</classname> annotation can be found
in the <ulink
url="https://github.com/spring-projects/spring-data-mongodb/blob/master/spring-data-mongodb/src/test/java/org/springframework/data/mongodb/core/convert/MappingMongoConverterUnitTests.java">MappingMongoConverterUnitTests</ulink>
test suite.</para>
</section>
<section id="mapping-usage-indexes">
@@ -608,7 +680,7 @@ public class Person {
}</programlisting>
<para></para>
<para/>
</section>
</section>
</chapter>

View File

@@ -1,6 +1,29 @@
Spring Data MongoDB Changelog
=============================
Changes in version 1.3.2.RELEASE (2013-10-25)
---------------------------------------------
** Bug
* [DATAMONGO-746] IndexInfo cannot be read for indices created via mongo shell
* [DATAMONGO-752] QueryMapper prevents searching for values that start with a $ [dollarsign]
* [DATAMONGO-753] Add support for nested field references in group operations
* [DATAMONGO-758] Reject excludes other than _id in projection operations
* [DATAMONGO-759] Render group operation without non synthetic fields correctly.
* [DATAMONGO-761] ClassCastException in SpringDataMongodbSerializer.getKeyForPath
* [DATAMONGO-768] Improve documentation of how to use @PersistenceConstructor
** Improvement
* [DATAMONGO-757] - Projections should follow mongodb conventions more precisely.
* [DATAMONGO-769] - Support arithmetic operators for properties
* [DATAMONGO-771] - Saving raw JSON through MongoTemplate.insert(…) fails
** Task
* [DATAMONGO-772] - Release 1.3.2
Changes in version 1.3.1.RELEASE (2013-09-09)
---------------------------------------------
** Task
* [DATAMONGO-751] Upgraded to Spring Data Commons 1.6.1.
Changes in version 1.3.0.RELEASE (2013-09-09)
---------------------------------------------
** Bug

View File

@@ -1,4 +1,4 @@
Spring Data Document 1.3.0 RELEASE
Spring Data Document 1.3.2.RELEASE
Copyright (c) [2010-2013] Pivotal Inc.
This product is licensed to you under the Apache License, Version 2.0 (the "License").

View File

@@ -1,4 +1,4 @@
SPRING DATA MongoDB 1.3.0.RELEASE
SPRING DATA MongoDB 1.3.2.RELEASE
-----------------------------
Spring Data MongoDB is released under the terms of the Apache Software License Version 2.0 (see license.txt).