Compare commits

..

39 Commits

Author SHA1 Message Date
Spring Buildmaster
01468b640a DATAMONGO-1228 - Release version 1.8.0.M1 (Gosling M1). 2015-06-02 01:29:01 -07:00
Oliver Gierke
4d96b036a2 DATAMONGO-1228 - Prepare 1.8.0.M1 (Gosling M1). 2015-06-02 09:29:53 +02:00
Oliver Gierke
2d1ac15e24 DATAMONGO-1228 - Updated changelog. 2015-06-02 08:24:47 +02:00
Oliver Gierke
2c27e8576f DATAMONGO-990 - Polishing.
Removed EvaluationExpressionContext from all AbstractMongoQuery implementations that don't actually need it and from AbstractMongoQuery itself, too. Cleaned up test cases after that.

Moved SpEL related tests into AbstractPersonRepositoryIntegrationTests to make sure they're executed for all sub-types. JavaDoc and assertion polishes.

Original pull request: #285.
2015-06-01 17:27:58 +02:00
Thomas Darimont
67f638d953 DATAMONGO-990 - Add support for SpEL expressions in @Query.
Ported and adapted support for SpEL expressions @Query annotations from Spring Data JPA. StringBasedMongoQuery can now evaluate SpEL fragments in queries with the help of the given EvaluationContextProvider. Introduced EvaluationContextProvider to AbstractMongoQuery. Exposed access to actual parameter values in MongoParameterAccessor.

Original pull request: #285.
2015-06-01 17:27:58 +02:00
Oliver Gierke
ea5bd5f7d3 DATAMONGO-1210 - Polishing.
Moved getTypeHint(…) method to Field class.

Original pull request: #292.
2015-06-01 13:21:07 +02:00
Christoph Strobl
394f695416 DATAMONGO-1210 - Fixed type hints for usage with findAndModify(…).
We now inspect the actual field type during update mapping and provide a type hint accordingly. Simple, non interface and non abstract types will no longer be decorated with the _class attribute. We now honor positional parameters when trying to map paths to properties. This allows more decent type mapping since we have now access to the meta model which allows us to check if presence of type hint (aka _class) is required.

We now add a special type hint indicating nested types to the converter. This allows more fine grained removal of _class property without the need to break the contract of MongoWriter.convertToMongoType(…).

Original pull request: #292.
2015-06-01 13:21:07 +02:00
Stefan Ganzer
e4db466ab9 DATAMONGO-1210 - Add breaking test case for findAndModify/addToSet/each.
The problem stems from the inconsistent handling of type hints such as MongoTemplate.save(…) does not add a type hint, but findAndModify(…) does. The same values are then treated differently by MongoDB, depending on whether they have a type hint or not. To verify this behavior, you can manually add the (superfluous) type hint to the saved object - findAndModify will then work as expected.

Additional tests demonstrate that findAndModify(…) removes type hints from complex documents in collections that are either nested in another collection or in a document, or doesn't add them in the first place.

Original pull requests: #290, #291.
Related pull request: #292.
CLA: 119820150506013701 (Stefan Ganzer)
2015-06-01 13:21:01 +02:00
Christoph Strobl
ee04c014c9 DATAMONGO-1134 - Add support for $geoIntersects.
We now support $geoIntersects via Criteria.intersects(…) using GeoJSON types.

Original pull request: #295.
2015-06-01 12:36:20 +02:00
Christoph Strobl
ea84f08de8 DATAMONGO-1216 - Skip authentication via AuthDB for MongoClient.
We now skip authentication via an explicit AuthDB when requesting a DB via a MongoClient instance.

Related ticket: DATACMNS-1218
Original pull request: #296.
2015-06-01 12:10:14 +02:00
Christoph Strobl
7d8a2b2d56 DATAMONGO-1218 - Deprecate non-MongoClient related configuration options in XML namespace.
We added deprecation hints to the description sections of elements and attributes within the spring-mongo.xsd of 1.7. Also we’ve added (for 1.8) a configuration attribute to db-factory allowing to set a client-uri creating a MongoClientURI instead of a MongoURI that will be passed on to MongoDbFactory. Just as 'uri', 'client-uri' will not allow additional configuration options like username, password next to it.

Original pull request: #296
2015-06-01 12:10:14 +02:00
Christoph Strobl
995d1e5aac DATAMONGO-1202 - Polishing.
Moved and renamed types into test class.
Added collection cleanup and missing author information.

Original pull request: #293.
2015-06-01 09:23:35 +02:00
Thomas Darimont
3b918492ae DATAMONGO-1202 - More robust type inspection for @Indexed properties.
We now use TypeInformation in IndexResolver to lookup the root PersistentEntity for resolving @Indexed properties to ensure that we retrieve the same PersistentEntity that was stored. Previously we used the Class to lookup up the PersistentEntity which yielded a partially processed result.

Original pull request: #293.
2015-06-01 09:08:31 +02:00
Christoph Strobl
66b419163c DATAMONGO-1193 - Prevent unnecessary database lookups when resolving DBRefs on 2.x driver.
We now check against the used driver version before requesting db instance from factory. Potential improvements on fetch strategy for MongoDB Java Driver 3 will be handled in DATAMONGO-1194.

Related tickets: DATAMONGO-1194.
Original pull request: #286.
2015-06-01 08:09:50 +02:00
Oliver Gierke
52bff39c22 DATAMONGO-1224 - Ensure Spring Framework 4.2 compatibility.
Removed obsolete generics in MongoPersistentEntityIndexCreator to make sure MappingContextEvents are delivered to the listener on Spring 4.2 which applies more strict generics handling to ApplicationEvents.

Tweaked PersonBeforeSaveListener in test code to actually reflect how an ApplicationEventListener for MongoDB would be implemented.

Removed deprecated (and now removed) usage of ConversionServiceFactory in AbstractMongoConverter. Added MongoMappingEventPublisher.publishEvent(Object) as NoOp.
2015-05-25 13:12:47 +02:00
Domenique Tilleuil
d151a13e87 DATAMONGO-1208 - Use QueryCursorPreparer for streaming in MongoTemplate.
We now use the QueryCursorPreparer honor skip, limit, sort, etc. for streaming.

Original pull request: #297.
Polishing pull request: #298.
2015-05-21 09:00:33 +02:00
Oliver Gierke
5e7e7d3598 DATAMONGO-1221 - Removed <relativePath /> element from parent POM declaration. 2015-05-15 15:07:30 +02:00
Oliver Gierke
356248bd05 DATAMONGO-1213 - Included section on dependency management in reference documentation.
Related ticket: DATACMNS-687.
2015-05-04 14:51:34 +02:00
Oliver Gierke
73a60153f6 DATAMONGO-1211 - Adapt to changes in Spring Data Commons.
Tweaked method signatures in MongoRepositoryFactory after some signature changes in Spring Data Commons. Use newly introduced getTragetRepositoryViaReflection(…) to obtain the repository instance via the super class.

Added repositoryBaseClass() attribute to @EnableMongoRepositories.

Related tickets: DATACMNS-542.
2015-05-02 14:49:31 +02:00
Oliver Gierke
67cf0e62a7 DATAMONGO-1207 - Fixed potential NPE in MongoTemplate.doInsertAll(…).
If a collection containing null values is handed to MongoTempalte.insertAll(…), a NullPointerException was caused by the unguarded attempt to lookup the class of the element. We now explicitly handle this case and skip the element.

Some code cleanups in MongoTemplate.doInsertAll(…).
2015-05-02 14:49:31 +02:00
Oliver Gierke
21fbcc3e67 DATAMONGO-1196 - Upgraded build profiles after MongoDB 3.0 Java driver GA release. 2015-04-01 17:11:55 +02:00
Oliver Gierke
0d63ff92a0 DATAMONGO-1192 - Switched to Spring 4.1's CollectionFactory. 2015-03-31 17:16:44 +02:00
Oliver Gierke
983645e222 DATAMONGO-1189 - After release cleanups. 2015-03-23 14:00:52 +01:00
Spring Buildmaster
d2805bfa47 DATAMONGO-1189 - Prepare next development iteration. 2015-03-23 13:03:26 +01:00
Spring Buildmaster
3f16b30631 DATAMONGO-1189 - Release version 1.7.0.RELEASE (Fowler GA). 2015-03-23 13:03:07 +01:00
Oliver Gierke
8ebcbe3c5c DATAMONGO-1189 - Prepare 1.7.0.RELEASE (Fowler GA). 2015-03-23 12:34:49 +01:00
Oliver Gierke
363bed5c37 DATAMONGO-1189 - Updated changelog. 2015-03-23 12:03:56 +01:00
Christoph Strobl
1547a646dd DATAMONGO-1189 - DATAJPA-692 - Polish reference docs before release.
Add repository query return types to reference doc.
Fall back to locally available Spring Data Commons reference docs as the remote variant doesn't seem to work currently
2015-03-23 11:17:25 +01:00
Oliver Gierke
1408d51065 DATAMONGO-979 - Polishing.
Minor JavaDoc and code style polishes.

Original pull request: #272.
2015-03-23 09:32:52 +01:00
Thomas Darimont
f5c319f18f DATAMONGO-979 - Add support for $size expression in project and group aggregation pipeline.
Introduced AggregationExpression interface to be able to represent arbitrary MongoDB expressions that can be used in projection and group operations. Supported function expressions are provided via the AggregationFunctionExpressions enum.

Original pull request: #272.
2015-03-23 09:32:26 +01:00
Christoph Strobl
a3c29054d0 DATAMONGO-1124 - Switch log level for cyclic reference index warnings to INFO.
Reduce log level from warn to info to avoid noise during application startup.

Original pull request: #282.
2015-03-23 09:00:24 +01:00
Oliver Gierke
01533ca34c DATAMONGO-1181 - Register GeoJsonModule with @EnableSpringDataWebSupport.
Added the necessary configuration infrastructure to automatically register the GeoJsonModule as Spring bean when @EnableSpringDataWebSupport is used. This is implemented by exposing a configuration class annotated with @SpringDataWebConfigurationMixin.

Added Spring WebMVC as test dependency to be able to write an integration test. Polished GeoJsonModule to hide the actual serializers.

Original pull request: #283.
Related ticket: DATACMNS-660.
2015-03-17 19:40:57 +01:00
Christoph Strobl
a1f6dc6db4 DATAMONGO-1181 - Add Jackson Module for GeoJSON types.
Added GeoJsonModule providing JsonDeserializers for GeoJsonPoint, GeoJsonMultiPoint, GeoJsonLineString, GeoJsonMultiLineString, GeoJsonPolygon and GeoJsonMultiPolygon.

Original pull request: #283.
2015-03-17 19:40:57 +01:00
Oliver Gierke
37d53d936d DATAMONGO-1179 - Polishing. 2015-03-10 14:29:22 +01:00
Christoph Strobl
bc0a2df653 DATAMONGO-1179 - Update reference documentation.
Added new-features section. Updated links and requirements. Added section for GeoJSON support. Updated Script Operations section. Added return type Stream to repositories section. Updated keyword list.

Original pull request: #281.
2015-03-10 14:29:22 +01:00
Oliver Gierke
7e50fd8273 DATAMONGO-1180 - Polishing.
Fixed copyright ranges in license headers. Added unit test to PartTreeMongoQueryUnitTests to verify the root exception being propagated correctly.

Original pull request: #280.
Related pull request: #259.
2015-03-10 12:20:53 +01:00
Thomas Darimont
ba560ffbad DATAMONGO-1180 - Fixed incorrect exception message creation in PartTreeMongoQuery.
The JSONParseException caught in PartTreeMongoQuery is now passed to the IllegalStateException we throw from the method. Previously it was passed to the String.format(…) varargs. Verified by manually throwing a JSONParseException in the debugger.

Original pull request: #280.
Related pull request: #259.
2015-03-10 12:20:53 +01:00
Oliver Gierke
50ca32c8b9 DATAMONGO-1173 - After release cleanups. 2015-03-05 19:41:05 +01:00
Spring Buildmaster
bdfe3af505 DATAMONGO-1173 - Prepare next development iteration. 2015-03-05 07:47:13 -08:00
70 changed files with 3575 additions and 333 deletions

15
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.7.0.RC1</version>
<version>1.8.0.M1</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,8 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.6.0.RC1</version>
<relativePath>../spring-data-build/parent/pom.xml</relativePath>
<version>1.7.0.M1</version>
</parent>
<modules>
@@ -29,7 +28,7 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.10.0.RC1</springdata.commons>
<springdata.commons>1.11.0.M1</springdata.commons>
<mongo>2.13.0</mongo>
<mongo.osgi>2.13.0</mongo.osgi>
</properties>
@@ -108,7 +107,7 @@
<id>mongo-next</id>
<properties>
<mongo>2.13.0-SNAPSHOT</mongo>
<mongo>2.14.0-SNAPSHOT</mongo>
</properties>
<repositories>
@@ -124,7 +123,7 @@
<id>mongo3</id>
<properties>
<mongo>3.0.0-beta3</mongo>
<mongo>3.0.0</mongo>
</properties>
</profile>
@@ -158,14 +157,14 @@
<repositories>
<repository>
<id>spring-libs-milestone</id>
<url>http://repo.spring.io/libs-milestone</url>
<url>https://repo.spring.io/libs-milestone</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>spring-plugins-release</id>
<url>http://repo.spring.io/plugins-release</url>
<url>https://repo.spring.io/plugins-release</url>
</pluginRepository>
</pluginRepositories>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.7.0.RC1</version>
<version>1.8.0.M1</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -48,7 +48,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.7.0.RC1</version>
<version>1.8.0.M1</version>
</dependency>
<dependency>

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.7.0.RC1</version>
<version>1.8.0.M1</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -5,7 +5,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.7.0.RC1</version>
<version>1.8.0.M1</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.7.0.RC1</version>
<version>1.8.0.M1</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -145,6 +145,13 @@
<optional>true</optional>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson}</version>
<optional>true</optional>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jul-to-slf4j</artifactId>
@@ -158,6 +165,13 @@
<version>${equalsverifier}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-webmvc</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 by the original author(s).
* Copyright 2011-2015 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -34,6 +34,7 @@ import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
import com.mongodb.Mongo;
import com.mongodb.MongoClientURI;
import com.mongodb.MongoURI;
/**
@@ -42,6 +43,7 @@ import com.mongodb.MongoURI;
* @author Jon Brisbin
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
*/
public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
@@ -64,29 +66,28 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
Object source = parserContext.extractSource(element);
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
String uri = element.getAttribute("uri");
String mongoRef = element.getAttribute("mongo-ref");
String dbname = element.getAttribute("dbname");
BeanDefinition userCredentials = getUserCredentialsBeanDefinition(element, parserContext);
// Common setup
BeanDefinitionBuilder dbFactoryBuilder = BeanDefinitionBuilder.genericBeanDefinition(SimpleMongoDbFactory.class);
setPropertyValue(dbFactoryBuilder, element, "write-concern", "writeConcern");
if (StringUtils.hasText(uri)) {
if (StringUtils.hasText(mongoRef) || StringUtils.hasText(dbname) || userCredentials != null) {
parserContext.getReaderContext().error("Configure either Mongo URI or details individually!", source);
}
BeanDefinition mongoUri = getMongoUri(element);
dbFactoryBuilder.addConstructorArgValue(getMongoUri(uri));
if (mongoUri != null) {
if (element.getAttributes().getLength() >= 2 && !element.hasAttribute("write-concern")) {
parserContext.getReaderContext().error("Configure either Mongo URI or details individually!",
parserContext.extractSource(element));
}
dbFactoryBuilder.addConstructorArgValue(mongoUri);
return getSourceBeanDefinition(dbFactoryBuilder, parserContext, element);
}
BeanComponentDefinitionBuilder helper = new BeanComponentDefinitionBuilder(element, parserContext);
String mongoRef = element.getAttribute("mongo-ref");
String dbname = element.getAttribute("dbname");
BeanDefinition userCredentials = getUserCredentialsBeanDefinition(element, parserContext);
// Defaulting
if (StringUtils.hasText(mongoRef)) {
dbFactoryBuilder.addConstructorArgReference(mongoRef);
@@ -147,14 +148,24 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
}
/**
* Creates a {@link BeanDefinition} for a {@link MongoURI}.
* Creates a {@link BeanDefinition} for a {@link MongoURI} or {@link MongoClientURI} depending on configured
* attributes.
*
* @param uri
* @return
* @param element must not be {@literal null}.
* @return {@literal null} in case no client-/uri defined.
*/
private BeanDefinition getMongoUri(String uri) {
private BeanDefinition getMongoUri(Element element) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(MongoURI.class);
boolean hasClientUri = element.hasAttribute("client-uri");
if (!hasClientUri && !element.hasAttribute("uri")) {
return null;
}
Class<?> type = hasClientUri ? MongoClientURI.class : MongoURI.class;
String uri = hasClientUri ? element.getAttribute("client-uri") : element.getAttribute("uri");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(type);
builder.addConstructorArgValue(uri);
return builder.getBeanDefinition();

View File

@@ -0,0 +1,34 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.context.annotation.Bean;
import org.springframework.data.mongodb.core.geo.GeoJsonModule;
import org.springframework.data.web.config.SpringDataWebConfigurationMixin;
/**
* Configuration class to expose {@link GeoJsonModule} as a Spring bean.
*
* @author Oliver Gierke
*/
@SpringDataWebConfigurationMixin
public class GeoJsonConfiguration {
@Bean
public GeoJsonModule geoJsonModule() {
return new GeoJsonModule();
}
}

View File

@@ -123,7 +123,7 @@ public abstract class MongoDbUtils {
DB db = mongo.getDB(databaseName);
if (requiresAuthDbAuthentication(credentials)) {
if (!(mongo instanceof MongoClient) && requiresAuthDbAuthentication(credentials)) {
ReflectiveDbInvoker.authenticate(mongo, db, credentials, authenticationDatabaseName);
}

View File

@@ -136,6 +136,7 @@ import com.mongodb.util.JSONParseException;
* @author Thomas Darimont
* @author Chuong Ngo
* @author Christoph Strobl
* @author Doménique Tilleuil
*/
@SuppressWarnings("deprecation")
public class MongoTemplate implements MongoOperations, ApplicationContextAware {
@@ -335,9 +336,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
DBObject mappedQuery = queryMapper.getMappedObject(query.getQueryObject(), persistentEntity);
DBCursor cursor = collection.find(mappedQuery, mappedFields);
QueryCursorPreparer cursorPreparer = new QueryCursorPreparer(query, entityType);
ReadDbObjectCallback<T> readCallback = new ReadDbObjectCallback<T>(mongoConverter, entityType);
return new CloseableIterableCusorAdapter<T>(cursor, exceptionTranslator, readCallback);
return new CloseableIterableCusorAdapter<T>(cursorPreparer.prepare(cursor), exceptionTranslator, readCallback);
}
});
}
@@ -841,27 +844,33 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
}
protected <T> void doInsertAll(Collection<? extends T> listToSave, MongoWriter<T> writer) {
Map<String, List<T>> objs = new HashMap<String, List<T>>();
for (T o : listToSave) {
Map<String, List<T>> elementsByCollection = new HashMap<String, List<T>>();
for (T element : listToSave) {
if (element == null) {
continue;
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(element.getClass());
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(o.getClass());
if (entity == null) {
throw new InvalidDataAccessApiUsageException("No Persitent Entity information found for the class "
+ o.getClass().getName());
throw new InvalidDataAccessApiUsageException("No PersistentEntity information found for " + element.getClass());
}
String collection = entity.getCollection();
List<T> collectionElements = elementsByCollection.get(collection);
List<T> objList = objs.get(collection);
if (null == objList) {
objList = new ArrayList<T>();
objs.put(collection, objList);
if (null == collectionElements) {
collectionElements = new ArrayList<T>();
elementsByCollection.put(collection, collectionElements);
}
objList.add(o);
collectionElements.add(element);
}
for (Map.Entry<String, List<T>> entry : objs.entrySet()) {
for (Map.Entry<String, List<T>> entry : elementsByCollection.entrySet()) {
doInsertBatch(entry.getKey(), entry.getValue(), this.mongoConverter);
}
}

View File

@@ -0,0 +1,37 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import com.mongodb.DBObject;
/**
* An {@link AggregationExpression} can be used with field expressions in aggregation pipeline stages like
* {@code project} and {@code group}.
*
* @author Thomas Darimont
* @author Oliver Gierke
*/
interface AggregationExpression {
/**
* Turns the {@link AggregationExpression} into a {@link DBObject} within the given
* {@link AggregationOperationContext}.
*
* @param context
* @return
*/
DBObject toDbObject(AggregationOperationContext context);
}

View File

@@ -0,0 +1,105 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.List;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* An enum of supported {@link AggregationExpression}s in aggregation pipeline stages.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.10
*/
public enum AggregationFunctionExpressions {
SIZE;
/**
* Returns an {@link AggregationExpression} build from the current {@link Enum} name and the given parameters.
*
* @param parameters must not be {@literal null}
* @return
*/
public AggregationExpression of(Object... parameters) {
Assert.notNull(parameters, "Parameters must not be null!");
return new FunctionExpression(name().toLowerCase(), parameters);
}
/**
* An {@link AggregationExpression} representing a function call.
*
* @author Thomas Darimont
* @author Oliver Gierke
* @since 1.10
*/
static class FunctionExpression implements AggregationExpression {
private final String name;
private final Object[] values;
/**
* Creates a new {@link FunctionExpression} for the given name and values.
*
* @param name must not be {@literal null} or empty.
* @param values must not be {@literal null}.
*/
public FunctionExpression(String name, Object[] values) {
Assert.hasText(name, "Name must not be null!");
Assert.notNull(values, "Values must not be null!");
this.name = name;
this.values = values;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.Expression#toDbObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public DBObject toDbObject(AggregationOperationContext context) {
List<Object> args = new ArrayList<Object>(values.length);
for (int i = 0; i < values.length; i++) {
args.add(unpack(values[i], context));
}
return new BasicDBObject("$" + name, args);
}
private static Object unpack(Object value, AggregationOperationContext context) {
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDbObject(context);
}
if (value instanceof Field) {
return context.getReference((Field) value).toString();
}
return value;
}
}
}

View File

@@ -193,6 +193,16 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return newBuilder(GroupOps.LAST, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $last}-expression for the given {@link AggregationExpression}.
*
* @param expr
* @return
*/
public GroupOperationBuilder last(AggregationExpression expr) {
return newBuilder(GroupOps.LAST, null, expr);
}
/**
* Generates an {@link GroupOperationBuilder} for a {@code $first}-expression for the given field-reference.
*
@@ -203,6 +213,16 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return newBuilder(GroupOps.FIRST, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for a {@code $first}-expression for the given {@link AggregationExpression}.
*
* @param expr
* @return
*/
public GroupOperationBuilder first(AggregationExpression expr) {
return newBuilder(GroupOps.FIRST, null, expr);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $avg}-expression for the given field-reference.
*
@@ -213,6 +233,16 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return newBuilder(GroupOps.AVG, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $avg}-expression for the given {@link AggregationExpression}.
*
* @param expr
* @return
*/
public GroupOperationBuilder avg(AggregationExpression expr) {
return newBuilder(GroupOps.AVG, null, expr);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $push}-expression for the given field-reference.
*
@@ -247,6 +277,16 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return newBuilder(GroupOps.MIN, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $min}-expression that for the given {@link AggregationExpression}.
*
* @param expr
* @return
*/
public GroupOperationBuilder min(AggregationExpression expr) {
return newBuilder(GroupOps.MIN, null, expr);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $max}-expression that for the given field-reference.
*
@@ -257,6 +297,16 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
return newBuilder(GroupOps.MAX, reference, null);
}
/**
* Generates an {@link GroupOperationBuilder} for an {@code $max}-expression that for the given {@link AggregationExpression}.
*
* @param expr
* @return
*/
public GroupOperationBuilder max(AggregationExpression expr) {
return newBuilder(GroupOps.MAX, null, expr);
}
private GroupOperationBuilder newBuilder(Keyword keyword, String reference, Object value) {
return new GroupOperationBuilder(this, new Operation(keyword, null, reference, value));
}
@@ -369,6 +419,11 @@ public class GroupOperation implements FieldsExposingAggregationOperation {
public Object getValue(AggregationOperationContext context) {
if (reference == null) {
if (value instanceof AggregationExpression) {
return ((AggregationExpression) value).toDbObject(context);
}
return value;
}

View File

@@ -121,6 +121,10 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return new ExpressionProjectionOperationBuilder(expression, this, params);
}
public ProjectionOperationBuilder and(AggregationExpression expression) {
return new ProjectionOperationBuilder(expression, this, null);
}
/**
* Excludes the given fields from the projection.
*
@@ -420,9 +424,13 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
if (this.previousProjection != null) {
return this.operation.andReplaceLastOneWith(this.previousProjection.withAlias(alias));
} else {
return this.operation.and(new FieldProjection(Fields.field(alias, name), null));
}
if (value instanceof AggregationExpression) {
return this.operation.and(new ExpressionProjection(Fields.field(alias), (AggregationExpression) value));
}
return this.operation.and(new FieldProjection(Fields.field(alias, name), null));
}
/**
@@ -552,6 +560,10 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return project("mod", Fields.field(fieldReference));
}
public ProjectionOperationBuilder size() {
return project("size");
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDBObject(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
@@ -940,4 +952,31 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
*/
public abstract DBObject toDBObject(AggregationOperationContext context);
}
/**
* @author Thomas Darimont
*/
static class ExpressionProjection extends Projection {
private final AggregationExpression expression;
private final Field field;
/**
* Creates a new {@link ExpressionProjection}.
*
* @param field
* @param expression
*/
public ExpressionProjection(Field field, AggregationExpression expression) {
super(field);
this.field = field;
this.expression = expression;
}
@Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject(field.getName(), expression.toDbObject(context));
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2013 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,7 +20,7 @@ import java.math.BigInteger;
import org.bson.types.ObjectId;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.core.convert.support.GenericConversionService;
import org.springframework.data.convert.EntityInstantiators;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigIntegerToObjectIdConverter;
@@ -46,10 +46,8 @@ public abstract class AbstractMongoConverter implements MongoConverter, Initiali
*
* @param conversionService
*/
@SuppressWarnings("deprecation")
public AbstractMongoConverter(GenericConversionService conversionService) {
this.conversionService = conversionService == null ? ConversionServiceFactory.createDefaultConversionService()
: conversionService;
this.conversionService = conversionService == null ? new DefaultConversionService() : conversionService;
}
/**

View File

@@ -106,7 +106,7 @@ public class DefaultDbRefResolver implements DbRefResolver {
*/
@Override
public DBObject fetch(DBRef dbRef) {
return ReflectiveDBRefResolver.fetch(mongoDbFactory.getDb(), dbRef);
return ReflectiveDBRefResolver.fetch(mongoDbFactory, dbRef);
}
/**

View File

@@ -29,10 +29,10 @@ import org.slf4j.LoggerFactory;
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.core.CollectionFactory;
import org.springframework.core.convert.ConversionException;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.data.convert.CollectionFactory;
import org.springframework.data.convert.EntityInstantiator;
import org.springframework.data.convert.TypeMapper;
import org.springframework.data.mapping.Association;
@@ -136,8 +136,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param typeMapper the typeMapper to set
*/
public void setTypeMapper(MongoTypeMapper typeMapper) {
this.typeMapper = typeMapper == null ? new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY,
mappingContext) : typeMapper;
this.typeMapper = typeMapper == null
? new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, mappingContext) : typeMapper;
}
/*
@@ -238,7 +238,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
PersistentEntityParameterValueProvider<MongoPersistentProperty> parameterProvider = new PersistentEntityParameterValueProvider<MongoPersistentProperty>(
entity, provider, path.getCurrentObject());
return new ConverterAwareSpELExpressionParameterValueProvider(evaluator, conversionService, parameterProvider, path);
return new ConverterAwareSpELExpressionParameterValueProvider(evaluator, conversionService, parameterProvider,
path);
}
private <S extends Object> S read(final MongoPersistentEntity<S> entity, final DBObject dbo, final ObjectPath path) {
@@ -510,8 +511,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
: new BasicDBObject();
addCustomTypeKeyIfNecessary(ClassTypeInformation.from(prop.getRawType()), obj, propDbObj);
MongoPersistentEntity<?> entity = isSubtype(prop.getType(), obj.getClass()) ? mappingContext
.getPersistentEntity(obj.getClass()) : mappingContext.getPersistentEntity(type);
MongoPersistentEntity<?> entity = isSubtype(prop.getType(), obj.getClass())
? mappingContext.getPersistentEntity(obj.getClass()) : mappingContext.getPersistentEntity(type);
writeInternal(obj, propDbObj, entity);
accessor.put(prop, propDbObj);
@@ -700,8 +701,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (mapKeyDotReplacement == null) {
throw new MappingException(String.format("Map key %s contains dots but no replacement was configured! Make "
+ "sure map keys don't contain dots in the first place or configure an appropriate replacement!", source));
throw new MappingException(String.format(
"Map key %s contains dots but no replacement was configured! Make "
+ "sure map keys don't contain dots in the first place or configure an appropriate replacement!",
source));
}
return source.replaceAll("\\.", mapKeyDotReplacement);
@@ -719,8 +722,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return (String) key;
}
return conversions.hasCustomWriteTarget(key.getClass(), String.class) ? (String) getPotentiallyConvertedSimpleWrite(key)
: key.toString();
return conversions.hasCustomWriteTarget(key.getClass(), String.class)
? (String) getPotentiallyConvertedSimpleWrite(key) : key.toString();
}
/**
@@ -889,16 +892,16 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Class<?> rawComponentType = componentType == null ? null : componentType.getType();
collectionType = Collection.class.isAssignableFrom(collectionType) ? collectionType : List.class;
Collection<Object> items = targetType.getType().isArray() ? new ArrayList<Object>() : CollectionFactory
.createCollection(collectionType, rawComponentType, sourceValue.size());
Collection<Object> items = targetType.getType().isArray() ? new ArrayList<Object>()
: CollectionFactory.createCollection(collectionType, rawComponentType, sourceValue.size());
for (int i = 0; i < sourceValue.size(); i++) {
Object dbObjItem = sourceValue.get(i);
if (dbObjItem instanceof DBRef) {
items.add(DBRef.class.equals(rawComponentType) ? dbObjItem : read(componentType, readRef((DBRef) dbObjItem),
path));
items.add(
DBRef.class.equals(rawComponentType) ? dbObjItem : read(componentType, readRef((DBRef) dbObjItem), path));
} else if (dbObjItem instanceof DBObject) {
items.add(read(componentType, (DBObject) dbObjItem, path));
} else {
@@ -1016,10 +1019,14 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
this.write(obj, newDbo);
if (typeInformation == null) {
return removeTypeInfoRecursively(newDbo);
return removeTypeInfo(newDbo, true);
}
return !obj.getClass().equals(typeInformation.getType()) ? newDbo : removeTypeInfoRecursively(newDbo);
if (typeInformation.getType().equals(NestedDocument.class)) {
return removeTypeInfo(newDbo, false);
}
return !obj.getClass().equals(typeInformation.getType()) ? newDbo : removeTypeInfo(newDbo, true);
}
public BasicDBList maybeConvertList(Iterable<?> source, TypeInformation<?> typeInformation) {
@@ -1033,12 +1040,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
/**
* Removes the type information from the conversion result.
* Removes the type information from the entire conversion result.
*
* @param object
* @param recursively whether to apply the removal recursively
* @return
*/
private Object removeTypeInfoRecursively(Object object) {
private Object removeTypeInfo(Object object, boolean recursively) {
if (!(object instanceof DBObject)) {
return object;
@@ -1046,19 +1054,29 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
DBObject dbObject = (DBObject) object;
String keyToRemove = null;
for (String key : dbObject.keySet()) {
if (typeMapper.isTypeKey(key)) {
keyToRemove = key;
if (recursively) {
Object value = dbObject.get(key);
if (value instanceof BasicDBList) {
for (Object element : (BasicDBList) value) {
removeTypeInfo(element, recursively);
}
} else {
removeTypeInfo(value, recursively);
}
}
Object value = dbObject.get(key);
if (value instanceof BasicDBList) {
for (Object element : (BasicDBList) value) {
removeTypeInfoRecursively(element);
if (typeMapper.isTypeKey(key)) {
keyToRemove = key;
if (!recursively) {
break;
}
} else {
removeTypeInfoRecursively(value);
}
}
@@ -1122,8 +1140,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*
* @author Oliver Gierke
*/
private class ConverterAwareSpELExpressionParameterValueProvider extends
SpELExpressionParameterValueProvider<MongoPersistentProperty> {
private class ConverterAwareSpELExpressionParameterValueProvider
extends SpELExpressionParameterValueProvider<MongoPersistentProperty> {
private final ObjectPath path;
@@ -1135,7 +1153,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param delegate must not be {@literal null}.
*/
public ConverterAwareSpELExpressionParameterValueProvider(SpELExpressionEvaluator evaluator,
ConversionService conversionService, ParameterValueProvider<MongoPersistentProperty> delegate, ObjectPath path) {
ConversionService conversionService, ParameterValueProvider<MongoPersistentProperty> delegate,
ObjectPath path) {
super(evaluator, conversionService, delegate);
this.path = path;
@@ -1194,4 +1213,15 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
DBObject readRef(DBRef ref) {
return dbRefResolver.fetch(ref);
}
/**
* Marker class used to indicate we have a non root document object here that might be used within an update - so we
* need to preserve type hints for potential nested elements but need to remove it on top level.
*
* @author Christoph Strobl
* @since 1.8
*/
static class NestedDocument {
}
}

View File

@@ -34,10 +34,13 @@ import org.springframework.data.mapping.PropertyReferenceException;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.PersistentPropertyPath;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter.NestedDocument;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty.PropertyToFieldNameConverter;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
import com.mongodb.BasicDBList;
@@ -58,6 +61,7 @@ public class QueryMapper {
private static final List<String> DEFAULT_ID_NAMES = Arrays.asList("id", "_id");
private static final DBObject META_TEXT_SCORE = new BasicDBObject("$meta", "textScore");
static final ClassTypeInformation<?> NESTED_DOCUMENT = ClassTypeInformation.from(NestedDocument.class);
private enum MetaMapping {
FORCE, WHEN_PRESENT, IGNORE;
@@ -222,7 +226,7 @@ public class QueryMapper {
protected DBObject getMappedKeyword(Keyword keyword, MongoPersistentEntity<?> entity) {
// $or/$nor
if (keyword.isOrOrNor() || keyword.hasIterableValue()) {
if (keyword.isOrOrNor() || (keyword.hasIterableValue() && !keyword.isGeometry())) {
Iterable<?> conditions = keyword.getValue();
BasicDBList newConditions = new BasicDBList();
@@ -250,8 +254,8 @@ public class QueryMapper {
boolean needsAssociationConversion = property.isAssociation() && !keyword.isExists();
Object value = keyword.getValue();
Object convertedValue = needsAssociationConversion ? convertAssociation(value, property) : getMappedValue(
property.with(keyword.getKey()), value);
Object convertedValue = needsAssociationConversion ? convertAssociation(value, property)
: getMappedValue(property.with(keyword.getKey()), value);
return new BasicDBObject(keyword.key, convertedValue);
}
@@ -473,8 +477,8 @@ public class QueryMapper {
}
try {
return conversionService.canConvert(id.getClass(), ObjectId.class) ? conversionService
.convert(id, ObjectId.class) : delegateConvertToMongoType(id, null);
return conversionService.canConvert(id.getClass(), ObjectId.class) ? conversionService.convert(id, ObjectId.class)
: delegateConvertToMongoType(id, null);
} catch (ConversionException o_O) {
return delegateConvertToMongoType(id, null);
}
@@ -552,6 +556,16 @@ public class QueryMapper {
return key.matches(N_OR_PATTERN);
}
/**
* Returns whether the current keyword is the {@code $geometry} keyword.
*
* @return
* @since 1.8
*/
public boolean isGeometry() {
return "$geometry".equalsIgnoreCase(key);
}
public boolean hasIterableValue() {
return value instanceof Iterable;
}
@@ -657,6 +671,10 @@ public class QueryMapper {
public Association<MongoPersistentProperty> getAssociation() {
return null;
}
public TypeInformation<?> getTypeHint() {
return ClassTypeInformation.OBJECT;
}
}
/**
@@ -816,7 +834,7 @@ public class QueryMapper {
try {
PropertyPath path = PropertyPath.from(pathExpression, entity.getTypeInformation());
PropertyPath path = PropertyPath.from(pathExpression.replaceAll("\\.\\d", ""), entity.getTypeInformation());
PersistentPropertyPath<MongoPersistentProperty> propertyPath = mappingContext.getPersistentPropertyPath(path);
Iterator<MongoPersistentProperty> iterator = propertyPath.iterator();
@@ -862,6 +880,27 @@ public class QueryMapper {
protected Converter<MongoPersistentProperty, String> getAssociationConverter() {
return new AssociationConverter(getAssociation());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper.Field#getTypeHint()
*/
@Override
public TypeInformation<?> getTypeHint() {
MongoPersistentProperty property = getProperty();
if (property == null) {
return super.getTypeHint();
}
if (property.getActualType().isInterface()
|| java.lang.reflect.Modifier.isAbstract(property.getActualType().getModifiers())) {
return ClassTypeInformation.OBJECT;
}
return NESTED_DOCUMENT;
}
}
/**

View File

@@ -20,9 +20,9 @@ import static org.springframework.util.ReflectionUtils.*;
import java.lang.reflect.Method;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.util.Assert;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
@@ -51,12 +51,14 @@ class ReflectiveDBRefResolver {
* @param ref must not be {@literal null}.
* @return the document that this references.
*/
public static DBObject fetch(DB db, DBRef ref) {
public static DBObject fetch(MongoDbFactory factory, DBRef ref) {
Assert.notNull(ref, "DBRef to fetch must not be null!");
if (isMongo3Driver()) {
return db.getCollection(ref.getCollectionName()).findOne(ref.getId());
Assert.notNull(factory, "DbFactory to fetch DB from must not be null!");
return factory.getDb().getCollection(ref.getCollectionName()).findOne(ref.getId());
}
return (DBObject) invokeMethod(FETCH_METHOD, ref);

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2014 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -29,6 +29,7 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update.Modifier;
import org.springframework.data.mongodb.core.query.Update.Modifiers;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
import com.mongodb.BasicDBObject;
@@ -65,8 +66,8 @@ public class UpdateMapper extends QueryMapper {
*/
@Override
protected Object delegateConvertToMongoType(Object source, MongoPersistentEntity<?> entity) {
return entity == null ? super.delegateConvertToMongoType(source, null) : converter.convertToMongoType(source,
entity.getTypeInformation());
return entity == null ? super.delegateConvertToMongoType(source, null)
: converter.convertToMongoType(source, getTypeHintForEntity(entity));
}
/*
@@ -97,14 +98,14 @@ public class UpdateMapper extends QueryMapper {
if (rawValue instanceof Modifier) {
value = getMappedValue((Modifier) rawValue);
value = getMappedValue(field, (Modifier) rawValue);
} else if (rawValue instanceof Modifiers) {
DBObject modificationOperations = new BasicDBObject();
for (Modifier modifier : ((Modifiers) rawValue).getModifiers()) {
modificationOperations.putAll(getMappedValue(modifier).toMap());
modificationOperations.putAll(getMappedValue(field, modifier).toMap());
}
value = modificationOperations;
@@ -132,12 +133,28 @@ public class UpdateMapper extends QueryMapper {
return value instanceof Query;
}
private DBObject getMappedValue(Modifier modifier) {
private DBObject getMappedValue(Field field, Modifier modifier) {
Object value = converter.convertToMongoType(modifier.getValue(), ClassTypeInformation.OBJECT);
TypeInformation<?> typeHint = field == null ? ClassTypeInformation.OBJECT : field.getTypeHint();
Object value = converter.convertToMongoType(modifier.getValue(), typeHint);
return new BasicDBObject(modifier.getKey(), value);
}
private TypeInformation<?> getTypeHintForEntity(MongoPersistentEntity<?> entity) {
return processTypeHintForNestedDocuments(entity.getTypeInformation());
}
private TypeInformation<?> processTypeHintForNestedDocuments(TypeInformation<?> info) {
Class<?> type = info.getActualType().getType();
if (type.isInterface() || java.lang.reflect.Modifier.isAbstract(type.getModifiers())) {
return info;
}
return NESTED_DOCUMENT;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.QueryMapper#createPropertyField(org.springframework.data.mongodb.core.mapping.MongoPersistentEntity, java.lang.String, org.springframework.data.mapping.context.MappingContext)
@@ -146,8 +163,8 @@ public class UpdateMapper extends QueryMapper {
protected Field createPropertyField(MongoPersistentEntity<?> entity, String key,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
return entity == null ? super.createPropertyField(entity, key, mappingContext) : //
new MetadataBackedUpdateField(entity, key, mappingContext);
return entity == null ? super.createPropertyField(entity, key, mappingContext)
: new MetadataBackedUpdateField(entity, key, mappingContext);
}
/**
@@ -233,7 +250,35 @@ public class UpdateMapper extends QueryMapper {
protected String mapPropertyName(MongoPersistentProperty property) {
String mappedName = PropertyToFieldNameConverter.INSTANCE.convert(property);
return iterator.hasNext() && iterator.next().equals("$") ? String.format("%s.$", mappedName) : mappedName;
boolean inspect = iterator.hasNext();
while (inspect) {
String partial = iterator.next();
boolean isPositional = isPositionalParameter(partial);
if (isPositional) {
mappedName += "." + partial;
}
inspect = isPositional && iterator.hasNext();
}
return mappedName;
}
boolean isPositionalParameter(String partial) {
if (partial.equals("$")) {
return true;
}
try {
Long.valueOf(partial);
return true;
} catch (NumberFormatException e) {
return false;
}
}
}

View File

@@ -0,0 +1,341 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.springframework.data.geo.Point;
import com.fasterxml.jackson.core.JsonParser;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.DeserializationContext;
import com.fasterxml.jackson.databind.JsonDeserializer;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.JsonSerializer;
import com.fasterxml.jackson.databind.Module;
import com.fasterxml.jackson.databind.module.SimpleModule;
import com.fasterxml.jackson.databind.node.ArrayNode;
/**
* A Jackson {@link Module} to register custom {@link JsonSerializer} and {@link JsonDeserializer}s for GeoJSON types.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @since 1.7
*/
public class GeoJsonModule extends SimpleModule {
private static final long serialVersionUID = -8723016728655643720L;
public GeoJsonModule() {
addDeserializer(GeoJsonPoint.class, new GeoJsonPointDeserializer());
addDeserializer(GeoJsonMultiPoint.class, new GeoJsonMultiPointDeserializer());
addDeserializer(GeoJsonLineString.class, new GeoJsonLineStringDeserializer());
addDeserializer(GeoJsonMultiLineString.class, new GeoJsonMultiLineStringDeserializer());
addDeserializer(GeoJsonPolygon.class, new GeoJsonPolygonDeserializer());
addDeserializer(GeoJsonMultiPolygon.class, new GeoJsonMultiPolygonDeserializer());
}
/**
* @author Christoph Strobl
* @since 1.7
*/
private static abstract class GeoJsonDeserializer<T extends GeoJson<?>> extends JsonDeserializer<T> {
/*
* (non-Javadoc)
* @see com.fasterxml.jackson.databind.JsonDeserializer#deserialize(com.fasterxml.jackson.core.JsonParser, com.fasterxml.jackson.databind.DeserializationContext)
*/
@Override
public T deserialize(JsonParser jp, DeserializationContext ctxt) throws IOException, JsonProcessingException {
JsonNode node = jp.readValueAsTree();
JsonNode coordinates = node.get("coordinates");
if (coordinates != null && coordinates.isArray()) {
return doDeserialize((ArrayNode) coordinates);
}
return null;
}
/**
* Perform the actual deserialization given the {@literal coordinates} as {@link ArrayNode}.
*
* @param coordinates
* @return
*/
protected abstract T doDeserialize(ArrayNode coordinates);
/**
* Get the {@link GeoJsonPoint} representation of given {@link ArrayNode} assuming {@code node.[0]} represents
* {@literal x - coordinate} and {@code node.[1]} is {@literal y}.
*
* @param node can be {@literal null}.
* @return {@literal null} when given a {@code null} value.
*/
protected GeoJsonPoint toGeoJsonPoint(ArrayNode node) {
if (node == null) {
return null;
}
return new GeoJsonPoint(node.get(0).asDouble(), node.get(1).asDouble());
}
/**
* Get the {@link Point} representation of given {@link ArrayNode} assuming {@code node.[0]} represents
* {@literal x - coordinate} and {@code node.[1]} is {@literal y}.
*
* @param node can be {@literal null}.
* @return {@literal null} when given a {@code null} value.
*/
protected Point toPoint(ArrayNode node) {
if (node == null) {
return null;
}
return new Point(node.get(0).asDouble(), node.get(1).asDouble());
}
/**
* Get the points nested within given {@link ArrayNode}.
*
* @param node can be {@literal null}.
* @return {@literal empty list} when given a {@code null} value.
*/
protected List<Point> toPoints(ArrayNode node) {
if (node == null) {
return Collections.emptyList();
}
List<Point> points = new ArrayList<Point>(node.size());
for (JsonNode coordinatePair : node) {
if (coordinatePair.isArray()) {
points.add(toPoint((ArrayNode) coordinatePair));
}
}
return points;
}
protected GeoJsonLineString toLineString(ArrayNode node) {
return new GeoJsonLineString(toPoints((ArrayNode) node));
}
}
/**
* {@link JsonDeserializer} converting GeoJSON representation of {@literal Point}.
*
* <pre>
* <code>
* { "type": "Point", "coordinates": [10.0, 20.0] }
* </code>
* </pre>
*
* @author Christoph Strobl
* @since 1.7
*/
private static class GeoJsonPointDeserializer extends GeoJsonDeserializer<GeoJsonPoint> {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJsonModule.GeoJsonDeserializer#doDeserialize(com.fasterxml.jackson.databind.node.ArrayNode)
*/
@Override
protected GeoJsonPoint doDeserialize(ArrayNode coordinates) {
return toGeoJsonPoint(coordinates);
}
}
/**
* {@link JsonDeserializer} converting GeoJSON representation of {@literal LineString}.
*
* <pre>
* <code>
* {
* "type": "LineString",
* "coordinates": [
* [10.0, 20.0], [30.0, 40.0], [50.0, 60.0]
* ]
* }
* </code>
* </pre>
*
* @author Christoph Strobl
* @since 1.7
*/
private static class GeoJsonLineStringDeserializer extends GeoJsonDeserializer<GeoJsonLineString> {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJsonModule.GeoJsonDeserializer#doDeserialize(com.fasterxml.jackson.databind.node.ArrayNode)
*/
@Override
protected GeoJsonLineString doDeserialize(ArrayNode coordinates) {
return new GeoJsonLineString(toPoints(coordinates));
}
}
/**
* {@link JsonDeserializer} converting GeoJSON representation of {@literal MultiPoint}.
*
* <pre>
* <code>
* {
* "type": "MultiPoint",
* "coordinates": [
* [10.0, 20.0], [30.0, 40.0], [50.0, 60.0]
* ]
* }
* </code>
* </pre>
*
* @author Christoph Strobl
* @since 1.7
*/
private static class GeoJsonMultiPointDeserializer extends GeoJsonDeserializer<GeoJsonMultiPoint> {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJsonModule.GeoJsonDeserializer#doDeserialize(com.fasterxml.jackson.databind.node.ArrayNode)
*/
@Override
protected GeoJsonMultiPoint doDeserialize(ArrayNode coordinates) {
return new GeoJsonMultiPoint(toPoints(coordinates));
}
}
/**
* {@link JsonDeserializer} converting GeoJSON representation of {@literal MultiLineString}.
*
* <pre>
* <code>
* {
* "type": "MultiLineString",
* "coordinates": [
* [ [10.0, 20.0], [30.0, 40.0] ],
* [ [50.0, 60.0] , [70.0, 80.0] ]
* ]
* }
* </code>
* </pre>
*
* @author Christoph Strobl
* @since 1.7
*/
private static class GeoJsonMultiLineStringDeserializer extends GeoJsonDeserializer<GeoJsonMultiLineString> {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJsonModule.GeoJsonDeserializer#doDeserialize(com.fasterxml.jackson.databind.node.ArrayNode)
*/
@Override
protected GeoJsonMultiLineString doDeserialize(ArrayNode coordinates) {
List<GeoJsonLineString> lines = new ArrayList<GeoJsonLineString>(coordinates.size());
for (JsonNode lineString : coordinates) {
if (lineString.isArray()) {
lines.add(toLineString((ArrayNode) lineString));
}
}
return new GeoJsonMultiLineString(lines);
}
}
/**
* {@link JsonDeserializer} converting GeoJSON representation of {@literal Polygon}.
*
* <pre>
* <code>
* {
* "type": "Polygon",
* "coordinates": [
* [ [100.0, 0.0], [101.0, 0.0], [101.0, 1.0], [100.0, 1.0], [100.0, 0.0] ]
* ]
* }
* </code>
* </pre>
*
* @author Christoph Strobl
* @since 1.7
*/
private static class GeoJsonPolygonDeserializer extends GeoJsonDeserializer<GeoJsonPolygon> {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJsonModule.GeoJsonDeserializer#doDeserialize(com.fasterxml.jackson.databind.node.ArrayNode)
*/
@Override
protected GeoJsonPolygon doDeserialize(ArrayNode coordinates) {
for (JsonNode ring : coordinates) {
// currently we do not support holes in polygons.
return new GeoJsonPolygon(toPoints((ArrayNode) ring));
}
return null;
}
}
/**
* {@link JsonDeserializer} converting GeoJSON representation of {@literal MultiPolygon}.
*
* <pre>
* <code>
* {
* "type": "MultiPolygon",
* "coordinates": [
* [[[102.0, 2.0], [103.0, 2.0], [103.0, 3.0], [102.0, 3.0], [102.0, 2.0]]],
* [[[100.0, 0.0], [101.0, 0.0], [101.0, 1.0], [100.0, 1.0], [100.0, 0.0]],
* [[100.2, 0.2], [100.8, 0.2], [100.8, 0.8], [100.2, 0.8], [100.2, 0.2]]]
* ]
* }
* </code>
* </pre>
*
* @author Christoph Strobl
* @since 1.7
*/
private static class GeoJsonMultiPolygonDeserializer extends GeoJsonDeserializer<GeoJsonMultiPolygon> {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.GeoJsonModule.GeoJsonDeserializer#doDeserialize(com.fasterxml.jackson.databind.node.ArrayNode)
*/
@Override
protected GeoJsonMultiPolygon doDeserialize(ArrayNode coordinates) {
List<GeoJsonPolygon> polygones = new ArrayList<GeoJsonPolygon>(coordinates.size());
for (JsonNode polygon : coordinates) {
for (JsonNode ring : (ArrayNode) polygon) {
polygones.add(new GeoJsonPolygon(toPoints((ArrayNode) ring)));
}
}
return new GeoJsonMultiPolygon(polygones);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2014 the original author or authors.
* Copyright 2014-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,22 +16,24 @@
package org.springframework.data.mongodb.core.index;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexResolver.IndexDefinitionHolder;
import org.springframework.data.util.TypeInformation;
/**
* {@link IndexResolver} finds those {@link IndexDefinition}s to be created for a given class.
*
* @author Christoph Strobl
* @author Thomas Darimont
* @since 1.5
*/
interface IndexResolver {
/**
* Find and create {@link IndexDefinition}s for properties of given {@code type}. {@link IndexDefinition}s are created
* Find and create {@link IndexDefinition}s for properties of given {@link TypeInformation}. {@link IndexDefinition}s are created
* for properties and types with {@link Indexed}, {@link CompoundIndexes} or {@link GeoSpatialIndexed}.
*
* @param type
* @param typeInformation
* @return Empty {@link Iterable} in case no {@link IndexDefinition} could be resolved for type.
*/
Iterable<? extends IndexDefinitionHolder> resolveIndexForClass(Class<?> type);
Iterable<? extends IndexDefinitionHolder> resolveIndexFor(TypeInformation<?> typeInformation);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -60,4 +60,10 @@ public class MongoMappingEventPublisher implements ApplicationEventPublisher {
indexCreator.onApplicationEvent((MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>) event);
}
}
/*
* (non-Javadoc)
* @see org.springframework.context.ApplicationEventPublisher#publishEvent(java.lang.Object)
*/
public void publishEvent(Object event) {}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2014 the original author or authors.
* Copyright 2011-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -29,7 +29,6 @@ import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexRes
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.Assert;
/**
@@ -43,8 +42,7 @@ import org.springframework.util.Assert;
* @author Laurent Canet
* @author Christoph Strobl
*/
public class MongoPersistentEntityIndexCreator implements
ApplicationListener<MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty>> {
public class MongoPersistentEntityIndexCreator implements ApplicationListener<MappingContextEvent<?, ?>> {
private static final Logger LOGGER = LoggerFactory.getLogger(MongoPersistentEntityIndexCreator.class);
@@ -54,7 +52,7 @@ public class MongoPersistentEntityIndexCreator implements
private final IndexResolver indexResolver;
/**
* Creats a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* Creates a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* {@link MongoDbFactory}.
*
* @param mappingContext must not be {@literal null}.
@@ -65,7 +63,7 @@ public class MongoPersistentEntityIndexCreator implements
}
/**
* Creats a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* Creates a new {@link MongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext} and
* {@link MongoDbFactory}.
*
* @param mappingContext must not be {@literal null}.
@@ -92,7 +90,7 @@ public class MongoPersistentEntityIndexCreator implements
* (non-Javadoc)
* @see org.springframework.context.ApplicationListener#onApplicationEvent(org.springframework.context.ApplicationEvent)
*/
public void onApplicationEvent(MappingContextEvent<MongoPersistentEntity<?>, MongoPersistentProperty> event) {
public void onApplicationEvent(MappingContextEvent<?, ?> event) {
if (!event.wasEmittedBy(mappingContext)) {
return;
@@ -102,7 +100,7 @@ public class MongoPersistentEntityIndexCreator implements
// Double check type as Spring infrastructure does not consider nested generics
if (entity instanceof MongoPersistentEntity) {
checkForIndexes(event.getPersistentEntity());
checkForIndexes((MongoPersistentEntity<?>) entity);
}
}
@@ -125,15 +123,15 @@ public class MongoPersistentEntityIndexCreator implements
private void checkForAndCreateIndexes(MongoPersistentEntity<?> entity) {
if (entity.findAnnotation(Document.class) != null) {
for (IndexDefinitionHolder indexToCreate : indexResolver.resolveIndexForClass(entity.getType())) {
for (IndexDefinitionHolder indexToCreate : indexResolver.resolveIndexFor(entity.getTypeInformation())) {
createIndex(indexToCreate);
}
}
}
private void createIndex(IndexDefinitionHolder indexDefinition) {
mongoDbFactory.getDb().getCollection(indexDefinition.getCollection())
.createIndex(indexDefinition.getIndexKeys(), indexDefinition.getIndexOptions());
mongoDbFactory.getDb().getCollection(indexDefinition.getCollection()).createIndex(indexDefinition.getIndexKeys(),
indexDefinition.getIndexOptions());
}
/**

View File

@@ -36,6 +36,7 @@ import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
@@ -51,6 +52,7 @@ import com.mongodb.util.JSON;
* scanning related annotations.
*
* @author Christoph Strobl
* @author Thomas Darimont
* @since 1.5
*/
public class MongoPersistentEntityIndexResolver implements IndexResolver {
@@ -70,13 +72,12 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
this.mappingContext = mappingContext;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexResolver#resolveIndexForClass(java.lang.Class)
/* (non-Javadoc)
* @see org.springframework.data.mongodb.core.index.IndexResolver#resolveIndexForClass(org.springframework.data.util.TypeInformation)
*/
@Override
public List<IndexDefinitionHolder> resolveIndexForClass(Class<?> type) {
return resolveIndexForEntity(mappingContext.getPersistentEntity(type));
public Iterable<? extends IndexDefinitionHolder> resolveIndexFor(TypeInformation<?> typeInformation) {
return resolveIndexForEntity(mappingContext.getPersistentEntity(typeInformation));
}
/**
@@ -117,7 +118,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
indexInformation.add(indexDefinitionHolder);
}
} catch (CyclicPropertyReferenceException e) {
LOGGER.warn(e.getMessage());
LOGGER.info(e.getMessage());
}
}
});
@@ -155,7 +156,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
indexInformation.addAll(resolveIndexForClass(persistentProperty.getActualType(), propertyDotPath,
collection, guard));
} catch (CyclicPropertyReferenceException e) {
LOGGER.warn(e.getMessage());
LOGGER.info(e.getMessage());
}
}
@@ -205,7 +206,7 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
appendTextIndexInformation("", indexDefinitionBuilder, root,
new TextIndexIncludeOptions(IncludeStrategy.DEFAULT), new CycleGuard());
} catch (CyclicPropertyReferenceException e) {
LOGGER.warn(e.getMessage());
LOGGER.info(e.getMessage());
}
TextIndexDefinition indexDefinition = indexDefinitionBuilder.build();
@@ -256,9 +257,9 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
appendTextIndexInformation(propertyDotPath, indexDefinitionBuilder,
mappingContext.getPersistentEntity(persistentProperty.getActualType()), optionsForNestedType, guard);
} catch (CyclicPropertyReferenceException e) {
LOGGER.warn(e.getMessage(), e);
LOGGER.info(e.getMessage(), e);
} catch (InvalidDataAccessApiUsageException e) {
LOGGER.warn(
LOGGER.info(
String.format("Potentially invald index structure discovered. Breaking operation for %s.",
entity.getName()), e);
}

View File

@@ -190,8 +190,8 @@ public class Criteria implements CriteriaDefinition {
*/
public Criteria in(Object... o) {
if (o.length > 1 && o[1] instanceof Collection) {
throw new InvalidMongoDbApiUsageException("You can only pass in one argument of type "
+ o[1].getClass().getName());
throw new InvalidMongoDbApiUsageException(
"You can only pass in one argument of type " + o[1].getClass().getName());
}
criteria.put("$in", Arrays.asList(o));
return this;
@@ -433,7 +433,23 @@ public class Criteria implements CriteriaDefinition {
}
/**
* Creates a geospatical criterion using a {@literal $maxDistance} operation, for use with $near
* Creates criterion using {@code $geoIntersects} operator which matches intersections of the given {@code geoJson}
* structure and the documents one. Requires MongoDB 2.4 or better.
*
* @param geoJson must not be {@literal null}.
* @return
* @since 1.8
*/
@SuppressWarnings("rawtypes")
public Criteria intersects(GeoJson geoJson) {
Assert.notNull(geoJson, "GeoJson must not be null!");
criteria.put("$geoIntersects", geoJson);
return this;
}
/**
* Creates a geo-spatial criterion using a {@literal $maxDistance} operation, for use with $near
*
* @see http://docs.mongodb.org/manual/reference/operator/query/maxDistance/
* @param maxDistance
@@ -526,8 +542,8 @@ public class Criteria implements CriteriaDefinition {
private Criteria registerCriteriaChainElement(Criteria criteria) {
if (lastOperatorWasNot()) {
throw new IllegalArgumentException("operator $not is not allowed around criteria chain element: "
+ criteria.getCriteriaObject());
throw new IllegalArgumentException(
"operator $not is not allowed around criteria chain element: " + criteria.getCriteriaObject());
} else {
criteriaChain.add(criteria);
}

View File

@@ -27,6 +27,7 @@ import org.springframework.context.annotation.ComponentScan.Filter;
import org.springframework.context.annotation.Import;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.repository.support.MongoRepositoryFactoryBean;
import org.springframework.data.repository.config.DefaultRepositoryBaseClass;
import org.springframework.data.repository.query.QueryLookupStrategy;
import org.springframework.data.repository.query.QueryLookupStrategy.Key;
@@ -107,6 +108,14 @@ public @interface EnableMongoRepositories {
*/
Class<?> repositoryFactoryBeanClass() default MongoRepositoryFactoryBean.class;
/**
* Configure the repository base class to be used to create repository proxies for this particular configuration.
*
* @return
* @since 1.8
*/
Class<?> repositoryBaseClass() default DefaultRepositoryBaseClass.class;
/**
* Configures the name of the {@link MongoTemplate} bean to be used with the repositories detected.
*

View File

@@ -60,8 +60,8 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
*/
public AbstractMongoQuery(MongoQueryMethod method, MongoOperations operations) {
Assert.notNull(operations);
Assert.notNull(method);
Assert.notNull(operations, "MongoOperations must not be null!");
Assert.notNull(method, "MongoQueryMethod must not be null!");
this.method = method;
this.operations = operations;
@@ -308,8 +308,8 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
Object execute(Query query) {
MongoEntityMetadata<?> metadata = method.getEntityInformation();
return countProjection ? operations.count(query, metadata.getJavaType()) : operations.findOne(query,
metadata.getJavaType(), metadata.getCollectionName());
return countProjection ? operations.count(query, metadata.getJavaType())
: operations.findOne(query, metadata.getJavaType(), metadata.getCollectionName());
}
}

View File

@@ -41,6 +41,7 @@ import com.mongodb.DBRef;
*
* @author Oliver Gierke
* @author Christoph Strobl
* @author Thomas Darimont
*/
public class ConvertingParameterAccessor implements MongoParameterAccessor {
@@ -216,7 +217,7 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
/**
* Returns the given object as {@link Collection}. Will do a copy of it if it implements {@link Iterable} or is an
* array. Will return an empty {@link Collection} in case {@literal null} is given. Will wrap all other types into a
* single-element collction
* single-element collection.
*
* @param source
* @return
@@ -240,6 +241,15 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
return source.getClass().isArray() ? CollectionUtils.arrayToList(source) : Collections.singleton(source);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.MongoParameterAccessor#getValues()
*/
@Override
public Object[] getValues() {
return delegate.getValues();
}
/**
* Custom {@link Iterator} that adds a method to access elements in a converted manner.
*

View File

@@ -26,6 +26,7 @@ import org.springframework.data.repository.query.ParameterAccessor;
*
* @author Oliver Gierke
* @author Christoph Strobl
* @author Thomas Darimont
*/
public interface MongoParameterAccessor extends ParameterAccessor {
@@ -51,4 +52,12 @@ public interface MongoParameterAccessor extends ParameterAccessor {
* @since 1.6
*/
TextCriteria getFullText();
/**
* Returns the raw parameter values of the underlying query method.
*
* @return
* @since 1.8
*/
Object[] getValues();
}

View File

@@ -29,20 +29,23 @@ import org.springframework.util.ClassUtils;
*
* @author Oliver Gierke
* @author Christoph Strobl
* @author Thomas Darimont
*/
public class MongoParametersParameterAccessor extends ParametersParameterAccessor implements MongoParameterAccessor {
private final MongoQueryMethod method;
private final Object[] values;
/**
* Creates a new {@link MongoParametersParameterAccessor}.
*
* @param method must not be {@literal null}.
* @param values must not be {@@iteral null}.
* @param values must not be {@literal null}.
*/
public MongoParametersParameterAccessor(MongoQueryMethod method, Object[] values) {
super(method.getParameters(), values);
this.method = method;
this.values = values;
}
public Range<Distance> getDistanceRange() {
@@ -117,8 +120,17 @@ public class MongoParametersParameterAccessor extends ParametersParameterAccesso
return ((TextCriteria) fullText);
}
throw new IllegalArgumentException(String.format(
"Expected full text parameter to be one of String, Term or TextCriteria but found %s.",
ClassUtils.getShortName(fullText.getClass())));
throw new IllegalArgumentException(
String.format("Expected full text parameter to be one of String, Term or TextCriteria but found %s.",
ClassUtils.getShortName(fullText.getClass())));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.MongoParameterAccessor#getValues()
*/
@Override
public Object[] getValues() {
return values;
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2002-2014 the original author or authors.
* Copyright 2002-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -34,6 +34,7 @@ import com.mongodb.util.JSONParseException;
*
* @author Oliver Gierke
* @author Christoph Strobl
* @author Thomas Darimont
*/
public class PartTreeMongoQuery extends AbstractMongoQuery {
@@ -45,7 +46,7 @@ public class PartTreeMongoQuery extends AbstractMongoQuery {
* Creates a new {@link PartTreeMongoQuery} from the given {@link QueryMethod} and {@link MongoTemplate}.
*
* @param method must not be {@literal null}.
* @param template must not be {@literal null}.
* @param mongoOperations must not be {@literal null}.
*/
public PartTreeMongoQuery(MongoQueryMethod method, MongoOperations mongoOperations) {
@@ -97,8 +98,8 @@ public class PartTreeMongoQuery extends AbstractMongoQuery {
return result;
} catch (JSONParseException o_O) {
throw new IllegalStateException(String.format("Invalid query or field specification in %s!", getQueryMethod(),
o_O));
throw new IllegalStateException(String.format("Invalid query or field specification in %s!", getQueryMethod()),
o_O);
}
}

View File

@@ -26,6 +26,11 @@ import org.slf4j.LoggerFactory;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.repository.query.EvaluationContextProvider;
import org.springframework.expression.EvaluationContext;
import org.springframework.expression.Expression;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.DBObject;
@@ -43,7 +48,7 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
private static final String COUND_AND_DELETE = "Manually defined query for %s cannot be both a count and delete query at the same time!";
private static final Logger LOG = LoggerFactory.getLogger(StringBasedMongoQuery.class);
private static final ParameterBindingParser PARSER = ParameterBindingParser.INSTANCE;
private static final ParameterBindingParser BINDING_PARSER = ParameterBindingParser.INSTANCE;
private final String query;
private final String fieldSpec;
@@ -51,33 +56,49 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
private final boolean isDeleteQuery;
private final List<ParameterBinding> queryParameterBindings;
private final List<ParameterBinding> fieldSpecParameterBindings;
private final SpelExpressionParser expressionParser;
private final EvaluationContextProvider evaluationContextProvider;
/**
* Creates a new {@link StringBasedMongoQuery} for the given {@link MongoQueryMethod} and {@link MongoOperations}.
*
* @param method must not be {@literal null}.
* @param mongoOperations must not be {@literal null}.
* @param expressionParser must not be {@literal null}.
* @param evaluationContextProvider must not be {@literal null}.
*/
public StringBasedMongoQuery(MongoQueryMethod method, MongoOperations mongoOperations) {
this(method.getAnnotatedQuery(), method, mongoOperations);
public StringBasedMongoQuery(MongoQueryMethod method, MongoOperations mongoOperations,
SpelExpressionParser expressionParser, EvaluationContextProvider evaluationContextProvider) {
this(method.getAnnotatedQuery(), method, mongoOperations, expressionParser, evaluationContextProvider);
}
/**
* Creates a new {@link StringBasedMongoQuery} for the given {@link String}, {@link MongoQueryMethod} and
* {@link MongoOperations}.
*
* Creates a new {@link StringBasedMongoQuery} for the given {@link String}, {@link MongoQueryMethod},
* {@link MongoOperations}, {@link SpelExpressionParser} and {@link EvaluationContextProvider}.
*
* @param query must not be {@literal null}.
* @param method must not be {@literal null}.
* @param template must not be {@literal null}.
* @param mongoOperations must not be {@literal null}.
* @param expressionParser must not be {@literal null}.
*/
public StringBasedMongoQuery(String query, MongoQueryMethod method, MongoOperations mongoOperations) {
public StringBasedMongoQuery(String query, MongoQueryMethod method, MongoOperations mongoOperations,
SpelExpressionParser expressionParser, EvaluationContextProvider evaluationContextProvider) {
super(method, mongoOperations);
this.query = query;
this.queryParameterBindings = PARSER.parseParameterBindingsFrom(query);
Assert.notNull(query, "Query must not be null!");
Assert.notNull(expressionParser, "SpelExpressionParser must not be null!");
this.fieldSpec = method.getFieldSpecification();
this.fieldSpecParameterBindings = PARSER.parseParameterBindingsFrom(method.getFieldSpecification());
this.expressionParser = expressionParser;
this.evaluationContextProvider = evaluationContextProvider;
this.queryParameterBindings = new ArrayList<ParameterBinding>();
this.query = BINDING_PARSER.parseAndCollectParameterBindingsFromQueryIntoBindings(query,
this.queryParameterBindings);
this.fieldSpecParameterBindings = new ArrayList<ParameterBinding>();
this.fieldSpec = BINDING_PARSER.parseAndCollectParameterBindingsFromQueryIntoBindings(
method.getFieldSpecification(), this.fieldSpecParameterBindings);
this.isCountQuery = method.hasAnnotatedQuery() ? method.getQueryAnnotation().count() : false;
this.isDeleteQuery = method.hasAnnotatedQuery() ? method.getQueryAnnotation().delete() : false;
@@ -140,7 +161,8 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
* @param bindings
* @return
*/
private String replacePlaceholders(String input, ConvertingParameterAccessor accessor, List<ParameterBinding> bindings) {
private String replacePlaceholders(String input, ConvertingParameterAccessor accessor,
List<ParameterBinding> bindings) {
if (bindings.isEmpty()) {
return input;
@@ -170,7 +192,8 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
*/
private String getParameterValueForBinding(ConvertingParameterAccessor accessor, ParameterBinding binding) {
Object value = accessor.getBindableValue(binding.getParameterIndex());
Object value = binding.isExpression() ? evaluateExpression(binding.getExpression(), accessor.getValues())
: accessor.getBindableValue(binding.getParameterIndex());
if (value instanceof String && binding.isQuoted()) {
return (String) value;
@@ -179,6 +202,21 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
return JSON.serialize(value);
}
/**
* Evaluates the given {@code expressionString}.
*
* @param expressionString
* @param parameterValues
* @return
*/
private Object evaluateExpression(String expressionString, Object[] parameterValues) {
EvaluationContext evaluationContext = evaluationContextProvider
.getEvaluationContext(getQueryMethod().getParameters(), parameterValues);
Expression expression = expressionParser.parseExpression(expressionString);
return expression.getValue(evaluationContext, Object.class);
}
/**
* A parser that extracts the parameter bindings from a given query string.
*
@@ -192,6 +230,7 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
private static final String PARSEABLE_PARAMETER = "\"" + PARAMETER_PREFIX + "$1\"";
private static final Pattern PARAMETER_BINDING_PATTERN = Pattern.compile("\\?(\\d+)");
private static final Pattern PARSEABLE_BINDING_PATTERN = Pattern.compile("\"?" + PARAMETER_PREFIX + "(\\d+)\"?");
private static final Pattern PARAMETER_EXPRESSION_PATTERN = Pattern.compile("((:|\\?)#\\{([^}]+)\\})");
private final static int PARAMETER_INDEX_GROUP = 1;
@@ -199,23 +238,53 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
* Returns a list of {@link ParameterBinding}s found in the given {@code input} or an
* {@link Collections#emptyList()}.
*
* @param input
* @param conversionService must not be {@literal null}.
* @param input can be {@literal null} or empty.
* @param bindings must not be {@literal null}.
* @return
*/
public List<ParameterBinding> parseParameterBindingsFrom(String input) {
public String parseAndCollectParameterBindingsFromQueryIntoBindings(String input, List<ParameterBinding> bindings) {
if (!StringUtils.hasText(input)) {
return Collections.emptyList();
return input;
}
List<ParameterBinding> bindings = new ArrayList<ParameterBinding>();
Assert.notNull(bindings, "Parameter bindings must not be null!");
String parseableInput = makeParameterReferencesParseable(input);
String transformedInput = transformQueryAndCollectExpressionParametersIntoBindings(input, bindings);
String parseableInput = makeParameterReferencesParseable(transformedInput);
collectParameterReferencesIntoBindings(bindings, JSON.parse(parseableInput));
return bindings;
return transformedInput;
}
private String transformQueryAndCollectExpressionParametersIntoBindings(String input,
List<ParameterBinding> bindings) {
Matcher matcher = PARAMETER_EXPRESSION_PATTERN.matcher(input);
StringBuilder result = new StringBuilder();
int lastPos = 0;
int exprIndex = 0;
while (matcher.find()) {
int startOffSet = matcher.start();
result.append(input.subSequence(lastPos, startOffSet));
result.append("'?expr").append(exprIndex).append("'");
lastPos = matcher.end();
bindings.add(new ParameterBinding(exprIndex, true, matcher.group(3)));
exprIndex++;
}
result.append(input.subSequence(lastPos, input.length()));
return result.toString();
}
private String makeParameterReferencesParseable(String input) {
@@ -236,9 +305,10 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
} else if (value instanceof Pattern) {
String string = ((Pattern) value).toString().trim();
Matcher valueMatcher = PARSEABLE_BINDING_PATTERN.matcher(string);
while (valueMatcher.find()) {
int paramIndex = Integer.parseInt(valueMatcher.group(PARAMETER_INDEX_GROUP));
/*
@@ -292,6 +362,7 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
private final int parameterIndex;
private final boolean quoted;
private final String expression;
/**
* Creates a new {@link ParameterBinding} with the given {@code parameterIndex} and {@code quoted} information.
@@ -300,9 +371,14 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
* @param quoted whether or not the parameter is already quoted.
*/
public ParameterBinding(int parameterIndex, boolean quoted) {
this(parameterIndex, quoted, null);
}
public ParameterBinding(int parameterIndex, boolean quoted, String expression) {
this.parameterIndex = parameterIndex;
this.quoted = quoted;
this.expression = expression;
}
public boolean isQuoted() {
@@ -314,7 +390,15 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
}
public String getParameter() {
return "?" + parameterIndex;
return "?" + (isExpression() ? "expr" : "") + parameterIndex;
}
public String getExpression() {
return expression;
}
public boolean isExpression() {
return this.expression != null;
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2012 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -32,31 +32,38 @@ import org.springframework.data.mongodb.repository.query.PartTreeMongoQuery;
import org.springframework.data.mongodb.repository.query.StringBasedMongoQuery;
import org.springframework.data.querydsl.QueryDslPredicateExecutor;
import org.springframework.data.repository.core.NamedQueries;
import org.springframework.data.repository.core.RepositoryInformation;
import org.springframework.data.repository.core.RepositoryMetadata;
import org.springframework.data.repository.core.support.RepositoryFactorySupport;
import org.springframework.data.repository.query.EvaluationContextProvider;
import org.springframework.data.repository.query.QueryLookupStrategy;
import org.springframework.data.repository.query.QueryLookupStrategy.Key;
import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.util.Assert;
/**
* Factory to create {@link MongoRepository} instances.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class MongoRepositoryFactory extends RepositoryFactorySupport {
private static final SpelExpressionParser EXPRESSION_PARSER = new SpelExpressionParser();
private final MongoOperations mongoOperations;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
/**
* Creates a new {@link MongoRepositoryFactory} with the given {@link MongoOperations}.
*
* @param mongoOperations must not be {@literal null}
* @param mongoOperations must not be {@literal null}.
*/
public MongoRepositoryFactory(MongoOperations mongoOperations) {
Assert.notNull(mongoOperations);
this.mongoOperations = mongoOperations;
this.mappingContext = mongoOperations.getConverter().getMappingContext();
}
@@ -67,67 +74,31 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
*/
@Override
protected Class<?> getRepositoryBaseClass(RepositoryMetadata metadata) {
return isQueryDslRepository(metadata.getRepositoryInterface()) ? QueryDslMongoRepository.class
: SimpleMongoRepository.class;
boolean isQueryDslRepository = QUERY_DSL_PRESENT
&& QueryDslPredicateExecutor.class.isAssignableFrom(metadata.getRepositoryInterface());
return isQueryDslRepository ? QueryDslMongoRepository.class : SimpleMongoRepository.class;
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.core.support.RepositoryFactorySupport#getTargetRepository(org.springframework.data.repository.core.RepositoryMetadata)
* @see org.springframework.data.repository.core.support.RepositoryFactorySupport#getTargetRepository(org.springframework.data.repository.core.RepositoryInformation)
*/
@Override
@SuppressWarnings({ "rawtypes", "unchecked" })
protected Object getTargetRepository(RepositoryMetadata metadata) {
protected Object getTargetRepository(RepositoryInformation information) {
Class<?> repositoryInterface = metadata.getRepositoryInterface();
MongoEntityInformation<?, Serializable> entityInformation = getEntityInformation(metadata.getDomainType());
if (isQueryDslRepository(repositoryInterface)) {
return new QueryDslMongoRepository(entityInformation, mongoOperations);
} else {
return new SimpleMongoRepository(entityInformation, mongoOperations);
}
MongoEntityInformation<?, Serializable> entityInformation = getEntityInformation(information.getDomainType());
return getTargetRepositoryViaReflection(information, entityInformation, mongoOperations);
}
private static boolean isQueryDslRepository(Class<?> repositoryInterface) {
return QUERY_DSL_PRESENT && QueryDslPredicateExecutor.class.isAssignableFrom(repositoryInterface);
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.repository.core.support.RepositoryFactorySupport#getQueryLookupStrategy(org.springframework.data.repository.query.QueryLookupStrategy.Key)
* @see org.springframework.data.repository.core.support.RepositoryFactorySupport#getQueryLookupStrategy(org.springframework.data.repository.query.QueryLookupStrategy.Key, org.springframework.data.repository.query.EvaluationContextProvider)
*/
@Override
protected QueryLookupStrategy getQueryLookupStrategy(Key key) {
return new MongoQueryLookupStrategy();
}
/**
* {@link QueryLookupStrategy} to create {@link PartTreeMongoQuery} instances.
*
* @author Oliver Gierke
*/
private class MongoQueryLookupStrategy implements QueryLookupStrategy {
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.QueryLookupStrategy#resolveQuery(java.lang.reflect.Method, org.springframework.data.repository.core.RepositoryMetadata, org.springframework.data.repository.core.NamedQueries)
*/
public RepositoryQuery resolveQuery(Method method, RepositoryMetadata metadata, NamedQueries namedQueries) {
MongoQueryMethod queryMethod = new MongoQueryMethod(method, metadata, mappingContext);
String namedQueryName = queryMethod.getNamedQueryName();
if (namedQueries.hasQuery(namedQueryName)) {
String namedQuery = namedQueries.getQuery(namedQueryName);
return new StringBasedMongoQuery(namedQuery, queryMethod, mongoOperations);
} else if (queryMethod.hasAnnotatedQuery()) {
return new StringBasedMongoQuery(queryMethod, mongoOperations);
} else {
return new PartTreeMongoQuery(queryMethod, mongoOperations);
}
}
protected QueryLookupStrategy getQueryLookupStrategy(Key key, EvaluationContextProvider evaluationContextProvider) {
return new MongoQueryLookupStrategy(evaluationContextProvider);
}
/*
@@ -141,10 +112,45 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(domainClass);
if (entity == null) {
throw new MappingException(String.format("Could not lookup mapping metadata for domain class %s!",
domainClass.getName()));
throw new MappingException(
String.format("Could not lookup mapping metadata for domain class %s!", domainClass.getName()));
}
return new MappingMongoEntityInformation<T, ID>((MongoPersistentEntity<T>) entity);
}
/**
* {@link QueryLookupStrategy} to create {@link PartTreeMongoQuery} instances.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
private class MongoQueryLookupStrategy implements QueryLookupStrategy {
private final EvaluationContextProvider evaluationContextProvider;
public MongoQueryLookupStrategy(EvaluationContextProvider evaluationContextProvider) {
this.evaluationContextProvider = evaluationContextProvider;
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.query.QueryLookupStrategy#resolveQuery(java.lang.reflect.Method, org.springframework.data.repository.core.RepositoryMetadata, org.springframework.data.repository.core.NamedQueries)
*/
public RepositoryQuery resolveQuery(Method method, RepositoryMetadata metadata, NamedQueries namedQueries) {
MongoQueryMethod queryMethod = new MongoQueryMethod(method, metadata, mappingContext);
String namedQueryName = queryMethod.getNamedQueryName();
if (namedQueries.hasQuery(namedQueryName)) {
String namedQuery = namedQueries.getQuery(namedQueryName);
return new StringBasedMongoQuery(namedQuery, queryMethod, mongoOperations, EXPRESSION_PARSER,
evaluationContextProvider);
} else if (queryMethod.hasAnnotatedQuery()) {
return new StringBasedMongoQuery(queryMethod, mongoOperations, EXPRESSION_PARSER, evaluationContextProvider);
} else {
return new PartTreeMongoQuery(queryMethod, mongoOperations);
}
}
}
}

View File

@@ -5,4 +5,5 @@ http\://www.springframework.org/schema/data/mongo/spring-mongo-1.3.xsd=org/sprin
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.4.xsd=org/springframework/data/mongodb/config/spring-mongo-1.4.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.5.xsd=org/springframework/data/mongodb/config/spring-mongo-1.5.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.7.xsd=org/springframework/data/mongodb/config/spring-mongo-1.7.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo.xsd=org/springframework/data/mongodb/config/spring-mongo-1.7.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo-1.8.xsd=org/springframework/data/mongodb/config/spring-mongo-1.8.xsd
http\://www.springframework.org/schema/data/mongo/spring-mongo.xsd=org/springframework/data/mongodb/config/spring-mongo-1.8.xsd

View File

@@ -18,7 +18,7 @@
<xsd:element name="mongo" type="mongoType">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.MongoFactoryBean"><![CDATA[
Defines a Mongo instance used for accessing MongoDB'.
Deprecated since 1.7 - use mongo-client instead. Defines a Mongo instance used for accessing MongoDB.
]]></xsd:documentation>
<xsd:appinfo>
<tool:annotation>
@@ -31,7 +31,7 @@ Defines a Mongo instance used for accessing MongoDB'.
<xsd:element name="mongo-client" type="mongoClientType">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.MongoClientFactoryBean"><![CDATA[
Defines a MongoClient instance used for accessing MongoDB'.
Defines a MongoClient instance used for accessing MongoDB.
]]></xsd:documentation>
<xsd:appinfo>
<tool:annotation>
@@ -72,14 +72,14 @@ The name of the database to connect to. Default is 'db'.
<xsd:attribute name="authentication-dbname" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the authentication database to connect to. Default is 'db'.
Deprecated since 1.7 - Please use MongoClient internal authentication. The name of the authentication database to connect to. Default is 'db'.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="port" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The port to connect to MongoDB server. Default is 27017
The port to connect to MongoDB server. Default is 27017
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
@@ -93,14 +93,14 @@ The host to connect to a MongoDB server. Default is localhost
<xsd:attribute name="username" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The username to use when connecting to a MongoDB server.
Deprecated since 1.7 - Please use MongoClient internal authentication. The username to use when connecting to a MongoDB server.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="password" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The password to use when connecting to a MongoDB server.
Deprecated since 1.7 - Please use MongoClient internal authentication. The password to use when connecting to a MongoDB server.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
@@ -382,6 +382,11 @@ The name of the Mongo object that determines what server to monitor. (by default
</xsd:attributeGroup>
-->
<xsd:complexType name="mongoType">
<xsd:annotation>
<xsd:documentation><![CDATA[
Deprecated since 1.7.
]]></xsd:documentation>
</xsd:annotation>
<xsd:sequence minOccurs="0" maxOccurs="1">
<xsd:element name="options" type="optionsType">
<xsd:annotation>
@@ -439,6 +444,11 @@ The comma delimited list of host:port entries to use for replica set/pairs.
</xsd:complexType>
<xsd:complexType name="optionsType">
<xsd:annotation>
<xsd:documentation><![CDATA[
Deprecated since 1.7.
]]></xsd:documentation>
</xsd:annotation>
<xsd:attribute name="connections-per-host" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
@@ -542,6 +552,11 @@ The SSLSocketFactory to use for the SSL connection. If none is configured here,
</xsd:complexType>
<xsd:complexType name="mongoClientType">
<xsd:annotation>
<xsd:documentation><![CDATA[
Configuration options for 'MongoClient' - @since 1.7
]]></xsd:documentation>
</xsd:annotation>
<xsd:sequence minOccurs="0" maxOccurs="1">
<xsd:element name="client-options" type="clientOptionsType">
<xsd:annotation>
@@ -593,10 +608,15 @@ The comma delimited list of username:password@database entries to use for authen
</xsd:complexType>
<xsd:complexType name="clientOptionsType">
<xsd:annotation>
<xsd:documentation><![CDATA[
Configuration options for 'MongoClientOptions' - @since 1.7
]]></xsd:documentation>
</xsd:annotation>
<xsd:attribute name="description" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The MongoClient description.
The MongoClient description.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>

View File

@@ -0,0 +1,894 @@
<?xml version="1.0" encoding="UTF-8" ?>
<xsd:schema xmlns="http://www.springframework.org/schema/data/mongo"
xmlns:xsd="http://www.w3.org/2001/XMLSchema"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:beans="http://www.springframework.org/schema/beans"
xmlns:tool="http://www.springframework.org/schema/tool"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:repository="http://www.springframework.org/schema/data/repository"
targetNamespace="http://www.springframework.org/schema/data/mongo"
elementFormDefault="qualified" attributeFormDefault="unqualified">
<xsd:import namespace="http://www.springframework.org/schema/beans" />
<xsd:import namespace="http://www.springframework.org/schema/tool" />
<xsd:import namespace="http://www.springframework.org/schema/context" />
<xsd:import namespace="http://www.springframework.org/schema/data/repository"
schemaLocation="http://www.springframework.org/schema/data/repository/spring-repository.xsd" />
<xsd:element name="mongo" type="mongoType">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.MongoFactoryBean"><![CDATA[
Deprecated since 1.7 - use mongo-client instead. Defines a Mongo instance used for accessing MongoDB.
]]></xsd:documentation>
<xsd:appinfo>
<tool:annotation>
<tool:exports type="com.mongodb.Mongo"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:element>
<xsd:element name="mongo-client" type="mongoClientType">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.MongoClientFactoryBean"><![CDATA[
Defines a MongoClient instance used for accessing MongoDB.
]]></xsd:documentation>
<xsd:appinfo>
<tool:annotation>
<tool:exports type="com.mongodb.MongoClient"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:element>
<xsd:element name="db-factory">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a MongoDbFactory for connecting to a specific database
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongoDbFactory").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mongo-ref" type="mongoRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The reference to a Mongo instance. If not configured a default com.mongodb.Mongo instance will be created.
]]>
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="dbname" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the database to connect to. Default is 'db'.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="authentication-dbname" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
Deprecated since 1.7 - Please use MongoClient internal authentication. The name of the authentication database to connect to. Default is 'db'.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="port" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The port to connect to MongoDB server. Default is 27017
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="host" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The host to connect to a MongoDB server. Default is localhost
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="username" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
Deprecated since 1.7 - Please use MongoClient internal authentication. The username to use when connecting to a MongoDB server.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="password" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
Deprecated since 1.7 - Please use MongoClient internal authentication. The password to use when connecting to a MongoDB server.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="uri" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
Deprecated since 1.8 - Please use client-uri instead. The Mongo URI string.]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="client-uri" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The MongoClientURI string.]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation>
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:attributeGroup name="mongo-repository-attributes">
<xsd:attribute name="mongo-template-ref" type="mongoTemplateRef" default="mongoTemplate">
<xsd:annotation>
<xsd:documentation>
The reference to a MongoTemplate. Will default to 'mongoTemplate'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="create-query-indexes" type="xsd:boolean" default="false">
<xsd:annotation>
<xsd:documentation>
Enables creation of indexes for queries that get derived from the method name
and thus reference domain class properties. Defaults to false.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:attributeGroup>
<xsd:element name="repositories">
<xsd:complexType>
<xsd:complexContent>
<xsd:extension base="repository:repositories">
<xsd:attributeGroup ref="mongo-repository-attributes"/>
<xsd:attributeGroup ref="repository:repository-attributes"/>
</xsd:extension>
</xsd:complexContent>
</xsd:complexType>
</xsd:element>
<xsd:element name="mapping-converter">
<xsd:annotation>
<xsd:documentation><![CDATA[Defines a MongoConverter for getting rich mapping functionality.]]></xsd:documentation>
<xsd:appinfo>
<tool:exports type="org.springframework.data.mongodb.core.convert.MappingMongoConverter" />
</xsd:appinfo>
</xsd:annotation>
<xsd:complexType>
<xsd:sequence>
<xsd:element name="custom-converters" minOccurs="0">
<xsd:annotation>
<xsd:documentation><![CDATA[
Top-level element that contains one or more custom converters to be used for mapping
domain objects to and from Mongo's DBObject]]>
</xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:sequence>
<xsd:element name="converter" type="customConverterType" minOccurs="0" maxOccurs="unbounded"/>
</xsd:sequence>
<xsd:attribute name="base-package" type="xsd:string" />
</xsd:complexType>
</xsd:element>
</xsd:sequence>
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the MappingMongoConverter instance (by default "mappingConverter").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="base-package" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The base package in which to scan for entities annotated with @Document
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="db-factory-ref" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a DbFactory.
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.MongoDbFactory" />
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="type-mapper-ref" type="typeMapperRef" use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a MongoTypeMapper to be used by this MappingMongoConverter.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="mapping-context-ref" type="mappingContextRef" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mapping.model.MappingContext">
The reference to a MappingContext. Will default to 'mappingContext'.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="disable-validation" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.mapping.event.ValidatingMongoEventListener">
Disables JSR-303 validation on MongoDB documents before they are saved. By default it is set to false.
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="xsd:boolean xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
<xsd:attribute name="abbreviate-field-names" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.mapping.CamelCaseAbbreviatingFieldNamingStrategy">
Enables abbreviating the field names for domain class properties to the
first character of their camel case names, e.g. fooBar -> fb. Defaults to false.
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="xsd:boolean xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
<xsd:attribute name="field-naming-strategy-ref" type="fieldNamingStrategyRef" use="optional">
<xsd:annotation>
<xsd:documentation source="org.springframework.data.mongodb.core.mapping.FieldNamingStrategy">
The reference to a FieldNamingStrategy.
</xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:element name="jmx">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a JMX Model MBeans for monitoring a MongoDB server'.
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="mongo-ref" type="mongoRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the Mongo object that determines what server to monitor. (by default "mongo").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:element name="auditing">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation>
<tool:exports type="org.springframework.data.mongodb.core.mapping.event.AuditingEventListener" />
<tool:exports type="org.springframework.data.auditing.IsNewAwareAuditingHandler" />
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:complexType>
<xsd:attributeGroup ref="repository:auditing-attributes" />
<xsd:attribute name="mapping-context-ref" type="mappingContextRef" />
</xsd:complexType>
</xsd:element>
<xsd:simpleType name="typeMapperRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.convert.MongoTypeMapper"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="mappingContextRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mapping.model.MappingContext"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="fieldNamingStrategyRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.mapping.FieldNamingStrategy"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="mongoTemplateRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.MongoTemplate"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="mongoRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.MongoFactoryBean"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="sslSocketFactoryRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="javax.net.ssl.SSLSocketFactory"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:simpleType name="writeConcernEnumeration">
<xsd:restriction base="xsd:token">
<xsd:enumeration value="NONE" />
<xsd:enumeration value="NORMAL" />
<xsd:enumeration value="SAFE" />
<xsd:enumeration value="FSYNC_SAFE" />
<xsd:enumeration value="REPLICAS_SAFE" />
<xsd:enumeration value="JOURNAL_SAFE" />
<xsd:enumeration value="MAJORITY" />
</xsd:restriction>
</xsd:simpleType>
<xsd:simpleType name="readPreferenceEnumeration">
<xsd:restriction base="xsd:token">
<xsd:enumeration value="PRIMARY" />
<xsd:enumeration value="PRIMARY_PREFERRED" />
<xsd:enumeration value="SECONDARY" />
<xsd:enumeration value="SECONDARY_PREFERRED" />
<xsd:enumeration value="NEAREST" />
</xsd:restriction>
</xsd:simpleType>
<!-- MLP
<xsd:attributeGroup name="writeConcern">
<xsd:attribute name="write-concern">
<xsd:simpleType>
<xsd:restriction base="xsd:string">
<xsd:enumeration value="NONE" />
<xsd:enumeration value="NORMAL" />
<xsd:enumeration value="SAFE" />
<xsd:enumeration value="FSYNC_SAFE" />
<xsd:enumeration value="REPLICA_SAFE" />
<xsd:enumeration value="JOURNAL_SAFE" />
<xsd:enumeration value="MAJORITY" />
</xsd:restriction>
</xsd:simpleType>
</xsd:attribute>
</xsd:attributeGroup>
-->
<xsd:complexType name="mongoType">
<xsd:annotation>
<xsd:documentation><![CDATA[
Deprecated since 1.7.
]]></xsd:documentation>
</xsd:annotation>
<xsd:sequence minOccurs="0" maxOccurs="1">
<xsd:element name="options" type="optionsType">
<xsd:annotation>
<xsd:documentation><![CDATA[
The Mongo driver options
]]></xsd:documentation>
<xsd:appinfo>
<tool:annotation>
<tool:exports type="com.mongodb.MongoOptions"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:element>
</xsd:sequence>
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation>
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
<!-- MLP
<xsd:attributeGroup ref="writeConcern" />
-->
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongo").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="port" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The port to connect to MongoDB server. Default is 27017
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="host" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The host to connect to a MongoDB server. Default is localhost
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="replica-set" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The comma delimited list of host:port entries to use for replica set/pairs.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
<xsd:complexType name="optionsType">
<xsd:annotation>
<xsd:documentation><![CDATA[
Deprecated since 1.7.
]]></xsd:documentation>
</xsd:annotation>
<xsd:attribute name="connections-per-host" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The number of connections allowed per host. Will block if run out. Default is 10. System property MONGO.POOLSIZE can override
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="threads-allowed-to-block-for-connection-multiplier" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The multiplier for connectionsPerHost for # of threads that can block. Default is 5.
If connectionsPerHost is 10, and threadsAllowedToBlockForConnectionMultiplier is 5,
then 50 threads can block more than that and an exception will be thrown.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="max-wait-time" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The max wait time of a blocking thread for a connection. Default is 12000 ms (2 minutes)
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="connect-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The connect timeout in milliseconds. 0 is default and infinite.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="socket-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The socket timeout. 0 is default and infinite.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="socket-keep-alive" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The keep alive flag, controls whether or not to have socket keep alive timeout. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="auto-connect-retry" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls whether or not on a connect, the system retries automatically. Default is false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="max-auto-connect-retry-time" type="xsd:long">
<xsd:annotation>
<xsd:documentation><![CDATA[
The maximum amount of time in millisecons to spend retrying to open connection to the same server. Default is 0, which means to use the default 15s if autoConnectRetry is on.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-number" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This specifies the number of servers to wait for on the write operation, and exception raising behavior. The 'w' option to the getlasterror command. Defaults to 0.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls timeout for write operations in milliseconds. The 'wtimeout' option to the getlasterror command. Defaults to 0 (indefinite). Greater than zero is number of milliseconds to wait.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-fsync" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls whether or not to fsync. The 'fsync' option to the getlasterror command. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="slave-ok" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls if the driver is allowed to read from secondaries or slaves. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="ssl" type="xsd:boolean">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls if the driver should us an SSL connection. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="ssl-socket-factory-ref" type="sslSocketFactoryRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The SSLSocketFactory to use for the SSL connection. If none is configured here, SSLSocketFactory#getDefault() will be used.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
<xsd:complexType name="mongoClientType">
<xsd:annotation>
<xsd:documentation><![CDATA[
Configuration options for 'MongoClient' - @since 1.7
]]></xsd:documentation>
</xsd:annotation>
<xsd:sequence minOccurs="0" maxOccurs="1">
<xsd:element name="client-options" type="clientOptionsType">
<xsd:annotation>
<xsd:documentation><![CDATA[
The Mongo driver options
]]></xsd:documentation>
<xsd:appinfo>
<tool:annotation>
<tool:exports type="com.mongodb.MongoClientOptions"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:element>
</xsd:sequence>
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongoClient").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="port" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The port to connect to MongoDB server. Default is 27017
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="host" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The host to connect to a MongoDB server. Default is localhost
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="replica-set" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The comma delimited list of host:port entries to use for replica set/pairs.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="credentials" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The comma delimited list of username:password@database entries to use for authentication. Appending ?uri.authMechanism allows to specify the authentication challenge mechanism.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
<xsd:complexType name="clientOptionsType">
<xsd:annotation>
<xsd:documentation><![CDATA[
Configuration options for 'MongoClientOptions' - @since 1.7
]]></xsd:documentation>
</xsd:annotation>
<xsd:attribute name="description" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The MongoClient description.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="min-connections-per-host" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The minimum number of connections per host.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="connections-per-host" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The number of connections allowed per host. Will block if run out. Default is 10. System property MONGO.POOLSIZE can override
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="threads-allowed-to-block-for-connection-multiplier" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The multiplier for connectionsPerHost for # of threads that can block. Default is 5.
If connectionsPerHost is 10, and threadsAllowedToBlockForConnectionMultiplier is 5,
then 50 threads can block more than that and an exception will be thrown.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="max-wait-time" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The max wait time of a blocking thread for a connection. Default is 12000 ms (2 minutes)
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="max-connection-idle-time" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The maximum idle time for a pooled connection.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="max-connection-life-time" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The maximum life time for a pooled connection.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="connect-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The connect timeout in milliseconds. 0 is default and infinite.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="socket-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The socket timeout. 0 is default and infinite.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="socket-keep-alive" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The keep alive flag, controls whether or not to have socket keep alive timeout. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="read-preference">
<xsd:annotation>
<xsd:documentation><![CDATA[
The read preference.
]]></xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="readPreferenceEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation><![CDATA[
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
]]></xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
<xsd:attribute name="heartbeat-frequency" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This is the frequency that the driver will attempt to determine the current state of each server in the cluster.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="min-heartbeat-frequency" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
In the event that the driver has to frequently re-check a server's availability, it will wait at least this long since the previous check to avoid wasted effort.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="heartbeat-connect-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The connect timeout for connections used for the cluster heartbeat.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="heartbeat-socket-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The socket timeout for connections used for the cluster heartbeat.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="ssl" type="xsd:boolean">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls if the driver should us an SSL connection. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="ssl-socket-factory-ref" type="sslSocketFactoryRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The SSLSocketFactory to use for the SSL connection. If none is configured here, SSLSocketFactory#getDefault() will be used.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
<xsd:group name="beanElementGroup">
<xsd:choice>
<xsd:element ref="beans:bean"/>
<xsd:element ref="beans:ref"/>
</xsd:choice>
</xsd:group>
<xsd:complexType name="customConverterType">
<xsd:annotation>
<xsd:documentation><![CDATA[
Element defining a custom converterr.
]]></xsd:documentation>
</xsd:annotation>
<xsd:group ref="beanElementGroup" minOccurs="0" maxOccurs="1"/>
<xsd:attribute name="ref" type="xsd:string">
<xsd:annotation>
<xsd:documentation>
A reference to a custom converter.
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref"/>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
<xsd:simpleType name="converterRef">
<xsd:annotation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.convert.MongoConverter"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
<xsd:union memberTypes="xsd:string"/>
</xsd:simpleType>
<xsd:element name="template">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a MongoDbFactory for connecting to a specific database
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongoDbFactory").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="converter-ref" type="converterRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The reference to a Mongoconverter instance.
]]>
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.convert.MongoConverter"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="db-factory-ref" type="xsd:string"
use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a DbFactory.
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to
type="org.springframework.data.mongodb.MongoDbFactory" />
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="write-concern">
<xsd:annotation>
<xsd:documentation>
The WriteConcern that will be the default value used when asking the MongoDbFactory for a DB object
</xsd:documentation>
</xsd:annotation>
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
<xsd:element name="gridFsTemplate">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a MongoDbFactory for connecting to a specific database
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="id" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The name of the mongo definition (by default "mongoDbFactory").]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="converter-ref" type="converterRef" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The reference to a Mongoconverter instance.
]]>
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.core.convert.MongoConverter"/>
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="db-factory-ref" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation>
The reference to a DbFactory.
</xsd:documentation>
<xsd:appinfo>
<tool:annotation kind="ref">
<tool:assignable-to type="org.springframework.data.mongodb.MongoDbFactory" />
</tool:annotation>
</xsd:appinfo>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="bucket" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The GridFs bucket string.]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
</xsd:schema>

View File

@@ -0,0 +1,53 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.core.GeoJsonConfiguration;
import org.springframework.data.mongodb.core.geo.GeoJsonModule;
import org.springframework.data.web.config.EnableSpringDataWebSupport;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
/**
* Integration tests for {@link GeoJsonConfiguration}.
*
* @author Oliver Gierke
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
public class GeoJsonConfigurationIntegrationTests {
@Configuration
@EnableSpringDataWebSupport
static class Config {}
@Autowired GeoJsonModule geoJsonModule;
/**
* @see DATAMONGO-1181
*/
@Test
public void picksUpGeoJsonModuleConfigurationByDefault() {
assertThat(geoJsonModule, is(notNullValue()));
}
}

View File

@@ -41,6 +41,7 @@ import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientURI;
import com.mongodb.MongoURI;
import com.mongodb.WriteConcern;
@@ -174,6 +175,29 @@ public class MongoDbFactoryParserIntegrationTests {
reader.loadBeanDefinitions(new ClassPathResource("namespace/mongo-uri-and-details.xml"));
}
/**
* @see DATAMONGO-1218
*/
@Test
public void setsUpMongoDbFactoryUsingAMongoClientUri() {
reader.loadBeanDefinitions(new ClassPathResource("namespace/mongo-client-uri.xml"));
BeanDefinition definition = factory.getBeanDefinition("mongoDbFactory");
ConstructorArgumentValues constructorArguments = definition.getConstructorArgumentValues();
assertThat(constructorArguments.getArgumentCount(), is(1));
ValueHolder argument = constructorArguments.getArgumentValue(0, MongoClientURI.class);
assertThat(argument, is(notNullValue()));
}
/**
* @see DATAMONGO-1218
*/
@Test(expected = BeanDefinitionParsingException.class)
public void rejectsClientUriPlusDetailedConfiguration() {
reader.loadBeanDefinitions(new ClassPathResource("namespace/mongo-client-uri-and-details.xml"));
}
private static void assertWriteConcern(ClassPathXmlApplicationContext ctx, WriteConcern expectedWriteConcern) {
SimpleMongoDbFactory dbFactory = ctx.getBean("first", SimpleMongoDbFactory.class);

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012-2013 the original author or authors.
* Copyright 2012-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.core;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.junit.Assume.*;
import static org.mockito.Matchers.*;
import static org.mockito.Mockito.*;
@@ -27,35 +28,37 @@ import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.invocation.InvocationOnMock;
import org.mockito.runners.MockitoJUnitRunner;
import org.mockito.stubbing.Answer;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mongodb.CannotGetMongoDbConnectionException;
import org.springframework.data.mongodb.util.MongoClientVersion;
import org.springframework.transaction.support.TransactionSynchronization;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.transaction.support.TransactionSynchronizationUtils;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
/**
* Unit tests for {@link MongoDbUtils}.
*
* @author Oliver Gierke
* @author Randy Watler
* @author Christoph Strobl
*/
@RunWith(MockitoJUnitRunner.class)
public class MongoDbUtilsUnitTests {
@Mock Mongo mongo;
@Mock MongoClient mongoClientMock;
@Mock DB dbMock;
@Before
public void setUp() throws Exception {
when(mongo.getDB(anyString())).then(new Answer<DB>() {
public DB answer(InvocationOnMock invocation) throws Throwable {
return mock(DB.class);
}
});
when(mongo.getDB(anyString())).thenReturn(dbMock).thenReturn(mock(DB.class));
when(mongoClientMock.getDB(anyString())).thenReturn(dbMock);
TransactionSynchronizationManager.initSynchronization();
}
@@ -151,6 +154,38 @@ public class MongoDbUtilsUnitTests {
assertThat(TransactionSynchronizationManager.getResourceMap().isEmpty(), is(true));
}
/**
* @see DATAMONGO-1218
*/
@Test
@SuppressWarnings("deprecation")
public void getDBDAuthenticateViaAuthDbWhenCalledWithMongoInstance() {
assumeThat(MongoClientVersion.isMongo3Driver(), is(false));
when(dbMock.getName()).thenReturn("db");
try {
MongoDbUtils.getDB(mongo, "db", new UserCredentials("shallan", "davar"), "authdb");
} catch (CannotGetMongoDbConnectionException e) {
// need to catch that one since we cannot answer the reflective call sufficiently
}
verify(mongo, times(1)).getDB("authdb");
}
/**
* @see DATAMONGO-1218
*/
@Test
@SuppressWarnings("deprecation")
public void getDBDShouldSkipAuthenticationViaAuthDbWhenCalledWithMongoClientInstance() {
MongoDbUtils.getDB(mongoClientMock, "db", new UserCredentials("dalinar", "kholin"), "authdb");
verify(mongoClientMock, never()).getDB("authdb");
}
/**
* Simulate transaction rollback/commit completion protocol on managed transaction synchronizations which will unbind
* managed transaction resources. Does not swallow exceptions for testing purposes.

View File

@@ -73,8 +73,10 @@ import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.util.CloseableIterator;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
@@ -187,9 +189,14 @@ public class MongoTemplateTests {
template.dropCollection(DocumentWithCollection.class);
template.dropCollection(DocumentWithCollectionOfSimpleType.class);
template.dropCollection(DocumentWithMultipleCollections.class);
template.dropCollection(DocumentWithNestedCollection.class);
template.dropCollection(DocumentWithEmbeddedDocumentWithCollection.class);
template.dropCollection(DocumentWithNestedList.class);
template.dropCollection(DocumentWithDBRefCollection.class);
template.dropCollection(SomeContent.class);
template.dropCollection(SomeTemplate.class);
template.dropCollection(Address.class);
template.dropCollection(DocumentWithCollectionOfSamples.class);
}
@Test
@@ -2204,6 +2211,243 @@ public class MongoTemplateTests {
assertThat(retrieved.model.value(), equalTo("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldRetainTypeInformationWithinUpdatedTypeOnDocumentWithNestedCollectionWhenWholeCollectionIsReplaced() {
DocumentWithNestedCollection doc = new DocumentWithNestedCollection();
Map<String, Model> entry = new HashMap<String, Model>();
entry.put("key1", new ModelA("value1"));
doc.models.add(entry);
template.save(doc);
entry.put("key2", new ModelA("value2"));
Query query = query(where("id").is(doc.id));
Update update = Update.update("models", Collections.singletonList(entry));
assertThat(template.findOne(query, DocumentWithNestedCollection.class), notNullValue());
template.findAndModify(query, update, DocumentWithNestedCollection.class);
DocumentWithNestedCollection retrieved = template.findOne(query, DocumentWithNestedCollection.class);
assertThat(retrieved, is(notNullValue()));
assertThat(retrieved.id, is(doc.id));
assertThat(retrieved.models.get(0).entrySet(), hasSize(2));
assertThat(retrieved.models.get(0).get("key1"), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get("key1").value(), equalTo("value1"));
assertThat(retrieved.models.get(0).get("key2"), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get("key2").value(), equalTo("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldRetainTypeInformationWithinUpdatedTypeOnDocumentWithNestedCollectionWhenFirstElementIsReplaced() {
DocumentWithNestedCollection doc = new DocumentWithNestedCollection();
Map<String, Model> entry = new HashMap<String, Model>();
entry.put("key1", new ModelA("value1"));
doc.models.add(entry);
template.save(doc);
entry.put("key2", new ModelA("value2"));
Query query = query(where("id").is(doc.id));
Update update = Update.update("models.0", entry);
assertThat(template.findOne(query, DocumentWithNestedCollection.class), notNullValue());
template.findAndModify(query, update, DocumentWithNestedCollection.class);
DocumentWithNestedCollection retrieved = template.findOne(query, DocumentWithNestedCollection.class);
assertThat(retrieved, is(notNullValue()));
assertThat(retrieved.id, is(doc.id));
assertThat(retrieved.models.get(0).entrySet(), hasSize(2));
assertThat(retrieved.models.get(0).get("key1"), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get("key1").value(), equalTo("value1"));
assertThat(retrieved.models.get(0).get("key2"), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get("key2").value(), equalTo("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldAddTypeInformationOnDocumentWithNestedCollectionObjectInsertedAtSecondIndex() {
DocumentWithNestedCollection doc = new DocumentWithNestedCollection();
Map<String, Model> entry = new HashMap<String, Model>();
entry.put("key1", new ModelA("value1"));
doc.models.add(entry);
template.save(doc);
Query query = query(where("id").is(doc.id));
Update update = Update.update("models.1", Collections.singletonMap("key2", new ModelA("value2")));
assertThat(template.findOne(query, DocumentWithNestedCollection.class), notNullValue());
template.findAndModify(query, update, DocumentWithNestedCollection.class);
DocumentWithNestedCollection retrieved = template.findOne(query, DocumentWithNestedCollection.class);
assertThat(retrieved, is(notNullValue()));
assertThat(retrieved.id, is(doc.id));
assertThat(retrieved.models.get(0).entrySet(), hasSize(1));
assertThat(retrieved.models.get(1).entrySet(), hasSize(1));
assertThat(retrieved.models.get(0).get("key1"), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get("key1").value(), equalTo("value1"));
assertThat(retrieved.models.get(1).get("key2"), instanceOf(ModelA.class));
assertThat(retrieved.models.get(1).get("key2").value(), equalTo("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldRetainTypeInformationWithinUpdatedTypeOnEmbeddedDocumentWithCollectionWhenUpdatingPositionedElement()
throws Exception {
List<Model> models = new ArrayList<Model>();
models.add(new ModelA("value1"));
DocumentWithEmbeddedDocumentWithCollection doc = new DocumentWithEmbeddedDocumentWithCollection(
new DocumentWithCollection(models));
template.save(doc);
Query query = query(where("id").is(doc.id));
Update update = Update.update("embeddedDocument.models.0", new ModelA("value2"));
assertThat(template.findOne(query, DocumentWithEmbeddedDocumentWithCollection.class), notNullValue());
template.findAndModify(query, update, DocumentWithEmbeddedDocumentWithCollection.class);
DocumentWithEmbeddedDocumentWithCollection retrieved = template.findOne(query,
DocumentWithEmbeddedDocumentWithCollection.class);
assertThat(retrieved, notNullValue());
assertThat(retrieved.embeddedDocument.models, hasSize(1));
assertThat(retrieved.embeddedDocument.models.get(0).value(), is("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldAddTypeInformationWithinUpdatedTypeOnEmbeddedDocumentWithCollectionWhenUpdatingSecondElement()
throws Exception {
List<Model> models = new ArrayList<Model>();
models.add(new ModelA("value1"));
DocumentWithEmbeddedDocumentWithCollection doc = new DocumentWithEmbeddedDocumentWithCollection(
new DocumentWithCollection(models));
template.save(doc);
Query query = query(where("id").is(doc.id));
Update update = Update.update("embeddedDocument.models.1", new ModelA("value2"));
assertThat(template.findOne(query, DocumentWithEmbeddedDocumentWithCollection.class), notNullValue());
template.findAndModify(query, update, DocumentWithEmbeddedDocumentWithCollection.class);
DocumentWithEmbeddedDocumentWithCollection retrieved = template.findOne(query,
DocumentWithEmbeddedDocumentWithCollection.class);
assertThat(retrieved, notNullValue());
assertThat(retrieved.embeddedDocument.models, hasSize(2));
assertThat(retrieved.embeddedDocument.models.get(0).value(), is("value1"));
assertThat(retrieved.embeddedDocument.models.get(1).value(), is("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldAddTypeInformationWithinUpdatedTypeOnEmbeddedDocumentWithCollectionWhenRewriting()
throws Exception {
List<Model> models = Arrays.<Model> asList(new ModelA("value1"));
DocumentWithEmbeddedDocumentWithCollection doc = new DocumentWithEmbeddedDocumentWithCollection(
new DocumentWithCollection(models));
template.save(doc);
Query query = query(where("id").is(doc.id));
Update update = Update.update("embeddedDocument",
new DocumentWithCollection(Arrays.<Model> asList(new ModelA("value2"))));
assertThat(template.findOne(query, DocumentWithEmbeddedDocumentWithCollection.class), notNullValue());
template.findAndModify(query, update, DocumentWithEmbeddedDocumentWithCollection.class);
DocumentWithEmbeddedDocumentWithCollection retrieved = template.findOne(query,
DocumentWithEmbeddedDocumentWithCollection.class);
assertThat(retrieved, notNullValue());
assertThat(retrieved.embeddedDocument.models, hasSize(1));
assertThat(retrieved.embeddedDocument.models.get(0).value(), is("value2"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyShouldAddTypeInformationWithinUpdatedTypeOnDocumentWithNestedLists() {
DocumentWithNestedList doc = new DocumentWithNestedList();
List<Model> entry = new ArrayList<Model>();
entry.add(new ModelA("value1"));
doc.models.add(entry);
template.save(doc);
Query query = query(where("id").is(doc.id));
assertThat(template.findOne(query, DocumentWithNestedList.class), notNullValue());
Update update = Update.update("models.0.1", new ModelA("value2"));
template.findAndModify(query, update, DocumentWithNestedList.class);
DocumentWithNestedList retrieved = template.findOne(query, DocumentWithNestedList.class);
assertThat(retrieved, is(notNullValue()));
assertThat(retrieved.id, is(doc.id));
assertThat(retrieved.models.get(0), hasSize(2));
assertThat(retrieved.models.get(0).get(0), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get(0).value(), equalTo("value1"));
assertThat(retrieved.models.get(0).get(1), instanceOf(ModelA.class));
assertThat(retrieved.models.get(0).get(1).value(), equalTo("value2"));
}
/**
* @see DATAMONGO-407
*/
@@ -2609,6 +2853,33 @@ public class MongoTemplateTests {
assertThat(template.findOne(query, DocumentWithCollectionOfSimpleType.class).values, hasSize(3));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void findAndModifyAddToSetWithEachShouldNotAddDuplicatesNorTypeHintForSimpleDocuments() {
DocumentWithCollectionOfSamples doc = new DocumentWithCollectionOfSamples();
doc.samples = Arrays.asList(new Sample(null, "sample1"));
template.save(doc);
Query query = query(where("id").is(doc.id));
assertThat(template.findOne(query, DocumentWithCollectionOfSamples.class), notNullValue());
Update update = new Update().addToSet("samples").each(new Sample(null, "sample2"), new Sample(null, "sample1"));
template.findAndModify(query, update, DocumentWithCollectionOfSamples.class);
DocumentWithCollectionOfSamples retrieved = template.findOne(query, DocumentWithCollectionOfSamples.class);
assertThat(retrieved, notNullValue());
assertThat(retrieved.samples, hasSize(2));
assertThat(retrieved.samples.get(0).field, is("sample1"));
assertThat(retrieved.samples.get(1).field, is("sample2"));
}
/**
* @see DATAMONGO-888
*/
@@ -2723,6 +2994,61 @@ public class MongoTemplateTests {
assertThat(template.findAll(DBObject.class, "collection"), hasSize(0));
}
/**
* @see DATAMONGO-1207
*/
@Test
public void ignoresNullElementsForInsertAll() {
Address newYork = new Address("NY", "New York");
Address washington = new Address("DC", "Washington");
template.insertAll(Arrays.asList(newYork, null, washington));
List<Address> result = template.findAll(Address.class);
assertThat(result, hasSize(2));
assertThat(result, hasItems(newYork, washington));
}
/**
* @see DATAMONGO-1208
*/
@Test
public void takesSortIntoAccountWhenStreaming() {
Person youngestPerson = new Person("John", 20);
Person oldestPerson = new Person("Jane", 42);
template.insertAll(Arrays.asList(oldestPerson, youngestPerson));
Query q = new Query();
q.with(new Sort(Direction.ASC, "age"));
CloseableIterator<Person> stream = template.stream(q, Person.class);
assertThat(stream.next().getAge(), is(youngestPerson.getAge()));
assertThat(stream.next().getAge(), is(oldestPerson.getAge()));
}
/**
* @see DATAMONGO-1208
*/
@Test
public void takesLimitIntoAccountWhenStreaming() {
Person youngestPerson = new Person("John", 20);
Person oldestPerson = new Person("Jane", 42);
template.insertAll(Arrays.asList(oldestPerson, youngestPerson));
Query q = new Query();
q.with(new PageRequest(0, 1, new Sort(Direction.ASC, "age")));
CloseableIterator<Person> stream = template.stream(q, Person.class);
assertThat(stream.next().getAge(), is(youngestPerson.getAge()));
assertThat(stream.hasNext(), is(false));
}
static class DoucmentWithNamedIdField {
@Id String someIdKey;
@@ -2798,12 +3124,36 @@ public class MongoTemplateTests {
List<String> values;
}
static class DocumentWithCollectionOfSamples {
@Id String id;
List<Sample> samples;
}
static class DocumentWithMultipleCollections {
@Id String id;
List<String> string1;
List<String> string2;
}
static class DocumentWithNestedCollection {
@Id String id;
List<Map<String, Model>> models = new ArrayList<Map<String, Model>>();
}
static class DocumentWithNestedList {
@Id String id;
List<List<Model>> models = new ArrayList<List<Model>>();
}
static class DocumentWithEmbeddedDocumentWithCollection {
@Id String id;
DocumentWithCollection embeddedDocument;
DocumentWithEmbeddedDocumentWithCollection(DocumentWithCollection embeddedDocument) {
this.embeddedDocument = embeddedDocument;
}
}
static interface Model {
String value();
@@ -2909,6 +3259,41 @@ public class MongoTemplateTests {
String state;
String city;
Address() {}
Address(String state, String city) {
this.state = state;
this.city = city;
}
@Override
public boolean equals(Object obj) {
if (obj == this) {
return true;
}
if (!(obj instanceof Address)) {
return false;
}
Address that = (Address) obj;
return ObjectUtils.nullSafeEquals(this.city, that.city) && //
ObjectUtils.nullSafeEquals(this.state, that.state);
}
@Override
public int hashCode() {
int result = 17;
result += 31 * ObjectUtils.nullSafeHashCode(this.city);
result += 31 * ObjectUtils.nullSafeHashCode(this.state);
return result;
}
}
static class VersionedPerson {

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,8 +17,11 @@ package org.springframework.data.mongodb.core.aggregation;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.core.aggregation.AggregationFunctionExpressions.*;
import static org.springframework.data.mongodb.core.aggregation.Fields.*;
import java.util.Arrays;
import org.junit.Test;
import org.springframework.data.mongodb.core.DBObjectTestUtils;
@@ -29,6 +32,7 @@ import com.mongodb.DBObject;
* Unit tests for {@link GroupOperation}.
*
* @author Oliver Gierke
* @author Thomas Darimont
*/
public class GroupOperationUnitTests {
@@ -183,6 +187,23 @@ public class GroupOperationUnitTests {
assertThat(push, is((DBObject) new BasicDBObject("$addToSet", 42)));
}
/**
* @see DATAMONGO-979
*/
@Test
public void shouldRenderSizeExpressionInGroup() {
GroupOperation groupOperation = Aggregation //
.group("username") //
.first(SIZE.of(field("tags"))) //
.as("tags_count");
DBObject groupClause = extractDbObjectFromGroupOperation(groupOperation);
DBObject tagsCount = DBObjectTestUtils.getAsDBObject(groupClause, "tags_count");
assertThat(tagsCount.get("$first"), is((Object) new BasicDBObject("$size", Arrays.asList("$tags"))));
}
private DBObject extractDbObjectFromGroupOperation(GroupOperation groupOperation) {
DBObject dbObject = groupOperation.toDBObject(Aggregation.DEFAULT_CONTEXT);
DBObject groupClause = DBObjectTestUtils.getAsDBObject(dbObject, "$group");

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013 the original author or authors.
* Copyright 2013-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,6 +17,8 @@ package org.springframework.data.mongodb.core.aggregation;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.core.aggregation.AggregationFunctionExpressions.*;
import static org.springframework.data.mongodb.core.aggregation.Fields.*;
import static org.springframework.data.mongodb.util.DBObjectUtils.*;
import java.util.Arrays;
@@ -334,6 +336,41 @@ public class ProjectionOperationUnitTests {
"$date", 86400000))))));
}
/**
* @see DATAMONGO-979
*/
@Test
public void shouldRenderSizeExpressionInProjection() {
ProjectionOperation operation = Aggregation //
.project() //
.and("tags") //
.size()//
.as("tags_count");
DBObject dbObject = operation.toDBObject(Aggregation.DEFAULT_CONTEXT);
DBObject projected = exctractOperation("$project", dbObject);
assertThat(projected.get("tags_count"), is((Object) new BasicDBObject("$size", Arrays.asList("$tags"))));
}
/**
* @see DATAMONGO-979
*/
@Test
public void shouldRenderGenericSizeExpressionInProjection() {
ProjectionOperation operation = Aggregation //
.project() //
.and(SIZE.of(field("tags"))) //
.as("tags_count");
DBObject dbObject = operation.toDBObject(Aggregation.DEFAULT_CONTEXT);
DBObject projected = exctractOperation("$project", dbObject);
assertThat(projected.get("tags_count"), is((Object) new BasicDBObject("$size", Arrays.asList("$tags"))));
}
private static DBObject exctractOperation(String field, DBObject fromProjectClause) {
return (DBObject) fromProjectClause.get(field);
}

View File

@@ -775,6 +775,21 @@ public class QueryMapperUnitTests {
assertThat(dbo, isBsonObject().containing("geoJsonPoint.$geoWithin.$geometry.type", "Polygon"));
}
/**
* @see DATAMONGO-1134
*/
@Test
public void intersectsShouldUseGeoJsonRepresentationCorrectly() {
Query query = query(where("geoJsonPoint").intersects(
new GeoJsonPolygon(new Point(0, 0), new Point(100, 100), new Point(100, 0), new Point(0, 0))));
DBObject dbo = mapper.getMappedObject(query.getQueryObject(), context.getPersistentEntity(ClassWithGeoTypes.class));
assertThat(dbo, isBsonObject().containing("geoJsonPoint.$geoIntersects.$geometry.type", "Polygon"));
assertThat(dbo, isBsonObject().containing("geoJsonPoint.$geoIntersects.$geometry.coordinates"));
}
@Document
public class Foo {
@Id private ObjectId id;

View File

@@ -0,0 +1,97 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import static org.hamcrest.core.Is.*;
import static org.hamcrest.core.IsNull.*;
import static org.junit.Assert.*;
import static org.junit.Assume.*;
import static org.mockito.Matchers.*;
import static org.mockito.Mockito.*;
import static org.springframework.data.mongodb.util.MongoClientVersion.*;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.data.mongodb.MongoDbFactory;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.DBRef;
/**
* Unit tests for {@link ReflectiveDBRefResolver}.
*
* @author Christoph Strobl
*/
@RunWith(MockitoJUnitRunner.class)
public class ReflectiveDBRefResolverUnitTests {
@Mock MongoDbFactory dbFactoryMock;
@Mock DBRef dbRefMock;
@Mock DB dbMock;
@Mock DBCollection collectionMock;
@Before
public void setUp() {
when(dbRefMock.getCollectionName()).thenReturn("collection-1");
when(dbRefMock.getId()).thenReturn("id-1");
when(dbFactoryMock.getDb()).thenReturn(dbMock);
when(dbMock.getCollection(eq("collection-1"))).thenReturn(collectionMock);
when(collectionMock.findOne(eq("id-1"))).thenReturn(new BasicDBObject("_id", "id-1"));
}
/**
* @see DATAMONGO-1193
*/
@Test
public void fetchShouldNotLookUpDbWhenUsingDriverVersion2() {
assumeThat(isMongo3Driver(), is(false));
ReflectiveDBRefResolver.fetch(dbFactoryMock, dbRefMock);
verify(dbFactoryMock, never()).getDb();
verify(dbFactoryMock, never()).getDb(anyString());
}
/**
* @see DATAMONGO-1193
*/
@Test
public void fetchShouldUseDbToResolveDbRefWhenUsingDriverVersion3() {
assumeThat(isMongo3Driver(), is(true));
assertThat(ReflectiveDBRefResolver.fetch(dbFactoryMock, dbRefMock), notNullValue());
verify(dbFactoryMock, times(1)).getDb();
}
/**
* @see DATAMONGO-1193
*/
@Test(expected = IllegalArgumentException.class)
public void fetchShouldThrowExceptionWhenDbFactoryIsNullUsingDriverVersion3() {
assumeThat(isMongo3Driver(), is(true));
ReflectiveDBRefResolver.fetch(null, dbRefMock);
}
}

View File

@@ -20,8 +20,10 @@ import static org.hamcrest.collection.IsMapContaining.*;
import static org.junit.Assert.*;
import static org.mockito.Mockito.*;
import static org.springframework.data.mongodb.core.DBObjectTestUtils.*;
import static org.springframework.data.mongodb.test.util.IsBsonObject.*;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import org.hamcrest.Matcher;
@@ -428,8 +430,8 @@ public class UpdateMapperUnitTests {
public void rendersNestedDbRefCorrectly() {
Update update = new Update().pull("nested.dbRefAnnotatedList.id", "2");
DBObject mappedObject = mapper
.getMappedObject(update.getUpdateObject(), context.getPersistentEntity(Wrapper.class));
DBObject mappedObject = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(Wrapper.class));
DBObject pullClause = getAsDBObject(mappedObject, "$pull");
assertThat(pullClause.containsField("mapped.dbRefAnnotatedList"), is(true));
@@ -524,7 +526,6 @@ public class UpdateMapperUnitTests {
assertThat(((DBObject) updateValue).get("_class").toString(),
equalTo("org.springframework.data.mongodb.core.convert.UpdateMapperUnitTests$ModelImpl"));
}
}
/**
@@ -595,6 +596,106 @@ public class UpdateMapperUnitTests {
assertThat($unset, equalTo(new BasicDBObjectBuilder().add("dbRefAnnotatedList.$", 1).get()));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void mappingEachOperatorShouldNotAddTypeInfoForNonInterfaceNonAbstractTypes() {
Update update = new Update().addToSet("nestedDocs").each(new NestedDocument("nested-1"),
new NestedDocument("nested-2"));
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(DocumentWithNestedCollection.class));
assertThat(mappedUpdate, isBsonObject().notContaining("$addToSet.nestedDocs.$each.[0]._class"));
assertThat(mappedUpdate, isBsonObject().notContaining("$addToSet.nestedDocs.$each.[1]._class"));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void mappingEachOperatorShouldAddTypeHintForInterfaceTypes() {
Update update = new Update().addToSet("models").each(new ModelImpl(1), new ModelImpl(2));
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(ListModelWrapper.class));
assertThat(mappedUpdate, isBsonObject().containing("$addToSet.models.$each.[0]._class", ModelImpl.class.getName()));
assertThat(mappedUpdate, isBsonObject().containing("$addToSet.models.$each.[1]._class", ModelImpl.class.getName()));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void mappingEachOperatorShouldAddTypeHintForAbstractTypes() {
Update update = new Update().addToSet("list").each(new ConcreteChildClass("foo", "one"),
new ConcreteChildClass("bar", "two"));
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(ParentClass.class));
assertThat(mappedUpdate,
isBsonObject().containing("$addToSet.aliased.$each.[0]._class", ConcreteChildClass.class.getName()));
assertThat(mappedUpdate,
isBsonObject().containing("$addToSet.aliased.$each.[1]._class", ConcreteChildClass.class.getName()));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void mappingShouldOnlyRemoveTypeHintFromTopLevelTypeInCaseOfNestedDocument() {
WrapperAroundInterfaceType wait = new WrapperAroundInterfaceType();
wait.interfaceType = new ModelImpl(1);
Update update = new Update().addToSet("listHoldingConcretyTypeWithInterfaceTypeAttribute").each(wait);
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(DomainTypeWithListOfConcreteTypesHavingSingleInterfaceTypeAttribute.class));
assertThat(mappedUpdate,
isBsonObject().notContaining("$addToSet.listHoldingConcretyTypeWithInterfaceTypeAttribute.$each.[0]._class"));
assertThat(mappedUpdate,
isBsonObject().containing(
"$addToSet.listHoldingConcretyTypeWithInterfaceTypeAttribute.$each.[0].interfaceType._class",
ModelImpl.class.getName()));
}
/**
* @see DATAMONGO-1210
*/
@Test
public void mappingShouldRetainTypeInformationOfNestedListWhenUpdatingConcreteyParentType() {
ListModelWrapper lmw = new ListModelWrapper();
lmw.models = Collections.<Model> singletonList(new ModelImpl(1));
Update update = new Update().set("concreteTypeWithListAttributeOfInterfaceType", lmw);
DBObject mappedUpdate = mapper.getMappedObject(update.getUpdateObject(),
context.getPersistentEntity(DomainTypeWrappingConcreteyTypeHavingListOfInterfaceTypeAttributes.class));
assertThat(mappedUpdate, isBsonObject().notContaining("$set.concreteTypeWithListAttributeOfInterfaceType._class"));
assertThat(mappedUpdate, isBsonObject()
.containing("$set.concreteTypeWithListAttributeOfInterfaceType.models.[0]._class", ModelImpl.class.getName()));
}
static class DomainTypeWrappingConcreteyTypeHavingListOfInterfaceTypeAttributes {
ListModelWrapper concreteTypeWithListAttributeOfInterfaceType;
}
static class DomainTypeWithListOfConcreteTypesHavingSingleInterfaceTypeAttribute {
List<WrapperAroundInterfaceType> listHoldingConcretyTypeWithInterfaceTypeAttribute;
}
static class WrapperAroundInterfaceType {
Model interfaceType;
}
@org.springframework.data.mongodb.core.mapping.Document(collection = "DocumentWithReferenceToInterface")
static interface DocumentWithReferenceToInterface {
@@ -631,7 +732,7 @@ public class UpdateMapperUnitTests {
private @Id String id;
@org.springframework.data.mongodb.core.mapping.DBRef//
@org.springframework.data.mongodb.core.mapping.DBRef //
private InterfaceDocumentDefinitionWithoutId referencedDocument;
public String getId() {
@@ -692,10 +793,10 @@ public class UpdateMapperUnitTests {
String id;
@Field("aliased")//
@Field("aliased") //
List<? extends AbstractChildClass> list;
@Field//
@Field //
List<Model> listOfInterface;
public ParentClass(String id, List<? extends AbstractChildClass> list) {
@@ -728,6 +829,10 @@ public class UpdateMapperUnitTests {
static class DomainEntity {
List<NestedEntity> collectionOfNestedEntities;
public List<NestedEntity> getCollectionOfNestedEntities() {
return collectionOfNestedEntities;
}
}
static class NestedEntity {
@@ -753,10 +858,10 @@ public class UpdateMapperUnitTests {
@Id public String id;
@org.springframework.data.mongodb.core.mapping.DBRef//
@org.springframework.data.mongodb.core.mapping.DBRef //
public List<Entity> dbRefAnnotatedList;
@org.springframework.data.mongodb.core.mapping.DBRef//
@org.springframework.data.mongodb.core.mapping.DBRef //
public Entity dbRefProperty;
}
@@ -770,4 +875,18 @@ public class UpdateMapperUnitTests {
@Field("mapped") DocumentWithDBRefCollection nested;
}
static class DocumentWithNestedCollection {
List<NestedDocument> nestedDocs;
}
static class NestedDocument {
String name;
public NestedDocument(String name) {
super();
this.name = name;
}
}
}

View File

@@ -0,0 +1,137 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import java.io.IOException;
import java.util.Arrays;
import org.junit.Before;
import org.junit.Test;
import org.springframework.data.geo.Point;
import com.fasterxml.jackson.core.JsonParseException;
import com.fasterxml.jackson.databind.JsonMappingException;
import com.fasterxml.jackson.databind.ObjectMapper;
/**
* @author Christoph Strobl
*/
public class GeoJsonModuleUnitTests {
ObjectMapper mapper;
@Before
public void setUp() {
mapper = new ObjectMapper();
mapper.registerModule(new GeoJsonModule());
}
/**
* @see DATAMONGO-1181
*/
@Test
public void shouldDeserializeJsonPointCorrectly() throws JsonParseException, JsonMappingException, IOException {
String json = "{ \"type\": \"Point\", \"coordinates\": [10.0, 20.0] }";
assertThat(mapper.readValue(json, GeoJsonPoint.class), is(new GeoJsonPoint(10D, 20D)));
}
/**
* @see DATAMONGO-1181
*/
@Test
public void shouldDeserializeGeoJsonLineStringCorrectly() throws JsonParseException, JsonMappingException,
IOException {
String json = "{ \"type\": \"LineString\", \"coordinates\": [ [10.0, 20.0], [30.0, 40.0], [50.0, 60.0] ]}";
assertThat(mapper.readValue(json, GeoJsonLineString.class),
is(new GeoJsonLineString(Arrays.asList(new Point(10, 20), new Point(30, 40), new Point(50, 60)))));
}
/**
* @see DATAMONGO-1181
*/
@Test
public void shouldDeserializeGeoJsonMultiPointCorrectly() throws JsonParseException, JsonMappingException,
IOException {
String json = "{ \"type\": \"MultiPoint\", \"coordinates\": [ [10.0, 20.0], [30.0, 40.0], [50.0, 60.0] ]}";
assertThat(mapper.readValue(json, GeoJsonLineString.class),
is(new GeoJsonMultiPoint(Arrays.asList(new Point(10, 20), new Point(30, 40), new Point(50, 60)))));
}
/**
* @see DATAMONGO-1181
*/
@Test
@SuppressWarnings("unchecked")
public void shouldDeserializeGeoJsonMultiLineStringCorrectly() throws JsonParseException, JsonMappingException,
IOException {
String json = "{ \"type\": \"MultiLineString\", \"coordinates\": [ [ [10.0, 20.0], [30.0, 40.0] ], [ [50.0, 60.0] , [70.0, 80.0] ] ]}";
assertThat(
mapper.readValue(json, GeoJsonMultiLineString.class),
is(new GeoJsonMultiLineString(Arrays.asList(new Point(10, 20), new Point(30, 40)), Arrays.asList(new Point(50,
60), new Point(70, 80)))));
}
/**
* @see DATAMONGO-1181
*/
@Test
public void shouldDeserializeGeoJsonPolygonCorrectly() throws JsonParseException, JsonMappingException, IOException {
String json = "{ \"type\": \"Polygon\", \"coordinates\": [ [ [100.0, 0.0], [101.0, 0.0], [101.0, 1.0], [100.0, 1.0], [100.0, 0.0] ] ]}";
assertThat(
mapper.readValue(json, GeoJsonPolygon.class),
is(new GeoJsonPolygon(Arrays.asList(new Point(100, 0), new Point(101, 0), new Point(101, 1), new Point(100, 1),
new Point(100, 0)))));
}
/**
* @see DATAMONGO-1181
*/
@Test
public void shouldDeserializeGeoJsonMultiPolygonCorrectly() throws JsonParseException, JsonMappingException,
IOException {
String json = "{ \"type\": \"Polygon\", \"coordinates\": ["
+ "[[[102.0, 2.0], [103.0, 2.0], [103.0, 3.0], [102.0, 3.0], [102.0, 2.0]]],"
+ "[[[100.0, 0.0], [101.0, 0.0], [101.0, 1.0], [100.0, 1.0], [100.0, 0.0]],"
+ "[[100.2, 0.2], [100.8, 0.2], [100.8, 0.8], [100.2, 0.8], [100.2, 0.2]]]"//
+ "]}";
assertThat(
mapper.readValue(json, GeoJsonMultiPolygon.class),
is(new GeoJsonMultiPolygon(Arrays.asList(
new GeoJsonPolygon(Arrays.asList(new Point(102, 2), new Point(103, 2), new Point(103, 3),
new Point(102, 3), new Point(102, 2))),
new GeoJsonPolygon(Arrays.asList(new Point(100, 0), new Point(101, 0), new Point(101, 1),
new Point(100, 1), new Point(100, 0))),
new GeoJsonPolygon(Arrays.asList(new Point(100.2, 0.2), new Point(100.8, 0.2), new Point(100.8, 0.8),
new Point(100.2, 0.8), new Point(100.2, 0.2)))))));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2012-2014 the original author or authors.
* Copyright 2012-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -18,16 +18,20 @@ package org.springframework.data.mongodb.core.index;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.util.Arrays;
import java.util.List;
import org.hamcrest.Matchers;
import org.junit.After;
import org.junit.ClassRule;
import org.junit.Test;
import org.junit.rules.RuleChain;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.test.util.CleanMongoDB;
import org.springframework.data.mongodb.test.util.MongoVersionRule;
import org.springframework.data.util.Version;
import org.springframework.test.context.ContextConfiguration;
@@ -38,23 +42,22 @@ import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
*
* @author Oliver Gierke
* @author Christoph Strobl
* @author Thomas Darimont
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
public class MongoPersistentEntityIndexCreatorIntegrationTests {
public static @ClassRule MongoVersionRule version = MongoVersionRule.atLeast(new Version(2, 6));
static final String SAMPLE_TYPE_COLLECTION_NAME = "sampleEntity";
static final String RECURSIVE_TYPE_COLLECTION_NAME = "recursiveGenericTypes";
public static @ClassRule RuleChain rules = RuleChain.outerRule(MongoVersionRule.atLeast(new Version(2, 6))).around(
CleanMongoDB.indexes(Arrays.asList(SAMPLE_TYPE_COLLECTION_NAME, RECURSIVE_TYPE_COLLECTION_NAME)));
@Autowired @Qualifier("mongo1") MongoOperations templateOne;
@Autowired @Qualifier("mongo2") MongoOperations templateTwo;
@After
public void cleanUp() {
templateOne.dropCollection(SampleEntity.class);
templateTwo.dropCollection(SampleEntity.class);
}
@Test
public void createsIndexForConfiguredMappingContextOnly() {
@@ -62,7 +65,42 @@ public class MongoPersistentEntityIndexCreatorIntegrationTests {
assertThat(indexInfo, hasSize(greaterThan(0)));
assertThat(indexInfo, Matchers.<IndexInfo> hasItem(hasProperty("name", is("prop"))));
indexInfo = templateTwo.indexOps("sampleEntity").getIndexInfo();
indexInfo = templateTwo.indexOps(SAMPLE_TYPE_COLLECTION_NAME).getIndexInfo();
assertThat(indexInfo, hasSize(0));
}
/**
* @see DATAMONGO-1202
*/
@Test
public void shouldHonorIndexedPropertiesWithRecursiveMappings() {
List<IndexInfo> indexInfo = templateOne.indexOps(RecursiveConcreteType.class).getIndexInfo();
assertThat(indexInfo, hasSize(greaterThan(0)));
assertThat(indexInfo, Matchers.<IndexInfo> hasItem(hasProperty("name", is("firstName"))));
}
@Document(collection = RECURSIVE_TYPE_COLLECTION_NAME)
static abstract class RecursiveGenericType<RGT extends RecursiveGenericType<RGT>> {
@Id Long id;
@org.springframework.data.mongodb.core.mapping.DBRef RGT referrer;
@Indexed String firstName;
public RecursiveGenericType(Long id, String firstName, RGT referrer) {
this.firstName = firstName;
this.id = id;
this.referrer = referrer;
}
}
static class RecursiveConcreteType extends RecursiveGenericType<RecursiveConcreteType> {
public RecursiveConcreteType(Long id, String firstName, RecursiveConcreteType referrer) {
super(id, firstName, referrer);
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011-2015 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,17 +16,23 @@
package org.springframework.data.mongodb.core.mapping.event;
import java.util.ArrayList;
import java.util.List;
import org.springframework.context.ApplicationEvent;
import org.springframework.context.ApplicationListener;
import org.springframework.data.mongodb.core.mapping.PersonPojoStringId;
public class PersonBeforeSaveListener implements ApplicationListener<BeforeSaveEvent<PersonPojoStringId>> {
import com.mongodb.DBObject;
public final ArrayList<ApplicationEvent> seenEvents = new ArrayList<ApplicationEvent>();
public class PersonBeforeSaveListener extends AbstractMongoEventListener<PersonPojoStringId> {
public void onApplicationEvent(BeforeSaveEvent<PersonPojoStringId> event) {
this.seenEvents.add(event);
public final List<ApplicationEvent> seenEvents = new ArrayList<ApplicationEvent>();
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.event.AbstractMongoEventListener#onBeforeSave(java.lang.Object, com.mongodb.DBObject)
*/
@Override
public void onBeforeSave(PersonPojoStringId source, DBObject dbo) {
seenEvents.add(new BeforeSaveEvent<PersonPojoStringId>(source, dbo));
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2014 the original author or authors.
* Copyright 2010-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -17,12 +17,13 @@ package org.springframework.data.mongodb.core.query;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.test.util.IsBsonObject.*;
import org.junit.Test;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.data.mongodb.core.geo.GeoJsonLineString;
import org.springframework.data.mongodb.core.geo.GeoJsonPoint;
import org.springframework.data.mongodb.test.util.IsBsonObject;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
@@ -176,7 +177,7 @@ public class CriteriaTests {
DBObject dbo = new Criteria("foo").near(new GeoJsonPoint(100, 200)).getCriteriaObject();
assertThat(dbo, IsBsonObject.isBsonObject().containing("foo.$near.$geometry", new GeoJsonPoint(100, 200)));
assertThat(dbo, isBsonObject().containing("foo.$near.$geometry", new GeoJsonPoint(100, 200)));
}
/**
@@ -187,7 +188,7 @@ public class CriteriaTests {
DBObject dbo = new Criteria("foo").near(new Point(100, 200)).getCriteriaObject();
assertThat(dbo, IsBsonObject.isBsonObject().notContaining("foo.$near.$geometry"));
assertThat(dbo, isBsonObject().notContaining("foo.$near.$geometry"));
}
/**
@@ -198,7 +199,7 @@ public class CriteriaTests {
DBObject dbo = new Criteria("foo").near(new GeoJsonPoint(100, 200)).maxDistance(50D).getCriteriaObject();
assertThat(dbo, IsBsonObject.isBsonObject().containing("foo.$near.$maxDistance", 50D));
assertThat(dbo, isBsonObject().containing("foo.$near.$maxDistance", 50D));
}
/**
@@ -209,7 +210,7 @@ public class CriteriaTests {
DBObject dbo = new Criteria("foo").nearSphere(new GeoJsonPoint(100, 200)).maxDistance(50D).getCriteriaObject();
assertThat(dbo, IsBsonObject.isBsonObject().containing("foo.$nearSphere.$maxDistance", 50D));
assertThat(dbo, isBsonObject().containing("foo.$nearSphere.$maxDistance", 50D));
}
/**
@@ -220,7 +221,7 @@ public class CriteriaTests {
DBObject dbo = new Criteria("foo").near(new GeoJsonPoint(100, 200)).minDistance(50D).getCriteriaObject();
assertThat(dbo, IsBsonObject.isBsonObject().containing("foo.$near.$minDistance", 50D));
assertThat(dbo, isBsonObject().containing("foo.$near.$minDistance", 50D));
}
/**
@@ -231,7 +232,7 @@ public class CriteriaTests {
DBObject dbo = new Criteria("foo").nearSphere(new GeoJsonPoint(100, 200)).minDistance(50D).getCriteriaObject();
assertThat(dbo, IsBsonObject.isBsonObject().containing("foo.$nearSphere.$minDistance", 50D));
assertThat(dbo, isBsonObject().containing("foo.$nearSphere.$minDistance", 50D));
}
/**
@@ -243,8 +244,27 @@ public class CriteriaTests {
DBObject dbo = new Criteria("foo").nearSphere(new GeoJsonPoint(100, 200)).minDistance(50D).maxDistance(100D)
.getCriteriaObject();
assertThat(dbo, IsBsonObject.isBsonObject().containing("foo.$nearSphere.$minDistance", 50D));
assertThat(dbo, IsBsonObject.isBsonObject().containing("foo.$nearSphere.$maxDistance", 100D));
assertThat(dbo, isBsonObject().containing("foo.$nearSphere.$minDistance", 50D));
assertThat(dbo, isBsonObject().containing("foo.$nearSphere.$maxDistance", 100D));
}
/**
* @see DATAMONGO-1134
*/
@Test(expected = IllegalArgumentException.class)
public void intersectsShouldThrowExceptionWhenCalledWihtNullValue() {
new Criteria("foo").intersects(null);
}
/**
* @see DATAMONGO-1134
*/
@Test
public void intersectsShouldWrapGeoJsonTypeInGeometryCorrectly() {
GeoJsonLineString lineString = new GeoJsonLineString(new Point(0, 0), new Point(10, 10));
DBObject dbo = new Criteria("foo").intersects(lineString).getCriteriaObject();
assertThat(dbo, isBsonObject().containing("foo.$geoIntersects.$geometry", lineString));
}
}

View File

@@ -52,6 +52,7 @@ import org.springframework.data.geo.Polygon;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.repository.Person.Sex;
import org.springframework.data.mongodb.repository.SampleEvaluationContextExtension.SampleSecurityContextHolder;
import org.springframework.data.querydsl.QSort;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
@@ -174,8 +175,8 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
@Test
public void executesPagedFinderCorrectly() throws Exception {
Page<Person> page = repository.findByLastnameLike("*a*", new PageRequest(0, 2, Direction.ASC, "lastname",
"firstname"));
Page<Person> page = repository.findByLastnameLike("*a*",
new PageRequest(0, 2, Direction.ASC, "lastname", "firstname"));
assertThat(page.isFirst(), is(true));
assertThat(page.isLast(), is(false));
assertThat(page.getNumberOfElements(), is(2));
@@ -185,8 +186,8 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
@Test
public void executesPagedFinderWithAnnotatedQueryCorrectly() throws Exception {
Page<Person> page = repository.findByLastnameLikeWithPageable(".*a.*", new PageRequest(0, 2, Direction.ASC,
"lastname", "firstname"));
Page<Person> page = repository.findByLastnameLikeWithPageable(".*a.*",
new PageRequest(0, 2, Direction.ASC, "lastname", "firstname"));
assertThat(page.isFirst(), is(true));
assertThat(page.isLast(), is(false));
assertThat(page.getNumberOfElements(), is(2));
@@ -310,8 +311,8 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
@Test
public void findsPagedPeopleByPredicate() throws Exception {
Page<Person> page = repository.findAll(person.lastname.contains("a"), new PageRequest(0, 2, Direction.ASC,
"lastname"));
Page<Person> page = repository.findAll(person.lastname.contains("a"),
new PageRequest(0, 2, Direction.ASC, "lastname"));
assertThat(page.isFirst(), is(true));
assertThat(page.isLast(), is(false));
assertThat(page.getNumberOfElements(), is(2));
@@ -397,8 +398,8 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
dave.setLocation(point);
repository.save(dave);
GeoResults<Person> results = repository.findByLocationNear(new Point(-73.99, 40.73), new Distance(2000,
Metrics.KILOMETERS));
GeoResults<Person> results = repository.findByLocationNear(new Point(-73.99, 40.73),
new Distance(2000, Metrics.KILOMETERS));
assertThat(results.getContent().isEmpty(), is(false));
}
@@ -409,8 +410,8 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
dave.setLocation(point);
repository.save(dave);
GeoPage<Person> results = repository.findByLocationNear(new Point(-73.99, 40.73), new Distance(2000,
Metrics.KILOMETERS), new PageRequest(0, 20));
GeoPage<Person> results = repository.findByLocationNear(new Point(-73.99, 40.73),
new Distance(2000, Metrics.KILOMETERS), new PageRequest(0, 20));
assertThat(results.getContent().isEmpty(), is(false));
// DATAMONGO-607
@@ -620,8 +621,8 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
repository.save(Arrays.asList(dave, oliver, carter, boyd, leroi));
GeoPage<Person> results = repository.findByLocationNear(new Point(-73.99, 40.73), new Distance(2000,
Metrics.KILOMETERS), new PageRequest(1, 2));
GeoPage<Person> results = repository.findByLocationNear(new Point(-73.99, 40.73),
new Distance(2000, Metrics.KILOMETERS), new PageRequest(1, 2));
assertThat(results.getContent().isEmpty(), is(false));
assertThat(results.getNumberOfElements(), is(2));
@@ -645,8 +646,8 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
repository.save(Arrays.asList(dave, oliver, carter));
GeoPage<Person> results = repository.findByLocationNear(new Point(-73.99, 40.73), new Distance(2000,
Metrics.KILOMETERS), new PageRequest(1, 2));
GeoPage<Person> results = repository.findByLocationNear(new Point(-73.99, 40.73),
new Distance(2000, Metrics.KILOMETERS), new PageRequest(1, 2));
assertThat(results.getContent().isEmpty(), is(false));
assertThat(results.getNumberOfElements(), is(1));
assertThat(results.isFirst(), is(false));
@@ -664,8 +665,8 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
dave.setLocation(point);
repository.save(dave);
GeoPage<Person> results = repository.findByLocationNear(new Point(-73.99, 40.73), new Distance(2000,
Metrics.KILOMETERS), new PageRequest(0, 2));
GeoPage<Person> results = repository.findByLocationNear(new Point(-73.99, 40.73),
new Distance(2000, Metrics.KILOMETERS), new PageRequest(0, 2));
assertThat(results.getContent().isEmpty(), is(false));
assertThat(results.getNumberOfElements(), is(1));
@@ -683,8 +684,8 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
dave.setLocation(new Point(-73.99171, 40.738868));
repository.save(dave);
GeoPage<Person> results = repository.findByLocationNear(new Point(-73.99, 40.73), new Distance(2000,
Metrics.KILOMETERS), new PageRequest(1, 2));
GeoPage<Person> results = repository.findByLocationNear(new Point(-73.99, 40.73),
new Distance(2000, Metrics.KILOMETERS), new PageRequest(1, 2));
assertThat(results.getContent().isEmpty(), is(true));
assertThat(results.getNumberOfElements(), is(0));
@@ -934,8 +935,8 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
@Test
public void shouldLimitCollectionQueryToMaxResultsWhenPresent() {
repository.save(Arrays.asList(new Person("Bob-1", "Dylan"), new Person("Bob-2", "Dylan"), new Person("Bob-3",
"Dylan"), new Person("Bob-4", "Dylan"), new Person("Bob-5", "Dylan")));
repository.save(Arrays.asList(new Person("Bob-1", "Dylan"), new Person("Bob-2", "Dylan"),
new Person("Bob-3", "Dylan"), new Person("Bob-4", "Dylan"), new Person("Bob-5", "Dylan")));
List<Person> result = repository.findTop3ByLastnameStartingWith("Dylan");
assertThat(result.size(), is(3));
}
@@ -946,8 +947,8 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
@Test
public void shouldNotLimitPagedQueryWhenPageRequestWithinBounds() {
repository.save(Arrays.asList(new Person("Bob-1", "Dylan"), new Person("Bob-2", "Dylan"), new Person("Bob-3",
"Dylan"), new Person("Bob-4", "Dylan"), new Person("Bob-5", "Dylan")));
repository.save(Arrays.asList(new Person("Bob-1", "Dylan"), new Person("Bob-2", "Dylan"),
new Person("Bob-3", "Dylan"), new Person("Bob-4", "Dylan"), new Person("Bob-5", "Dylan")));
Page<Person> result = repository.findTop3ByLastnameStartingWith("Dylan", new PageRequest(0, 2));
assertThat(result.getContent().size(), is(2));
}
@@ -958,8 +959,8 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
@Test
public void shouldLimitPagedQueryWhenPageRequestExceedsUpperBoundary() {
repository.save(Arrays.asList(new Person("Bob-1", "Dylan"), new Person("Bob-2", "Dylan"), new Person("Bob-3",
"Dylan"), new Person("Bob-4", "Dylan"), new Person("Bob-5", "Dylan")));
repository.save(Arrays.asList(new Person("Bob-1", "Dylan"), new Person("Bob-2", "Dylan"),
new Person("Bob-3", "Dylan"), new Person("Bob-4", "Dylan"), new Person("Bob-5", "Dylan")));
Page<Person> result = repository.findTop3ByLastnameStartingWith("Dylan", new PageRequest(1, 2));
assertThat(result.getContent().size(), is(1));
}
@@ -970,8 +971,8 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
@Test
public void shouldReturnEmptyWhenPageRequestedPageIsTotallyOutOfScopeForLimit() {
repository.save(Arrays.asList(new Person("Bob-1", "Dylan"), new Person("Bob-2", "Dylan"), new Person("Bob-3",
"Dylan"), new Person("Bob-4", "Dylan"), new Person("Bob-5", "Dylan")));
repository.save(Arrays.asList(new Person("Bob-1", "Dylan"), new Person("Bob-2", "Dylan"),
new Person("Bob-3", "Dylan"), new Person("Bob-4", "Dylan"), new Person("Bob-5", "Dylan")));
Page<Person> result = repository.findTop3ByLastnameStartingWith("Dylan", new PageRequest(2, 2));
assertThat(result.getContent().size(), is(0));
}
@@ -1183,4 +1184,41 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
GeoResults<Person> results = repository.findPersonByLocationNear(new Point(-73.99, 40.73), range);
assertThat(results.getContent().isEmpty(), is(false));
}
/**
* @see DATAMONGO-990
*/
@Test
public void shouldFindByFirstnameForSpELExpressionWithParameterIndexOnly() {
List<Person> users = repository.findWithSpelByFirstnameForSpELExpressionWithParameterIndexOnly("Dave");
assertThat(users, hasSize(1));
assertThat(users.get(0), is(dave));
}
/**
* @see DATAMONGO-990
*/
@Test
public void shouldFindByFirstnameAndCurrentUserWithCustomQuery() {
SampleSecurityContextHolder.getCurrent().setPrincipal(dave);
List<Person> users = repository.findWithSpelByFirstnameAndCurrentUserWithCustomQuery("Dave");
assertThat(users, hasSize(1));
assertThat(users.get(0), is(dave));
}
/**
* @see DATAMONGO-990
*/
@Test
public void shouldFindByFirstnameForSpELExpressionWithParameterVariableOnly() {
List<Person> users = repository.findWithSpelByFirstnameForSpELExpressionWithParameterVariableOnly("Dave");
assertThat(users, hasSize(1));
assertThat(users.get(0), is(dave));
}
}

View File

@@ -34,6 +34,7 @@ import org.springframework.data.geo.Point;
import org.springframework.data.geo.Polygon;
import org.springframework.data.mongodb.repository.Person.Sex;
import org.springframework.data.querydsl.QueryDslPredicateExecutor;
import org.springframework.data.repository.query.Param;
/**
* Sample repository managing {@link Person} entities.
@@ -333,4 +334,22 @@ public interface PersonRepository extends MongoRepository<Person, String>, Query
*/
@Query("{ firstname : { $in : ?0 }}")
Stream<Person> findByCustomQueryWithStreamingCursorByFirstnames(List<String> firstnames);
/**
* @see DATAMONGO-990
*/
@Query("{ firstname : ?#{[0]}}")
List<Person> findWithSpelByFirstnameForSpELExpressionWithParameterIndexOnly(String firstname);
/**
* @see DATAMONGO-990
*/
@Query("{ firstname : ?#{[0]}, email: ?#{principal.email} }")
List<Person> findWithSpelByFirstnameAndCurrentUserWithCustomQuery(String firstname);
/**
* @see DATAMONGO-990
*/
@Query("{ firstname : :#{#firstname}}")
List<Person> findWithSpelByFirstnameForSpELExpressionWithParameterVariableOnly(@Param("firstname") String firstname);
}

View File

@@ -0,0 +1,124 @@
/*
* Copyright 2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository;
import java.util.Collections;
import java.util.Map;
import org.springframework.data.repository.query.spi.EvaluationContextExtension;
import org.springframework.data.repository.query.spi.EvaluationContextExtensionSupport;
/**
* A sample implementation of a custom {@link EvaluationContextExtension}.
*
* @author Thomas Darimont
*/
public class SampleEvaluationContextExtension extends EvaluationContextExtensionSupport {
@Override
public String getExtensionId() {
return "security";
}
@Override
public Map<String, Object> getProperties() {
return Collections.singletonMap("principal", SampleSecurityContextHolder.getCurrent().getPrincipal());
}
/**
* @author Thomas Darimont
*/
public static class SampleSecurityContextHolder {
private static ThreadLocal<SampleAuthentication> auth = new ThreadLocal<SampleAuthentication>() {
protected SampleAuthentication initialValue() {
return new SampleAuthentication(new SampleUser(-1, "anonymous"));
}
};
public static SampleAuthentication getCurrent() {
return auth.get();
}
public static void clear() {
auth.remove();
}
}
/**
* @author Thomas Darimont
*/
public static class SampleAuthentication {
private Object principal;
public SampleAuthentication(Object principal) {
this.principal = principal;
}
public Object getPrincipal() {
return principal;
}
public void setPrincipal(Object principal) {
this.principal = principal;
}
}
/**
* @author Thomas Darimont
*/
public static class SampleUser {
private Object id;
private String name;
public SampleUser(Object id, String name) {
this.id = id;
this.name = name;
}
public Object getId() {
return id;
}
public void setId(Object id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public SampleUser withName(String name) {
this.name = name;
return this;
}
public SampleUser withId(Object id) {
this.id = id;
return this;
}
}
}

View File

@@ -45,8 +45,7 @@ public class MongoNamespaceIntegrationTests extends AbstractPersonRepositoryInte
DefaultListableBeanFactory factory;
BeanDefinitionReader reader;
@Autowired
ApplicationContext context;
@Autowired ApplicationContext context;
@Before
@Override

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2014 the original author or authors.
* Copyright 2014-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -64,6 +64,7 @@ import com.mongodb.WriteResult;
*
* @author Christoph Strobl
* @author Oliver Gierke
* @author Thomas Darimont
*/
@RunWith(MockitoJUnitRunner.class)
public class AbstractMongoQueryUnitTests {
@@ -142,8 +143,8 @@ public class AbstractMongoQueryUnitTests {
public void testDeleteExecutionReturnsNrDocumentsDeletedFromWriteResult() {
when(writeResultMock.getN()).thenReturn(100);
when(this.mongoOperationsMock.remove(Matchers.any(Query.class), eq(Person.class), eq("persons"))).thenReturn(
writeResultMock);
when(this.mongoOperationsMock.remove(Matchers.any(Query.class), eq(Person.class), eq("persons")))
.thenReturn(writeResultMock);
MongoQueryFake query = createQueryForMethod("deletePersonByLastname", String.class);
query.setDeleteQuery(true);

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2014 the original author or authors.
* Copyright 2014-2015 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -45,12 +45,14 @@ import org.springframework.data.mongodb.repository.Query;
import org.springframework.data.repository.core.RepositoryMetadata;
import com.mongodb.BasicDBObjectBuilder;
import com.mongodb.util.JSONParseException;
/**
* Unit tests for {@link PartTreeMongoQuery}.
*
* @author Christoph Strobl
* @author Oliver Gierke
* @author Thomas Darimont
*/
@RunWith(MockitoJUnitRunner.class)
public class PartTreeMongoQueryUnitTests {
@@ -135,6 +137,18 @@ public class PartTreeMongoQueryUnitTests {
assertThat(query, isTextQuery().searchingFor("search").where(new Criteria("firstname").is("text")));
}
/**
* @see DATAMONGO-1180
*/
@Test
public void propagatesRootExceptionForInvalidQuery() {
exception.expect(IllegalStateException.class);
exception.expectCause(is(org.hamcrest.Matchers.<Throwable> instanceOf(JSONParseException.class)));
deriveQueryFromMethod("findByAge", new Object[] { 1 });
}
private org.springframework.data.mongodb.core.query.Query deriveQueryFromMethod(String method, Object[] args) {
Class<?>[] types = new Class<?>[args.length];
@@ -179,5 +193,8 @@ public class PartTreeMongoQueryUnitTests {
Person findPersonByFirstnameAndLastname(String firstname, String lastname);
Person findPersonByFirstname(String firstname, TextCriteria fullText);
@Query(fields = "{ 'firstname }")
Person findByAge(Integer age);
}
}

View File

@@ -41,6 +41,8 @@ import org.springframework.data.mongodb.repository.Address;
import org.springframework.data.mongodb.repository.Person;
import org.springframework.data.mongodb.repository.Query;
import org.springframework.data.repository.core.RepositoryMetadata;
import org.springframework.data.repository.query.DefaultEvaluationContextProvider;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import com.mongodb.BasicDBObject;
import com.mongodb.BasicDBObjectBuilder;
@@ -57,6 +59,8 @@ import com.mongodb.DBRef;
@RunWith(MockitoJUnitRunner.class)
public class StringBasedMongoQueryUnitTests {
SpelExpressionParser PARSER = new SpelExpressionParser();
@Mock MongoOperations operations;
@Mock RepositoryMetadata metadata;
@Mock DbRefResolver factory;
@@ -76,7 +80,8 @@ public class StringBasedMongoQueryUnitTests {
Method method = SampleRepository.class.getMethod("findByLastname", String.class);
MongoQueryMethod queryMethod = new MongoQueryMethod(method, metadata, converter.getMappingContext());
StringBasedMongoQuery mongoQuery = new StringBasedMongoQuery(queryMethod, operations);
StringBasedMongoQuery mongoQuery = new StringBasedMongoQuery(queryMethod, operations, PARSER,
DefaultEvaluationContextProvider.INSTANCE);
ConvertingParameterAccessor accesor = StubParameterAccessor.getAccessor(converter, "Matthews");
org.springframework.data.mongodb.core.query.Query query = mongoQuery.createQuery(accesor);
@@ -209,8 +214,9 @@ public class StringBasedMongoQueryUnitTests {
org.springframework.data.mongodb.core.query.Query query = mongoQuery.createQuery(accessor);
assertThat(query.getQueryObject(), is(new BasicQuery(
"{$where: 'return this.date.getUTCMonth() == 3 && this.date.getUTCDay() == 4;'}").getQueryObject()));
assertThat(query.getQueryObject(),
is(new BasicQuery("{$where: 'return this.date.getUTCMonth() == 3 && this.date.getUTCDay() == 4;'}")
.getQueryObject()));
}
/**
@@ -289,11 +295,26 @@ public class StringBasedMongoQueryUnitTests {
assertThat(query.getQueryObject(), is(new BasicDBObjectBuilder().add("key", "value").get()));
}
/**
* @see DATAMONGO-990
*/
@Test
public void shouldSupportExpressionsInCustomQueries() throws Exception {
ConvertingParameterAccessor accesor = StubParameterAccessor.getAccessor(converter, "Matthews");
StringBasedMongoQuery mongoQuery = createQueryForMethod("findByQueryWithExpression", String.class);
org.springframework.data.mongodb.core.query.Query query = mongoQuery.createQuery(accesor);
org.springframework.data.mongodb.core.query.Query reference = new BasicQuery("{'lastname' : 'Matthews'}");
assertThat(query.getQueryObject(), is(reference.getQueryObject()));
}
private StringBasedMongoQuery createQueryForMethod(String name, Class<?>... parameters) throws Exception {
Method method = SampleRepository.class.getMethod(name, parameters);
MongoQueryMethod queryMethod = new MongoQueryMethod(method, metadata, converter.getMappingContext());
return new StringBasedMongoQuery(queryMethod, operations);
return new StringBasedMongoQuery(queryMethod, operations, PARSER, DefaultEvaluationContextProvider.INSTANCE);
}
private interface SampleRepository {
@@ -334,5 +355,7 @@ public class StringBasedMongoQueryUnitTests {
@Query("{ ?0 : ?1}")
Object methodWithPlaceholderInKeyOfJsonStructure(String keyReplacement, String valueReplacement);
@Query(value = "{'lastname': ?#{[0]} }")
List<Person> findByQueryWithExpression(String param0);
}
}

View File

@@ -32,6 +32,7 @@ import org.springframework.data.repository.query.ParameterAccessor;
*
* @author Oliver Gierke
* @author Christoh Strobl
* @author Thomas Darimont
*/
class StubParameterAccessor implements MongoParameterAccessor {
@@ -129,4 +130,12 @@ class StubParameterAccessor implements MongoParameterAccessor {
public TextCriteria getFullText() {
return null;
}
/* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.MongoParameterAccessor#getValues()
*/
@Override
public Object[] getValues() {
return this.values;
}
}

View File

@@ -90,6 +90,10 @@ public class IsBsonObject<T extends BSONObject> extends TypeSafeMatcher<T> {
return false;
}
if (o != null && expectation.not) {
return false;
}
}
return true;
}

View File

@@ -0,0 +1,10 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:db-factory client-uri="mongodb://username:password@localhost/database" username="username" />
</beans>

View File

@@ -0,0 +1,10 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<mongo:db-factory client-uri="mongodb://username:password@localhost/database" />
</beans>

View File

@@ -26,6 +26,13 @@
</constructor-arg>
</bean>
</property>
<property name="evaluationContextProvider" ref="extensionAwareEvaluationContextProvider"/>
</bean>
<bean id="extensionAwareEvaluationContextProvider" class="org.springframework.data.repository.query.ExtensionAwareEvaluationContextProvider">
<constructor-arg>
<bean class="org.springframework.data.mongodb.repository.SampleEvaluationContextExtension"/>
</constructor-arg>
</bean>
</beans>

View File

@@ -22,4 +22,6 @@
<repository:exclude-filter type="regex" expression=".*MongoRepository"/>
</mongo:repositories>
<bean class="org.springframework.data.mongodb.repository.SampleEvaluationContextExtension"/>
</beans>

View File

@@ -7,6 +7,7 @@ Import-Package:
Export-Template:
org.springframework.data.mongodb.*;version="${project.version}"
Import-Template:
com.fasterxml.jackson.*;version="${jackson:[=.=.=,+1.0.0)}";resolution:=optional,
com.google.common.base.*;version="[11.0.0,14.0.0)";resolution:=optional,
com.mongodb.*;version="${mongo.osgi:[=.=.=,+1.0.0)}",
com.mysema.query.*;version="[2.1.1, 3.0.0)";resolution:=optional,

View File

@@ -4,9 +4,9 @@ Mark Pollack; Thomas Risberg; Oliver Gierke; Costin Leau; Jon Brisbin; Thomas Da
:revdate: {localdate}
:toc:
:toc-placement!:
:spring-data-commons-docs: https://raw.githubusercontent.com/spring-projects/spring-data-commons/master/src/main/asciidoc
:spring-data-commons-docs: ../../../../spring-data-commons/src/main/asciidoc
(C) 2008-2014 The original authors.
(C) 2008-2015 The original authors.
NOTE: _Copies of this document may be made for your own use and for distribution to others, provided that you do not charge any fee for such copies and further provided that each copy contains this Copyright Notice, whether distributed in print or electronically._
@@ -15,6 +15,8 @@ toc::[]
include::preface.adoc[]
:leveloffset: +1
include::new-features.adoc[]
include::{spring-data-commons-docs}/dependencies.adoc[]
include::{spring-data-commons-docs}/repositories.adoc[]
:leveloffset: -1
@@ -42,4 +44,5 @@ include::reference/mongo-3.adoc[]
include::{spring-data-commons-docs}/repository-namespace-reference.adoc[]
include::{spring-data-commons-docs}/repository-populator-namespace-reference.adoc[]
include::{spring-data-commons-docs}/repository-query-keywords-reference.adoc[]
include::{spring-data-commons-docs}/repository-query-return-types-reference.adoc[]
:leveloffset: -1

View File

@@ -0,0 +1,15 @@
[[new-features]]
= New & Noteworthy
[[new-features.1-7-0]]
== What's new in Spring Data MongoDB 1.7
* Assert compatibility with MongoDB 3.0 and MongoDB Java Driver 3-beta3 (see: <<mongo.mongo-3>>).
* Support JSR-310 and ThreeTen back-port date/time types.
* Allow `Stream` as query method return type (see: <<mongodb.repositories.queries>>).
* Added http://geojson.org/[GeoJSON] support in both domain types and queries (see: <<mongo.geo-json>>).
* `QueryDslPredicateExcecutor` now supports `findAll(OrderSpecifier<?>… orders)`.
* Support calling JavaScript functions via <<mongo.server-side-scripts>>.
* Improve support for `CONTAINS` keyword on collection like properties.
* Support for `$bit`, `$mul` and `$position` operators to `Update`.

View File

@@ -30,9 +30,9 @@ The jumping off ground for learning about MongoDB is http://www.mongodb.org/[www
[[requirements]]
== Requirements
Spring Data MongoDB 1.x binaries requires JDK level 6.0 and above, and http://spring.io/docs[Spring Framework] 3.2.x and above.
Spring Data MongoDB 1.x binaries requires JDK level 6.0 and above, and http://spring.io/docs[Spring Framework] 4.0.x and above.
In terms of document stores, http://www.mongodb.org/[MongoDB] at least 2.4, preferably version 2.6.
In terms of document stores, http://www.mongodb.org/[MongoDB] at least 2.6.
== Additional Help Resources
@@ -46,12 +46,12 @@ There are a few support options available:
[[get-started:help:community]]
==== Community Forum
Spring Data on Stackoverflow http://stackoverflow.com/questions/tagged/spring-data[Stackoverflow ] is a tag for all Spring Data (not just Document) users to share information and help each other. Note that registration is needed *only* for posting.
Spring Data on Stackoverflow http://stackoverflow.com/questions/tagged/spring-data[Stackoverflow] is a tag for all Spring Data (not just Document) users to share information and help each other. Note that registration is needed *only* for posting.
[[get-started:help:professional]]
==== Professional Support
Professional, from-the-source support, with guaranteed response time, is available from http://gopivotal.com/[Pivotal Sofware, Inc.], the company behind Spring Data and Spring.
Professional, from-the-source support, with guaranteed response time, is available from http://pivotal.io/[Pivotal Sofware, Inc.], the company behind Spring Data and Spring.
[[get-started:up-to-date]]
=== Following Development

View File

@@ -134,16 +134,22 @@ Most of the data access operations you usually trigger on a repository result a
----
public interface PersonRepository extends PagingAndSortingRepository<Person, String> {
List<Person> findByLastname(String lastname);
List<Person> findByLastname(String lastname); <1>
Page<Person> findByFirstname(String firstname, Pageable pageable);
Page<Person> findByFirstname(String firstname, Pageable pageable); <2>
Person findByShippingAddresses(Address address);
Person findByShippingAddresses(Address address); <3>
Stream<Person> findAllBy(); <4>
}
----
<1> The method shows a query for all people with the given lastname. The query will be derived parsing the method name for constraints which can be concatenated with `And` and `Or`. Thus the method name will result in a query expression of `{"lastname" : lastname}`.
<2> Applies pagination to a query. Just equip your method signature with a `Pageable` parameter and let the method return a `Page` instance and we will automatically page the query accordingly.
<3> Shows that you can query based on properties which are not a primitive type.
<4> Uses a Java 8 `Stream` which reads and converts individual elements while iterating the stream.
====
The first method shows a query for all people with the given lastname. The query will be derived parsing the method name for constraints which can be concatenated with `And` and `Or`. Thus the method name will result in a query expression of`{"lastname" : lastname}`. The second example shows how pagination is applied to a query. Just equip your method signature with a `Pageable` parameter and let the method return a `Page` instance and we will automatically page the query accordingly. The third examples shows that you can query based on properties which are not a primitive type.
NOTE: Note that for version 1.0 we currently don't support referring to parameters that are mapped as `DBRef` in the domain class.
@@ -154,6 +160,10 @@ NOTE: Note that for version 1.0 we currently don't support referring to paramete
| Sample
| Logical result
| `After`
| `findByBirthdateAfter(Date date)`
| `{"birthdate" : {"$gt" : date}}`
| `GreaterThan`
| `findByAgeGreaterThan(int age)`
| `{"age" : {"$gt" : age}}`
@@ -162,6 +172,10 @@ NOTE: Note that for version 1.0 we currently don't support referring to paramete
| `findByAgeGreaterThanEqual(int age)`
| `{"age" : {"$gte" : age}}`
| `Before`
| `findByBirthdateBefore(Date date)`
| `{"birthdate" : {"$lt" : date}}`
| `LessThan`
| `findByAgeLessThan(int age)`
| `{"age" : {"$lt" : age}}`
@@ -190,10 +204,18 @@ NOTE: Note that for version 1.0 we currently don't support referring to paramete
| `findByFirstnameNull()`
| `{"firstname" : null}`
| `Like`
| `Like`, `StartingWith`, `EndingWith`
| `findByFirstnameLike(String name)`
| `{"firstname" : name} ( name as regex)`
| `Containing` on String
| `findByFirstnameContaining(String name)`
| `{"firstname" : name} (name as regex)`
| `Containing` on Collection
| `findByAddressesContaining(Address address)`
| `{"addresses" : { "$in" : address}}`
| `Regex`
| `findByFirstnameRegex(String firstname)`
| `{"firstname" : {"$regex" : firstname }}`

View File

@@ -21,7 +21,7 @@ For most tasks you will find yourself using `MongoTemplate` or the Repository su
[[mongodb-getting-started]]
== Getting Started
Spring MongoDB support requires MongoDB 1.4 or higher and Java SE 5 or higher. The latest production release (2.4.9 as of this writing) is recommended. An easy way to bootstrap setting up a working environment is to create a Spring based project in http://spring.io/tools/sts[STS].
Spring MongoDB support requires MongoDB 2.6 or higher and Java SE 6 or higher. An easy way to bootstrap setting up a working environment is to create a Spring based project in http://spring.io/tools/sts[STS].
First you need to set up a running Mongodb server. Refer to the http://docs.mongodb.org/manual/core/introduction/[Mongodb Quick Start guide] for an explanation on how to startup a MongoDB instance. Once installed starting MongoDB is typically a matter of executing the following command: `MONGO_HOME/bin/mongod`
@@ -38,7 +38,7 @@ Then add the following to pom.xml dependencies section.
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.4.1.RELEASE</version>
<version>1.7.0.RELEASE</version>
</dependency>
</dependencies>
@@ -48,7 +48,7 @@ Also change the version of Spring in the pom.xml to be
[source,xml]
----
<spring.framework.version>3.2.8.RELEASE</spring.framework.version>
<spring.framework.version>4.0.9.RELEASE</spring.framework.version>
----
You will also need to add the location of the Spring Milestone repository for maven to your pom.xml which is at the same level of your <dependencies/> element
@@ -162,7 +162,7 @@ Even in this simple example, there are few things to take notice of
[[mongo.examples-repo]]
== Examples Repository
There is an https://github.com/spring-projects/spring-data-document-examples[github repository with several examples] that you can download and play around with to get a feel for how the library works.
There is an https://github.com/spring-projects/spring-data-examples[github repository with several examples] that you can download and play around with to get a feel for how the library works.
[[mongodb-connectors]]
== Connecting to MongoDB with Spring
@@ -1156,6 +1156,96 @@ As you can see we use the `NearQuery` builder API to set up a query to return al
The geo near operations return a `GeoResults` wrapper object that encapsulates `GeoResult` instances. The wrapping `GeoResults` allows to access the average distance of all results. A single `GeoResult` object simply carries the entity found plus its distance from the origin.
[[mongo.geo-json]]
=== GeoJSON Support
MongoDB supports http://geojeson.org/[GeoJSON] and simple (legacy) coordinate pairs for geospatial data. Those formats can both be used for storing as well as querying data.
NOTE: Please refer to the http://docs.mongodb.org/manual/core/2dsphere/#geospatial-indexes-store-geojson/[MongoDB manual on GeoJSON support] to learn about requirements and restrictions.
==== GeoJSON types in domain classes
Usage of http://geojeson.org/[GeoJSON] types in domain classes is straight forward. The `org.springframework.data.mongodb.core.geo` package contains types like `GeoJsonPoint`, `GeoJsonPolygon` and others. Those are extensions to the existing `org.springframework.data.geo` types.
====
[source,java]
----
public class Store {
String id;
/**
* location is stored in GeoJSON format.
* {
* "type" : "Point",
* "coordinates" : [ x, y ]
* }
*/
GeoJsonPoint location;
}
----
====
==== GeoJSON types in repository query methods
Using GeoJSON types as repository query parameters forces usage of the `$geometry` operator when creating the query.
====
[source,java]
----
public interface StoreRepository extends CrudRepository<Store, String> {
List<Store> findByLocationWithin(Polygon polygon); <1>
}
/*
* {
* "location": {
* "$geoWithin": {
* "$geometry": {
* "type": "Polygon",
* "coordinates": [
* [
* [-73.992514,40.758934],
* [-73.961138,40.760348],
* [-73.991658,40.730006],
* [-73.992514,40.758934]
* ]
* ]
* }
* }
* }
* }
*/
repo.findByLocationWithin( <2>
new GeoJsonPolygon(
new Point(-73.992514, 40.758934),
new Point(-73.961138, 40.760348),
new Point(-73.991658, 40.730006),
new Point(-73.992514, 40.758934))); <3>
/*
* {
* "location" : {
* "$geoWithin" : {
* "$polygon" : [ [-73.992514,40.758934] , [-73.961138,40.760348] , [-73.991658,40.730006] ]
* }
* }
* }
*/
repo.findByLocationWithin( <4>
new Polygon(
new Point(-73.992514, 40.758934),
new Point(-73.961138, 40.760348),
new Point(-73.991658, 40.730006));
----
<1> Repository method definition using the commons type allows calling it with both GeoJSON and legacy format.
<2> Use GeoJSON type the make use of `$geometry` operator.
<3> Plase note that GeoJSON polygons need the define a closed ring.
<4> Use legacy format `$polygon` operator.
====
[[mongo.textsearch]]
=== Full Text Queries
@@ -1340,17 +1430,21 @@ MongoDB allows to execute JavaScript functions on the server by either directly
=== Example Usage
====
[source,java]
----
ScriptOperations scriptOps = template.scriptOps();
ServerSideJavaScript echoScript = new ExecutableMongoScript("function(x) { return x; }");
scriptOps.execute(echoScript, "directly execute script");
ExecutableMongoScript echoScript = new ExecutableMongoScript("function(x) { return x; }");
scriptOps.execute(echoScript, "directly execute script"); <1>
scriptOps.register(new CallableMongoScript("echo", echoScript));
scriptOps.call("echo", "execute script via name");
scriptOps.register(new NamedMongoScript("echo", echoScript)); <2>
scriptOps.call("echo", "execute script via name"); <3>
----
<1> Execute the script directly without storing the function on server side.
<2> Store the script using 'echo' as its name. The given name identifies the script and allows calling it later.
<3> Execute the script with name 'echo' using the provided parameters.
====
[[mongo.group]]
== Group Operations

View File

@@ -1,6 +1,36 @@
Spring Data MongoDB Changelog
=============================
Changes in version 1.8.0.M1 (2015-06-02)
----------------------------------------
* DATAMONGO-1228 - Release 1.8 M1 (Gosling).
* DATAMONGO-1224 - Assert Spring Framework 4.2 compatibility.
* DATAMONGO-1221 - Remove relative reference to parent POM to make sure the right Spring version is picked up.
* DATAMONGO-1218 - Deprecate non-MongoClient related configuration options in XML namespace.
* DATAMONGO-1216 - Authentication mechanism PLAIN changes to SCRAM-SHA-1.
* DATAMONGO-1213 - Include new section on Spring Data and Spring Framework dependencies in reference documentation.
* DATAMONGO-1211 - Adapt API changes in Spring Data Commons to simplify custom repository base class registration.
* DATAMONGO-1210 - Inconsistent property order of _class type hint breaks document equality.
* DATAMONGO-1208 - MongoTemplate.stream(…) does not consider limit, order, sort etc.
* DATAMONGO-1207 - MongoTemplate#doInsertAll throws NullPointerException when passed Collection contains a null item.
* DATAMONGO-1202 - Indexed annotation problems under generics.
* DATAMONGO-1196 - Upgrade build profiles after MongoDB 3.0 Java driver release.
* DATAMONGO-1193 - Prevent unnecessary database lookups when resolving DBRefs on 2.x driver.
* DATAMONGO-1192 - Switch back to Spring 4.1's CollectionFactory.
* DATAMONGO-1134 - Add support for $geoIntersects.
* DATAMONGO-990 - Add support for SpEL expressions in @Query.
Changes in version 1.7.0.RELEASE (2015-03-23)
---------------------------------------------
* DATAMONGO-1189 - Release 1.7 GA.
* DATAMONGO-1181 - Add Jackson Module for GeoJSON types.
* DATAMONGO-1180 - Incorrect exception message creation in PartTreeMongoQuery.
* DATAMONGO-1179 - Update reference documentation.
* DATAMONGO-1124 - Switch log level for cyclic reference index warnings from WARN to INFO.
* DATAMONGO-979 - Add support for $size expression in project and group aggregation pipeline.
Changes in version 1.7.0.RC1 (2015-03-05)
-----------------------------------------
* DATAMONGO-1173 - Release 1.7 RC1.

View File

@@ -1,4 +1,4 @@
Spring Data MongoDB 1.7 RC1
Spring Data MongoDB 1.8 M1
Copyright (c) [2010-2015] Pivotal Software, Inc.
This product is licensed to you under the Apache License, Version 2.0 (the "License").