Compare commits

...

21 Commits

Author SHA1 Message Date
Oliver Gierke
cd35b9ed2a DATAMONGO-1410 - Release version 1.9.2 (Hopper SR2). 2016-06-15 13:45:52 +02:00
Oliver Gierke
e8944a6c3a DATAMONGO-1410 - Prepare 1.9.2 (Hopper SR2). 2016-06-15 13:44:53 +02:00
Oliver Gierke
6fcbc225eb DATAMONGO-1410 - Updated changelog. 2016-06-15 13:44:46 +02:00
Kevin Dosey
f06eda488c DATAMONGO-1449 - Switched to foreach loop in collection handling of MappingMongoConverter.
This should result in minor to moderate performance improvement for iteration on Collections/Arrays during DBObject to object mapping.

Original pull request: #368.
2016-06-11 18:33:16 +02:00
Mark Paluch
0ef910445d DATAMONGO-1437 - Polishing.
Renamed test. Added JavaDoc. Simplify throws declaration.

Original pull request: #367.
2016-06-02 14:15:08 +02:00
Christoph Strobl
b22ee9d27c DATAMONGO-1437 - Preserve non translatable Exception cause when lazily resolving DBRef.
We now preserve the cause of Exceptions that cannot be translated into DataAccessExceptions when an error occurs during lazily loading DBRefs.

Original pull request: #367.
2016-06-02 14:14:55 +02:00
Oliver Gierke
859a0e83c8 DATAMONGO-1423 - Polishing.
Orignal pull request: #365.
2016-05-25 17:33:05 +02:00
Christoph Strobl
b35f151b80 DATAMONGO-1423 - Map keys now get registered conversions applied for Updates.
We now pipe map keys through the potentially registered conversions when mapping Updates.

Orignal pull request: #365.
2016-05-25 17:33:05 +02:00
Oliver Gierke
17afb07e45 DATAMONGO-1416 - Polishing.
Just use instanceOf(…) from Hamcrest's Matchers class instead of dedicated class.

Original pull request: #362.
2016-05-24 16:01:12 +02:00
Christoph Strobl
b407963344 DATAMONGO-1416 - Get rid of the warnings for Atomic… type conversions.
We now use explicit converters instead of a ConverterFactory. This reduces noise in log when registering converters.

Original pull request: #362.
2016-05-24 16:00:54 +02:00
Mark Paluch
8b31ba1836 DATAMONGO-1425 - Polishing.
Add NotContaining to documentation. Add integration test for Containing/NotContaining on collection properties.

Original pull request: #363.
2016-05-09 11:14:03 +02:00
Christoph Strobl
0cf6edae43 DATAMONGO-1425 - Fix query derivation for notContaining on String properties.
We now correctly build up the criteria for derived queries using notContaining keyword on String properties.

Original pull request: #363.
2016-05-09 11:13:58 +02:00
Mark Paluch
0824105377 DATAMONGO-1412 - Fix backticks and code element highlighting.
Fixed broken highlighting using backticks followed by chars/single quotes. Convert single quote emphasis of id to backtick code fences. Add missing spaces between words and backticks.

Original Pull Request: #359
2016-05-02 14:18:13 +02:00
Mark Paluch
dc936a5b7b DATAMONGO-1412 - Document mapping rules for Java types to MongoDB representation.
Original Pull Request: #359
Related pull request: #353
Related ticket: DATAMONGO-1404
2016-05-02 14:18:07 +02:00
Mark Paluch
c8fe02e48e DATAMONGO-1411 - Enable build on TravisCI.
We now start MongoDB server via apt-get instead of relying on the TravisCI managed 2.4.2 installation.
Doing this we altered tests to just check on the port and not the host part of the URIs.

Additionally we upgraded build profiles, removed promoted snapshot-versions, renamed mongo32-next to mongo32  and added mongo33-next build profile.

Original pull request: #358
2016-04-26 11:17:00 +02:00
Oliver Gierke
32547db306 DATAMONGO-1408 - After release cleanups. 2016-04-06 22:51:44 +02:00
Oliver Gierke
41902154ca DATAMONGO-1408 - Prepare next development iteration. 2016-04-06 22:51:43 +02:00
Oliver Gierke
2354ced1bf DATAMONGO-1408 - Release version 1.9.1 (Hopper SR1). 2016-04-06 22:32:15 +02:00
Oliver Gierke
791cc3a1b8 DATAMONGO-1408 - Prepare 1.9.1 (Hopper SR1). 2016-04-06 22:31:40 +02:00
Oliver Gierke
021c03fbbf DATAMONGO-1408 - Updated changelog. 2016-04-06 22:31:35 +02:00
Oliver Gierke
e4a59f29d0 DATAMONGO-1405 - Prepare next development iteration. 2016-04-06 16:38:18 +02:00
22 changed files with 586 additions and 82 deletions

View File

@@ -3,13 +3,28 @@ language: java
jdk:
- oraclejdk8
services:
- mongodb
before_script:
- mongod --version
env:
matrix:
- PROFILE=ci
- PROFILE=mongo-next
- PROFILE=mongo3
- PROFILE=mongo3-next
- PROFILE=mongo31
- PROFILE=mongo32
- PROFILE=mongo33-next
# Current MongoDB version is 2.4.2 as of 2016-04, see https://github.com/travis-ci/travis-ci/issues/3694
# apt-get starts a MongoDB instance so it's not started using before_script
addons:
apt:
sources:
- mongodb-3.2-precise
packages:
- mongodb-org-server
- mongodb-org-shell
sudo: false

23
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.9.0.RELEASE</version>
<version>1.9.2.RELEASE</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>1.8.0.RELEASE</version>
<version>1.8.2.RELEASE</version>
</parent>
<modules>
@@ -28,7 +28,7 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>1.12.0.RELEASE</springdata.commons>
<springdata.commons>1.12.2.RELEASE</springdata.commons>
<mongo>2.14.0</mongo>
<mongo.osgi>2.13.0</mongo.osgi>
</properties>
@@ -107,7 +107,7 @@
<id>mongo-next</id>
<properties>
<mongo>2.14.0-SNAPSHOT</mongo>
<mongo>2.15.0-SNAPSHOT</mongo>
</properties>
<repositories>
@@ -148,16 +148,25 @@
<id>mongo31</id>
<properties>
<mongo>3.1.0</mongo>
<mongo>3.1.1</mongo>
</properties>
</profile>
<profile>
<id>mongo32-next</id>
<id>mongo32</id>
<properties>
<mongo>3.2.0-SNAPSHOT</mongo>
<mongo>3.2.2</mongo>
</properties>
</profile>
<profile>
<id>mongo33-next</id>
<properties>
<mongo>3.3.0-SNAPSHOT</mongo>
</properties>
<repositories>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.9.0.RELEASE</version>
<version>1.9.2.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -48,7 +48,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>1.9.0.RELEASE</version>
<version>1.9.2.RELEASE</version>
</dependency>
<dependency>

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.9.0.RELEASE</version>
<version>1.9.2.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -5,7 +5,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.9.0.RELEASE</version>
<version>1.9.2.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.9.0.RELEASE</version>
<version>1.9.2.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2013-2015 the original author or authors.
* Copyright 2013-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -180,8 +180,8 @@ public class DefaultDbRefResolver implements DbRefResolver {
* @author Oliver Gierke
* @author Christoph Strobl
*/
static class LazyLoadingInterceptor implements MethodInterceptor, org.springframework.cglib.proxy.MethodInterceptor,
Serializable {
static class LazyLoadingInterceptor
implements MethodInterceptor, org.springframework.cglib.proxy.MethodInterceptor, Serializable {
private static final Method INITIALIZE_METHOD, TO_DBREF_METHOD, FINALIZE_METHOD;
@@ -387,7 +387,8 @@ public class DefaultDbRefResolver implements DbRefResolver {
} catch (RuntimeException ex) {
DataAccessException translatedException = this.exceptionTranslator.translateExceptionIfPossible(ex);
throw new LazyLoadingException("Unable to lazily resolve DBRef!", translatedException);
throw new LazyLoadingException("Unable to lazily resolve DBRef!",
translatedException != null ? translatedException : ex);
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2015 by the original author(s).
* Copyright 2011-2016 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,6 +20,7 @@ import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
import java.util.HashSet;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
@@ -896,9 +897,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Collection<Object> items = targetType.getType().isArray() ? new ArrayList<Object>()
: CollectionFactory.createCollection(collectionType, rawComponentType, sourceValue.size());
for (int i = 0; i < sourceValue.size(); i++) {
Object dbObjItem = sourceValue.get(i);
for (Object dbObjItem : sourceValue) {
if (dbObjItem instanceof DBRef) {
items.add(
@@ -992,20 +991,31 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (obj instanceof DBObject) {
DBObject newValueDbo = new BasicDBObject();
for (String vk : ((DBObject) obj).keySet()) {
Object o = ((DBObject) obj).get(vk);
newValueDbo.put(vk, convertToMongoType(o, typeHint));
}
return newValueDbo;
}
if (obj instanceof Map) {
DBObject result = new BasicDBObject();
for (Map.Entry<Object, Object> entry : ((Map<Object, Object>) obj).entrySet()) {
result.put(entry.getKey().toString(), convertToMongoType(entry.getValue(), typeHint));
Map<Object, Object> converted = new LinkedHashMap<Object, Object>();
for (Entry<Object, Object> entry : ((Map<Object, Object>) obj).entrySet()) {
TypeInformation<? extends Object> valueTypeHint = typeHint != null && typeHint.getMapValueType() != null
? typeHint.getMapValueType() : typeHint;
converted.put(convertToMongoType(entry.getKey()), convertToMongoType(entry.getValue(), valueTypeHint));
}
return result;
return new BasicDBObject(converted);
}
if (obj.getClass().isArray()) {

View File

@@ -81,7 +81,10 @@ abstract class MongoConverters {
converters.add(DBObjectToNamedMongoScriptCoverter.INSTANCE);
converters.add(CurrencyToStringConverter.INSTANCE);
converters.add(StringToCurrencyConverter.INSTANCE);
converters.add(NumberToNumberConverterFactory.INSTANCE);
converters.add(AtomicIntegerToIntegerConverter.INSTANCE);
converters.add(AtomicLongToLongConverter.INSTANCE);
converters.add(LongToAtomicLongConverter.INSTANCE);
converters.add(IntegerToAtomicIntegerConverter.INSTANCE);
return converters;
}
@@ -374,4 +377,68 @@ abstract class MongoConverters {
}
}
}
/**
* {@link ConverterFactory} implementation converting {@link AtomicLong} into {@link Long}.
*
* @author Christoph Strobl
* @since 1.10
*/
@WritingConverter
public static enum AtomicLongToLongConverter implements Converter<AtomicLong, Long> {
INSTANCE;
@Override
public Long convert(AtomicLong source) {
return NumberUtils.convertNumberToTargetClass(source, Long.class);
}
}
/**
* {@link ConverterFactory} implementation converting {@link AtomicInteger} into {@link Integer}.
*
* @author Christoph Strobl
* @since 1.10
*/
@WritingConverter
public static enum AtomicIntegerToIntegerConverter implements Converter<AtomicInteger, Integer> {
INSTANCE;
@Override
public Integer convert(AtomicInteger source) {
return NumberUtils.convertNumberToTargetClass(source, Integer.class);
}
}
/**
* {@link ConverterFactory} implementation converting {@link Long} into {@link AtomicLong}.
*
* @author Christoph Strobl
* @since 1.10
*/
@ReadingConverter
public static enum LongToAtomicLongConverter implements Converter<Long, AtomicLong> {
INSTANCE;
@Override
public AtomicLong convert(Long source) {
return source != null ? new AtomicLong(source) : null;
}
}
/**
* {@link ConverterFactory} implementation converting {@link Integer} into {@link AtomicInteger}.
*
* @author Christoph Strobl
* @since 1.10
*/
@ReadingConverter
public static enum IntegerToAtomicIntegerConverter implements Converter<Integer, AtomicInteger> {
INSTANCE;
@Override
public AtomicInteger convert(Integer source) {
return source != null ? new AtomicInteger(source) : null;
}
}
}

View File

@@ -200,7 +200,7 @@ class MongoQueryCreator extends AbstractQueryCreator<Query, Criteria> {
case CONTAINING:
return createContainingCriteria(part, property, criteria, parameters);
case NOT_CONTAINING:
return createContainingCriteria(part, property, criteria, parameters).not();
return createContainingCriteria(part, property, criteria.not(), parameters);
case REGEX:
return criteria.regex(parameters.next().toString());
case EXISTS:

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2011-2012 the original author or authors.
* Copyright 2011-2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -20,6 +20,7 @@ import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.net.InetAddress;
import java.util.ArrayList;
import java.util.List;
import org.junit.Ignore;
@@ -37,6 +38,13 @@ import com.mongodb.CommandResult;
import com.mongodb.Mongo;
import com.mongodb.ServerAddress;
/**
*
* @author Mark Pollack
* @author Oliver Gierke
* @author Thomas Darimont
* @author Mark Paluch
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
public class MongoNamespaceReplicaSetTests {
@@ -70,10 +78,13 @@ public class MongoNamespaceReplicaSetTests {
assertThat(replicaSetSeeds, is(notNullValue()));
assertThat(replicaSetSeeds, hasSize(3));
assertThat(
replicaSetSeeds,
hasItems(new ServerAddress("192.168.174.130", 27017), new ServerAddress("192.168.174.130", 27018),
new ServerAddress("192.168.174.130", 27019)));
List<Integer> ports = new ArrayList<Integer>();
for (ServerAddress replicaSetSeed : replicaSetSeeds) {
ports.add(replicaSetSeed.getPort());
}
assertThat(ports, hasItems(27017, 27018, 27019));
}
@Test

View File

@@ -0,0 +1,73 @@
/*
* Copyright 2016 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import static org.hamcrest.core.Is.*;
import static org.hamcrest.core.IsEqual.*;
import static org.mockito.Mockito.*;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.ExpectedException;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.runners.MockitoJUnitRunner;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.LazyLoadingException;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver.LazyLoadingInterceptor;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBRef;
/**
* Unit tests for {@link LazyLoadingInterceptor}.
*
* @author Christoph Strobl
*/
@RunWith(MockitoJUnitRunner.class)
public class LazyLoadingInterceptorUnitTests {
public @Rule ExpectedException exception = ExpectedException.none();
@Mock MongoPersistentProperty propertyMock;
@Mock DBRef dbrefMock;
@Mock DbRefResolverCallback callbackMock;
/**
* @see DATAMONGO-1437
*/
@Test
public void shouldPreserveCauseForNonTranslatableExceptions() throws Throwable {
NullPointerException npe = new NullPointerException("Some Exception we did not think about.");
when(callbackMock.resolve(propertyMock)).thenThrow(npe);
exception.expect(LazyLoadingException.class);
exception.expectCause(is(equalTo(npe)));
new LazyLoadingInterceptor(propertyMock, dbrefMock, new NullExceptionTranslator(), callbackMock).intercept(null,
LazyLoadingProxy.class.getMethod("getTarget"), null, null);
}
static class NullExceptionTranslator implements PersistenceExceptionTranslator {
@Override
public DataAccessException translateExceptionIfPossible(RuntimeException ex) {
return null;
}
}
}

View File

@@ -15,11 +15,13 @@
*/
package org.springframework.data.mongodb.core.convert;
import static org.hamcrest.CoreMatchers.*;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import java.math.BigDecimal;
import java.util.Currency;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.concurrent.atomic.AtomicLong;
import org.junit.Test;
import org.springframework.data.geo.Box;
@@ -27,8 +29,12 @@ import org.springframework.data.geo.Circle;
import org.springframework.data.geo.Point;
import org.springframework.data.geo.Polygon;
import org.springframework.data.geo.Shape;
import org.springframework.data.mongodb.core.convert.MongoConverters.AtomicIntegerToIntegerConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.AtomicLongToLongConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.BigDecimalToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.CurrencyToStringConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.IntegerToAtomicIntegerConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.LongToAtomicLongConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToBigDecimalConverter;
import org.springframework.data.mongodb.core.convert.MongoConverters.StringToCurrencyConverter;
import org.springframework.data.mongodb.core.geo.Sphere;
@@ -140,4 +146,36 @@ public class MongoConvertersUnitTests {
public void convertsStringToCurrencyCorrectly() {
assertThat(StringToCurrencyConverter.INSTANCE.convert("USD"), is(Currency.getInstance("USD")));
}
/**
* @see DATAMONGO-1416
*/
@Test
public void convertsAtomicLongToLongCorrectly() {
assertThat(AtomicLongToLongConverter.INSTANCE.convert(new AtomicLong(100L)), is(100L));
}
/**
* @see DATAMONGO-1416
*/
@Test
public void convertsAtomicIntegerToIntegerCorrectly() {
assertThat(AtomicIntegerToIntegerConverter.INSTANCE.convert(new AtomicInteger(100)), is(100));
}
/**
* @see DATAMONGO-1416
*/
@Test
public void convertsLongToAtomicLongCorrectly() {
assertThat(LongToAtomicLongConverter.INSTANCE.convert(100L), is(instanceOf(AtomicLong.class)));
}
/**
* @see DATAMONGO-1416
*/
@Test
public void convertsIntegerToAtomicIntegerCorrectly() {
assertThat(IntegerToAtomicIntegerConverter.INSTANCE.convert(100), is(instanceOf(AtomicInteger.class)));
}
}

View File

@@ -887,6 +887,33 @@ public class UpdateMapperUnitTests {
assertThat($set.get("primIntValue"), Is.<Object> is(10));
}
/**
* @see DATAMONGO-1423
*/
@Test
@SuppressWarnings("unchecked")
public void mappingShouldConsiderCustomConvertersForEnumMapKeys() {
CustomConversions conversions = new CustomConversions(
Arrays.asList(AllocationToStringConverter.INSTANCE, StringToAllocationConverter.INSTANCE));
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setSimpleTypeHolder(conversions.getSimpleTypeHolder());
mappingContext.afterPropertiesSet();
MappingMongoConverter converter = new MappingMongoConverter(mock(DbRefResolver.class), mappingContext);
converter.setCustomConversions(conversions);
converter.afterPropertiesSet();
UpdateMapper mapper = new UpdateMapper(converter);
Update update = new Update().set("enumAsMapKey", Collections.singletonMap(Allocation.AVAILABLE, 100));
DBObject result = mapper.getMappedObject(update.getUpdateObject(),
mappingContext.getPersistentEntity(ClassWithEnum.class));
assertThat(result, isBsonObject().containing("$set.enumAsMapKey.V", 100));
}
static class DomainTypeWrappingConcreteyTypeHavingListOfInterfaceTypeAttributes {
ListModelWrapper concreteTypeWithListAttributeOfInterfaceType;
}
@@ -1113,6 +1140,7 @@ public class UpdateMapperUnitTests {
static class ClassWithEnum {
Allocation allocation;
Map<Allocation, String> enumAsMapKey;
static enum Allocation {

View File

@@ -86,8 +86,10 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
dave = new Person("Dave", "Matthews", 42);
oliver = new Person("Oliver August", "Matthews", 4);
carter = new Person("Carter", "Beauford", 49);
carter.setSkills(Arrays.asList("Drums", "percussion", "vocals"));
Thread.sleep(10);
boyd = new Person("Boyd", "Tinsley", 45);
boyd.setSkills(Arrays.asList("Violin", "Electric Violin", "Viola", "Mandolin", "Vocals", "Guitar"));
stefan = new Person("Stefan", "Lessard", 34);
leroi = new Person("Leroi", "Moore", 41);
@@ -1261,4 +1263,37 @@ public abstract class AbstractPersonRepositoryIntegrationTests {
assertThat(result.size(), is(2));
}
/**
* @see DATAMONGO-1425
*/
@Test
public void findsPersonsByFirstnameNotContains() throws Exception {
List<Person> result = repository.findByFirstnameNotContains("Boyd");
assertThat(result.size(), is((int) (repository.count() - 1)));
assertThat(result, not(hasItem(boyd)));
}
/**
* @see DATAMONGO-1425
*/
@Test
public void findBySkillsContains() throws Exception {
List<Person> result = repository.findBySkillsContains(Arrays.asList("Drums"));
assertThat(result.size(), is(1));
assertThat(result, hasItem(carter));
}
/**
* @see DATAMONGO-1425
*/
@Test
public void findBySkillsNotContains() throws Exception {
List<Person> result = repository.findBySkillsNotContains(Arrays.asList("Drums"));
assertThat(result.size(), is((int) (repository.count() - 1)));
assertThat(result, not(hasItem(carter)));
}
}

View File

@@ -89,8 +89,14 @@ public interface PersonRepository extends MongoRepository<Person, String>, Query
*/
List<Person> findByFirstnameLike(String firstname);
List<Person> findByFirstnameNotContains(String firstname);
List<Person> findByFirstnameLikeOrderByLastnameAsc(String firstname, Sort sort);
List<Person> findBySkillsContains(List<String> skills);
List<Person> findBySkillsNotContains(List<String> skills);
@Query("{'age' : { '$lt' : ?0 } }")
List<Person> findByAgeLessThan(int age, Sort sort);
@@ -309,7 +315,8 @@ public interface PersonRepository extends MongoRepository<Person, String>, Query
* @see DATAMONGO-745
*/
@Query("{lastname:?0, address.street:{$in:?1}}")
Page<Person> findByCustomQueryLastnameAndAddressStreetInList(String lastname, List<String> streetNames, Pageable page);
Page<Person> findByCustomQueryLastnameAndAddressStreetInList(String lastname, List<String> streetNames,
Pageable page);
/**
* @see DATAMONGO-950
@@ -334,19 +341,19 @@ public interface PersonRepository extends MongoRepository<Person, String>, Query
*/
@Query("{ firstname : { $in : ?0 }}")
Stream<Person> findByCustomQueryWithStreamingCursorByFirstnames(List<String> firstnames);
/**
* @see DATAMONGO-990
*/
@Query("{ firstname : ?#{[0]}}")
List<Person> findWithSpelByFirstnameForSpELExpressionWithParameterIndexOnly(String firstname);
/**
* @see DATAMONGO-990
*/
@Query("{ firstname : ?#{[0]}, email: ?#{principal.email} }")
List<Person> findWithSpelByFirstnameAndCurrentUserWithCustomQuery(String firstname);
/**
* @see DATAMONGO-990
*/

View File

@@ -454,6 +454,7 @@ public class MongoQueryCreatorUnitTests {
/**
* @see DATAMONGO-1075
* @see DATAMONGO-1425
*/
@Test
public void shouldCreateRegexWhenUsingNotContainsOnStringProperty() {
@@ -462,7 +463,7 @@ public class MongoQueryCreatorUnitTests {
MongoQueryCreator creator = new MongoQueryCreator(tree, getAccessor(converter, "thew"), context);
Query query = creator.createQuery();
assertThat(query, is(query(where("username").regex(".*thew.*").not())));
assertThat(query.getQueryObject(), is(query(where("username").not().regex(".*thew.*")).getQueryObject()));
}
/**

View File

@@ -12,25 +12,25 @@ NOTE: `SimpleMongoConverter` has been deprecated in Spring Data MongoDB M3 as al
`MongoMappingConverter` has a few conventions for mapping objects to documents when no additional mapping metadata is provided. The conventions are:
* The short Java class name is mapped to the collection name in the following manner. The class '`com.bigbank.SavingsAccount`' maps to '`savingsAccount`' collection name.
* The short Java class name is mapped to the collection name in the following manner. The class `com.bigbank.SavingsAccount` maps to `savingsAccount` collection name.
* All nested objects are stored as nested objects in the document and *not* as DBRefs
* The converter will use any Spring Converters registered with it to override the default mapping of object properties to document field/values.
* The fields of an object are used to convert to and from fields in the document. Public JavaBean properties are not used.
* You can have a single non-zero argument constructor whose constructor argument names match top level field names of document, that constructor will be used. Otherwise the zero arg constructor will be used. if there is more than one non-zero argument constructor an exception will be thrown.
[[mapping.conventions.id-field]]
=== How the '_id' field is handled in the mapping layer
=== How the `_id` field is handled in the mapping layer
MongoDB requires that you have an '_id' field for all documents. If you don't provide one the driver will assign a ObjectId with a generated value. The "_id" field can be of any type the, other than arrays, so long as it is unique. The driver naturally supports all primitive types and Dates. When using the `MongoMappingConverter` there are certain rules that govern how properties from the Java class is mapped to this '_id' field.
MongoDB requires that you have an `_id` field for all documents. If you don't provide one the driver will assign a ObjectId with a generated value. The "_id" field can be of any type the, other than arrays, so long as it is unique. The driver naturally supports all primitive types and Dates. When using the `MongoMappingConverter` there are certain rules that govern how properties from the Java class is mapped to this `_id` field.
The following outlines what field will be mapped to the '_id' document field:
The following outlines what field will be mapped to the `_id` document field:
* A field annotated with `@Id` (`org.springframework.data.annotation.Id`) will be mapped to the '_id' field.
* A field without an annotation but named 'id' will be mapped to the '_id' field.
* The default field name for identifiers is '_id' and can be customized via the `@Field` annotation.
* A field annotated with `@Id` (`org.springframework.data.annotation.Id`) will be mapped to the `_id` field.
* A field without an annotation but named `id` will be mapped to the `_id` field.
* The default field name for identifiers is `_id` and can be customized via the `@Field` annotation.
[cols="1,2", options="header"]
.Examples for the translation of '_id'-field definitions
.Examples for the translation of `_id` field definitions
|===
| Field definition
| Resulting Id-Fieldname in MongoDB
@@ -41,24 +41,208 @@ The following outlines what field will be mapped to the '_id' document field:
| `@Field` `String` id
| `_id`
| `@Field('x')` `String` id
| `@Field("x")` `String` id
| `x`
| `@Id` `String` x
| `_id`
| `@Field('x')` `@Id` `String` x
| `@Field("x")` `@Id` `String` x
| `_id`
|===
The following outlines what type conversion, if any, will be done on the property mapped to the _id document field.
* If a field named 'id' is declared as a String or BigInteger in the Java class it will be converted to and stored as an ObjectId if possible. ObjectId as a field type is also valid. If you specify a value for 'id' in your application, the conversion to an ObjectId is detected to the MongoDBdriver. If the specified 'id' value cannot be converted to an ObjectId, then the value will be stored as is in the document's _id field.
* If a field named ' id' id field is not declared as a String, BigInteger, or ObjectID in the Java class then you should assign it a value in your application so it can be stored 'as-is' in the document's _id field.
* If no field named 'id' is present in the Java class then an implicit '_id' file will be generated by the driver but not mapped to a property or field of the Java class.
* If a field named `id` is declared as a String or BigInteger in the Java class it will be converted to and stored as an ObjectId if possible. ObjectId as a field type is also valid. If you specify a value for `id` in your application, the conversion to an ObjectId is detected to the MongoDBdriver. If the specified `id` value cannot be converted to an ObjectId, then the value will be stored as is in the document's _id field.
* If a field named `id` id field is not declared as a String, BigInteger, or ObjectID in the Java class then you should assign it a value in your application so it can be stored 'as-is' in the document's _id field.
* If no field named `id` is present in the Java class then an implicit `_id` file will be generated by the driver but not mapped to a property or field of the Java class.
When querying and updating `MongoTemplate` will use the converter to handle conversions of the `Query` and `Update` objects that correspond to the above rules for saving documents so field names and types used in your queries will be able to match what is in your domain classes.
[[mapping-conversion]]
== Data mapping and type conversion
This section explain how types are mapped to a MongoDB representation and vice versa. Spring Data MongoDB supports all types that can be represented as BSON, MongoDB's internal document format.
In addition to these types, Spring Data MongoDB provides a set of built-in converters to map additional types. You can provide your own converters to adjust type conversion, see <<mapping-explicit-converters>> for further details.
[cols="3,1,6", options="header"]
.Type
|===
| Type
| Type conversion
| Sample
| `String`
| native
| `{"firstname" : "Dave"}`
| `double`, `Double`, `float`, `Float`
| native
| `{"weight" : 42.5}`
| `int`, `Integer`, `short`, `Short`
| native +
32-bit integer
| `{"height" : 42}`
| `long`, `Long`
| native +
64-bit integer
| `{"height" : 42}`
| `Date`, `Timestamp`
| native
| `{"date" : ISODate("2019-11-12T23:00:00.809Z")}`
| `byte[]`
| native
| `{"bin" : { "$binary" : "AQIDBA==", "$type" : "00" }}`
| `java.util.UUID` (Legacy UUID)
| native
| `{"uuid" : { "$binary" : "MEaf1CFQ6lSphaa3b9AtlA==", "$type" : "03" }}`
| `Date`
| native
| `{"date" : ISODate("2019-11-12T23:00:00.809Z")}`
| `ObjectId`
| native
| `{"_id" : ObjectId("5707a2690364aba3136ab870")}`
| Array, `List`, `BasicDBList`
| native
| `{"cookies" : [ … ]}`
| `boolean`, `Boolean`
| native
| `{"active" : true}`
| `null`
| native
| `{"value" : null}`
| `DBObject`
| native
| `{"value" : { … }}`
| `AtomicInteger` +
calling `get()` before the actual conversion
| converter +
32-bit integer
| `{"value" : "741" }`
| `AtomicLong` +
calling `get()` before the actual conversion
| converter +
64-bit integer
| `{"value" : "741" }`
| `BigInteger`
| converter +
`String`
| `{"value" : "741" }`
| `BigDecimal`
| converter +
`String`
| `{"value" : "741.99" }`
| `URL`
| converter
| `{"website" : "http://projects.spring.io/spring-data-mongodb/" }`
| `Locale`
| converter
| `{"locale : "en_US" }`
| `char`, `Character`
| converter
| `{"char" : "a" }`
| `NamedMongoScript`
| converter +
`Code`
| `{"_id" : "script name", value: (some javascript code)`}
| `java.util.Currency`
| converter
| `{"currencyCode" : "EUR"}`
| `LocalDate` +
(Joda, Java 8, JSR310-BackPort)
| converter
| `{"date" : ISODate("2019-11-12T00:00:00.000Z")}`
| `LocalDateTime`, `LocalTime`, `Instant` +
(Joda, Java 8, JSR310-BackPort)
| converter
| `{"date" : ISODate("2019-11-12T23:00:00.809Z")}`
| `DateTime` (Joda)
| converter
| `{"date" : ISODate("2019-11-12T23:00:00.809Z")}`
| `DateMidnight` (Joda)
| converter
| `{"date" : ISODate("2019-11-12T00:00:00.000Z")}`
| `ZoneId` (Java 8, JSR310-BackPort)
| converter
| `{"zoneId" : "ECT - Europe/Paris"}`
| `Box`
| converter
| `{"box" : { "first" : { "x" : 1.0 , "y" : 2.0} , "second" : { "x" : 3.0 , "y" : 4.0}}`
| `Polygon`
| converter
| `{"polygon" : { "points" : [ { "x" : 1.0 , "y" : 2.0} , { "x" : 3.0 , "y" : 4.0} , { "x" : 4.0 , "y" : 5.0}]}}`
| `Circle`
| converter
| `{"circle" : { "center" : { "x" : 1.0 , "y" : 2.0} , "radius" : 3.0 , "metric" : "NEUTRAL"}}`
| `Point`
| converter
| `{"point" : { "x" : 1.0 , "y" : 2.0}}`
| `GeoJsonPoint`
| converter
| `{"point" : { "type" : "Point" , "coordinates" : [3.0 , 4.0] }}`
| `GeoJsonMultiPoint`
| converter
| `{"geoJsonLineString" : {"type":"MultiPoint", "coordinates": [ [ 0 , 0 ], [ 0 , 1 ], [ 1 , 1 ] ] }}`
| `Sphere`
| converter
| `{"sphere" : { "center" : { "x" : 1.0 , "y" : 2.0} , "radius" : 3.0 , "metric" : "NEUTRAL"}}`
| `GeoJsonPolygon`
| converter
| `{"polygon" : { "type" : "Polygon", "coordinates" : [[ [ 0 , 0 ], [ 3 , 6 ], [ 6 , 1 ], [ 0 , 0 ] ]] }}`
| `GeoJsonMultiPolygon`
| converter
| `{"geoJsonMultiPolygon" : { "type" : "MultiPolygon", "coordinates" : [
[ [ [ -73.958 , 40.8003 ] , [ -73.9498 , 40.7968 ] ] ],
[ [ [ -73.973 , 40.7648 ] , [ -73.9588 , 40.8003 ] ] ]
] }}`
| `GeoJsonLineString`
| converter
| `{ "geoJsonLineString" : { "type" : "LineString", "coordinates" : [ [ 40 , 5 ], [ 41 , 6 ] ] }}`
| `GeoJsonMultiLineString`
| converter
| `{"geoJsonLineString" : { "type" : "MultiLineString", coordinates: [
[ [ -73.97162 , 40.78205 ], [ -73.96374 , 40.77715 ] ],
[ [ -73.97880 , 40.77247 ], [ -73.97036 , 40.76811 ] ]
] }}`
|===
[[mapping-configuration]]
== Mapping Configuration
@@ -108,11 +292,11 @@ public class GeoSpatialAppConfig extends AbstractMongoConfiguration {
----
====
`AbstractMongoConfiguration` requires you to implement methods that define a `com.mongodb.Mongo` as well as provide a database name. `AbstractMongoConfiguration` also has a method you can override named '`getMappingBasePackage`' which tells the converter where to scan for classes annotated with the `@org.springframework.data.mongodb.core.mapping.Document` annotation.
`AbstractMongoConfiguration` requires you to implement methods that define a `com.mongodb.Mongo` as well as provide a database name. `AbstractMongoConfiguration` also has a method you can override named `getMappingBasePackage(…)` which tells the converter where to scan for classes annotated with the `@Document` annotation.
You can add additional converters to the converter by overriding the method afterMappingMongoConverterCreation. Also shown in the above example is a `LoggingEventListener` which logs `MongoMappingEvent`s that are posted onto Spring's `ApplicationContextEvent` infrastructure.
You can add additional converters to the converter by overriding the method afterMappingMongoConverterCreation. Also shown in the above example is a `LoggingEventListener` which logs `MongoMappingEvent` s that are posted onto Spring's `ApplicationContextEvent` infrastructure.
NOTE: AbstractMongoConfiguration will create a MongoTemplate instance and registered with the container under the name 'mongoTemplate'.
NOTE: AbstractMongoConfiguration will create a MongoTemplate instance and registered with the container under the name `mongoTemplate`.
You can also override the method `UserCredentials getUserCredentials()` to provide the username and password information to connect to the database.
@@ -165,7 +349,7 @@ The `base-package` property tells it where to scan for classes annotated with th
[[mapping-usage]]
== Metadata based Mapping
To take full advantage of the object mapping functionality inside the Spring Data/MongoDB support, you should annotate your mapped objects with the `@org.springframework.data.mongodb.core.mapping.Document` annotation. Although it is not necessary for the mapping framework to have this annotation (your POJOs will be mapped correctly, even without any annotations), it allows the classpath scanner to find and pre-process your domain objects to extract the necessary metadata. If you don't use this annotation, your application will take a slight performance hit the first time you store a domain object because the mapping framework needs to build up its internal metadata model so it knows about the properties of your domain object and how to persist them.
To take full advantage of the object mapping functionality inside the Spring Data/MongoDB support, you should annotate your mapped objects with the `@Document` annotation. Although it is not necessary for the mapping framework to have this annotation (your POJOs will be mapped correctly, even without any annotations), it allows the classpath scanner to find and pre-process your domain objects to extract the necessary metadata. If you don't use this annotation, your application will take a slight performance hit the first time you store a domain object because the mapping framework needs to build up its internal metadata model so it knows about the properties of your domain object and how to persist them.
.Example domain object
====
@@ -415,7 +599,7 @@ Simply declaring these beans in your Spring ApplicationContext will cause them t
[[mapping-explicit-converters]]
=== Overriding Mapping with explicit Converters
When storing and querying your objects it is convenient to have a `MongoConverter` instance handle the mapping of all Java types to DBObjects. However, sometimes you may want the `MongoConverter`'s do most of the work but allow you to selectively handle the conversion for a particular type or to optimize performance.
When storing and querying your objects it is convenient to have a `MongoConverter` instance handle the mapping of all Java types to DBObjects. However, sometimes you may want the `MongoConverter` s do most of the work but allow you to selectively handle the conversion for a particular type or to optimize performance.
To selectively handle the conversion yourself, register one or more one or more `org.springframework.core.convert.converter.Converter` instances with the MongoConverter.

View File

@@ -28,7 +28,7 @@ public class Person {
----
====
We have a quite simple domain object here. Note that it has a property named `id` of type`ObjectId`. The default serialization mechanism used in `MongoTemplate` (which is backing the repository support) regards properties named id as document id. Currently we support`String`, `ObjectId` and `BigInteger` as id-types.
We have a quite simple domain object here. Note that it has a property named `id` of type `ObjectId`. The default serialization mechanism used in `MongoTemplate` (which is backing the repository support) regards properties named id as document id. Currently we support `String`, `ObjectId` and `BigInteger` as id-types.
.Basic repository interface to persist Person entities
====
@@ -99,7 +99,7 @@ class ApplicationConfig extends AbstractMongoConfiguration {
----
====
As our domain repository extends `PagingAndSortingRepository` it provides you with CRUD operations as well as methods for paginated and sorted access to the entities. Working with the repository instance is just a matter of dependency injecting it into a client. So accessing the second page of `Person`s at a page size of 10 would simply look something like this:
As our domain repository extends `PagingAndSortingRepository` it provides you with CRUD operations as well as methods for paginated and sorted access to the entities. Working with the repository instance is just a matter of dependency injecting it into a client. So accessing the second page of `Person` s at a page size of 10 would simply look something like this:
.Paging access to Person entities
====
@@ -139,17 +139,17 @@ public interface PersonRepository extends PagingAndSortingRepository<Person, Str
Page<Person> findByFirstname(String firstname, Pageable pageable); <2>
Person findByShippingAddresses(Address address); <3>
Stream<Person> findAllBy(); <4>
}
----
<1> The method shows a query for all people with the given lastname. The query will be derived parsing the method name for constraints which can be concatenated with `And` and `Or`. Thus the method name will result in a query expression of `{"lastname" : lastname}`.
<2> Applies pagination to a query. Just equip your method signature with a `Pageable` parameter and let the method return a `Page` instance and we will automatically page the query accordingly.
<3> Shows that you can query based on properties which are not a primitive type.
<3> Shows that you can query based on properties which are not a primitive type.
<4> Uses a Java 8 `Stream` which reads and converts individual elements while iterating the stream.
====
NOTE: Note that for version 1.0 we currently don't support referring to parameters that are mapped as `DBRef` in the domain class.
@@ -212,10 +212,18 @@ NOTE: Note that for version 1.0 we currently don't support referring to paramete
| `findByFirstnameContaining(String name)`
| `{"firstname" : name} (name as regex)`
| `NotContaining` on String
| `findByFirstnameNotContaining(String name)`
| `{"firstname" : { "$not" : name}} (name as regex)`
| `Containing` on Collection
| `findByAddressesContaining(Address address)`
| `{"addresses" : { "$in" : address}}`
| `NotContaining` on Collection
| `findByAddressesNotContaining(Address address)`
| `{"addresses" : { "$not" : { "$in" : address}}}`
| `Regex`
| `findByFirstnameRegex(String firstname)`
| `{"firstname" : {"$regex" : firstname }}`
@@ -328,7 +336,7 @@ public interface PersonRepository extends MongoRepository<Person, String>
// Metric: {'geoNear' : 'person', 'near' : [x, y], 'maxDistance' : distance,
// 'distanceMultiplier' : metric.multiplier, 'spherical' : true }
GeoResults<Person> findByLocationNear(Point location, Distance distance);
// Metric: {'geoNear' : 'person', 'near' : [x, y], 'minDistance' : min,
// 'maxDistance' : max, 'distanceMultiplier' : metric.multiplier,
// 'spherical' : true }

View File

@@ -366,7 +366,7 @@ public class MongoConfiguration {
[[mongo.mongo-db-factory-xml]]
=== Registering a MongoDbFactory instance using XML based metadata
The mongo namespace provides a convient way to create a `SimpleMongoDbFactory` as compared to using the`<beans/>` namespace. Simple usage is shown below
The mongo namespace provides a convient way to create a `SimpleMongoDbFactory` as compared to using the `<beans/>` namespace. Simple usage is shown below
[source,xml]
----
@@ -643,21 +643,21 @@ NOTE: This example is meant to show the use of save, update and remove operation
The query syntax used in the example is explained in more detail in the section <<mongo.query,Querying Documents>>.
[[mongo-template.id-handling]]
=== How the '_id' field is handled in the mapping layer
=== How the `_id` field is handled in the mapping layer
MongoDB requires that you have an '_id' field for all documents. If you don't provide one the driver will assign a `ObjectId` with a generated value. When using the `MongoMappingConverter` there are certain rules that govern how properties from the Java class is mapped to this '_id' field.
MongoDB requires that you have an `_id` field for all documents. If you don't provide one the driver will assign a `ObjectId` with a generated value. When using the `MongoMappingConverter` there are certain rules that govern how properties from the Java class is mapped to this `_id` field.
The following outlines what property will be mapped to the '_id' document field:
The following outlines what property will be mapped to the `_id` document field:
* A property or field annotated with `@Id` (`org.springframework.data.annotation.Id`) will be mapped to the '_id' field.
* A property or field without an annotation but named `id` will be mapped to the '_id' field.
* A property or field annotated with `@Id` (`org.springframework.data.annotation.Id`) will be mapped to the `_id` field.
* A property or field without an annotation but named `id` will be mapped to the `_id` field.
The following outlines what type conversion, if any, will be done on the property mapped to the _id document field when using the `MappingMongoConverter`, the default for `MongoTemplate`.
* An id property or field declared as a String in the Java class will be converted to and stored as an `ObjectId` if possible using a Spring `Converter<String, ObjectId>`. Valid conversion rules are delegated to the MongoDB Java driver. If it cannot be converted to an ObjectId, then the value will be stored as a string in the database.
* An id property or field declared as `BigInteger` in the Java class will be converted to and stored as an `ObjectId` using a Spring `Converter<BigInteger, ObjectId>`.
If no field or property specified above is present in the Java class then an implicit '_id' file will be generated by the driver but not mapped to a property or field of the Java class.
If no field or property specified above is present in the Java class then an implicit `_id` file will be generated by the driver but not mapped to a property or field of the Java class.
When querying and updating `MongoTemplate` will use the converter to handle conversions of the `Query` and `Update` objects that correspond to the above rules for saving documents so field names and types used in your queries will be able to match what is in your domain classes.
@@ -815,7 +815,7 @@ There are two ways to manage the collection name that is used for operating on t
The MongoDB driver supports inserting a collection of documents in one operation. The methods in the MongoOperations interface that support this functionality are listed below
* *insert* inserts an object. If there is an existing document with the same id then an error is generated.
* *insertAll* takes a `Collection `of objects as the first parameter. This method inspects each object and inserts it to the appropriate collection based on the rules specified above.
* *insertAll* takes a `Collection` of objects as the first parameter. This method inspects each object and inserts it to the appropriate collection based on the rules specified above.
* *save* saves the object overwriting any object that might exist with the same id.
[[mongo-template.save-insert.batch]]
@@ -823,12 +823,12 @@ The MongoDB driver supports inserting a collection of documents in one operation
The MongoDB driver supports inserting a collection of documents in one operation. The methods in the MongoOperations interface that support this functionality are listed below
* *insert*` methods that take a `Collection` as the first argument. This inserts a list of objects in a single batch write to the database.
* *insert* methods that take a `Collection` as the first argument. This inserts a list of objects in a single batch write to the database.
[[mongodb-template-update]]
=== Updating documents in a collection
For updates we can elect to update the first document found using `MongoOperation`'s method `updateFirst` or we can update all documents that were found to match the query using the method `updateMulti`. Here is an example of an update of all SAVINGS accounts where we are adding a one time $50.00 bonus to the balance using the `$inc` operator.
For updates we can elect to update the first document found using `MongoOperation` 's method `updateFirst` or we can update all documents that were found to match the query using the method `updateMulti`. Here is an example of an update of all SAVINGS accounts where we are adding a one time $50.00 bonus to the balance using the `$inc` operator.
.Updating documents using the MongoTemplate
====
@@ -1045,7 +1045,7 @@ There are also methods on the Criteria class for geospatial queries. Here is a l
* `Criteria` *within* `(Circle circle)` Creates a geospatial criterion using `$geoWithin $center` operators.
* `Criteria` *within* `(Box box)` Creates a geospatial criterion using a `$geoWithin $box` operation.
* `Criteria` *withinSphere* `(Circle circle)` Creates a geospatial criterion using `$geoWithin $center` operators.
* `Criteria` *near* `(Point point)` Creates a geospatial criterion using a `$near `operation
* `Criteria` *near* `(Point point)` Creates a geospatial criterion using a `$near` operation
* `Criteria` *nearSphere* `(Point point)` Creates a geospatial criterion using `$nearSphere$center` operations. This is only available for MongoDB 1.7 and higher.
* `Criteria` *minDistance* `(double minDistance)` Creates a geospatial criterion using the `$minDistance` operation, for use with $near.
* `Criteria` *maxDistance* `(double maxDistance)` Creates a geospatial criterion using the `$maxDistance` operation, for use with $near.
@@ -1385,7 +1385,7 @@ Executing this will result in a collection as shown below.
{ "_id" : "d", "value" : 1 }
----
Assuming that the map and reduce functions are located in map.js and reduce.js and bundled in your jar so they are available on the classpath, you can execute a map-reduce operation and obtain the results as shown below
Assuming that the map and reduce functions are located in `map.js` and `reduce.js` and bundled in your jar so they are available on the classpath, you can execute a map-reduce operation and obtain the results as shown below
[source,java]
----
@@ -1493,7 +1493,7 @@ Spring provides integration with MongoDB's group operation by providing methods
[[mongo.group.example]]
=== Example Usage
In order to understand how group operations work the following example is used, which is somewhat artificial. For a more realistic example consult the book 'MongoDB - The definitive guide'. A collection named "group_test_collection" created with the following rows.
In order to understand how group operations work the following example is used, which is somewhat artificial. For a more realistic example consult the book 'MongoDB - The definitive guide'. A collection named `group_test_collection` created with the following rows.
[source]
----
@@ -1505,7 +1505,7 @@ In order to understand how group operations work the following example is used,
{ "_id" : ObjectId("4ec1d25d41421e2015da64f6"), "x" : 3 }
----
We would like to group by the only field in each row, the 'x' field and aggregate the number of times each specific value of 'x' occurs. To do this we need to create an initial document that contains our count variable and also a reduce function which will increment it each time it is encountered. The Java code to execute the group operation is shown below
We would like to group by the only field in each row, the `x` field and aggregate the number of times each specific value of `x` occurs. To do this we need to create an initial document that contains our count variable and also a reduce function which will increment it each time it is encountered. The Java code to execute the group operation is shown below
[source,java]
----
@@ -1664,7 +1664,7 @@ Note that the aggregation operations not listed here are currently not supported
[[mongo.aggregation.projection]]
=== Projection Expressions
Projection expressions are used to define the fields that are the outcome of a particular aggregation step. Projection expressions can be defined via the `project` method of the `Aggregate` class either by passing a list of `String`s or an aggregation framework `Fields` object. The projection can be extended with additional fields through a fluent API via the `and(String)` method and aliased via the `as(String)` method.
Projection expressions are used to define the fields that are the outcome of a particular aggregation step. Projection expressions can be defined via the `project` method of the `Aggregate` class either by passing a list of `String` 's or an aggregation framework `Fields` object. The projection can be extended with additional fields through a fluent API via the `and(String)` method and aliased via the `as(String)` method.
Note that one can also define fields with aliases via the static factory method `Fields.field` of the aggregation framework that can then be used to construct a new `Fields` instance.
.Projection expression examples
@@ -1808,7 +1808,7 @@ ZipInfoStats firstZipInfoStats = result.getMappedResults().get(0);
----
* The class `ZipInfo` maps the structure of the given input-collection. The class `ZipInfoStats` defines the structure in the desired output format.
* As a first step we use the `group` operation to define a group from the input-collection. The grouping criteria is the combination of the fields `"state"` and `"city" `which forms the id structure of the group. We aggregate the value of the `"population"` property from the grouped elements with by using the `sum` operator saving the result in the field `"pop"`.
* As a first step we use the `group` operation to define a group from the input-collection. The grouping criteria is the combination of the fields `"state"` and `"city"` which forms the id structure of the group. We aggregate the value of the `"population"` property from the grouped elements with by using the `sum` operator saving the result in the field `"pop"`.
* In a second step we use the `sort` operation to sort the intermediate-result by the fields `"pop"`, `"state"` and `"city"` in ascending order, such that the smallest city is at the top and the biggest city is at the bottom of the result. Note that the sorting on "state" and `"city"` is implicitly performed against the group id fields which Spring Data MongoDB took care of.
* In the third step we use a `group` operation again to group the intermediate result by `"state"`. Note that `"state"` again implicitly references an group-id field. We select the name and the population count of the biggest and smallest city with calls to the `last(…)` and `first(...)` operator respectively via the `project` operation.
* As the forth step we select the `"state"` field from the previous `group` operation. Note that `"state"` again implicitly references an group-id field. As we do not want an implicit generated id to appear, we exclude the id from the previous operation via `and(previousOperation()).exclude()`. As we want to populate the nested `City` structures in our output-class accordingly we have to emit appropriate sub-documents with the nested method.
@@ -2005,7 +2005,7 @@ public class PersonReadConverter implements Converter<DBObject, Person> {
[[mongo.custom-converters.xml]]
=== Registering Spring Converters with the MongoConverter
The Mongo Spring namespace provides a convenience way to register Spring `Converter`s with the `MappingMongoConverter`. The configuration snippet below shows how to manually register converter beans as well as configuring the wrapping `MappingMongoConverter` into a `MongoTemplate`.
The Mongo Spring namespace provides a convenience way to register Spring `Converter` s with the `MappingMongoConverter`. The configuration snippet below shows how to manually register converter beans as well as configuring the wrapping `MappingMongoConverter` into a `MongoTemplate`.
[source,xml]
----
@@ -2215,7 +2215,7 @@ Here is a list of execute callback methods.
* `<T> T` *execute* `(String collectionName, DbCallback<T> action)` Executes a DbCallback on the collection of the given name translating any exceptions as necessary.
* `<T> T` *executeInSession* `(DbCallback<T> action) ` Executes the given DbCallback within the same connection to the database so as to ensure consistency in a write heavy environment where you may read the data that you wrote.
* `<T> T` *executeInSession* `(DbCallback<T> action)` Executes the given DbCallback within the same connection to the database so as to ensure consistency in a write heavy environment where you may read the data that you wrote.
Here is an example that uses the `CollectionCallback` to return information about an index

View File

@@ -1,6 +1,23 @@
Spring Data MongoDB Changelog
=============================
Changes in version 1.9.2.RELEASE (2016-06-15)
---------------------------------------------
* DATAMONGO-1449 - Replace legacy for loop with foreach in MappingMongoConverter.
* DATAMONGO-1437 - DefaultDbRefResolver swallows cause of non DataAccessException translatable Exception.
* DATAMONGO-1425 - NOT_CONTAINS keyword issues CONTAINS query.
* DATAMONGO-1423 - Nested document update doesn't apply converters on embedded maps.
* DATAMONGO-1416 - Standard bootstrap issues warning in converter registration.
* DATAMONGO-1412 - Document mapping rules for Java types to MongoDB representation.
* DATAMONGO-1411 - Enable MongoDB build on TravisCI.
* DATAMONGO-1410 - Release 1.9.2 (Hopper SR2).
Changes in version 1.9.1.RELEASE (2016-04-06)
---------------------------------------------
* DATAMONGO-1408 - Release 1.9.1 (Hopper SR1).
Changes in version 1.9.0.RELEASE (2016-04-06)
---------------------------------------------
* DATAMONGO-1407 - Add pull request template.

View File

@@ -1,4 +1,4 @@
Spring Data MongoDB 1.9 GA
Spring Data MongoDB 1.9.2
Copyright (c) [2010-2015] Pivotal Software, Inc.
This product is licensed to you under the Apache License, Version 2.0 (the "License").