Compare commits

..

40 Commits

Author SHA1 Message Date
Christoph Strobl
24417c4c99 DATAMONGO-1927 - Release version 2.1 M3 (Lovelace). 2018-05-17 09:51:42 +02:00
Christoph Strobl
eec36b791a DATAMONGO-1927 - Prepare 2.1 M3 (Lovelace). 2018-05-17 09:50:48 +02:00
Christoph Strobl
512fa036bb DATAMONGO-1927 - Updated changelog. 2018-05-17 09:50:42 +02:00
Sébastien Deleuze
ea8df26eee DATAMONGO-1980 - Fix a typo in CriteriaExtensions.kt.
Original pull request: #563.
2018-05-16 09:43:53 +02:00
Oliver Gierke
43d821aab0 DATAMONGO-1874 - Polishing.
Cleanups in test case. Moved assertions to AssertJ.
2018-05-15 14:39:35 +02:00
Oliver Gierke
9bb8211ed7 DATAMONGO-1874 - @Document's collection attribute now supports EvaluationContextExtensions.
We now use the API newly introduced with DATACMNS-1260 to expose EvaluationContextExtensions to the SpEL evaluation in case the collection attribute of @Document uses SpEL.

Related tickets: DATACMNS-1260.
2018-05-15 14:39:35 +02:00
Victor
5a24e04226 DATAMONGO-1978 - Fix minor typo in Field.positionKey field name.
Original pull request: #558.
2018-05-15 12:30:22 +02:00
Mark Paluch
521d28ff3f DATAMONGO-1466 - Polishing.
Switch conditionals to Map-based Function registry to pick the appropriate converter. Fix typos in method names.

Original pull request: #561.
2018-05-15 11:27:04 +02:00
Christoph Strobl
e38e7d89f4 DATAMONGO-1466 - Polishing.
Just some minor code style improvements.

Original pull request: #561.
2018-05-15 11:27:04 +02:00
Christoph Strobl
fff69b9ed7 DATAMONGO-1466 - Add embedded typeinformation-based reading GeoJSON converter.
Original pull request: #561.
2018-05-15 11:27:04 +02:00
Oliver Gierke
e2e9e92563 DATAMONGO-1880 - Improve test execution in SessionBoundMongotTemplateTests.
We now also require a replica set for the test execution.
2018-05-15 10:39:36 +02:00
Oliver Gierke
3e040d283b DATAMONGO-1976 - Adapt to SpEL extension API changes in Spring Data Commons.
Related tickets: DATACMNS-1260.
2018-05-15 10:36:18 +02:00
Mark Paluch
a66b87118e DATAMONGO-1970 - Polishing.
ReactiveMongoOperations.withSession(…) no longer commits transactions if a transaction is active. ReactiveSessionScoped obtained through inTransaction() solely manages transactions and participates in ongoing transactions if a given ClientSession has already an active transaction. Remove ReactiveSessionScoped.executeSingle methods to align with ReactiveMongoOperations.

Add tests. Switch reactive tests to .as(StepVerifier:create) form. Extend documentation.

Original pull request: #560.
2018-05-14 13:18:15 +02:00
Christoph Strobl
f296a499e5 DATAMONGO-1970 - Add support for MongoDB 4.0 transactions (reactive).
We now support Mongo Transactions through the reactive Template API. However, there's no reactive repository transaction support yet.

Mono<DeleteResult> result = template.inTransaction()
                              .execute(action -> action.remove(query(where("id").is("step-1")), Step.class));

Original pull request: #560.
2018-05-14 13:16:22 +02:00
Mark Paluch
1acf00b039 DATAMONGO-1974 - Polishing.
Fix typos, links, and code fences.

Original pull request: #559.
2018-05-11 15:30:05 +02:00
Jay Bryant
e23c861a39 DATAMONGO-1974 - Full editing pass for Spring Data MongoDB.
Full editing pass of the Spring Data MongoDB reference guide. I also adjusted index.adoc to work with the changes I made to the build project, so that we get Epub and PDF as well as HTML.

Original pull request: #559.
2018-05-11 15:30:05 +02:00
Mark Paluch
85aef4836d DATAMONGO-1968 - Polishing.
Rename MongoDbFactoryBase to MongoDbFactorySupport. Add constructor to MongoTemplate accepting the new MongoClient type. Extend Javadoc. Switch tests to use the new MongoTemplate constructor.

Original pull request: #557.
2018-05-11 10:29:58 +02:00
Christoph Strobl
5470486d8d DATAMONGO-1968 - Add configuration support for com.mongodb.client.MongoClient.
We now accept MongoDB's new com.mongodb.client.MongoClient object to setup Spring Data MongoDB infrastructure through AbstractMongoClientConfiguration. The new MongoClient does not support DBObject anymore hence it cannot be used with Querydsl.

@Configuration
public class MongoClientConfiguration extends AbstractMongoClientConfiguration {

	@Override
	protected String getDatabaseName() {
		return "database";
	}

	@Override
	public MongoClient mongoClient() {
		return MongoClients.create("mongodb://localhost:27017/?replicaSet=rs0&w=majority");
	}
}

Original pull request: #557.
2018-05-11 10:27:30 +02:00
Mark Paluch
42c02c9b70 DATAMONGO-1971 - Polishing.
Remove outdated profiles.

Original pull request: #554.
2018-05-09 16:35:21 +02:00
Mark Paluch
ee9bca4856 DATAMONGO-1971 - Install MongoDB 3.7.9 on TravisCI.
We now download and unpack MongoDB directly instead of using TravisCI's outdated MongoDB version.

Original pull request: #554.
2018-05-09 16:35:18 +02:00
Mark Paluch
0d823df7f3 DATAMONGO-1920 - Polishing.
Slightly tweak method names. Document MongoDatabaseUtils usage in the context of MongoTransactionManager. Rename SessionSynchronization constants to align with AbstractPlatformTransactionManager. Slightly tweak Javadoc and reference docs for typos.

Original pull request: #554.
2018-05-09 16:31:58 +02:00
Christoph Strobl
4cd2935087 DATAMONGO-1920 - Add support for MongoDB 4.0 transactions (synchronous driver).
MongoTransactionManager is the gateway to the well known Spring transaction support. It allows applications to use managed transaction features of Spring.
The MongoTransactionManager binds a ClientSession to the thread. MongoTemplate automatically detects those and operates on them accordingly.

static class Config extends AbstractMongoConfiguration {

	// ...

	@Bean
	MongoTransactionManager transactionManager(MongoDbFactory dbFactory) {
		return new MongoTransactionManager(dbFactory);
	}
}

@Component
public class StateService {

	@Transactional
	void someBusinessFunction(Step step) {

		template.insert(step);

		process(step);

		template.update(Step.class).apply(update.set("state", // ...
	};
});

Original pull request: #554.
2018-05-09 16:31:06 +02:00
Mark Paluch
6fbb7cec22 DATAMONGO-1918 - Updated changelog. 2018-05-08 15:27:17 +02:00
Mark Paluch
44ea579b69 DATAMONGO-1917 - Updated changelog. 2018-05-08 12:22:51 +02:00
Mark Paluch
30af34f80a DATAMONGO-1943 - Polishing.
Reduce visibility. Use List interface instead of concrete type.

Original pull request: #556.
2018-05-07 16:20:41 +02:00
Christoph Strobl
247f30143b DATAMONGO-1943 - Fix ClassCastException caused by SpringDataMongodbSerializer.
We now convert List-typed predicates to List to BasicDBList to meet MongodbSerializer's expectations for top-level lists used for the $and operator.

Original pull request: #556.
2018-05-07 16:18:52 +02:00
Mark Paluch
364f266a3a DATAMONGO-1914 - Polishing.
Throw FileNotFoundException on inherited methods throwing IOException if resource is absent. Retain filename for absent resources to provide context through GridFsResource.getFilename(). Switch exists() to determine presence/absence based on GridFSFile presence. Extend tests.

Original pull request: #555.
2018-05-07 14:53:27 +02:00
Christoph Strobl
f92bd20384 DATAMONGO-1914 - Return an empty GridFsResource instead of null when resource does not exist.
Original pull request: #555.
2018-05-07 14:53:21 +02:00
Mark Paluch
304e1c607f DATAMONGO-1929 - Polishing.
Add missing Nullable annotations and missing diamond operators. Slightly reorder null guards.

Use Lombok to generate required constructors.

Original pull request: #551.
2018-04-23 10:34:37 +02:00
Christoph Strobl
449780573e DATAMONGO-1929 - Add Kotlin extensions for Executable and ReactiveMapReduceOperation.
Original pull request: #551.
2018-04-23 10:34:31 +02:00
Christoph Strobl
e424573f0d DATAMONGO-1929 - Add fluent mapReduce template API.
Original pull request: #551.
2018-04-23 10:34:10 +02:00
Christoph Strobl
31630c0dcc DATAMONGO-1928 - Polishing.
Use native driver operations to avoid potential unwanted template index interaction.

Original Pull Request: #550
2018-04-20 13:17:38 +02:00
Mark Paluch
7cdc3d00c1 DATAMONGO-1928 - Polishing.
Migrate test to AssertJ and as-style for StepVerifier.

Original Pull Request: #550
2018-04-20 13:02:00 +02:00
Mark Paluch
a9f5d7bd3d DATAMONGO-1928 - Use non-blocking index creation through ReactiveMongoTemplate.
We now use ReactiveIndexOperationsProvider to inspect and create indexes for MongoDB collections without using blocking methods. Indexes are created for initial entities and whenever a MongoPersistentEntity is registered in MongoMappingContext.

Index creation is now decoupled from the actual ReactiveMongoTemplate call causing indexes to be created asynchronously. Mongo commands no longer depend on the completion of index creation commands. Decoupling also comes with the aspect that ReactiveMongoTemplate creation/command invocation no longer fails if the actual index creation fails. Previous usage of blocking index creation caused the actual ReactiveMongoTemplate call to fail.

ReactiveMongoTemplate objects can be created with a Consumer<Throwable> callback that is notified if an index creation fails.

Original Pull Request: #550
2018-04-20 13:00:54 +02:00
Christoph Strobl
b5f18468db DATAMONGO-1808 - Polishing.
Move individual methods under BitwiseCriteriaOperators, update Javadoc and split tests.

Original Pull Request: #507
2018-04-18 13:15:05 +02:00
Andreas Zink
ef872d2527 DATAMONGO-1808 - Add support for bitwise query operators.
Add support for $bitsAllClear, $bitsAllSet, $bitsAnyClear and $bitsAnySet.

Original Pull Request: #507
2018-04-18 13:14:45 +02:00
Mark Paluch
c2516946e9 DATAMONGO-1890 - Polishing.
Remove mapReduce default methods in favor of adding variants through a fluent API at a later stage. Assert mapReduce arguments and remove subsequent null guards. Adapt tests.

Original pull request: #548.
2018-04-17 10:12:46 +02:00
Christoph Strobl
857add7349 DATAMONGO-1890 - Add support for mapReduce to ReactiveMongoOperations.
We now support mapReduce via ReactiveMongoTemplate returning a Flux<T> as the operations result.

Original pull request: #548.
2018-04-17 10:11:29 +02:00
Mark Paluch
2ec3f219c8 DATAMONGO-1869 - After release cleanups. 2018-04-13 15:08:33 +02:00
Mark Paluch
5c8701f79c DATAMONGO-1869 - Prepare next development iteration. 2018-04-13 15:08:32 +02:00
126 changed files with 8185 additions and 1950 deletions

View File

@@ -3,44 +3,33 @@ language: java
jdk:
- oraclejdk8
before_script:
- mongod --version
before_install:
- mkdir -p downloads
- mkdir -p var/db var/log
- if [[ ! -d downloads/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION} ]] ; then cd downloads && wget https://fastdl.mongodb.org/linux/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}.tgz && tar xzf mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}.tgz && cd ..; fi
- downloads/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}/bin/mongod --version
- downloads/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}/bin/mongod --dbpath var/db --replSet rs0 --fork --logpath var/log/mongod.log
- sleep 10
- |-
echo "replication:
replSetName: rs0" | sudo tee -a /etc/mongod.conf
- sudo service mongod restart
- sleep 20
- |-
mongo --eval "rs.initiate({_id: 'rs0', members:[{_id: 0, host: '127.0.0.1:27017'}]});"
- sleep 15
services:
- mongodb
downloads/mongodb-linux-x86_64-ubuntu1604-${MONGO_VERSION}/bin/mongo --eval "rs.initiate({_id: 'rs0', members:[{_id: 0, host: '127.0.0.1:27017'}]});"
sleep 15
env:
matrix:
- PROFILE=ci
- PROFILE=mongo36-next
global:
- MONGO_VERSION=3.7.9
# Current MongoDB version is 2.4.2 as of 2016-04, see https://github.com/travis-ci/travis-ci/issues/3694
# apt-get starts a MongoDB instance so it's not started using before_script
addons:
apt:
sources:
- mongodb-3.4-precise
packages:
- mongodb-org-server
- mongodb-org-shell
- oracle-java8-installer
- oracle-java8-installer
sudo: false
cache:
directories:
- $HOME/.m2
install:
- |-
mongo admin --eval "db.adminCommand({setFeatureCompatibilityVersion: '3.4'});"
- downloads
script: "mvn clean dependency:list test -P${PROFILE} -Dsort"

View File

@@ -138,6 +138,42 @@ public class MyService {
}
```
### MongoDB 4.0 Transactions
As of version 4 MongoDB supports [Transactions](https://www.mongodb.com/transactions). Transactions are built on top of
`ClientSessions` and therefore require an active session.
`MongoTransactionManager` is the gateway to the well known Spring transaction support. It allows applications to use
[managed transaction features of Spring](http://docs.spring.io/spring/docs/current/spring-framework-reference/html/transaction.html).
The `MongoTransactionManager` binds a `ClientSession` to the thread. `MongoTemplate` automatically detects those and operates on them accordingly.
```java
@Configuration
static class Config extends AbstractMongoConfiguration {
@Bean
MongoTransactionManager transactionManager(MongoDbFactory dbFactory) {
return new MongoTransactionManager(dbFactory);
}
// ...
}
@Component
public class StateService {
@Transactional
void someBusinessFunction(Step step) {
template.insert(step);
process(step);
template.update(Step.class).apply(Update.set("state", // ...
};
});
```
## Contributing to Spring Data
Here are some ways for you to get involved in the community:

42
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.1.0.M2</version>
<version>2.1.0.M3</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>2.1.0.M2</version>
<version>2.1.0.M3</version>
</parent>
<modules>
@@ -27,9 +27,9 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>2.1.0.M2</springdata.commons>
<mongo>3.6.3</mongo>
<mongo.reactivestreams>1.7.1</mongo.reactivestreams>
<springdata.commons>2.1.0.M3</springdata.commons>
<mongo>3.8.0-beta2</mongo>
<mongo.reactivestreams>1.9.0-beta1</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
</properties>
@@ -115,38 +115,6 @@
<profiles>
<!-- not-yet available profile>
<id>mongo35-next</id>
<properties>
<mongo>3.5.1-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile -->
<profile>
<id>mongo36-next</id>
<properties>
<mongo>3.6.0-SNAPSHOT</mongo>
</properties>
<repositories>
<repository>
<id>mongo-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
</profile>
<profile>
<id>release</id>
<build>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.1.0.M2</version>
<version>2.1.0.M3</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -6,7 +6,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.1.0.M2</version>
<version>2.1.0.M3</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -50,7 +50,7 @@
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>2.1.0.M2</version>
<version>2.1.0.M3</version>
</dependency>
<!-- reactive -->

View File

@@ -13,7 +13,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.1.0.M2</version>
<version>2.1.0.M3</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>2.1.0.M2</version>
<version>2.1.0.M3</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -253,6 +253,13 @@
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.transaction</groupId>
<artifactId>jta</artifactId>
<version>1.1</version>
<scope>test</scope>
</dependency>
<!-- Kotlin extension -->
<dependency>
<groupId>org.jetbrains.kotlin</groupId>

View File

@@ -0,0 +1,230 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.springframework.lang.Nullable;
import org.springframework.transaction.support.ResourceHolderSynchronization;
import org.springframework.transaction.support.TransactionSynchronization;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.ClientSessionOptions;
import com.mongodb.client.ClientSession;
import com.mongodb.client.MongoDatabase;
/**
* Helper class for managing a {@link MongoDatabase} instances via {@link MongoDbFactory}. Used for obtaining
* {@link ClientSession session bound} resources, such as {@link MongoDatabase} and
* {@link com.mongodb.client.MongoCollection} suitable for transactional usage.
* <p />
* <strong>Note:</strong> Intended for internal usage only.
*
* @author Christoph Strobl
* @author Mark Paluch
* @currentRead Shadow's Edge - Brent Weeks
* @since 2.1
*/
public class MongoDatabaseUtils {
/**
* Obtain the default {@link MongoDatabase database} form the given {@link MongoDbFactory factory} using
* {@link SessionSynchronization#ON_ACTUAL_TRANSACTION native session synchronization}.
* <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the current
* {@link Thread} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
*
* @param factory the {@link MongoDbFactory} to get the {@link MongoDatabase} from.
* @return the {@link MongoDatabase} that is potentially associated with a transactional {@link ClientSession}.
*/
public static MongoDatabase getDatabase(MongoDbFactory factory) {
return doGetMongoDatabase(null, factory, SessionSynchronization.ON_ACTUAL_TRANSACTION);
}
/**
* Obtain the default {@link MongoDatabase database} form the given {@link MongoDbFactory factory}.
* <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the current
* {@link Thread} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
*
* @param factory the {@link MongoDbFactory} to get the {@link MongoDatabase} from.
* @param sessionSynchronization the synchronization to use. Must not be {@literal null}.
* @return the {@link MongoDatabase} that is potentially associated with a transactional {@link ClientSession}.
*/
public static MongoDatabase getDatabase(MongoDbFactory factory, SessionSynchronization sessionSynchronization) {
return doGetMongoDatabase(null, factory, sessionSynchronization);
}
/**
* Obtain the {@link MongoDatabase database} with given name form the given {@link MongoDbFactory factory} using
* {@link SessionSynchronization#ON_ACTUAL_TRANSACTION native session synchronization}.
* <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the current
* {@link Thread} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
*
* @param dbName the name of the {@link MongoDatabase} to get.
* @param factory the {@link MongoDbFactory} to get the {@link MongoDatabase} from.
* @return the {@link MongoDatabase} that is potentially associated with a transactional {@link ClientSession}.
*/
public static MongoDatabase getDatabase(String dbName, MongoDbFactory factory) {
return doGetMongoDatabase(dbName, factory, SessionSynchronization.ON_ACTUAL_TRANSACTION);
}
/**
* Obtain the {@link MongoDatabase database} with given name form the given {@link MongoDbFactory factory}.
* <p />
* Registers a {@link MongoSessionSynchronization MongoDB specific transaction synchronization} within the current
* {@link Thread} if {@link TransactionSynchronizationManager#isSynchronizationActive() synchronization is active}.
*
* @param dbName the name of the {@link MongoDatabase} to get.
* @param factory the {@link MongoDbFactory} to get the {@link MongoDatabase} from.
* @param sessionSynchronization the synchronization to use. Must not be {@literal null}.
* @return the {@link MongoDatabase} that is potentially associated with a transactional {@link ClientSession}.
*/
public static MongoDatabase getDatabase(String dbName, MongoDbFactory factory,
SessionSynchronization sessionSynchronization) {
return doGetMongoDatabase(dbName, factory, sessionSynchronization);
}
private static MongoDatabase doGetMongoDatabase(@Nullable String dbName, MongoDbFactory factory,
SessionSynchronization sessionSynchronization) {
Assert.notNull(factory, "Factory must not be null!");
if (!TransactionSynchronizationManager.isSynchronizationActive()) {
return StringUtils.hasText(dbName) ? factory.getDb(dbName) : factory.getDb();
}
ClientSession session = doGetSession(factory, sessionSynchronization);
if(session == null) {
return StringUtils.hasText(dbName) ? factory.getDb(dbName) : factory.getDb();
}
MongoDbFactory factoryToUse = factory.withSession(session);
return StringUtils.hasText(dbName) ? factoryToUse.getDb(dbName) : factoryToUse.getDb();
}
@Nullable
private static ClientSession doGetSession(MongoDbFactory dbFactory, SessionSynchronization sessionSynchronization) {
MongoResourceHolder resourceHolder = (MongoResourceHolder) TransactionSynchronizationManager.getResource(dbFactory);
// check for native MongoDB transaction
if (resourceHolder != null && (resourceHolder.hasSession() || resourceHolder.isSynchronizedWithTransaction())) {
if (!resourceHolder.hasSession()) {
resourceHolder.setSession(createClientSession(dbFactory));
}
return resourceHolder.getSession();
}
if (SessionSynchronization.ON_ACTUAL_TRANSACTION.equals(sessionSynchronization)) {
return null;
}
// init a non native MongoDB transaction by registering a MongoSessionSynchronization
resourceHolder = new MongoResourceHolder(createClientSession(dbFactory), dbFactory);
resourceHolder.getSession().startTransaction();
TransactionSynchronizationManager
.registerSynchronization(new MongoSessionSynchronization(resourceHolder, dbFactory));
resourceHolder.setSynchronizedWithTransaction(true);
TransactionSynchronizationManager.bindResource(dbFactory, resourceHolder);
return resourceHolder.getSession();
}
private static ClientSession createClientSession(MongoDbFactory dbFactory) {
return dbFactory.getSession(ClientSessionOptions.builder().causallyConsistent(true).build());
}
/**
* MongoDB specific {@link ResourceHolderSynchronization} for resource cleanup at the end of a transaction when
* participating in a non-native MongoDB transaction, such as a Jta or JDBC transaction.
*
* @author Christoph Strobl
* @since 2.1
*/
private static class MongoSessionSynchronization extends ResourceHolderSynchronization<MongoResourceHolder, Object> {
private final MongoResourceHolder resourceHolder;
MongoSessionSynchronization(MongoResourceHolder resourceHolder, MongoDbFactory dbFactory) {
super(resourceHolder, dbFactory);
this.resourceHolder = resourceHolder;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#shouldReleaseBeforeCompletion()
*/
@Override
protected boolean shouldReleaseBeforeCompletion() {
return false;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#processResourceAfterCommit(java.lang.Object)
*/
@Override
protected void processResourceAfterCommit(MongoResourceHolder resourceHolder) {
if (isTransactionActive(resourceHolder)) {
resourceHolder.getSession().commitTransaction();
}
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#afterCompletion(int)
*/
@Override
public void afterCompletion(int status) {
if (status == TransactionSynchronization.STATUS_ROLLED_BACK && isTransactionActive(this.resourceHolder)) {
resourceHolder.getSession().abortTransaction();
}
super.afterCompletion(status);
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceHolderSynchronization#releaseResource(java.lang.Object, java.lang.Object)
*/
@Override
protected void releaseResource(MongoResourceHolder resourceHolder, Object resourceKey) {
if (resourceHolder.hasActiveSession()) {
resourceHolder.getSession().close();
}
}
private boolean isTransactionActive(MongoResourceHolder resourceHolder) {
if (!resourceHolder.hasSession()) {
return false;
}
return resourceHolder.getSession().hasActiveTransaction();
}
}
}

View File

@@ -22,8 +22,8 @@ import org.springframework.data.mongodb.core.MongoExceptionTranslator;
import com.mongodb.ClientSessionOptions;
import com.mongodb.DB;
import com.mongodb.client.ClientSession;
import com.mongodb.client.MongoDatabase;
import com.mongodb.session.ClientSession;
/**
* Interface for factories creating {@link MongoDatabase} instances.
@@ -32,7 +32,7 @@ import com.mongodb.session.ClientSession;
* @author Thomas Darimont
* @author Christoph Strobl
*/
public interface MongoDbFactory extends CodecRegistryProvider {
public interface MongoDbFactory extends CodecRegistryProvider, MongoSessionProvider {
/**
* Creates a default {@link MongoDatabase} instance.
@@ -58,6 +58,11 @@ public interface MongoDbFactory extends CodecRegistryProvider {
*/
PersistenceExceptionTranslator getExceptionTranslator();
/**
* Get the legacy database entry point. Please consider {@link #getDb()} instead.
*
* @return
*/
DB getLegacyDb();
/**

View File

@@ -0,0 +1,122 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.springframework.lang.Nullable;
import org.springframework.transaction.TransactionDefinition;
import org.springframework.transaction.support.ResourceHolderSupport;
import com.mongodb.client.ClientSession;
/**
* MongoDB specific {@link ResourceHolderSupport resource holder}, wrapping a {@link ClientSession}.
* {@link MongoTransactionManager} binds instances of this class to the thread.
* <p />
* <strong>Note:</strong> Intended for internal usage only.
*
* @author Christoph Strobl
* @since 2.1
* @see MongoTransactionManager
* @see org.springframework.data.mongodb.core.MongoTemplate
*/
class MongoResourceHolder extends ResourceHolderSupport {
private @Nullable ClientSession session;
private MongoDbFactory dbFactory;
/**
* Create a new {@link MongoResourceHolder} for a given {@link ClientSession session}.
*
* @param session the associated {@link ClientSession}. Can be {@literal null}.
* @param dbFactory the associated {@link MongoDbFactory}. must not be {@literal null}.
*/
MongoResourceHolder(@Nullable ClientSession session, MongoDbFactory dbFactory) {
this.session = session;
this.dbFactory = dbFactory;
}
/**
* @return the associated {@link ClientSession}. Can be {@literal null}.
*/
@Nullable
ClientSession getSession() {
return session;
}
/**
* @return the associated {@link MongoDbFactory}.
*/
public MongoDbFactory getDbFactory() {
return dbFactory;
}
/**
* Set the {@link ClientSession} to guard.
*
* @param session can be {@literal null}.
*/
public void setSession(@Nullable ClientSession session) {
this.session = session;
}
/**
* Only set the timeout if it does not match the {@link TransactionDefinition#TIMEOUT_DEFAULT default timeout}.
*
* @param seconds
*/
void setTimeoutIfNotDefaulted(int seconds) {
if (seconds != TransactionDefinition.TIMEOUT_DEFAULT) {
setTimeoutInSeconds(seconds);
}
}
/**
* @return {@literal true} if session is not {@literal null}.
*/
boolean hasSession() {
return session != null;
}
/**
* @return {@literal true} if the session is active and has not been closed.
*/
boolean hasActiveSession() {
if (!hasSession()) {
return false;
}
return hasServerSession() && !getSession().getServerSession().isClosed();
}
/**
* @return {@literal true} if the {@link ClientSession} has a {@link com.mongodb.session.ServerSession} associated
* that is accessible via {@link ClientSession#getServerSession()}.
*/
boolean hasServerSession() {
try {
return getSession().getServerSession() != null;
} catch (IllegalStateException serverSessionClosed) {
// ignore
}
return false;
}
}

View File

@@ -0,0 +1,41 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import com.mongodb.ClientSessionOptions;
import com.mongodb.client.ClientSession;
/**
* A simple interface for obtaining a {@link ClientSession} to be consumed by
* {@link org.springframework.data.mongodb.core.MongoOperations} and MongoDB native operations that support causal
* consistency and transactions.
*
* @author Christoph Strobl
* @currentRead Shadow's Edge - Brent Weeks
* @since 2.1
*/
@FunctionalInterface
public interface MongoSessionProvider {
/**
* Obtain a {@link ClientSession} with with given options.
*
* @param options must not be {@literal null}.
* @return never {@literal null}.
* @throws org.springframework.dao.DataAccessException
*/
ClientSession getSession(ClientSessionOptions options);
}

View File

@@ -0,0 +1,490 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.lang.Nullable;
import org.springframework.transaction.TransactionDefinition;
import org.springframework.transaction.TransactionException;
import org.springframework.transaction.TransactionSystemException;
import org.springframework.transaction.support.AbstractPlatformTransactionManager;
import org.springframework.transaction.support.DefaultTransactionStatus;
import org.springframework.transaction.support.ResourceTransactionManager;
import org.springframework.transaction.support.SmartTransactionObject;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.transaction.support.TransactionSynchronizationUtils;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import com.mongodb.ClientSessionOptions;
import com.mongodb.MongoException;
import com.mongodb.TransactionOptions;
import com.mongodb.client.ClientSession;
/**
* A {@link org.springframework.transaction.PlatformTransactionManager} implementation that manages
* {@link ClientSession} based transactions for a single {@link MongoDbFactory}.
* <p />
* Binds a {@link ClientSession} from the specified {@link MongoDbFactory} to the thread.
* <p />
* {@link TransactionDefinition#isReadOnly() Readonly} transactions operate on a {@link ClientSession} and enable causal
* consistency, and also {@link ClientSession#startTransaction() start}, {@link ClientSession#commitTransaction()
* commit} or {@link ClientSession#abortTransaction() abort} a transaction.
* <p />
* Application code is required to retrieve the {@link com.mongodb.client.MongoDatabase} via
* {@link MongoDatabaseUtils#getDatabase(MongoDbFactory)} instead of a standard {@link MongoDbFactory#getDb()} call.
* Spring classes such as {@link org.springframework.data.mongodb.core.MongoTemplate} use this strategy implicitly.
*
* @author Christoph Strobl
* @author Mark Paluch
* @currentRead Shadow's Edge - Brent Weeks
* @since 2.1
* @see <a href="https://www.mongodb.com/transactions">MongoDB Transaction Documentation</a>
* @see MongoDatabaseUtils#getDatabase(MongoDbFactory, SessionSynchronization)
*/
public class MongoTransactionManager extends AbstractPlatformTransactionManager
implements ResourceTransactionManager, InitializingBean {
private @Nullable MongoDbFactory dbFactory;
private @Nullable TransactionOptions options;
/**
* Create a new {@link MongoTransactionManager} for bean-style usage.
* <p />
* <strong>Note:</strong>The {@link MongoDbFactory db factory} has to be {@link #setDbFactory(MongoDbFactory) set}
* before using the instance. Use this constructor to prepare a {@link MongoTransactionManager} via a
* {@link org.springframework.beans.factory.BeanFactory}.
* <p />
* Optionally it is possible to set default {@link TransactionOptions transaction options} defining
* {@link com.mongodb.ReadConcern} and {@link com.mongodb.WriteConcern}.
*
* @see #setDbFactory(MongoDbFactory)
* @see #setTransactionSynchronization(int)
*/
public MongoTransactionManager() {}
/**
* Create a new {@link MongoTransactionManager} obtaining sessions from the given {@link MongoDbFactory}.
*
* @param dbFactory must not be {@literal null}.
*/
public MongoTransactionManager(MongoDbFactory dbFactory) {
this(dbFactory, null);
}
/**
* Create a new {@link MongoTransactionManager} obtaining sessions from the given {@link MongoDbFactory} applying the
* given {@link TransactionOptions options}, if present, when starting a new transaction.
*
* @param dbFactory must not be {@literal null}.
* @param options can be {@literal null}.
*/
public MongoTransactionManager(MongoDbFactory dbFactory, @Nullable TransactionOptions options) {
Assert.notNull(dbFactory, "DbFactory must not be null!");
this.dbFactory = dbFactory;
this.options = options;
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doGetTransaction()
*/
@Override
protected Object doGetTransaction() throws TransactionException {
MongoResourceHolder resourceHolder = (MongoResourceHolder) TransactionSynchronizationManager
.getResource(getRequiredDbFactory());
return new MongoTransactionObject(resourceHolder);
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#isExistingTransaction(java.lang.Object)
*/
@Override
protected boolean isExistingTransaction(Object transaction) throws TransactionException {
return extractMongoTransaction(transaction).hasResourceHolder();
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doBegin(java.lang.Object, org.springframework.transaction.TransactionDefinition)
*/
@Override
protected void doBegin(Object transaction, TransactionDefinition definition) throws TransactionException {
MongoTransactionObject mongoTransactionObject = extractMongoTransaction(transaction);
MongoResourceHolder resourceHolder = newResourceHolder(definition,
ClientSessionOptions.builder().causallyConsistent(true).build());
mongoTransactionObject.setResourceHolder(resourceHolder);
if (logger.isDebugEnabled()) {
logger
.debug(String.format("About to start transaction for session %s.", debugString(resourceHolder.getSession())));
}
try {
mongoTransactionObject.startTransaction(options);
} catch (MongoException ex) {
throw new TransactionSystemException(String.format("Could not start Mongo transaction for session %s.",
debugString(mongoTransactionObject.getSession())), ex);
}
if (logger.isDebugEnabled()) {
logger.debug(String.format("Started transaction for session %s.", debugString(resourceHolder.getSession())));
}
resourceHolder.setSynchronizedWithTransaction(true);
TransactionSynchronizationManager.bindResource(getRequiredDbFactory(), resourceHolder);
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doSuspend(java.lang.Object)
*/
@Override
protected Object doSuspend(Object transaction) throws TransactionException {
MongoTransactionObject mongoTransactionObject = extractMongoTransaction(transaction);
mongoTransactionObject.setResourceHolder(null);
return TransactionSynchronizationManager.unbindResource(getRequiredDbFactory());
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doResume(java.lang.Object, java.lang.Object)
*/
@Override
protected void doResume(@Nullable Object transaction, Object suspendedResources) {
TransactionSynchronizationManager.bindResource(getRequiredDbFactory(), suspendedResources);
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doCommit(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override
protected void doCommit(DefaultTransactionStatus status) throws TransactionException {
MongoTransactionObject mongoTransactionObject = extractMongoTransaction(status);
if (logger.isDebugEnabled()) {
logger.debug(String.format("About to commit transaction for session %s.",
debugString(mongoTransactionObject.getSession())));
}
try {
mongoTransactionObject.commitTransaction();
} catch (MongoException ex) {
throw new TransactionSystemException(String.format("Could not commit Mongo transaction for session %s.",
debugString(mongoTransactionObject.getSession())), ex);
}
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doRollback(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override
protected void doRollback(DefaultTransactionStatus status) throws TransactionException {
MongoTransactionObject mongoTransactionObject = extractMongoTransaction(status);
if (logger.isDebugEnabled()) {
logger.debug(String.format("About to abort transaction for session %s.",
debugString(mongoTransactionObject.getSession())));
}
try {
mongoTransactionObject.abortTransaction();
} catch (MongoException ex) {
throw new TransactionSystemException(String.format("Could not abort Mongo transaction for session %s.",
debugString(mongoTransactionObject.getSession())), ex);
}
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doSetRollbackOnly(org.springframework.transaction.support.DefaultTransactionStatus)
*/
@Override
protected void doSetRollbackOnly(DefaultTransactionStatus status) throws TransactionException {
MongoTransactionObject transactionObject = extractMongoTransaction(status);
transactionObject.getRequiredResourceHolder().setRollbackOnly();
}
/*
* (non-Javadoc)
* org.springframework.transaction.support.AbstractPlatformTransactionManager#doCleanupAfterCompletion(java.lang.Object)
*/
@Override
protected void doCleanupAfterCompletion(Object transaction) {
Assert.isInstanceOf(MongoTransactionObject.class, transaction,
() -> String.format("Expected to find a %s but it turned out to be %s.", MongoTransactionObject.class,
transaction.getClass()));
MongoTransactionObject mongoTransactionObject = (MongoTransactionObject) transaction;
// Remove the connection holder from the thread.
TransactionSynchronizationManager.unbindResource(getRequiredDbFactory());
mongoTransactionObject.getRequiredResourceHolder().clear();
if (logger.isDebugEnabled()) {
logger.debug(String.format("About to release Session %s after transaction.",
debugString(mongoTransactionObject.getSession())));
}
mongoTransactionObject.closeSession();
}
/**
* Set the {@link MongoDbFactory} that this instance should manage transactions for.
*
* @param dbFactory must not be {@literal null}.
*/
public void setDbFactory(MongoDbFactory dbFactory) {
Assert.notNull(dbFactory, "DbFactory must not be null!");
this.dbFactory = dbFactory;
}
/**
* Set the {@link TransactionOptions} to be applied when starting transactions.
*
* @param options can be {@literal null}.
*/
public void setOptions(@Nullable TransactionOptions options) {
this.options = options;
}
/**
* Get the {@link MongoDbFactory} that this instance manages transactions for.
*
* @return can be {@literal null}.
*/
@Nullable
public MongoDbFactory getDbFactory() {
return dbFactory;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.ResourceTransactionManager#getResourceFactory()
*/
@Override
public MongoDbFactory getResourceFactory() {
return getRequiredDbFactory();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.InitializingBean#afterPropertiesSet()
*/
@Override
public void afterPropertiesSet() {
getRequiredDbFactory();
}
private MongoResourceHolder newResourceHolder(TransactionDefinition definition, ClientSessionOptions options) {
MongoDbFactory dbFactory = getResourceFactory();
MongoResourceHolder resourceHolder = new MongoResourceHolder(dbFactory.getSession(options), dbFactory);
resourceHolder.setTimeoutIfNotDefaulted(determineTimeout(definition));
return resourceHolder;
}
/**
* @throws IllegalStateException if {@link #dbFactory} is {@literal null}.
*/
private MongoDbFactory getRequiredDbFactory() {
Assert.state(dbFactory != null,
"MongoTransactionManager operates upon a MongoDbFactory. Did you forget to provide one? It's required.");
return dbFactory;
}
private static MongoTransactionObject extractMongoTransaction(Object transaction) {
Assert.isInstanceOf(MongoTransactionObject.class, transaction,
() -> String.format("Expected to find a %s but it turned out to be %s.", MongoTransactionObject.class,
transaction.getClass()));
return (MongoTransactionObject) transaction;
}
private static MongoTransactionObject extractMongoTransaction(DefaultTransactionStatus status) {
Assert.isInstanceOf(MongoTransactionObject.class, status.getTransaction(),
() -> String.format("Expected to find a %s but it turned out to be %s.", MongoTransactionObject.class,
status.getTransaction().getClass()));
return (MongoTransactionObject) status.getTransaction();
}
private static String debugString(@Nullable ClientSession session) {
if (session == null) {
return "null";
}
String debugString = String.format("[%s@%s ", ClassUtils.getShortName(session.getClass()),
Integer.toHexString(session.hashCode()));
try {
if (session.getServerSession() != null) {
debugString += String.format("id = %s, ", session.getServerSession().getIdentifier());
debugString += String.format("causallyConsistent = %s, ", session.isCausallyConsistent());
debugString += String.format("txActive = %s, ", session.hasActiveTransaction());
debugString += String.format("txNumber = %d, ", session.getServerSession().getTransactionNumber());
debugString += String.format("closed = %d, ", session.getServerSession().isClosed());
debugString += String.format("clusterTime = %s", session.getClusterTime());
} else {
debugString += "id = n/a";
debugString += String.format("causallyConsistent = %s, ", session.isCausallyConsistent());
debugString += String.format("txActive = %s, ", session.hasActiveTransaction());
debugString += String.format("clusterTime = %s", session.getClusterTime());
}
} catch (RuntimeException e) {
debugString += String.format("error = %s", e.getMessage());
}
debugString += "]";
return debugString;
}
/**
* MongoDB specific transaction object, representing a {@link MongoResourceHolder}. Used as transaction object by
* {@link MongoTransactionManager}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.1
* @see MongoResourceHolder
*/
static class MongoTransactionObject implements SmartTransactionObject {
private @Nullable MongoResourceHolder resourceHolder;
MongoTransactionObject(@Nullable MongoResourceHolder resourceHolder) {
this.resourceHolder = resourceHolder;
}
/**
* Set the {@link MongoResourceHolder}.
*
* @param resourceHolder can be {@literal null}.
*/
void setResourceHolder(@Nullable MongoResourceHolder resourceHolder) {
this.resourceHolder = resourceHolder;
}
/**
* @return {@literal true} if a {@link MongoResourceHolder} is set.
*/
boolean hasResourceHolder() {
return resourceHolder != null;
}
/**
* Start a MongoDB transaction optionally given {@link TransactionOptions}.
*
* @param options can be {@literal null}
*/
void startTransaction(@Nullable TransactionOptions options) {
ClientSession session = getRequiredSession();
if (options != null) {
session.startTransaction(options);
} else {
session.startTransaction();
}
}
/**
* Commit the transaction.
*/
void commitTransaction() {
getRequiredSession().commitTransaction();
}
/**
* Rollback (abort) the transaction.
*/
void abortTransaction() {
getRequiredSession().abortTransaction();
}
/**
* Close a {@link ClientSession} without regard to its transactional state.
*/
void closeSession() {
ClientSession session = getRequiredSession();
if (session.getServerSession() != null && !session.getServerSession().isClosed()) {
session.close();
}
}
@Nullable
ClientSession getSession() {
return resourceHolder != null ? resourceHolder.getSession() : null;
}
private MongoResourceHolder getRequiredResourceHolder() {
Assert.state(resourceHolder != null, "MongoResourceHolder is required but not present. o_O");
return resourceHolder;
}
private ClientSession getRequiredSession() {
ClientSession session = getSession();
Assert.state(session != null, "A Session is required but it turned out to be null.");
return session;
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#isRollbackOnly()
*/
@Override
public boolean isRollbackOnly() {
return this.resourceHolder != null && this.resourceHolder.isRollbackOnly();
}
/*
* (non-Javadoc)
* @see org.springframework.transaction.support.SmartTransactionObject#flush()
*/
@Override
public void flush() {
TransactionSynchronizationUtils.triggerFlush();
}
}
}

View File

@@ -24,8 +24,8 @@ import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.core.MongoExceptionTranslator;
import com.mongodb.ClientSessionOptions;
import com.mongodb.reactivestreams.client.ClientSession;
import com.mongodb.reactivestreams.client.MongoDatabase;
import com.mongodb.session.ClientSession;
/**
* Interface for factories creating reactive {@link MongoDatabase} instances.

View File

@@ -57,6 +57,7 @@ public class SessionAwareMethodInterceptor<D, C> implements MethodInterceptor {
private final Class<?> targetType;
private final Class<?> collectionType;
private final Class<?> databaseType;
private final Class<? extends ClientSession> sessionType;
/**
* Create a new SessionAwareMethodInterceptor for given target.
@@ -71,12 +72,13 @@ public class SessionAwareMethodInterceptor<D, C> implements MethodInterceptor {
* {@code MongoCollection}.
* @param <T> target object type.
*/
public <T> SessionAwareMethodInterceptor(ClientSession session, T target, Class<D> databaseType,
ClientSessionOperator<D> databaseDecorator, Class<C> collectionType,
public <T> SessionAwareMethodInterceptor(ClientSession session, T target, Class<? extends ClientSession> sessionType,
Class<D> databaseType, ClientSessionOperator<D> databaseDecorator, Class<C> collectionType,
ClientSessionOperator<C> collectionDecorator) {
Assert.notNull(session, "ClientSession must not be null!");
Assert.notNull(target, "Target must not be null!");
Assert.notNull(sessionType, "SessionType must not be null!");
Assert.notNull(databaseType, "Database type must not be null!");
Assert.notNull(databaseDecorator, "Database ClientSessionOperator must not be null!");
Assert.notNull(collectionType, "Collection type must not be null!");
@@ -90,6 +92,7 @@ public class SessionAwareMethodInterceptor<D, C> implements MethodInterceptor {
this.databaseDecorator = databaseDecorator;
this.targetType = ClassUtils.isAssignable(databaseType, target.getClass()) ? databaseType : collectionType;
this.sessionType = sessionType;
}
/*
@@ -114,7 +117,7 @@ public class SessionAwareMethodInterceptor<D, C> implements MethodInterceptor {
return methodInvocation.proceed();
}
Optional<Method> targetMethod = METHOD_CACHE.lookup(methodInvocation.getMethod(), targetType);
Optional<Method> targetMethod = METHOD_CACHE.lookup(methodInvocation.getMethod(), targetType, sessionType);
return !targetMethod.isPresent() ? methodInvocation.proceed()
: ReflectionUtils.invokeMethod(targetMethod.get(), target,
@@ -171,18 +174,19 @@ public class SessionAwareMethodInterceptor<D, C> implements MethodInterceptor {
* @param targetClass
* @return
*/
Optional<Method> lookup(Method method, Class<?> targetClass) {
Optional<Method> lookup(Method method, Class<?> targetClass, Class<? extends ClientSession> sessionType) {
return cache.computeIfAbsent(new MethodClassKey(method, targetClass),
val -> Optional.ofNullable(findTargetWithSession(method, targetClass)));
val -> Optional.ofNullable(findTargetWithSession(method, targetClass, sessionType)));
}
@Nullable
private Method findTargetWithSession(Method sourceMethod, Class<?> targetType) {
private Method findTargetWithSession(Method sourceMethod, Class<?> targetType,
Class<? extends ClientSession> sessionType) {
Class<?>[] argTypes = sourceMethod.getParameterTypes();
Class<?>[] args = new Class<?>[argTypes.length + 1];
args[0] = ClientSession.class;
args[0] = sessionType;
System.arraycopy(argTypes, 0, args, 1, argTypes.length);
return ReflectionUtils.findMethod(targetType, sourceMethod.getName(), args);

View File

@@ -0,0 +1,38 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
/**
* {@link SessionSynchronization} is used along with {@link org.springframework.data.mongodb.core.MongoTemplate} to
* define in which type of transactions to participate if any.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.1
*/
public enum SessionSynchronization {
/**
* Synchronize with any transaction even with empty transactions and initiate a MongoDB transaction when doing so by
* registering a MongoDB specific {@link org.springframework.transaction.support.ResourceHolderSynchronization}.
*/
ALWAYS,
/**
* Synchronize with native MongoDB transactions initiated via {@link MongoTransactionManager}.
*/
ON_ACTUAL_TRANSACTION;
}

View File

@@ -0,0 +1,111 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoClientDbFactory;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.lang.Nullable;
import com.mongodb.client.MongoClient;
/**
* Base class for Spring Data MongoDB configuration using JavaConfig with {@link com.mongodb.client.MongoClient}.
*
* @author Christoph Strobl
* @since 2.1
* @see MongoConfigurationSupport
* @see AbstractMongoConfiguration
*/
@Configuration
public abstract class AbstractMongoClientConfiguration extends MongoConfigurationSupport {
/**
* Return the {@link MongoClient} instance to connect to. Annotate with {@link Bean} in case you want to expose a
* {@link MongoClient} instance to the {@link org.springframework.context.ApplicationContext}.
*
* @return
*/
public abstract MongoClient mongoClient();
/**
* Creates a {@link MongoTemplate}.
*
* @return
*/
@Bean
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongoDbFactory(), mappingMongoConverter());
}
/**
* Creates a {@link SimpleMongoDbFactory} to be used by the {@link MongoTemplate}. Will use the {@link MongoClient}
* instance configured in {@link #mongoClient()}.
*
* @see #mongoClient()
* @see #mongoTemplate()
* @return
*/
@Bean
public MongoDbFactory mongoDbFactory() {
return new SimpleMongoClientDbFactory(mongoClient(), getDatabaseName());
}
/**
* Return the base package to scan for mapped {@link Document}s. Will return the package name of the configuration
* class' (the concrete class, not this one here) by default. So if you have a {@code com.acme.AppConfig} extending
* {@link AbstractMongoClientConfiguration} the base package will be considered {@code com.acme} unless the method is
* overridden to implement alternate behavior.
*
* @return the base package to scan for mapped {@link Document} classes or {@literal null} to not enable scanning for
* entities.
* @deprecated use {@link #getMappingBasePackages()} instead.
*/
@Deprecated
@Nullable
protected String getMappingBasePackage() {
Package mappingBasePackage = getClass().getPackage();
return mappingBasePackage == null ? null : mappingBasePackage.getName();
}
/**
* Creates a {@link MappingMongoConverter} using the configured {@link #mongoDbFactory()} and
* {@link #mongoMappingContext()}. Will get {@link #customConversions()} applied.
*
* @see #customConversions()
* @see #mongoMappingContext()
* @see #mongoDbFactory()
* @return
* @throws Exception
*/
@Bean
public MappingMongoConverter mappingMongoConverter() throws Exception {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(mongoDbFactory());
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, mongoMappingContext());
converter.setCustomConversions(customConversions());
return converter;
}
}

View File

@@ -29,7 +29,10 @@ import org.springframework.lang.Nullable;
import com.mongodb.MongoClient;
/**
* Base class for Spring Data MongoDB configuration using JavaConfig.
* Base class for Spring Data MongoDB configuration using JavaConfig with {@link com.mongodb.MongoClient}.
* <p />
* <strong>INFO:</strong>In case you want to use {@link com.mongodb.client.MongoClients} for configuration please refer
* to {@link AbstractMongoClientConfiguration}.
*
* @author Mark Pollack
* @author Oliver Gierke
@@ -38,10 +41,10 @@ import com.mongodb.MongoClient;
* @author Christoph Strobl
* @author Mark Paluch
* @see MongoConfigurationSupport
* @see AbstractMongoClientConfiguration
*/
@Configuration
public abstract class
AbstractMongoConfiguration extends MongoConfigurationSupport {
public abstract class AbstractMongoConfiguration extends MongoConfigurationSupport {
/**
* Return the {@link MongoClient} instance to connect to. Annotate with {@link Bean} in case you want to expose a
@@ -63,7 +66,7 @@ AbstractMongoConfiguration extends MongoConfigurationSupport {
/**
* Creates a {@link SimpleMongoDbFactory} to be used by the {@link MongoTemplate}. Will use the {@link MongoClient}
* instance configured in {@link #mongo()}.
* instance configured in {@link #mongoClient()}.
*
* @see #mongoClient()
* @see #mongoTemplate()
@@ -111,4 +114,5 @@ AbstractMongoConfiguration extends MongoConfigurationSupport {
return converter;
}
}

View File

@@ -35,22 +35,10 @@ import org.springframework.util.StringUtils;
* @author Mark Paluch
* @since 2.0
*/
@RequiredArgsConstructor
class ExecutableAggregationOperationSupport implements ExecutableAggregationOperation {
private final MongoTemplate template;
/**
* Create new instance of {@link ExecutableAggregationOperationSupport}.
*
* @param template must not be {@literal null}.
* @throws IllegalArgumentException if template is {@literal null}.
*/
ExecutableAggregationOperationSupport(MongoTemplate template) {
Assert.notNull(template, "Template must not be null!");
this.template = template;
}
private final @NonNull MongoTemplate template;
/*
* (non-Javadoc)

View File

@@ -45,24 +45,12 @@ import com.mongodb.client.FindIterable;
* @author Mark Paluch
* @since 2.0
*/
@RequiredArgsConstructor
class ExecutableFindOperationSupport implements ExecutableFindOperation {
private static final Query ALL_QUERY = new Query();
private final MongoTemplate template;
/**
* Create new {@link ExecutableFindOperationSupport}.
*
* @param template must not be {@literal null}.
* @throws IllegalArgumentException if template is {@literal null}.
*/
ExecutableFindOperationSupport(MongoTemplate template) {
Assert.notNull(template, "Template must not be null!");
this.template = template;
}
private final @NonNull MongoTemplate template;
/*
* (non-Javadoc)

View File

@@ -37,22 +37,10 @@ import com.mongodb.bulk.BulkWriteResult;
* @author Mark Paluch
* @since 2.0
*/
@RequiredArgsConstructor
class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
private final MongoTemplate template;
/**
* Create new {@link ExecutableInsertOperationSupport}.
*
* @param template must not be {@literal null}.
* @throws IllegalArgumentException if template is {@literal null}.
*/
ExecutableInsertOperationSupport(MongoTemplate template) {
Assert.notNull(template, "Template must not be null!");
this.template = template;
}
private final @NonNull MongoTemplate template;
/*
* (non-Javadoc)

View File

@@ -0,0 +1,199 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.List;
import org.springframework.data.mongodb.core.ExecutableFindOperation.ExecutableFind;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.query.Query;
/**
* {@link ExecutableMapReduceOperation} allows creation and execution of MongoDB mapReduce operations in a fluent API
* style. The starting {@literal domainType} is used for mapping an optional {@link Query} provided via {@code matching}
* into the MongoDB specific representation. By default, the originating {@literal domainType} is also used for mapping
* back the results from the {@link org.bson.Document}. However, it is possible to define an different
* {@literal returnType} via {@code as} to mapping the result.<br />
* The collection to operate on is by default derived from the initial {@literal domainType} and can be defined there
* via {@link org.springframework.data.mongodb.core.mapping.Document}. Using {@code inCollection} allows to override the
* collection name for the execution.
*
* <pre>
* <code>
* mapReduce(Human.class)
* .map("function() { emit(this.id, this.firstname) }")
* .reduce("function(id, name) { return sum(id, name); }")
* .inCollection("star-wars")
* .as(Jedi.class)
* .matching(query(where("lastname").is("skywalker")))
* .all();
* </code>
* </pre>
*
* @author Christoph Strobl
* @since 2.1
*/
public interface ExecutableMapReduceOperation {
/**
* Start creating a mapReduce operation for the given {@literal domainType}.
*
* @param domainType must not be {@literal null}.
* @return new instance of {@link ExecutableFind}.
* @throws IllegalArgumentException if domainType is {@literal null}.
*/
<T> MapReduceWithMapFunction<T> mapReduce(Class<T> domainType);
/**
* Trigger mapReduce execution by calling one of the terminating methods.
*
* @author Christoph Strobl
* @since 2.1
*/
interface TerminatingMapReduce<T> {
/**
* Get the mapReduce results.
*
* @return never {@literal null}.
*/
List<T> all();
}
/**
* Provide the Javascript {@code function()} used to map matching documents.
*
* @author Christoph Strobl
* @since 2.1
*/
interface MapReduceWithMapFunction<T> {
/**
* Set the Javascript map {@code function()}.
*
* @param mapFunction must not be {@literal null} nor empty.
* @return new instance of {@link MapReduceWithReduceFunction}.
* @throws IllegalArgumentException if {@literal mapFunction} is {@literal null} or empty.
*/
MapReduceWithReduceFunction<T> map(String mapFunction);
}
/**
* Provide the Javascript {@code function()} used to reduce matching documents.
*
* @author Christoph Strobl
* @since 2.1
*/
interface MapReduceWithReduceFunction<T> {
/**
* Set the Javascript map {@code function()}.
*
* @param reduceFunction must not be {@literal null} nor empty.
* @return new instance of {@link ExecutableMapReduce}.
* @throws IllegalArgumentException if {@literal reduceFunction} is {@literal null} or empty.
*/
ExecutableMapReduce<T> reduce(String reduceFunction);
}
/**
* Collection override (Optional).
*
* @author Christoph Strobl
* @since 2.1
*/
interface MapReduceWithCollection<T> extends MapReduceWithQuery<T> {
/**
* Explicitly set the name of the collection to perform the mapReduce operation on. <br />
* Skip this step to use the default collection derived from the domain type.
*
* @param collection must not be {@literal null} nor {@literal empty}.
* @return new instance of {@link MapReduceWithProjection}.
* @throws IllegalArgumentException if collection is {@literal null}.
*/
MapReduceWithProjection<T> inCollection(String collection);
}
/**
* Input document filter query (Optional).
*
* @author Christoph Strobl
* @since 2.1
*/
interface MapReduceWithQuery<T> extends TerminatingMapReduce<T> {
/**
* Set the filter query to be used.
*
* @param query must not be {@literal null}.
* @return new instance of {@link TerminatingMapReduce}.
* @throws IllegalArgumentException if query is {@literal null}.
*/
TerminatingMapReduce<T> matching(Query query);
}
/**
* Result type override (Optional).
*
* @author Christoph Strobl
* @since 2.1
*/
interface MapReduceWithProjection<T> extends MapReduceWithQuery<T> {
/**
* Define the target type fields should be mapped to. <br />
* Skip this step if you are anyway only interested in the original domain type.
*
* @param resultType must not be {@literal null}.
* @param <R> result type.
* @return new instance of {@link TerminatingMapReduce}.
* @throws IllegalArgumentException if resultType is {@literal null}.
*/
<R> MapReduceWithQuery<R> as(Class<R> resultType);
}
/**
* Additional mapReduce options (Optional).
*
* @author Christoph Strobl
* @since 2.1
*/
interface MapReduceWithOptions<T> {
/**
* Set additional options to apply to the mapReduce operation.
*
* @param options must not be {@literal null}.
* @return new instance of {@link ExecutableMapReduce}.
* @throws IllegalArgumentException if options is {@literal null}.
*/
ExecutableMapReduce<T> with(MapReduceOptions options);
}
/**
* {@link ExecutableMapReduce} provides methods for constructing mapReduce operations in a fluent way.
*
* @author Christoph Strobl
* @since 2.1
*/
interface ExecutableMapReduce<T> extends MapReduceWithMapFunction<T>, MapReduceWithReduceFunction<T>,
MapReduceWithCollection<T>, MapReduceWithProjection<T>, MapReduceWithOptions<T> {
}
}

View File

@@ -0,0 +1,175 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import java.util.List;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
/**
* Implementation of {@link ExecutableMapReduceOperation}.
*
* @author Christoph Strobl
* @since 2.1
*/
@RequiredArgsConstructor
class ExecutableMapReduceOperationSupport implements ExecutableMapReduceOperation {
private static final Query ALL_QUERY = new Query();
private final @NonNull MongoTemplate template;
/*
* (non-Javascript)
* @see in org.springframework.data.mongodb.core.ExecutableMapReduceOperation#mapReduce(java.lang.Class)
*/
@Override
public <T> ExecutableMapReduceSupport<T> mapReduce(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null!");
return new ExecutableMapReduceSupport<>(template, domainType, domainType, null, ALL_QUERY, null, null, null);
}
/**
* @author Christoph Strobl
* @since 2.1
*/
static class ExecutableMapReduceSupport<T>
implements ExecutableMapReduce<T>, MapReduceWithOptions<T>, MapReduceWithCollection<T>,
MapReduceWithProjection<T>, MapReduceWithQuery<T>, MapReduceWithReduceFunction<T>, MapReduceWithMapFunction<T> {
private final MongoTemplate template;
private final Class<?> domainType;
private final Class<T> returnType;
private final @Nullable String collection;
private final Query query;
private final @Nullable String mapFunction;
private final @Nullable String reduceFunction;
private final @Nullable MapReduceOptions options;
ExecutableMapReduceSupport(MongoTemplate template, Class<?> domainType, Class<T> returnType, @Nullable String collection,
Query query, @Nullable String mapFunction, @Nullable String reduceFunction, @Nullable MapReduceOptions options) {
this.template = template;
this.domainType = domainType;
this.returnType = returnType;
this.collection = collection;
this.query = query;
this.mapFunction = mapFunction;
this.reduceFunction = reduceFunction;
this.options = options;
}
/*
* (non-Javascript)
* @see in org.springframework.data.mongodb.core.ExecutableMapReduceOperation.TerminatingMapReduce#all()
*/
@Override
public List<T> all() {
return template.mapReduce(query, domainType, getCollectionName(), mapFunction, reduceFunction, options,
returnType);
}
/*
* (non-Javascript)
* @see in org.springframework.data.mongodb.core.ExecutableMapReduceOperation.MapReduceWithCollection#inCollection(java.lang.String)
*/
@Override
public MapReduceWithProjection<T> inCollection(String collection) {
Assert.hasText(collection, "Collection name must not be null nor empty!");
return new ExecutableMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options);
}
/*
* (non-Javascript)
* @see in org.springframework.data.mongodb.core.ExecutableMapReduceOperation.MapReduceWithQuery#query(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public TerminatingMapReduce<T> matching(Query query) {
Assert.notNull(query, "Query must not be null!");
return new ExecutableMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options);
}
/*
* (non-Javascript)
* @see in org.springframework.data.mongodb.core.ExecutableMapReduceOperation.MapReduceWithProjection#as(java.lang.Class)
*/
@Override
public <R> MapReduceWithQuery<R> as(Class<R> resultType) {
Assert.notNull(resultType, "ResultType must not be null!");
return new ExecutableMapReduceSupport<>(template, domainType, resultType, collection, query, mapFunction,
reduceFunction, options);
}
/*
* (non-Javascript)
* @see in org.springframework.data.mongodb.core.ExecutableMapReduceOperation.MapReduceWithOptions#with(org.springframework.data.mongodb.core.mapreduce.MapReduceOptions)
*/
@Override
public ExecutableMapReduce<T> with(MapReduceOptions options) {
Assert.notNull(options, "Options must not be null! Please consider empty MapReduceOptions#options() instead.");
return new ExecutableMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options);
}
/*
* (non-Javascript)
* @see in org.springframework.data.mongodb.core.ExecutableMapReduceOperation.MapReduceWithMapFunction#map(java.lang.String)
*/
@Override
public MapReduceWithReduceFunction<T> map(String mapFunction) {
Assert.hasText(mapFunction, "MapFunction name must not be null nor empty!");
return new ExecutableMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options);
}
/*
* (non-Javascript)
* @see in org.springframework.data.mongodb.core.ExecutableMapReduceOperation.MapReduceWithReduceFunction#reduce(java.lang.String)
*/
@Override
public ExecutableMapReduce<T> reduce(String reduceFunction) {
Assert.hasText(reduceFunction, "ReduceFunction name must not be null nor empty!");
return new ExecutableMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options);
}
private String getCollectionName() {
return StringUtils.hasText(collection) ? collection : template.determineCollectionName(domainType);
}
}
}

View File

@@ -86,6 +86,13 @@ public interface ExecutableRemoveOperation {
*/
DeleteResult all();
/**
* Remove the first matching document.
*
* @return the {@link DeleteResult}. Never {@literal null}.
*/
DeleteResult one();
/**
* Remove and return all matching documents. <br/>
* <strong>NOTE</strong> The entire list of documents will be fetched before sending the actual delete commands.

View File

@@ -36,24 +36,12 @@ import com.mongodb.client.result.DeleteResult;
* @author Mark Paluch
* @since 2.0
*/
@RequiredArgsConstructor
class ExecutableRemoveOperationSupport implements ExecutableRemoveOperation {
private static final Query ALL_QUERY = new Query();
private final MongoTemplate tempate;
/**
* Create new {@link ExecutableRemoveOperationSupport}.
*
* @param template must not be {@literal null}.
* @throws IllegalArgumentException if template is {@literal null}.
*/
ExecutableRemoveOperationSupport(MongoTemplate template) {
Assert.notNull(template, "Template must not be null!");
this.tempate = template;
}
private final @NonNull MongoTemplate tempate;
/*
* (non-Javadoc)
@@ -110,10 +98,16 @@ class ExecutableRemoveOperationSupport implements ExecutableRemoveOperation {
*/
@Override
public DeleteResult all() {
return template.doRemove(getCollectionName(), query, domainType, true);
}
String collectionName = getCollectionName();
return template.doRemove(collectionName, query, domainType);
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableRemoveOperation.TerminatingRemove#one()
*/
@Override
public DeleteResult one() {
return template.doRemove(getCollectionName(), query, domainType, false);
}
/*

View File

@@ -35,23 +35,12 @@ import com.mongodb.client.result.UpdateResult;
* @author Mark Paluch
* @since 2.0
*/
@RequiredArgsConstructor
class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
private static final Query ALL_QUERY = new Query();
private final MongoTemplate template;
/**
* Creates new {@link ExecutableUpdateOperationSupport}.
*
* @param template must not be {@literal null}.
*/
ExecutableUpdateOperationSupport(MongoTemplate template) {
Assert.notNull(template, "Template must not be null!");
this.template = template;
}
private final @NonNull MongoTemplate template;
/*
* (non-Javadoc)

View File

@@ -22,4 +22,4 @@ package org.springframework.data.mongodb.core;
* @since 2.0
*/
public interface FluentMongoOperations extends ExecutableFindOperation, ExecutableInsertOperation,
ExecutableUpdateOperation, ExecutableRemoveOperation, ExecutableAggregationOperation {}
ExecutableUpdateOperation, ExecutableRemoveOperation, ExecutableAggregationOperation, ExecutableMapReduceOperation {}

View File

@@ -0,0 +1,262 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import lombok.Value;
import org.springframework.aop.framework.ProxyFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.SessionAwareMethodInterceptor;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.ClientSessionOptions;
import com.mongodb.DB;
import com.mongodb.WriteConcern;
import com.mongodb.client.ClientSession;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.MongoDatabase;
/**
* Common base class for usage with both {@link com.mongodb.client.MongoClients} and {@link com.mongodb.MongoClient}
* defining common properties such as database name and exception translator.
* <p/>
* Not intended to be used directly.
*
* @author Christoph Strobl
* @author Mark Paluch
* @param <C> Client type.
* @since 2.1
* @see SimpleMongoDbFactory
* @see SimpleMongoClientDbFactory
*/
public abstract class MongoDbFactorySupport<C> implements MongoDbFactory {
private final C mongoClient;
private final String databaseName;
private final boolean mongoInstanceCreated;
private final PersistenceExceptionTranslator exceptionTranslator;
private @Nullable WriteConcern writeConcern;
/**
* Create a new {@link MongoDbFactorySupport} object given {@code mongoClient}, {@code databaseName},
* {@code mongoInstanceCreated} and {@link PersistenceExceptionTranslator}.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null} or empty.
* @param mongoInstanceCreated {@literal true} if the client instance was created by a subclass of
* {@link MongoDbFactorySupport} to close the client on {@link #destroy()}.
* @param exceptionTranslator must not be {@literal null}.
*/
protected MongoDbFactorySupport(C mongoClient, String databaseName, boolean mongoInstanceCreated,
PersistenceExceptionTranslator exceptionTranslator) {
Assert.notNull(mongoClient, "MongoClient must not be null!");
Assert.hasText(databaseName, "Database name must not be empty!");
Assert.isTrue(databaseName.matches("[^/\\\\.$\"\\s]+"),
"Database name must not contain slashes, dots, spaces, quotes, or dollar signs!");
this.mongoClient = mongoClient;
this.databaseName = databaseName;
this.mongoInstanceCreated = mongoInstanceCreated;
this.exceptionTranslator = exceptionTranslator;
}
/**
* Configures the {@link WriteConcern} to be used on the {@link MongoDatabase} instance being created.
*
* @param writeConcern the writeConcern to set
*/
public void setWriteConcern(WriteConcern writeConcern) {
this.writeConcern = writeConcern;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb()
*/
public MongoDatabase getDb() throws DataAccessException {
return getDb(databaseName);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb(java.lang.String)
*/
@Override
public MongoDatabase getDb(String dbName) throws DataAccessException {
Assert.hasText(dbName, "Database name must not be empty!");
MongoDatabase db = doGetMongoDatabase(dbName);
if (writeConcern == null) {
return db;
}
return db.withWriteConcern(writeConcern);
}
/**
* Get the actual {@link MongoDatabase} from the client.
*
* @param dbName must not be {@literal null} or empty.
* @return
*/
protected abstract MongoDatabase doGetMongoDatabase(String dbName);
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.DisposableBean#destroy()
*/
public void destroy() throws Exception {
if (mongoInstanceCreated) {
closeClient();
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getExceptionTranslator()
*/
public PersistenceExceptionTranslator getExceptionTranslator() {
return this.exceptionTranslator;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#withSession(com.mongodb.session.Session)
*/
public MongoDbFactory withSession(ClientSession session) {
return new MongoDbFactorySupport.ClientSessionBoundMongoDbFactory(session, this);
}
/**
* Close the client instance.
*/
protected abstract void closeClient();
/**
* @return the Mongo client object.
*/
protected C getMongoClient() {
return mongoClient;
}
/**
* @return the database name.
*/
protected String getDefaultDatabaseName() {
return databaseName;
}
/**
* {@link ClientSession} bound {@link MongoDbFactory} decorating the database with a
* {@link SessionAwareMethodInterceptor}.
*
* @author Christoph Strobl
* @since 2.1
*/
@Value
static class ClientSessionBoundMongoDbFactory implements MongoDbFactory {
ClientSession session;
MongoDbFactory delegate;
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb()
*/
@Override
public MongoDatabase getDb() throws DataAccessException {
return proxyMongoDatabase(delegate.getDb());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb(java.lang.String)
*/
@Override
public MongoDatabase getDb(String dbName) throws DataAccessException {
return proxyMongoDatabase(delegate.getDb(dbName));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getExceptionTranslator()
*/
@Override
public PersistenceExceptionTranslator getExceptionTranslator() {
return delegate.getExceptionTranslator();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getLegacyDb()
*/
@Override
public DB getLegacyDb() {
return delegate.getLegacyDb();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getSession(com.mongodb.ClientSessionOptions)
*/
@Override
public ClientSession getSession(ClientSessionOptions options) {
return delegate.getSession(options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#withSession(com.mongodb.session.ClientSession)
*/
@Override
public MongoDbFactory withSession(ClientSession session) {
return delegate.withSession(session);
}
private MongoDatabase proxyMongoDatabase(MongoDatabase database) {
return createProxyInstance(session, database, MongoDatabase.class);
}
private MongoDatabase proxyDatabase(com.mongodb.session.ClientSession session, MongoDatabase database) {
return createProxyInstance(session, database, MongoDatabase.class);
}
private MongoCollection<?> proxyCollection(com.mongodb.session.ClientSession session, MongoCollection<?> collection) {
return createProxyInstance(session, collection, MongoCollection.class);
}
private <T> T createProxyInstance(com.mongodb.session.ClientSession session, T target, Class<T> targetType) {
ProxyFactory factory = new ProxyFactory();
factory.setTarget(target);
factory.setInterfaces(targetType);
factory.setOpaque(true);
factory.addAdvice(new SessionAwareMethodInterceptor<>(session, target, ClientSession.class, MongoDatabase.class,
this::proxyDatabase, MongoCollection.class, this::proxyCollection));
return targetType.cast(factory.getProxy());
}
}
}

View File

@@ -46,10 +46,10 @@ import org.springframework.util.Assert;
import com.mongodb.ClientSessionOptions;
import com.mongodb.Cursor;
import com.mongodb.ReadPreference;
import com.mongodb.client.ClientSession;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.result.DeleteResult;
import com.mongodb.client.result.UpdateResult;
import com.mongodb.session.ClientSession;
/**
* Interface that specifies a basic set of MongoDB operations. Implemented by {@link MongoTemplate}. Not often used but
@@ -1016,7 +1016,7 @@ public interface MongoOperations extends FluentMongoOperations {
* Insert a mixed Collection of objects into a database collection determining the collection name to use based on the
* class.
*
* @param collectionToSave the list of objects to save. Must not be {@literal null}.
* @param objectsToSave the list of objects to save. Must not be {@literal null}.
*/
void insertAll(Collection<? extends Object> objectsToSave);

View File

@@ -61,7 +61,9 @@ import org.springframework.data.mapping.PropertyPath;
import org.springframework.data.mapping.PropertyReferenceException;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.ConvertingPropertyAccessor;
import org.springframework.data.mongodb.MongoDatabaseUtils;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.SessionSynchronization;
import org.springframework.data.mongodb.core.BulkOperations.BulkMode;
import org.springframework.data.mongodb.core.DefaultBulkOperations.BulkOperationContext;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
@@ -71,7 +73,16 @@ import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.Fields;
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.*;
import org.springframework.data.mongodb.core.convert.DbRefResolver;
import org.springframework.data.mongodb.core.convert.DefaultDbRefResolver;
import org.springframework.data.mongodb.core.convert.JsonSchemaMapper;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoCustomConversions;
import org.springframework.data.mongodb.core.convert.MongoJsonSchemaMapper;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.index.IndexOperations;
import org.springframework.data.mongodb.core.index.IndexOperationsProvider;
import org.springframework.data.mongodb.core.index.MongoMappingEventPublisher;
@@ -125,6 +136,7 @@ import com.mongodb.MongoException;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
import com.mongodb.client.AggregateIterable;
import com.mongodb.client.ClientSession;
import com.mongodb.client.DistinctIterable;
import com.mongodb.client.FindIterable;
import com.mongodb.client.MapReduceIterable;
@@ -135,7 +147,6 @@ import com.mongodb.client.MongoIterable;
import com.mongodb.client.model.*;
import com.mongodb.client.result.DeleteResult;
import com.mongodb.client.result.UpdateResult;
import com.mongodb.session.ClientSession;
import com.mongodb.util.JSONParseException;
/**
@@ -196,8 +207,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
private @Nullable ResourceLoader resourceLoader;
private @Nullable MongoPersistentEntityIndexCreator indexCreator;
private SessionSynchronization sessionSynchronization = SessionSynchronization.ON_ACTUAL_TRANSACTION;
/**
* Constructor used for a basic template configuration
* Constructor used for a basic template configuration.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null} or empty.
@@ -205,6 +218,17 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
public MongoTemplate(MongoClient mongoClient, String databaseName) {
this(new SimpleMongoDbFactory(mongoClient, databaseName), (MongoConverter) null);
}
/**
* Constructor used for a basic template configuration.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null} or empty.
* @since 2.1
*/
public MongoTemplate(com.mongodb.client.MongoClient mongoClient, String databaseName) {
this(new SimpleMongoClientDbFactory(mongoClient, databaseName), (MongoConverter) null);
}
/**
* Constructor used for a basic template configuration.
@@ -249,6 +273,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
this.mongoDbFactory = dbFactory;
this.exceptionTranslator = that.exceptionTranslator;
this.sessionSynchronization = that.sessionSynchronization;
this.mongoConverter = that.mongoConverter instanceof MappingMongoConverter ? getDefaultMongoConverter(dbFactory)
: that.mongoConverter;
this.queryMapper = that.queryMapper;
@@ -564,6 +589,17 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
return new SessionBoundMongoTemplate(session, MongoTemplate.this);
}
/**
* Define if {@link MongoTemplate} should participate in transactions. Default is set to
* {@link SessionSynchronization#ON_ACTUAL_TRANSACTION}.<br />
* <strong>NOTE:</strong> MongoDB transactions require at least MongoDB 4.0.
*
* @since 2.1
*/
public void setSessionSynchronization(SessionSynchronization sessionSynchronization) {
this.sessionSynchronization = sessionSynchronization;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#createCollection(java.lang.Class)
@@ -669,7 +705,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
public Void doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException {
collection.drop();
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Dropped collection [{}]", collection.getNamespace().getCollectionName());
LOGGER.debug("Dropped collection [{}]",
collection.getNamespace() != null ? collection.getNamespace().getCollectionName() : collectionName);
}
return null;
}
@@ -1140,7 +1177,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* In case of using MongoDB Java driver version 3 the returned {@link WriteConcern} will be defaulted to
* {@link WriteConcern#ACKNOWLEDGED} when {@link WriteResultChecking} is set to {@link WriteResultChecking#EXCEPTION}.
*
* @param writeConcern any WriteConcern already configured or null
* @param mongoAction any MongoAction already configured or null
* @return The prepared WriteConcern or null
*/
@Nullable
@@ -1456,10 +1493,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
collection.withWriteConcern(writeConcernToUse).insertOne(dbDoc);
}
} else if (writeConcernToUse == null) {
collection.replaceOne(Filters.eq(ID_FIELD, dbDoc.get(ID_FIELD)), dbDoc, new UpdateOptions().upsert(true));
collection.replaceOne(Filters.eq(ID_FIELD, dbDoc.get(ID_FIELD)), dbDoc, new ReplaceOptions().upsert(true));
} else {
collection.withWriteConcern(writeConcernToUse).replaceOne(Filters.eq(ID_FIELD, dbDoc.get(ID_FIELD)), dbDoc,
new UpdateOptions().upsert(true));
new ReplaceOptions().upsert(true));
}
return dbDoc.get(ID_FIELD);
}
@@ -1565,7 +1602,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
collection = writeConcernToUse != null ? collection.withWriteConcern(writeConcernToUse) : collection;
if (!UpdateMapper.isUpdateObject(updateObj)) {
return collection.replaceOne(queryObj, updateObj, opts);
ReplaceOptions replaceOptions = new ReplaceOptions();
replaceOptions.collation(opts.getCollation());
replaceOptions.upsert(opts.isUpsert());
return collection.replaceOne(queryObj, updateObj, replaceOptions);
} else {
if (multi) {
return collection.updateMany(queryObj, updateObj, opts);
@@ -1601,7 +1643,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Assert.notNull(object, "Object must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
return doRemove(collectionName, getIdQueryFor(object), object.getClass());
return doRemove(collectionName, getIdQueryFor(object), object.getClass(), false);
}
/**
@@ -1690,7 +1732,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
@Override
public DeleteResult remove(Query query, String collectionName) {
return doRemove(collectionName, query, null);
return doRemove(collectionName, query, null, true);
}
@Override
@@ -1702,11 +1744,11 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
public DeleteResult remove(Query query, Class<?> entityClass, String collectionName) {
Assert.notNull(entityClass, "EntityClass must not be null!");
return doRemove(collectionName, query, entityClass);
return doRemove(collectionName, query, entityClass, true);
}
protected <T> DeleteResult doRemove(final String collectionName, final Query query,
@Nullable final Class<T> entityClass) {
@Nullable final Class<T> entityClass, boolean multi) {
Assert.notNull(query, "Query must not be null!");
Assert.hasText(collectionName, "Collection name must not be null or empty!");
@@ -1731,7 +1773,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
WriteConcern writeConcernToUse = prepareWriteConcern(mongoAction);
DeleteResult dr = null;
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Remove using query: {} in collection: {}.",
new Object[] { serializeToJsonSafely(removeQuery), collectionName });
@@ -1750,15 +1791,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
removeQuery = new Document(ID_FIELD, new Document("$in", ids));
}
if (writeConcernToUse == null) {
dr = collection.deleteMany(removeQuery, options);
} else {
dr = collection.withWriteConcern(writeConcernToUse).deleteMany(removeQuery, options);
}
MongoCollection<Document> collectionToUse = writeConcernToUse != null
? collection.withWriteConcern(writeConcernToUse) : collection;
DeleteResult result = multi ? collectionToUse.deleteMany(removeQuery, options)
: collection.deleteOne(removeQuery, options);
maybeEmitEvent(new AfterDeleteEvent<T>(queryObject, entityClass, collectionName));
return dr;
return result;
}
});
}
@@ -1798,30 +1839,48 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
public <T> MapReduceResults<T> mapReduce(Query query, String inputCollectionName, String mapFunction,
String reduceFunction, @Nullable MapReduceOptions mapReduceOptions, Class<T> entityClass) {
Assert.notNull(query, "Query must not be null!");
Assert.notNull(inputCollectionName, "InputCollectionName must not be null!");
Assert.notNull(entityClass, "EntityClass must not be null!");
Assert.notNull(reduceFunction, "ReduceFunction must not be null!");
Assert.notNull(mapFunction, "MapFunction must not be null!");
return new MapReduceResults<>(
mapReduce(query, entityClass, inputCollectionName, mapFunction, reduceFunction, mapReduceOptions, entityClass),
new Document());
}
/**
* @param query
* @param domainType
* @param inputCollectionName
* @param mapFunction
* @param reduceFunction
* @param mapReduceOptions
* @param resultType
* @return
* @since 2.1
*/
public <T> List<T> mapReduce(Query query, Class<?> domainType, String inputCollectionName, String mapFunction,
String reduceFunction, @Nullable MapReduceOptions mapReduceOptions, Class<T> resultType) {
Assert.notNull(domainType, "Domain type must not be null!");
Assert.notNull(inputCollectionName, "Input collection name must not be null!");
Assert.notNull(resultType, "Result type must not be null!");
Assert.notNull(mapFunction, "Map function must not be null!");
Assert.notNull(reduceFunction, "Reduce function must not be null!");
String mapFunc = replaceWithResourceIfNecessary(mapFunction);
String reduceFunc = replaceWithResourceIfNecessary(reduceFunction);
MongoCollection<Document> inputCollection = getAndPrepareCollection(doGetDatabase(), inputCollectionName);
// MapReduceOp
MapReduceIterable<Document> result = inputCollection.mapReduce(mapFunc, reduceFunc, Document.class);
if (query != null && result != null) {
MapReduceIterable<Document> mapReduce = inputCollection.mapReduce(mapFunc, reduceFunc, Document.class);
if (query.getLimit() > 0 && mapReduceOptions.getLimit() == null) {
result = result.limit(query.getLimit());
}
if (query.getMeta() != null && query.getMeta().getMaxTimeMsec() != null) {
result = result.maxTime(query.getMeta().getMaxTimeMsec(), TimeUnit.MILLISECONDS);
}
result = result.sort(getMappedSortObject(query, entityClass));
result = result.filter(queryMapper.getMappedObject(query.getQueryObject(), Optional.empty()));
if (query.getLimit() > 0 && mapReduceOptions != null && mapReduceOptions.getLimit() == null) {
mapReduce = mapReduce.limit(query.getLimit());
}
if (query.getMeta().getMaxTimeMsec() != null) {
mapReduce = mapReduce.maxTime(query.getMeta().getMaxTimeMsec(), TimeUnit.MILLISECONDS);
}
mapReduce = mapReduce.sort(getMappedSortObject(query, domainType));
mapReduce = mapReduce
.filter(queryMapper.getMappedObject(query.getQueryObject(), mappingContext.getPersistentEntity(domainType)));
Optional<Collation> collation = query.getCollation();
@@ -1837,32 +1896,32 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
}
if (!CollectionUtils.isEmpty(mapReduceOptions.getScopeVariables())) {
result = result.scope(new Document(mapReduceOptions.getScopeVariables()));
mapReduce = mapReduce.scope(new Document(mapReduceOptions.getScopeVariables()));
}
if (mapReduceOptions.getLimit() != null && mapReduceOptions.getLimit().intValue() > 0) {
result = result.limit(mapReduceOptions.getLimit());
if (mapReduceOptions.getLimit() != null && mapReduceOptions.getLimit() > 0) {
mapReduce = mapReduce.limit(mapReduceOptions.getLimit());
}
if (mapReduceOptions.getFinalizeFunction().filter(StringUtils::hasText).isPresent()) {
result = result.finalizeFunction(mapReduceOptions.getFinalizeFunction().get());
mapReduce = mapReduce.finalizeFunction(mapReduceOptions.getFinalizeFunction().get());
}
if (mapReduceOptions.getJavaScriptMode() != null) {
result = result.jsMode(mapReduceOptions.getJavaScriptMode());
mapReduce = mapReduce.jsMode(mapReduceOptions.getJavaScriptMode());
}
if (mapReduceOptions.getOutputSharded().isPresent()) {
result = result.sharded(mapReduceOptions.getOutputSharded().get());
mapReduce = mapReduce.sharded(mapReduceOptions.getOutputSharded().get());
}
}
result = collation.map(Collation::toMongoCollation).map(result::collation).orElse(result);
mapReduce = collation.map(Collation::toMongoCollation).map(mapReduce::collation).orElse(mapReduce);
List<T> mappedResults = new ArrayList<T>();
DocumentCallback<T> callback = new ReadDocumentCallback<T>(mongoConverter, entityClass, inputCollectionName);
List<T> mappedResults = new ArrayList<>();
DocumentCallback<T> callback = new ReadDocumentCallback<>(mongoConverter, resultType, inputCollectionName);
for (Document document : result) {
for (Document document : mapReduce) {
mappedResults.add(callback.doWith(document));
}
return new MapReduceResults<T>(mappedResults, new Document());
return mappedResults;
}
public <T> GroupByResults<T> group(String inputCollectionName, GroupBy groupBy, Class<T> entityClass) {
@@ -2184,6 +2243,15 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
return new ExecutableAggregationOperationSupport(this).aggregateAndReturn(domainType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableAggregationOperation#aggregateAndReturn(java.lang.Class)
*/
@Override
public <T> ExecutableMapReduce<T> mapReduce(Class<T> domainType) {
return new ExecutableMapReduceOperationSupport(this).mapReduce(domainType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ExecutableInsertOperation#insert(java.lang.Class)
@@ -2243,7 +2311,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
}
protected MongoDatabase doGetDatabase() {
return mongoDbFactory.getDb();
return MongoDatabaseUtils.getDatabase(mongoDbFactory, sessionSynchronization);
}
protected MongoDatabase prepareDatabase(MongoDatabase database) {
@@ -2305,7 +2373,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
// TODO: Emit a collection created event
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Created collection [{}]", coll.getNamespace().getCollectionName());
LOGGER.debug("Created collection [{}]",
coll.getNamespace() != null ? coll.getNamespace().getCollectionName() : collectionName);
}
return coll;
}
@@ -2816,7 +2885,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findOne using query: {} fields: {} in db.collection: {}", serializeToJsonSafely(query),
serializeToJsonSafely(fields.orElseGet(Document::new)), collection.getNamespace().getFullName());
serializeToJsonSafely(fields.orElseGet(Document::new)),
collection.getNamespace() != null ? collection.getNamespace().getFullName() : "n/a");
}
if (fields.isPresent()) {
@@ -2959,7 +3029,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
/**
* Simple {@link DocumentCallback} that will transform {@link Document} into the given target type using the given
* {@link MongoReader}.
* {@link EntityReader}.
*
* @author Oliver Gierke
* @author Christoph Strobl

View File

@@ -19,7 +19,8 @@ package org.springframework.data.mongodb.core;
* Stripped down interface providing access to a fluent API that specifies a basic set of reactive MongoDB operations.
*
* @author Mark Paluch
* @author Christoph Strobl
* @since 2.0
*/
public interface ReactiveFluentMongoOperations extends ReactiveFindOperation, ReactiveInsertOperation,
ReactiveUpdateOperation, ReactiveRemoveOperation, ReactiveAggregationOperation {}
ReactiveUpdateOperation, ReactiveRemoveOperation, ReactiveAggregationOperation, ReactiveMapReduceOperation {}

View File

@@ -0,0 +1,199 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import reactor.core.publisher.Flux;
import org.springframework.data.mongodb.core.ExecutableFindOperation.ExecutableFind;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.query.Query;
/**
* {@link ReactiveMapReduceOperation} allows creation and execution of MongoDB mapReduce operations in a fluent API
* style. The starting {@literal domainType} is used for mapping an optional {@link Query} provided via {@code matching}
* into the MongoDB specific representation. By default, the originating {@literal domainType} is also used for mapping
* back the results from the {@link org.bson.Document}. However, it is possible to define an different
* {@literal returnType} via {@code as} to mapping the result.<br />
* The collection to operate on is by default derived from the initial {@literal domainType} and can be defined there
* via {@link org.springframework.data.mongodb.core.mapping.Document}. Using {@code inCollection} allows to override the
* collection name for the execution.
*
* <pre>
* <code>
* mapReduce(Human.class)
* .map("function() { emit(this.id, this.firstname) }")
* .reduce("function(id, name) { return sum(id, name); }")
* .inCollection("star-wars")
* .as(Jedi.class)
* .matching(query(where("lastname").is("skywalker")))
* .all();
* </code>
* </pre>
*
* @author Christoph Strobl
* @since 2.1
*/
public interface ReactiveMapReduceOperation {
/**
* Start creating a mapReduce operation for the given {@literal domainType}.
*
* @param domainType must not be {@literal null}.
* @return new instance of {@link ExecutableFind}.
* @throws IllegalArgumentException if domainType is {@literal null}.
*/
<T> MapReduceWithMapFunction<T> mapReduce(Class<T> domainType);
/**
* Trigger mapReduce execution by calling one of the terminating methods.
*
* @author Christoph Strobl
* @since 2.1
*/
interface TerminatingMapReduce<T> {
/**
* Get the {@link Flux} emitting mapReduce results.
*
* @return a {@link Flux} emitting the already mapped operation results.
*/
Flux<T> all();
}
/**
* Provide the Javascript {@code function()} used to map matching documents.
*
* @author Christoph Strobl
* @since 2.1
*/
interface MapReduceWithMapFunction<T> {
/**
* Set the Javascript map {@code function()}.
*
* @param mapFunction must not be {@literal null} nor empty.
* @return new instance of {@link MapReduceWithReduceFunction}.
* @throws IllegalArgumentException if {@literal mapFunction} is {@literal null} or empty.
*/
MapReduceWithReduceFunction<T> map(String mapFunction);
}
/**
* Provide the Javascript {@code function()} used to reduce matching documents.
*
* @author Christoph Strobl
* @since 2.1
*/
interface MapReduceWithReduceFunction<T> {
/**
* Set the Javascript map {@code function()}.
*
* @param reduceFunction must not be {@literal null} nor empty.
* @return new instance of {@link ReactiveMapReduce}.
* @throws IllegalArgumentException if {@literal reduceFunction} is {@literal null} or empty.
*/
ReactiveMapReduce<T> reduce(String reduceFunction);
}
/**
* Collection override (Optional).
*
* @author Christoph Strobl
* @since 2.1
*/
interface MapReduceWithCollection<T> extends MapReduceWithQuery<T> {
/**
* Explicitly set the name of the collection to perform the mapReduce operation on. <br />
* Skip this step to use the default collection derived from the domain type.
*
* @param collection must not be {@literal null} nor {@literal empty}.
* @return new instance of {@link MapReduceWithProjection}.
* @throws IllegalArgumentException if collection is {@literal null}.
*/
MapReduceWithProjection<T> inCollection(String collection);
}
/**
* Input document filter query (Optional).
*
* @author Christoph Strobl
* @since 2.1
*/
interface MapReduceWithQuery<T> extends TerminatingMapReduce<T> {
/**
* Set the filter query to be used.
*
* @param query must not be {@literal null}.
* @return new instance of {@link TerminatingMapReduce}.
* @throws IllegalArgumentException if query is {@literal null}.
*/
TerminatingMapReduce<T> matching(Query query);
}
/**
* Result type override (Optional).
*
* @author Christoph Strobl
* @since 2.1
*/
interface MapReduceWithProjection<T> extends MapReduceWithQuery<T> {
/**
* Define the target type fields should be mapped to. <br />
* Skip this step if you are anyway only interested in the original domain type.
*
* @param resultType must not be {@literal null}.
* @param <R> result type.
* @return new instance of {@link TerminatingMapReduce}.
* @throws IllegalArgumentException if resultType is {@literal null}.
*/
<R> MapReduceWithQuery<R> as(Class<R> resultType);
}
/**
* Additional mapReduce options (Optional).
*
* @author Christoph Strobl
* @since 2.1
*/
interface MapReduceWithOptions<T> {
/**
* Set additional options to apply to the mapReduce operation.
*
* @param options must not be {@literal null}.
* @return new instance of {@link ReactiveMapReduce}.
* @throws IllegalArgumentException if options is {@literal null}.
*/
ReactiveMapReduce<T> with(MapReduceOptions options);
}
/**
* {@link ReactiveMapReduce} provides methods for constructing reactive mapReduce operations in a fluent way.
*
* @author Christoph Strobl
* @since 2.1
*/
interface ReactiveMapReduce<T> extends MapReduceWithMapFunction<T>, MapReduceWithReduceFunction<T>,
MapReduceWithCollection<T>, MapReduceWithProjection<T>, MapReduceWithOptions<T> {
}
}

View File

@@ -0,0 +1,177 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import reactor.core.publisher.Flux;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
/**
* Implementation of {@link ReactiveMapReduceOperation}.
*
* @author Christoph Strobl
* @since 2.1
*/
@RequiredArgsConstructor
class ReactiveMapReduceOperationSupport implements ReactiveMapReduceOperation {
private static final Query ALL_QUERY = new Query();
private final @NonNull ReactiveMongoTemplate template;
/*
* (non-Javascript)
* @see in org.springframework.data.mongodb.core.ExecutableMapReduceOperation#mapReduce(java.lang.Class)
*/
@Override
public <T> ReactiveMapReduceSupport<T> mapReduce(Class<T> domainType) {
Assert.notNull(domainType, "DomainType must not be null!");
return new ReactiveMapReduceSupport<>(template, domainType, domainType, null, ALL_QUERY, null, null, null);
}
/**
* @author Christoph Strobl
* @since 2.1
*/
static class ReactiveMapReduceSupport<T>
implements ReactiveMapReduce<T>, MapReduceWithOptions<T>, MapReduceWithCollection<T>, MapReduceWithProjection<T>,
MapReduceWithQuery<T>, MapReduceWithReduceFunction<T>, MapReduceWithMapFunction<T> {
private final ReactiveMongoTemplate template;
private final Class<?> domainType;
private final Class<T> returnType;
private final @Nullable String collection;
private final Query query;
private final @Nullable String mapFunction;
private final @Nullable String reduceFunction;
private final @Nullable MapReduceOptions options;
ReactiveMapReduceSupport(ReactiveMongoTemplate template, Class<?> domainType, Class<T> returnType,
@Nullable String collection, Query query, @Nullable String mapFunction, @Nullable String reduceFunction,
@Nullable MapReduceOptions options) {
this.template = template;
this.domainType = domainType;
this.returnType = returnType;
this.collection = collection;
this.query = query;
this.mapFunction = mapFunction;
this.reduceFunction = reduceFunction;
this.options = options;
}
/*
* (non-Javascript)
* @see in org.springframework.data.mongodb.core.ExecutableMapReduceOperation.TerminatingMapReduce#all()
*/
@Override
public Flux<T> all() {
return template.mapReduce(query, domainType, getCollectionName(), returnType, mapFunction, reduceFunction,
options);
}
/*
* (non-Javascript)
* @see in org.springframework.data.mongodb.core.ReactiveMapReduceOperation.MapReduceWithCollection#inCollection(java.lang.String)
*/
@Override
public MapReduceWithProjection<T> inCollection(String collection) {
Assert.hasText(collection, "Collection name must not be null nor empty!");
return new ReactiveMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options);
}
/*
* (non-Javascript)
* @see in org.springframework.data.mongodb.core.ReactiveMapReduceOperation.MapReduceWithQuery#query(org.springframework.data.mongodb.core.query.Query)
*/
@Override
public TerminatingMapReduce<T> matching(Query query) {
Assert.notNull(query, "Query must not be null!");
return new ReactiveMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options);
}
/*
* (non-Javascript)
* @see in org.springframework.data.mongodb.core.ReactiveMapReduceOperation.MapReduceWithProjection#as(java.lang.Class)
*/
@Override
public <R> MapReduceWithQuery<R> as(Class<R> resultType) {
Assert.notNull(resultType, "ResultType must not be null!");
return new ReactiveMapReduceSupport<>(template, domainType, resultType, collection, query, mapFunction,
reduceFunction, options);
}
/*
* (non-Javascript)
* @see in org.springframework.data.mongodb.core.ReactiveMapReduceOperation.MapReduceWithOptions#with(org.springframework.data.mongodb.core.mapreduce.MapReduceOptions)
*/
@Override
public ReactiveMapReduce<T> with(MapReduceOptions options) {
Assert.notNull(options, "Options must not be null! Please consider empty MapReduceOptions#options() instead.");
return new ReactiveMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options);
}
/*
* (non-Javascript)
* @see in org.springframework.data.mongodb.core.ReactiveMapReduceOperation.MapReduceWithMapFunction#map(java.lang.String)
*/
@Override
public MapReduceWithReduceFunction<T> map(String mapFunction) {
Assert.hasText(mapFunction, "MapFunction name must not be null nor empty!");
return new ReactiveMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options);
}
/*
* (non-Javascript)
* @see in org.springframework.data.mongodb.core.ReactiveMapReduceOperation.MapReduceWithReduceFunction#reduce(java.lang.String)
*/
@Override
public ReactiveMapReduce<T> reduce(String reduceFunction) {
Assert.hasText(reduceFunction, "ReduceFunction name must not be null nor empty!");
return new ReactiveMapReduceSupport<>(template, domainType, returnType, collection, query, mapFunction,
reduceFunction, options);
}
private String getCollectionName() {
return StringUtils.hasText(collection) ? collection : template.determineCollectionName(domainType);
}
}
}

View File

@@ -0,0 +1,68 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.reactivestreams.Publisher;
import org.springframework.util.Assert;
import reactor.core.publisher.Mono;
import reactor.util.context.Context;
import com.mongodb.reactivestreams.client.ClientSession;
/**
* {@link ReactiveMongoContext} utilizes and enriches the Reactor {@link Context} with information potentially required
* for e.g. {@link ClientSession} handling and transactions.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.1
* @see Mono#subscriberContext()
* @see Context
*/
public class ReactiveMongoContext {
private static final Class<?> SESSION_KEY = ClientSession.class;
/**
* Gets the {@code Mono<ClientSession>} from Reactor {@link reactor.util.context.Context}. The resulting {@link Mono}
* emits the {@link ClientSession} if a session is associated with the current {@link reactor.util.context.Context
* subscriber context}. If the context does not contain a session, the resulting {@link Mono} terminates empty (i.e.
* without emitting a value).
*
* @return the {@link Mono} emitting the client session if present; otherwise the {@link Mono} terminates empty.
*/
public static Mono<ClientSession> getSession() {
return Mono.subscriberContext().filter(ctx -> ctx.hasKey(SESSION_KEY))
.flatMap(ctx -> ctx.<Mono<ClientSession>> get(SESSION_KEY));
}
/**
* Sets the {@link ClientSession} into the Reactor {@link reactor.util.context.Context}.
*
* @param context must not be {@literal null}.
* @param session must not be {@literal null}.
* @return a new {@link Context}.
* @see Context#put(Object, Object)
*/
public static Context setSession(Context context, Publisher<ClientSession> session) {
Assert.notNull(context, "Context must not be null!");
Assert.notNull(session, "Session publisher must not be null!");
return context.put(SESSION_KEY, Mono.from(session));
}
}

View File

@@ -33,6 +33,7 @@ import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.index.ReactiveIndexOperations;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
@@ -45,8 +46,8 @@ import com.mongodb.ClientSessionOptions;
import com.mongodb.ReadPreference;
import com.mongodb.client.result.DeleteResult;
import com.mongodb.client.result.UpdateResult;
import com.mongodb.reactivestreams.client.ClientSession;
import com.mongodb.reactivestreams.client.MongoCollection;
import com.mongodb.session.ClientSession;
/**
* Interface that specifies a basic set of MongoDB operations executed in a reactive way.
@@ -155,7 +156,7 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
* {@link ClientSession} when done.
*
* @param sessionProvider must not be {@literal null}.
* @return new instance of {@link SessionScoped}. Never {@literal null}.
* @return new instance of {@link ReactiveSessionScoped}. Never {@literal null}.
* @since 2.1
*/
default ReactiveSessionScoped withSession(Supplier<ClientSession> sessionProvider) {
@@ -168,40 +169,30 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
/**
* Obtain a {@link ClientSession session} bound instance of {@link SessionScoped} binding a new {@link ClientSession}
* with given {@literal sessionOptions} to each and every command issued against MongoDB.
* <p />
* <strong>Note:</strong> It is up to the caller to manage the {@link ClientSession} lifecycle. Use
* {@link ReactiveSessionScoped#execute(ReactiveSessionCallback, Consumer)} to provide a hook for processing the
* {@link ClientSession} when done.
*
* @param sessionOptions must not be {@literal null}.
* @return new instance of {@link SessionScoped}. Never {@literal null}.
* @return new instance of {@link ReactiveSessionScoped}. Never {@literal null}.
* @since 2.1
*/
ReactiveSessionScoped withSession(ClientSessionOptions sessionOptions);
/**
* Obtain a {@link ClientSession session} bound instance of {@link SessionScoped} binding the {@link ClientSession}
* provided by the given {@link Supplier} to each and every command issued against MongoDB.
* Obtain a {@link ClientSession session} bound instance of {@link ReactiveSessionScoped} binding the
* {@link ClientSession} provided by the given {@link Publisher} to each and every command issued against MongoDB.
* <p />
* <strong>Note:</strong> It is up to the caller to manage the {@link ClientSession} lifecycle. Use the
* {@literal onComplete} hook to potentially close the {@link ClientSession}.
* <strong>Note:</strong> It is up to the caller to manage the {@link ClientSession} lifecycle. Use
* {@link ReactiveSessionScoped#execute(ReactiveSessionCallback, Consumer)} to provide a hook for processing the
* {@link ClientSession} when done.
*
* @param sessionProvider must not be {@literal null}.
* @return new instance of {@link SessionScoped}. Never {@literal null}.
* @return new instance of {@link ReactiveSessionScoped}. Never {@literal null}.
* @since 2.1
*/
default ReactiveSessionScoped withSession(Publisher<ClientSession> sessionProvider) {
return new ReactiveSessionScoped() {
private final Mono<ClientSession> cachedSession = Mono.from(sessionProvider).cache();
@Override
public <T> Flux<T> execute(ReactiveSessionCallback<T> action, Consumer<ClientSession> doFinally) {
return cachedSession.flatMapMany(session -> {
return Flux.from(action.doInSession(ReactiveMongoOperations.this.withSession(session)))
.doFinally((signalType) -> doFinally.accept(session));
});
}
};
}
ReactiveSessionScoped withSession(Publisher<ClientSession> sessionProvider);
/**
* Obtain a {@link ClientSession} bound instance of {@link ReactiveMongoOperations}.
@@ -214,6 +205,34 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
*/
ReactiveMongoOperations withSession(ClientSession session);
/**
* Initiate a new {@link ClientSession} and obtain a {@link ClientSession session} bound instance of
* {@link ReactiveSessionScoped}. Starts the transaction and adds the {@link ClientSession} to each and every command
* issued against MongoDB.
* <p/>
* Each {@link ReactiveSessionScoped#execute(ReactiveSessionCallback) execution} initiates a new managed transaction
* that is {@link ClientSession#commitTransaction() committed} on success. Transactions are
* {@link ClientSession#abortTransaction() rolled back} upon errors.
*
* @return new instance of {@link ReactiveSessionScoped}. Never {@literal null}.
*/
ReactiveSessionScoped inTransaction();
/**
* Obtain a {@link ClientSession session} bound instance of {@link ReactiveSessionScoped}, start the transaction and
* bind the {@link ClientSession} provided by the given {@link Publisher} to each and every command issued against
* MongoDB.
* <p/>
* Each {@link ReactiveSessionScoped#execute(ReactiveSessionCallback) execution} initiates a new managed transaction
* that is {@link ClientSession#commitTransaction() committed} on success. Transactions are
* {@link ClientSession#abortTransaction() rolled back} upon errors.
*
* @param sessionProvider must not be {@literal null}.
* @return new instance of {@link ReactiveSessionScoped}. Never {@literal null}.
* @since 2.1
*/
ReactiveSessionScoped inTransaction(Publisher<ClientSession> sessionProvider);
/**
* Create an uncapped collection with a name based on the provided entity class.
*
@@ -1226,6 +1245,42 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
<T> Flux<ChangeStreamEvent<T>> changeStream(List<Document> filter, Class<T> resultType, ChangeStreamOptions options,
String collectionName);
/**
* Execute a map-reduce operation. Use {@link MapReduceOptions} to optionally specify an output collection and other
* args.
*
* @param filterQuery the selection criteria for the documents going input to the map function. Must not be
* {@literal null}.
* @param domainType source type used to determine the input collection name and map the filter {@link Query} against.
* Must not be {@literal null}.
* @param resultType the mapping target of the operations result documents. Must not be {@literal null}.
* @param mapFunction the JavaScript map function. Must not be {@literal null}.
* @param reduceFunction the JavaScript reduce function. Must not be {@literal null}.
* @param options additional options like output collection. Must not be {@literal null}.
* @return a {@link Flux} emitting the result document sequence. Never {@literal null}.
* @since 2.1
*/
<T> Flux<T> mapReduce(Query filterQuery, Class<?> domainType, Class<T> resultType, String mapFunction,
String reduceFunction, MapReduceOptions options);
/**
* Execute a map-reduce operation. Use {@link MapReduceOptions} to optionally specify an output collection and other
* args.
*
* @param filterQuery the selection criteria for the documents going input to the map function. Must not be
* {@literal null}.
* @param domainType source type used to map the filter {@link Query} against. Must not be {@literal null}.
* @param inputCollectionName the input collection.
* @param resultType the mapping target of the operations result documents. Must not be {@literal null}.
* @param mapFunction the JavaScript map function. Must not be {@literal null}.
* @param reduceFunction the JavaScript reduce function. Must not be {@literal null}.
* @param options additional options like output collection. Must not be {@literal null}.
* @return a {@link Flux} emitting the result document sequence. Never {@literal null}.
* @since 2.1
*/
<T> Flux<T> mapReduce(Query filterQuery, Class<?> domainType, String inputCollectionName, Class<T> resultType,
String mapFunction, String reduceFunction, MapReduceOptions options);
/**
* Returns the underlying {@link MongoConverter}.
*

View File

@@ -26,6 +26,7 @@ import reactor.util.function.Tuple2;
import java.util.*;
import java.util.Map.Entry;
import java.util.concurrent.TimeUnit;
import java.util.function.Consumer;
import java.util.function.Function;
import java.util.stream.Collectors;
@@ -38,6 +39,7 @@ import org.bson.codecs.Codec;
import org.bson.conversions.Bson;
import org.bson.types.ObjectId;
import org.reactivestreams.Publisher;
import org.reactivestreams.Subscriber;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.BeansException;
@@ -58,10 +60,12 @@ import org.springframework.data.geo.Distance;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.geo.Metric;
import org.springframework.data.mapping.MappingException;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.PersistentPropertyAccessor;
import org.springframework.data.mapping.PropertyPath;
import org.springframework.data.mapping.PropertyReferenceException;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.context.MappingContextEvent;
import org.springframework.data.mapping.model.ConvertingPropertyAccessor;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.ReactiveMongoDatabaseFactory;
@@ -72,10 +76,9 @@ import org.springframework.data.mongodb.core.aggregation.PrefixingDelegatingAggr
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.*;
import org.springframework.data.mongodb.core.index.IndexOperationsAdapter;
import org.springframework.data.mongodb.core.index.MongoMappingEventPublisher;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.index.ReactiveIndexOperations;
import org.springframework.data.mongodb.core.index.ReactiveMongoPersistentEntityIndexCreator;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
@@ -88,6 +91,7 @@ import org.springframework.data.mongodb.core.mapping.event.BeforeConvertEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeDeleteEvent;
import org.springframework.data.mongodb.core.mapping.event.BeforeSaveEvent;
import org.springframework.data.mongodb.core.mapping.event.MongoMappingEvent;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.NearQuery;
@@ -102,16 +106,28 @@ import org.springframework.data.util.Pair;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.ResourceUtils;
import org.springframework.util.StringUtils;
import com.mongodb.*;
import com.mongodb.BasicDBObject;
import com.mongodb.ClientSessionOptions;
import com.mongodb.CursorType;
import com.mongodb.DBCollection;
import com.mongodb.DBCursor;
import com.mongodb.DBRef;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
import com.mongodb.client.model.CountOptions;
import com.mongodb.client.model.CreateCollectionOptions;
import com.mongodb.client.model.DeleteOptions;
import com.mongodb.client.model.Filters;
import com.mongodb.client.model.FindOneAndDeleteOptions;
import com.mongodb.client.model.FindOneAndUpdateOptions;
import com.mongodb.client.model.ReplaceOptions;
import com.mongodb.client.model.ReturnDocument;
import com.mongodb.client.model.UpdateOptions;
import com.mongodb.client.model.ValidationOptions;
@@ -120,13 +136,14 @@ import com.mongodb.client.result.DeleteResult;
import com.mongodb.client.result.UpdateResult;
import com.mongodb.reactivestreams.client.AggregatePublisher;
import com.mongodb.reactivestreams.client.ChangeStreamPublisher;
import com.mongodb.reactivestreams.client.ClientSession;
import com.mongodb.reactivestreams.client.DistinctPublisher;
import com.mongodb.reactivestreams.client.FindPublisher;
import com.mongodb.reactivestreams.client.MapReducePublisher;
import com.mongodb.reactivestreams.client.MongoClient;
import com.mongodb.reactivestreams.client.MongoCollection;
import com.mongodb.reactivestreams.client.MongoDatabase;
import com.mongodb.reactivestreams.client.Success;
import com.mongodb.session.ClientSession;
import com.mongodb.util.JSONParseException;
/**
@@ -171,13 +188,14 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
private final UpdateMapper updateMapper;
private final JsonSchemaMapper schemaMapper;
private final SpelAwareProxyProjectionFactory projectionFactory;
private final ApplicationListener<MappingContextEvent<?, ?>> indexCreatorListener;
private @Nullable WriteConcern writeConcern;
private WriteConcernResolver writeConcernResolver = DefaultWriteConcernResolver.INSTANCE;
private WriteResultChecking writeResultChecking = WriteResultChecking.NONE;
private @Nullable ReadPreference readPreference;
private @Nullable ApplicationEventPublisher eventPublisher;
private @Nullable MongoPersistentEntityIndexCreator indexCreator;
private @Nullable ReactiveMongoPersistentEntityIndexCreator indexCreator;
/**
* Constructor used for a basic template configuration.
@@ -206,6 +224,21 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*/
public ReactiveMongoTemplate(ReactiveMongoDatabaseFactory mongoDatabaseFactory,
@Nullable MongoConverter mongoConverter) {
this(mongoDatabaseFactory, mongoConverter, ReactiveMongoTemplate::handleSubscriptionException);
}
/**
* Constructor used for a basic template configuration.
*
* @param mongoDatabaseFactory must not be {@literal null}.
* @param mongoConverter can be {@literal null}.
* @param subscriptionExceptionHandler exception handler called by {@link Flux#doOnError(Consumer)} on reactive type
* materialization via {@link Publisher#subscribe(Subscriber)}. This callback is used during non-blocking
* subscription of e.g. index creation {@link Publisher}s. Must not be {@literal null}.
* @since 2.1
*/
public ReactiveMongoTemplate(ReactiveMongoDatabaseFactory mongoDatabaseFactory,
@Nullable MongoConverter mongoConverter, Consumer<Throwable> subscriptionExceptionHandler) {
Assert.notNull(mongoDatabaseFactory, "ReactiveMongoDatabaseFactory must not be null!");
@@ -216,18 +249,21 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
this.updateMapper = new UpdateMapper(this.mongoConverter);
this.schemaMapper = new MongoJsonSchemaMapper(this.mongoConverter);
this.projectionFactory = new SpelAwareProxyProjectionFactory();
this.indexCreatorListener = new IndexCreatorEventListener(subscriptionExceptionHandler);
// We always have a mapping context in the converter, whether it's a simple one or not
mappingContext = this.mongoConverter.getMappingContext();
// We create indexes based on mapping events
this.mappingContext = this.mongoConverter.getMappingContext();
if (mappingContext instanceof MongoMappingContext) {
indexCreator = new MongoPersistentEntityIndexCreator((MongoMappingContext) mappingContext,
(collectionName) -> IndexOperationsAdapter.blocking(indexOps(collectionName)));
eventPublisher = new MongoMappingEventPublisher(indexCreator);
if (mappingContext instanceof ApplicationEventPublisherAware) {
((ApplicationEventPublisherAware) mappingContext).setApplicationEventPublisher(eventPublisher);
}
// We create indexes based on mapping events
if (this.mappingContext instanceof MongoMappingContext) {
MongoMappingContext mongoMappingContext = (MongoMappingContext) this.mappingContext;
this.indexCreator = new ReactiveMongoPersistentEntityIndexCreator(mongoMappingContext, this::indexOps);
this.eventPublisher = new MongoMappingEventPublisher(this.indexCreatorListener);
mongoMappingContext.setApplicationEventPublisher(this.eventPublisher);
this.mappingContext.getPersistentEntities()
.forEach(entity -> onCheckForIndexes(entity, subscriptionExceptionHandler));
}
}
@@ -240,10 +276,22 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
this.updateMapper = that.updateMapper;
this.schemaMapper = that.schemaMapper;
this.projectionFactory = that.projectionFactory;
this.indexCreator = that.indexCreator;
this.indexCreatorListener = that.indexCreatorListener;
this.mappingContext = that.mappingContext;
}
private void onCheckForIndexes(MongoPersistentEntity<?> entity, Consumer<Throwable> subscriptionExceptionHandler) {
if (indexCreator != null) {
indexCreator.checkForIndexes(entity).subscribe(v -> {}, subscriptionExceptionHandler);
}
}
private static void handleSubscriptionException(Throwable t) {
LOGGER.error("Unexpected exception during asynchronous execution", t);
}
/**
* Configures the {@link WriteResultChecking} to be used with the template. Setting {@literal null} will reset the
* default of {@link ReactiveMongoTemplate#DEFAULT_WRITE_RESULT_CHECKING}.
@@ -275,7 +323,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
}
/**
* Used by @{link {@link #prepareCollection(MongoCollection)} to set the {@link ReadPreference} before any operations
* Used by {@link {@link #prepareCollection(MongoCollection)} to set the {@link ReadPreference} before any operations
* are performed.
*
* @param readPreference
@@ -302,26 +350,28 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
}
/**
* Inspects the given {@link ApplicationContext} for {@link MongoPersistentEntityIndexCreator} and those in turn if
* they were registered for the current {@link MappingContext}. If no creator for the current {@link MappingContext}
* can be found we manually add the internally created one as {@link ApplicationListener} to make sure indexes get
* created appropriately for entity types persisted through this {@link ReactiveMongoTemplate} instance.
* Inspects the given {@link ApplicationContext} for {@link ReactiveMongoPersistentEntityIndexCreator} and those in
* turn if they were registered for the current {@link MappingContext}. If no creator for the current
* {@link MappingContext} can be found we manually add the internally created one as {@link ApplicationListener} to
* make sure indexes get created appropriately for entity types persisted through this {@link ReactiveMongoTemplate}
* instance.
*
* @param context must not be {@literal null}.
*/
private void prepareIndexCreator(ApplicationContext context) {
String[] indexCreators = context.getBeanNamesForType(MongoPersistentEntityIndexCreator.class);
String[] indexCreators = context.getBeanNamesForType(ReactiveMongoPersistentEntityIndexCreator.class);
for (String creator : indexCreators) {
MongoPersistentEntityIndexCreator creatorBean = context.getBean(creator, MongoPersistentEntityIndexCreator.class);
ReactiveMongoPersistentEntityIndexCreator creatorBean = context.getBean(creator,
ReactiveMongoPersistentEntityIndexCreator.class);
if (creatorBean.isIndexCreatorFor(mappingContext)) {
return;
}
}
if (context instanceof ConfigurableApplicationContext) {
((ConfigurableApplicationContext) context).addApplicationListener(indexCreator);
((ConfigurableApplicationContext) context).addApplicationListener(indexCreatorListener);
}
}
@@ -411,6 +461,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
public <T> Flux<T> execute(String collectionName, ReactiveCollectionCallback<T> callback) {
Assert.notNull(callback, "ReactiveCollectionCallback must not be null!");
return createFlux(collectionName, callback);
}
@@ -429,14 +480,75 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
public <T> Flux<T> execute(ReactiveSessionCallback<T> action, Consumer<ClientSession> doFinally) {
return cachedSession.flatMapMany(session -> {
return Flux
.from(action.doInSession(new ReactiveSessionBoundMongoTemplate(session, ReactiveMongoTemplate.this)))
.doFinally((signalType) -> doFinally.accept(session));
return ReactiveMongoTemplate.this.withSession(action, session) //
.doFinally(signalType -> {
doFinally.accept(session);
});
});
}
};
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#inTransaction()
*/
@Override
public ReactiveSessionScoped inTransaction() {
return inTransaction(mongoDatabaseFactory
.getSession(ClientSessionOptions.builder().causallyConsistent(true).build()));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#inTransaction(org.reactivestreams.Publisher)
*/
@Override
public ReactiveSessionScoped inTransaction(Publisher<ClientSession> sessionProvider) {
Mono<ClientSession> cachedSession = Mono.from(sessionProvider).cache();
return new ReactiveSessionScoped() {
@Override
public <T> Flux<T> execute(ReactiveSessionCallback<T> action, Consumer<ClientSession> doFinally) {
return cachedSession.flatMapMany(session -> {
if (!session.hasActiveTransaction()) {
session.startTransaction();
}
return ReactiveMongoTemplate.this.withSession(action, session) //
.materialize() //
.flatMap(signal -> {
if (session.hasActiveTransaction()) {
if (signal.isOnComplete()) {
return Mono.from(session.commitTransaction()).thenReturn(signal);
}
if (signal.isOnError()) {
return Mono.from(session.abortTransaction()).thenReturn(signal);
}
}
return Mono.just(signal);
}) //
.<T> dematerialize() //
.doFinally(signalType -> {
doFinally.accept(session);
});
});
}
};
}
private <T> Flux<T> withSession(ReactiveSessionCallback<T> action, ClientSession session) {
return Flux.from(action.doInSession(new ReactiveSessionBoundMongoTemplate(session, ReactiveMongoTemplate.this))) //
.subscriberContext(ctx -> ReactiveMongoContext.setSession(ctx, Mono.just(session)));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#withSession(com.mongodb.session.ClientSession)
@@ -1421,10 +1533,10 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
}
} else if (writeConcernToUse == null) {
publisher = collection.replaceOne(Filters.eq(ID_FIELD, document.get(ID_FIELD)), document,
new UpdateOptions().upsert(true));
new ReplaceOptions().upsert(true));
} else {
publisher = collection.withWriteConcern(writeConcernToUse)
.replaceOne(Filters.eq(ID_FIELD, document.get(ID_FIELD)), document, new UpdateOptions().upsert(true));
.replaceOne(Filters.eq(ID_FIELD, document.get(ID_FIELD)), document, new ReplaceOptions().upsert(true));
}
return Mono.from(publisher).map(o -> document.get(ID_FIELD));
@@ -1531,7 +1643,12 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
query.getCollation().map(Collation::toMongoCollation).ifPresent(updateOptions::collation);
if (!UpdateMapper.isUpdateObject(updateObj)) {
return collectionToUse.replaceOne(queryObj, updateObj, updateOptions);
ReplaceOptions replaceOptions = new ReplaceOptions();
replaceOptions.upsert(updateOptions.isUpsert());
replaceOptions.collation(updateOptions.getCollation());
return collectionToUse.replaceOne(queryObj, updateObj, replaceOptions);
}
if (multi) {
return collectionToUse.updateMany(queryObj, updateObj, updateOptions);
@@ -1904,6 +2021,115 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return Flux.from(publisher).map(document -> new ChangeStreamEvent<>(document, resultType, getConverter()));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#mapReduce(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.Class, java.lang.String, java.lang.String, org.springframework.data.mongodb.core.mapreduce.MapReduceOptions)
*/
public <T> Flux<T> mapReduce(Query filterQuery, Class<?> domainType, Class<T> resultType, String mapFunction,
String reduceFunction, MapReduceOptions options) {
return mapReduce(filterQuery, domainType, determineCollectionName(domainType), resultType, mapFunction,
reduceFunction, options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#mapReduce(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String, java.lang.Class, java.lang.String, java.lang.String, org.springframework.data.mongodb.core.mapreduce.MapReduceOptions)
*/
public <T> Flux<T> mapReduce(Query filterQuery, Class<?> domainType, String inputCollectionName, Class<T> resultType,
String mapFunction, String reduceFunction, MapReduceOptions options) {
Assert.notNull(filterQuery, "Filter query must not be null!");
Assert.notNull(domainType, "Domain type must not be null!");
Assert.hasText(inputCollectionName, "Input collection name must not be null or empty!");
Assert.notNull(resultType, "Result type must not be null!");
Assert.notNull(mapFunction, "Map function must not be null!");
Assert.notNull(reduceFunction, "Reduce function must not be null!");
Assert.notNull(options, "MapReduceOptions must not be null!");
assertLocalFunctionNames(mapFunction, reduceFunction);
return createFlux(inputCollectionName, collection -> {
Document mappedQuery = queryMapper.getMappedObject(filterQuery.getQueryObject(),
mappingContext.getPersistentEntity(domainType));
MapReducePublisher<Document> publisher = collection.mapReduce(mapFunction, reduceFunction, Document.class);
if (StringUtils.hasText(options.getOutputCollection())) {
publisher = publisher.collectionName(options.getOutputCollection());
}
publisher.filter(mappedQuery);
publisher.sort(getMappedSortObject(filterQuery, domainType));
if (filterQuery.getMeta().getMaxTimeMsec() != null) {
publisher.maxTime(filterQuery.getMeta().getMaxTimeMsec(), TimeUnit.MILLISECONDS);
}
if (filterQuery.getLimit() > 0 || (options.getLimit() != null)) {
if (filterQuery.getLimit() > 0 && (options.getLimit() != null)) {
throw new IllegalArgumentException(
"Both Query and MapReduceOptions define a limit. Please provide the limit only via one of the two.");
}
if (filterQuery.getLimit() > 0) {
publisher.limit(filterQuery.getLimit());
}
if (options.getLimit() != null) {
publisher.limit(options.getLimit());
}
}
Optional<Collation> collation = filterQuery.getCollation();
Optionals.ifAllPresent(filterQuery.getCollation(), options.getCollation(), (l, r) -> {
throw new IllegalArgumentException(
"Both Query and MapReduceOptions define a collation. Please provide the collation only via one of the two.");
});
if (options.getCollation().isPresent()) {
collation = options.getCollation();
}
if (!CollectionUtils.isEmpty(options.getScopeVariables())) {
publisher = publisher.scope(new Document(options.getScopeVariables()));
}
if (options.getLimit() != null && options.getLimit() > 0) {
publisher = publisher.limit(options.getLimit());
}
if (options.getFinalizeFunction().filter(StringUtils::hasText).isPresent()) {
publisher = publisher.finalizeFunction(options.getFinalizeFunction().get());
}
if (options.getJavaScriptMode() != null) {
publisher = publisher.jsMode(options.getJavaScriptMode());
}
if (options.getOutputSharded().isPresent()) {
publisher = publisher.sharded(options.getOutputSharded().get());
}
publisher = collation.map(Collation::toMongoCollation).map(publisher::collation).orElse(publisher);
return Flux.from(publisher)
.map(new ReadDocumentCallback<>(mongoConverter, resultType, inputCollectionName)::doWith);
});
}
private static void assertLocalFunctionNames(String... functions) {
for (String function : functions) {
if (ResourceUtils.isUrl(function)) {
throw new IllegalArgumentException(String.format(
"Blocking accessing to resource %s is not allowed using reactive infrastructure. You may load the resource at startup and cache its value.",
function));
}
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveFindOperation#query(java.lang.Class)
@@ -1949,6 +2175,15 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return new ReactiveAggregationOperationSupport(this).aggregateAndReturn(domainType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveMapReduceOperation#mapReduce(java.lang.Class)
*/
@Override
public <T> ReactiveMapReduce<T> mapReduce(Class<T> domainType) {
return new ReactiveMapReduceOperationSupport(this).mapReduce(domainType);
}
/**
* Retrieve and remove all documents matching the given {@code query} by calling {@link #find(Query, Class, String)}
* and {@link #remove(Query, Class, String)}, whereas the {@link Query} for {@link #remove(Query, Class, String)} is
@@ -2829,7 +3064,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
private final Metric metric;
/**
* Creates a new {@link GeoNearResultDbObjectCallback} using the given {@link DbObjectCallback} delegate for
* Creates a new {@link GeoNearResultDbObjectCallback} using the given {@link DocumentCallback} delegate for
* {@link GeoResult} content unmarshalling.
*
* @param delegate must not be {@literal null}.
@@ -3028,4 +3263,25 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
return delegate.getMongoDatabase();
}
}
@RequiredArgsConstructor
class IndexCreatorEventListener implements ApplicationListener<MappingContextEvent<?, ?>> {
final Consumer<Throwable> subscriptionExceptionHandler;
@Override
public void onApplicationEvent(MappingContextEvent<?, ?> event) {
if (!event.wasEmittedBy(mappingContext)) {
return;
}
PersistentEntity<?, ?> entity = event.getPersistentEntity();
// Double check type as Spring infrastructure does not consider nested generics
if (entity instanceof MongoPersistentEntity) {
onCheckForIndexes((MongoPersistentEntity<?>) entity, subscriptionExceptionHandler);
}
}
}
}

View File

@@ -19,12 +19,12 @@ import org.reactivestreams.Publisher;
import org.springframework.data.mongodb.core.query.Query;
/**
* Callback interface for executing operations within a {@link com.mongodb.session.ClientSession} using reactive
* infrastructure.
* Callback interface for executing operations within a {@link com.mongodb.reactivestreams.client.ClientSession} using
* reactive infrastructure.
*
* @author Christoph Strobl
* @since 2.1
* @see com.mongodb.session.ClientSession
* @see com.mongodb.reactivestreams.client.ClientSession
*/
@FunctionalInterface
public interface ReactiveSessionCallback<T> {

View File

@@ -19,7 +19,7 @@ import reactor.core.publisher.Flux;
import java.util.function.Consumer;
import com.mongodb.session.ClientSession;
import com.mongodb.reactivestreams.client.ClientSession;
/**
* Gateway interface to execute {@link ClientSession} bound operations against MongoDB via a
@@ -39,7 +39,7 @@ public interface ReactiveSessionScoped {
*
* @param action callback object that specifies the MongoDB action the callback action. Must not be {@literal null}.
* @param <T> return type.
* @return a result object returned by the action. Can be {@literal null}.
* @return a result object returned by the action, can be {@link Flux#empty()}.
*/
default <T> Flux<T> execute(ReactiveSessionCallback<T> action) {
return execute(action, (session) -> {});
@@ -56,7 +56,7 @@ public interface ReactiveSessionScoped {
* This {@link Consumer} is guaranteed to be notified in any case (successful and exceptional outcome of
* {@link ReactiveSessionCallback}).
* @param <T> return type.
* @return a result object returned by the action. Can be {@literal null}.
* @return a result object returned by the action, can be {@link Flux#empty()}.
*/
<T> Flux<T> execute(ReactiveSessionCallback<T> action, Consumer<ClientSession> doFinally);
}

View File

@@ -25,6 +25,7 @@ import org.springframework.lang.Nullable;
* @since 2.1
* @see com.mongodb.session.ClientSession
*/
@FunctionalInterface
public interface SessionCallback<T> {
/**

View File

@@ -19,7 +19,7 @@ import java.util.function.Consumer;
import org.springframework.lang.Nullable;
import com.mongodb.session.ClientSession;
import com.mongodb.client.ClientSession;
/**
* Gateway interface to execute {@link ClientSession} bound operations against MongoDB via a {@link SessionCallback}.

View File

@@ -0,0 +1,116 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import org.springframework.beans.factory.DisposableBean;
import com.mongodb.ClientSessionOptions;
import com.mongodb.ConnectionString;
import com.mongodb.DB;
import com.mongodb.client.ClientSession;
import com.mongodb.client.MongoClient;
import com.mongodb.client.MongoClients;
import com.mongodb.client.MongoDatabase;
/**
* Factory to create {@link MongoDatabase} instances from a {@link MongoClient} instance.
*
* @author Christoph Strobl
* @since 2.1
*/
public class SimpleMongoClientDbFactory extends MongoDbFactorySupport<MongoClient> implements DisposableBean {
/**
* Creates a new {@link SimpleMongoClientDbFactory} instance for the given {@code connectionString}.
*
* @param connectionString connection coordinates for a database connection. Must contain a database name and must not
* be {@literal null} or empty.
* @see <a href="https://docs.mongodb.com/manual/reference/connection-string/">MongoDB Connection String reference</a>
*/
public SimpleMongoClientDbFactory(String connectionString) {
this(new ConnectionString(connectionString));
}
/**
* Creates a new {@link SimpleMongoClientDbFactory} instance from the given {@link MongoClient}.
*
* @param connectionString connection coordinates for a database connection. Must contain also a database name and not
* be {@literal null}.
*/
public SimpleMongoClientDbFactory(ConnectionString connectionString) {
this(MongoClients.create(connectionString), connectionString.getDatabase(), true);
}
/**
* Creates a new {@link SimpleMongoClientDbFactory} instance from the given {@link MongoClient}.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null} or empty.
*/
public SimpleMongoClientDbFactory(MongoClient mongoClient, String databaseName) {
this(mongoClient, databaseName, false);
}
/**
* Creates a new {@link SimpleMongoClientDbFactory} instance from the given {@link MongoClient}.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null} or empty.
* @param mongoInstanceCreated
*/
private SimpleMongoClientDbFactory(MongoClient mongoClient, String databaseName, boolean mongoInstanceCreated) {
super(mongoClient, databaseName, mongoInstanceCreated, new MongoExceptionTranslator());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getLegacyDb()
*/
@Override
public DB getLegacyDb() {
throw new UnsupportedOperationException(String.format(
"%s does not support legacy DBObject API! Please consider using SimpleMongoDbFactory for that purpose.",
MongoClient.class));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getSession(com.mongodb.ClientSessionOptions)
*/
@Override
public ClientSession getSession(ClientSessionOptions options) {
return getMongoClient().startSession(options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoDbFactoryBase#closeClient()
*/
@Override
protected void closeClient() {
getMongoClient().close();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoDbFactoryBase#doGetMongoDatabase(java.lang.String)
*/
@Override
protected MongoDatabase doGetMongoDatabase(String dbName) {
return getMongoClient().getDatabase(dbName);
}
}

View File

@@ -15,25 +15,16 @@
*/
package org.springframework.data.mongodb.core;
import lombok.Value;
import org.springframework.aop.framework.ProxyFactory;
import org.springframework.beans.factory.DisposableBean;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.SessionAwareMethodInterceptor;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.ClientSessionOptions;
import com.mongodb.DB;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientURI;
import com.mongodb.WriteConcern;
import com.mongodb.client.ClientSession;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.MongoDatabase;
import com.mongodb.session.ClientSession;
/**
* Factory to create {@link MongoDatabase} instances from a {@link MongoClient} instance.
@@ -45,19 +36,12 @@ import com.mongodb.session.ClientSession;
* @author George Moraitis
* @author Mark Paluch
*/
public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
private final MongoClient mongoClient;
private final String databaseName;
private final boolean mongoInstanceCreated;
private final PersistenceExceptionTranslator exceptionTranslator;
private @Nullable WriteConcern writeConcern;
public class SimpleMongoDbFactory extends MongoDbFactorySupport<MongoClient> implements DisposableBean {
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClientURI}.
*
* @param uri must not be {@literal null}.
* @param uri coordinates for a database connection. Must contain a database name and must not be {@literal null}.
* @since 1.7
*/
public SimpleMongoDbFactory(MongoClientURI uri) {
@@ -68,7 +52,7 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoClient}.
*
* @param mongoClient must not be {@literal null}.
* @param databaseName must not be {@literal null}.
* @param databaseName must not be {@literal null} or empty.
* @since 1.7
*/
public SimpleMongoDbFactory(MongoClient mongoClient, String databaseName) {
@@ -82,76 +66,16 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
* @since 1.7
*/
private SimpleMongoDbFactory(MongoClient mongoClient, String databaseName, boolean mongoInstanceCreated) {
Assert.notNull(mongoClient, "MongoClient must not be null!");
Assert.hasText(databaseName, "Database name must not be empty!");
Assert.isTrue(databaseName.matches("[^/\\\\.$\"\\s]+"),
"Database name must not contain slashes, dots, spaces, quotes, or dollar signs!");
this.mongoClient = mongoClient;
this.databaseName = databaseName;
this.mongoInstanceCreated = mongoInstanceCreated;
this.exceptionTranslator = new MongoExceptionTranslator();
super(mongoClient, databaseName, mongoInstanceCreated, new MongoExceptionTranslator());
}
/**
* Configures the {@link WriteConcern} to be used on the {@link DB} instance being created.
*
* @param writeConcern the writeConcern to set
*/
public void setWriteConcern(WriteConcern writeConcern) {
this.writeConcern = writeConcern;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb()
* @see org.springframework.data.mongodb.MongoDbFactory#getLegacyDb()
*/
public MongoDatabase getDb() throws DataAccessException {
return getDb(databaseName);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb(java.lang.String)
*/
public MongoDatabase getDb(String dbName) throws DataAccessException {
Assert.hasText(dbName, "Database name must not be empty.");
MongoDatabase db = mongoClient.getDatabase(dbName);
if (writeConcern == null) {
return db;
}
return db.withWriteConcern(writeConcern);
}
/**
* Clean up the Mongo instance if it was created by the factory itself.
*
* @see DisposableBean#destroy()
*/
public void destroy() throws Exception {
if (mongoInstanceCreated) {
mongoClient.close();
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getExceptionTranslator()
*/
@Override
public PersistenceExceptionTranslator getExceptionTranslator() {
return this.exceptionTranslator;
}
@SuppressWarnings("deprecation")
@Override
public DB getLegacyDb() {
return mongoClient.getDB(databaseName);
return getMongoClient().getDB(getDefaultDatabaseName());
}
/*
@@ -160,108 +84,24 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
*/
@Override
public ClientSession getSession(ClientSessionOptions options) {
return mongoClient.startSession(options);
return getMongoClient().startSession(options);
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#withSession(com.mongodb.session.ClientSession)
* @see org.springframework.data.mongodb.core.MongoDbFactoryBase#closeClient()
*/
@Override
public MongoDbFactory withSession(ClientSession session) {
return new ClientSessionBoundMongoDbFactory(session, this);
protected void closeClient() {
getMongoClient().close();
}
/**
* {@link ClientSession} bound {@link MongoDbFactory} decorating the database with a
* {@link SessionAwareMethodInterceptor}.
*
* @author Christoph Strobl
* @since 2.1
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoDbFactoryBase#doGetMongoDatabase(java.lang.String)
*/
@Value
static class ClientSessionBoundMongoDbFactory implements MongoDbFactory {
ClientSession session;
MongoDbFactory delegate;
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb()
*/
@Override
public MongoDatabase getDb() throws DataAccessException {
return proxyMongoDatabase(delegate.getDb());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getDb(java.lang.String)
*/
@Override
public MongoDatabase getDb(String dbName) throws DataAccessException {
return proxyMongoDatabase(delegate.getDb(dbName));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getExceptionTranslator()
*/
@Override
public PersistenceExceptionTranslator getExceptionTranslator() {
return delegate.getExceptionTranslator();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getLegacyDb()
*/
@Override
public DB getLegacyDb() {
return delegate.getLegacyDb();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#getSession(com.mongodb.ClientSessionOptions)
*/
@Override
public ClientSession getSession(ClientSessionOptions options) {
return delegate.getSession(options);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.MongoDbFactory#withSession(com.mongodb.session.ClientSession)
*/
@Override
public MongoDbFactory withSession(ClientSession session) {
return delegate.withSession(session);
}
private MongoDatabase proxyMongoDatabase(MongoDatabase database) {
return createProxyInstance(session, database, MongoDatabase.class);
}
private MongoDatabase proxyDatabase(ClientSession session, MongoDatabase database) {
return createProxyInstance(session, database, MongoDatabase.class);
}
private MongoCollection proxyCollection(ClientSession session, MongoCollection collection) {
return createProxyInstance(session, collection, MongoCollection.class);
}
private <T> T createProxyInstance(ClientSession session, T target, Class<T> targetType) {
ProxyFactory factory = new ProxyFactory();
factory.setTarget(target);
factory.setInterfaces(targetType);
factory.setOpaque(true);
factory.addAdvice(new SessionAwareMethodInterceptor<>(session, target, MongoDatabase.class, this::proxyDatabase,
MongoCollection.class, this::proxyCollection));
return targetType.cast(factory.getProxy());
}
@Override
protected MongoDatabase doGetMongoDatabase(String dbName) {
return getMongoClient().getDatabase(dbName);
}
}

View File

@@ -30,11 +30,11 @@ import org.springframework.util.Assert;
import com.mongodb.ClientSessionOptions;
import com.mongodb.ConnectionString;
import com.mongodb.WriteConcern;
import com.mongodb.reactivestreams.client.ClientSession;
import com.mongodb.reactivestreams.client.MongoClient;
import com.mongodb.reactivestreams.client.MongoClients;
import com.mongodb.reactivestreams.client.MongoCollection;
import com.mongodb.reactivestreams.client.MongoDatabase;
import com.mongodb.session.ClientSession;
/**
* Factory to create {@link MongoDatabase} instances from a {@link MongoClient} instance.
@@ -215,23 +215,23 @@ public class SimpleReactiveMongoDatabaseFactory implements DisposableBean, React
return createProxyInstance(session, database, MongoDatabase.class);
}
private MongoDatabase proxyDatabase(ClientSession session, MongoDatabase database) {
private MongoDatabase proxyDatabase(com.mongodb.session.ClientSession session, MongoDatabase database) {
return createProxyInstance(session, database, MongoDatabase.class);
}
private MongoCollection proxyCollection(ClientSession session, MongoCollection collection) {
private MongoCollection proxyCollection(com.mongodb.session.ClientSession session, MongoCollection collection) {
return createProxyInstance(session, collection, MongoCollection.class);
}
private <T> T createProxyInstance(ClientSession session, T target, Class<T> targetType) {
private <T> T createProxyInstance(com.mongodb.session.ClientSession session, T target, Class<T> targetType) {
ProxyFactory factory = new ProxyFactory();
factory.setTarget(target);
factory.setInterfaces(targetType);
factory.setOpaque(true);
factory.addAdvice(new SessionAwareMethodInterceptor<>(session, target, MongoDatabase.class, this::proxyDatabase,
MongoCollection.class, this::proxyCollection));
factory.addAdvice(new SessionAwareMethodInterceptor<>(session, target, ClientSession.class, MongoDatabase.class,
this::proxyDatabase, MongoCollection.class, this::proxyCollection));
return targetType.cast(factory.getProxy());
}

View File

@@ -15,10 +15,13 @@
*/
package org.springframework.data.mongodb.core.convert;
import java.text.Collator;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
import java.util.List;
import java.util.Map;
import java.util.TreeMap;
import org.bson.Document;
import org.springframework.core.convert.converter.Converter;
@@ -41,11 +44,13 @@ import org.springframework.data.mongodb.core.geo.GeoJsonPoint;
import org.springframework.data.mongodb.core.geo.GeoJsonPolygon;
import org.springframework.data.mongodb.core.geo.Sphere;
import org.springframework.data.mongodb.core.query.GeoCommand;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.NumberUtils;
import org.springframework.util.ObjectUtils;
import com.mongodb.BasicDBList;
import com.mongodb.Function;
/**
* Wrapper class to contain useful geo structure converters for the usage with Mongo.
@@ -58,6 +63,26 @@ import com.mongodb.BasicDBList;
*/
abstract class GeoConverters {
private final static Map<String, Function<Document, GeoJson<?>>> converters;
static {
Collator caseInsensitive = Collator.getInstance();
caseInsensitive.setStrength(Collator.PRIMARY);
Map<String, Function<Document, GeoJson<?>>> geoConverters = new TreeMap<>(caseInsensitive);
geoConverters.put("point", DocumentToGeoJsonPointConverter.INSTANCE::convert);
geoConverters.put("multipoint", DocumentToGeoJsonMultiPointConverter.INSTANCE::convert);
geoConverters.put("linestring", DocumentToGeoJsonLineStringConverter.INSTANCE::convert);
geoConverters.put("multilinestring", DocumentToGeoJsonMultiLineStringConverter.INSTANCE::convert);
geoConverters.put("polygon", DocumentToGeoJsonPolygonConverter.INSTANCE::convert);
geoConverters.put("multipolygon", DocumentToGeoJsonMultiPolygonConverter.INSTANCE::convert);
geoConverters.put("geometrycollection", DocumentToGeoJsonGeometryCollectionConverter.INSTANCE::convert);
converters = geoConverters;
}
/**
* Private constructor to prevent instantiation.
*/
@@ -91,7 +116,8 @@ abstract class GeoConverters {
, DocumentToGeoJsonMultiLineStringConverter.INSTANCE //
, DocumentToGeoJsonMultiPointConverter.INSTANCE //
, DocumentToGeoJsonMultiPolygonConverter.INSTANCE //
, DocumentToGeoJsonGeometryCollectionConverter.INSTANCE);
, DocumentToGeoJsonGeometryCollectionConverter.INSTANCE //
, DocumentToGeoJsonConverter.INSTANCE);
}
/**
@@ -101,7 +127,7 @@ abstract class GeoConverters {
* @since 1.5
*/
@ReadingConverter
static enum DocumentToPointConverter implements Converter<Document, Point> {
enum DocumentToPointConverter implements Converter<Document, Point> {
INSTANCE;
@@ -132,7 +158,7 @@ abstract class GeoConverters {
* @author Thomas Darimont
* @since 1.5
*/
static enum PointToDocumentConverter implements Converter<Point, Document> {
enum PointToDocumentConverter implements Converter<Point, Document> {
INSTANCE;
@@ -147,13 +173,13 @@ abstract class GeoConverters {
}
/**
* Converts a {@link Box} into a {@link BasicDBList}.
* Converts a {@link Box} into a {@link Document}.
*
* @author Thomas Darimont
* @since 1.5
*/
@WritingConverter
static enum BoxToDocumentConverter implements Converter<Box, Document> {
enum BoxToDocumentConverter implements Converter<Box, Document> {
INSTANCE;
@@ -176,13 +202,13 @@ abstract class GeoConverters {
}
/**
* Converts a {@link BasicDBList} into a {@link Box}.
* Converts a {@link Document} into a {@link Box}.
*
* @author Thomas Darimont
* @since 1.5
*/
@ReadingConverter
static enum DocumentToBoxConverter implements Converter<Document, Box> {
enum DocumentToBoxConverter implements Converter<Document, Box> {
INSTANCE;
@@ -205,12 +231,12 @@ abstract class GeoConverters {
}
/**
* Converts a {@link Circle} into a {@link BasicDBList}.
* Converts a {@link Circle} into a {@link Document}.
*
* @author Thomas Darimont
* @since 1.5
*/
static enum CircleToDocumentConverter implements Converter<Circle, Document> {
enum CircleToDocumentConverter implements Converter<Circle, Document> {
INSTANCE;
@@ -276,7 +302,7 @@ abstract class GeoConverters {
}
/**
* Converts a {@link Sphere} into a {@link BasicDBList}.
* Converts a {@link Sphere} into a {@link Document}.
*
* @author Thomas Darimont
* @since 1.5
@@ -305,13 +331,13 @@ abstract class GeoConverters {
}
/**
* Converts a {@link BasicDBList} into a {@link Sphere}.
* Converts a {@link Document} into a {@link Sphere}.
*
* @author Thomas Darimont
* @since 1.5
*/
@ReadingConverter
static enum DocumentToSphereConverter implements Converter<Document, Sphere> {
enum DocumentToSphereConverter implements Converter<Document, Sphere> {
INSTANCE;
@@ -347,12 +373,12 @@ abstract class GeoConverters {
}
/**
* Converts a {@link Polygon} into a {@link BasicDBList}.
* Converts a {@link Polygon} into a {@link Document}.
*
* @author Thomas Darimont
* @since 1.5
*/
static enum PolygonToDocumentConverter implements Converter<Polygon, Document> {
enum PolygonToDocumentConverter implements Converter<Polygon, Document> {
INSTANCE;
@@ -368,7 +394,7 @@ abstract class GeoConverters {
}
List<Point> points = source.getPoints();
List<Document> pointTuples = new ArrayList<Document>(points.size());
List<Document> pointTuples = new ArrayList<>(points.size());
for (Point point : points) {
pointTuples.add(PointToDocumentConverter.INSTANCE.convert(point));
@@ -381,13 +407,13 @@ abstract class GeoConverters {
}
/**
* Converts a {@link BasicDBList} into a {@link Polygon}.
* Converts a {@link Document} into a {@link Polygon}.
*
* @author Thomas Darimont
* @since 1.5
*/
@ReadingConverter
static enum DocumentToPolygonConverter implements Converter<Document, Polygon> {
enum DocumentToPolygonConverter implements Converter<Document, Polygon> {
INSTANCE;
@@ -404,7 +430,7 @@ abstract class GeoConverters {
}
List<Document> points = (List<Document>) source.get("points");
List<Point> newPoints = new ArrayList<Point>(points.size());
List<Point> newPoints = new ArrayList<>(points.size());
for (Document element : points) {
@@ -417,12 +443,12 @@ abstract class GeoConverters {
}
/**
* Converts a {@link Sphere} into a {@link BasicDBList}.
* Converts a {@link Sphere} into a {@link Document}.
*
* @author Thomas Darimont
* @since 1.5
*/
static enum GeoCommandToDocumentConverter implements Converter<GeoCommand, Document> {
enum GeoCommandToDocumentConverter implements Converter<GeoCommand, Document> {
INSTANCE;
@@ -482,7 +508,7 @@ abstract class GeoConverters {
* @since 1.7
*/
@SuppressWarnings("rawtypes")
static enum GeoJsonToDocumentConverter implements Converter<GeoJson, Document> {
enum GeoJsonToDocumentConverter implements Converter<GeoJson, Document> {
INSTANCE;
@@ -545,7 +571,7 @@ abstract class GeoConverters {
* @author Christoph Strobl
* @since 1.7
*/
static enum GeoJsonPointToDocumentConverter implements Converter<GeoJsonPoint, Document> {
enum GeoJsonPointToDocumentConverter implements Converter<GeoJsonPoint, Document> {
INSTANCE;
@@ -563,7 +589,7 @@ abstract class GeoConverters {
* @author Christoph Strobl
* @since 1.7
*/
static enum GeoJsonPolygonToDocumentConverter implements Converter<GeoJsonPolygon, Document> {
enum GeoJsonPolygonToDocumentConverter implements Converter<GeoJsonPolygon, Document> {
INSTANCE;
@@ -581,7 +607,7 @@ abstract class GeoConverters {
* @author Christoph Strobl
* @since 1.7
*/
static enum DocumentToGeoJsonPointConverter implements Converter<Document, GeoJsonPoint> {
enum DocumentToGeoJsonPointConverter implements Converter<Document, GeoJsonPoint> {
INSTANCE;
@@ -609,7 +635,7 @@ abstract class GeoConverters {
* @author Christoph Strobl
* @since 1.7
*/
static enum DocumentToGeoJsonPolygonConverter implements Converter<Document, GeoJsonPolygon> {
enum DocumentToGeoJsonPolygonConverter implements Converter<Document, GeoJsonPolygon> {
INSTANCE;
@@ -654,7 +680,7 @@ abstract class GeoConverters {
String.format("Cannot convert type '%s' to MultiPolygon.", source.get("type")));
List dbl = (List) source.get("coordinates");
List<GeoJsonPolygon> polygones = new ArrayList<GeoJsonPolygon>();
List<GeoJsonPolygon> polygones = new ArrayList<>();
for (Object polygon : dbl) {
polygones.add(toGeoJsonPolygon((List) polygon));
@@ -668,7 +694,7 @@ abstract class GeoConverters {
* @author Christoph Strobl
* @since 1.7
*/
static enum DocumentToGeoJsonLineStringConverter implements Converter<Document, GeoJsonLineString> {
enum DocumentToGeoJsonLineStringConverter implements Converter<Document, GeoJsonLineString> {
INSTANCE;
@@ -696,7 +722,7 @@ abstract class GeoConverters {
* @author Christoph Strobl
* @since 1.7
*/
static enum DocumentToGeoJsonMultiPointConverter implements Converter<Document, GeoJsonMultiPoint> {
enum DocumentToGeoJsonMultiPointConverter implements Converter<Document, GeoJsonMultiPoint> {
INSTANCE;
@@ -724,7 +750,7 @@ abstract class GeoConverters {
* @author Christoph Strobl
* @since 1.7
*/
static enum DocumentToGeoJsonMultiLineStringConverter implements Converter<Document, GeoJsonMultiLineString> {
enum DocumentToGeoJsonMultiLineStringConverter implements Converter<Document, GeoJsonMultiLineString> {
INSTANCE;
@@ -756,7 +782,7 @@ abstract class GeoConverters {
* @author Christoph Strobl
* @since 1.7
*/
static enum DocumentToGeoJsonGeometryCollectionConverter implements Converter<Document, GeoJsonGeometryCollection> {
enum DocumentToGeoJsonGeometryCollectionConverter implements Converter<Document, GeoJsonGeometryCollection> {
INSTANCE;
@@ -775,41 +801,12 @@ abstract class GeoConverters {
Assert.isTrue(ObjectUtils.nullSafeEquals(source.get("type"), "GeometryCollection"),
String.format("Cannot convert type '%s' to GeometryCollection.", source.get("type")));
List<GeoJson<?>> geometries = new ArrayList<GeoJson<?>>();
List<GeoJson<?>> geometries = new ArrayList<>();
for (Object o : (List) source.get("geometries")) {
geometries.add(convertGeometries((Document) o));
geometries.add(toGenericGeoJson((Document) o));
}
return new GeoJsonGeometryCollection(geometries);
}
private static GeoJson<?> convertGeometries(Document source) {
Object type = source.get("type");
if (ObjectUtils.nullSafeEquals(type, "Point")) {
return DocumentToGeoJsonPointConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "MultiPoint")) {
return DocumentToGeoJsonMultiPointConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "LineString")) {
return DocumentToGeoJsonLineStringConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "MultiLineString")) {
return DocumentToGeoJsonMultiLineStringConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "Polygon")) {
return DocumentToGeoJsonPolygonConverter.INSTANCE.convert(source);
}
if (ObjectUtils.nullSafeEquals(type, "MultiPolygon")) {
return DocumentToGeoJsonMultiPolygonConverter.INSTANCE.convert(source);
}
throw new IllegalArgumentException(String.format("Cannot convert unknown GeoJson type %s", type));
}
}
@@ -827,7 +824,7 @@ abstract class GeoConverters {
@SuppressWarnings("unchecked")
static List<Point> toListOfPoint(List listOfCoordinatePairs) {
List<Point> points = new ArrayList<Point>();
List<Point> points = new ArrayList<>();
for (Object point : listOfCoordinatePairs) {
@@ -852,6 +849,46 @@ abstract class GeoConverters {
return new GeoJsonPolygon(toListOfPoint((List) dbList.get(0)));
}
/**
* Converter implementation transforming a {@link Document} into a concrete {@link GeoJson} based on the embedded
* {@literal type} information.
*
* @since 2.1
* @author Christoph Strobl
*/
@ReadingConverter
enum DocumentToGeoJsonConverter implements Converter<Document, GeoJson> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
@Nullable
@Override
public GeoJson convert(Document source) {
return toGenericGeoJson(source);
}
}
private static GeoJson<?> toGenericGeoJson(Document source) {
String type = source.get("type", String.class);
if(type != null) {
Function<Document, GeoJson<?>> converter = converters.get(type);
if(converter != null){
return converter.apply(source);
}
}
throw new IllegalArgumentException(
String.format("No converter found capable of converting GeoJson type %s.", type));
}
private static double toPrimitiveDoubleValue(Object value) {
Assert.isInstanceOf(Number.class, value, "Argument must be a Number.");

View File

@@ -13,21 +13,22 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
/**
* TODO: Revisit for a better pattern.
* Provider interface to obtain {@link IndexOperations} by MongoDB collection name.
*
* @author Mark Paluch
* @author Jens Schauder
* @since 2.0
*/
@FunctionalInterface
public interface IndexOperationsProvider {
/**
* Returns the operations that can be performed on indexes
*
* @param collectionName name of the MongoDB collection, must not be {@literal null}.
* @return index operations on the named collection
*/
IndexOperations indexOps(String collectionName);

View File

@@ -18,6 +18,7 @@ package org.springframework.data.mongodb.core.index;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationEvent;
import org.springframework.context.ApplicationEventPublisher;
import org.springframework.context.ApplicationListener;
import org.springframework.data.mapping.context.MappingContextEvent;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
@@ -38,7 +39,20 @@ import org.springframework.util.Assert;
*/
public class MongoMappingEventPublisher implements ApplicationEventPublisher {
private final MongoPersistentEntityIndexCreator indexCreator;
private final ApplicationListener<MappingContextEvent<?, ?>> indexCreator;
/**
* Creates a new {@link MongoMappingEventPublisher} for the given {@link ApplicationListener}.
*
* @param indexCreator must not be {@literal null}.
* @since 2.1
*/
public MongoMappingEventPublisher(ApplicationListener<MappingContextEvent<?, ?>> indexCreator) {
Assert.notNull(indexCreator, "ApplicationListener must not be null!");
this.indexCreator = indexCreator;
}
/**
* Creates a new {@link MongoMappingEventPublisher} for the given {@link MongoPersistentEntityIndexCreator}.
@@ -48,6 +62,7 @@ public class MongoMappingEventPublisher implements ApplicationEventPublisher {
public MongoMappingEventPublisher(MongoPersistentEntityIndexCreator indexCreator) {
Assert.notNull(indexCreator, "MongoPersistentEntityIndexCreator must not be null!");
this.indexCreator = indexCreator;
}

View File

@@ -0,0 +1,34 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
/**
* Provider interface to obtain {@link ReactiveIndexOperations} by MongoDB collection name.
*
* @author Mark Paluch
* @since 2.1
*/
@FunctionalInterface
public interface ReactiveIndexOperationsProvider {
/**
* Returns the operations that can be performed on indexes.
*
* @param collectionName name of the MongoDB collection, must not be {@literal null}.
* @return index operations on the named collection
*/
ReactiveIndexOperations indexOps(String collectionName);
}

View File

@@ -0,0 +1,186 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.UncategorizedMongoDbException;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexResolver.IndexDefinitionHolder;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.util.MongoDbErrorCodes;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import com.mongodb.MongoException;
/**
* Component that inspects {@link MongoPersistentEntity} instances contained in the given {@link MongoMappingContext}
* for indexing metadata and ensures the indexes to be available using reactive infrastructure.
*
* @author Mark Paluch
* @since 2.1
*/
public class ReactiveMongoPersistentEntityIndexCreator {
private static final Logger LOGGER = LoggerFactory.getLogger(ReactiveMongoPersistentEntityIndexCreator.class);
private final Map<Class<?>, Boolean> classesSeen = new ConcurrentHashMap<Class<?>, Boolean>();
private final MongoMappingContext mappingContext;
private final ReactiveIndexOperationsProvider operationsProvider;
private final IndexResolver indexResolver;
/**
* Creates a new {@link ReactiveMongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext},
* {@link ReactiveIndexOperationsProvider}.
*
* @param mappingContext must not be {@literal null}.
* @param operationsProvider must not be {@literal null}.
*/
public ReactiveMongoPersistentEntityIndexCreator(MongoMappingContext mappingContext,
ReactiveIndexOperationsProvider operationsProvider) {
this(mappingContext, operationsProvider, new MongoPersistentEntityIndexResolver(mappingContext));
}
/**
* Creates a new {@link ReactiveMongoPersistentEntityIndexCreator} for the given {@link MongoMappingContext},
* {@link ReactiveIndexOperationsProvider}, and {@link IndexResolver}.
*
* @param mappingContext must not be {@literal null}.
* @param operationsProvider must not be {@literal null}.
* @param indexResolver must not be {@literal null}.
*/
public ReactiveMongoPersistentEntityIndexCreator(MongoMappingContext mappingContext,
ReactiveIndexOperationsProvider operationsProvider, IndexResolver indexResolver) {
Assert.notNull(mappingContext, "MongoMappingContext must not be null!");
Assert.notNull(operationsProvider, "ReactiveIndexOperations must not be null!");
Assert.notNull(indexResolver, "IndexResolver must not be null!");
this.mappingContext = mappingContext;
this.operationsProvider = operationsProvider;
this.indexResolver = indexResolver;
}
/**
* Returns whether the current index creator was registered for the given {@link MappingContext}.
*
* @param context
* @return
*/
public boolean isIndexCreatorFor(MappingContext<?, ?> context) {
return this.mappingContext.equals(context);
}
/**
* Inspect entities for index creation.
*
* @return a {@link Mono} that completes without value after indexes were created.
*/
public Mono<Void> checkForIndexes(MongoPersistentEntity<?> entity) {
Class<?> type = entity.getType();
if (!classesSeen.containsKey(type)) {
if (this.classesSeen.put(type, Boolean.TRUE) == null) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("Analyzing class " + type + " for index information.");
}
return checkForAndCreateIndexes(entity);
}
}
return Mono.empty();
}
private Mono<Void> checkForAndCreateIndexes(MongoPersistentEntity<?> entity) {
List<Mono<?>> publishers = new ArrayList<>();
if (entity.isAnnotationPresent(Document.class)) {
for (IndexDefinitionHolder indexToCreate : indexResolver.resolveIndexFor(entity.getTypeInformation())) {
publishers.add(createIndex(indexToCreate));
}
}
return publishers.isEmpty() ? Mono.empty() : Flux.merge(publishers).then();
}
Mono<String> createIndex(IndexDefinitionHolder indexDefinition) {
return operationsProvider.indexOps(indexDefinition.getCollection()).ensureIndex(indexDefinition) //
.onErrorResume(ReactiveMongoPersistentEntityIndexCreator::isDataIntegrityViolation,
e -> translateException(e, indexDefinition));
}
private Mono<? extends String> translateException(Throwable e, IndexDefinitionHolder indexDefinition) {
Mono<IndexInfo> existingIndex = fetchIndexInformation(indexDefinition);
Mono<String> defaultError = Mono.error(new DataIntegrityViolationException(
String.format("Cannot create index for '%s' in collection '%s' with keys '%s' and options '%s'.",
indexDefinition.getPath(), indexDefinition.getCollection(), indexDefinition.getIndexKeys(),
indexDefinition.getIndexOptions()),
e.getCause()));
return existingIndex.flatMap(it -> {
return Mono.<String> error(new DataIntegrityViolationException(
String.format("Index already defined as '%s'.", indexDefinition.getPath()), e.getCause()));
}).switchIfEmpty(defaultError);
}
private Mono<IndexInfo> fetchIndexInformation(IndexDefinitionHolder indexDefinition) {
Object indexNameToLookUp = indexDefinition.getIndexOptions().get("name");
Flux<IndexInfo> existingIndexes = operationsProvider.indexOps(indexDefinition.getCollection()).getIndexInfo();
return existingIndexes //
.filter(indexInfo -> ObjectUtils.nullSafeEquals(indexNameToLookUp, indexInfo.getName())) //
.next() //
.doOnError(e -> {
LOGGER.debug(
String.format("Failed to load index information for collection '%s'.", indexDefinition.getCollection()),
e);
});
}
private static boolean isDataIntegrityViolation(Throwable t) {
if (t instanceof UncategorizedMongoDbException) {
return t.getCause() instanceof MongoException
&& MongoDbErrorCodes.isDataIntegrityViolationCode(((MongoException) t.getCause()).getCode());
}
return false;
}
}

View File

@@ -21,11 +21,6 @@ import java.util.Comparator;
import java.util.HashMap;
import java.util.Map;
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.context.expression.BeanFactoryAccessor;
import org.springframework.context.expression.BeanFactoryResolver;
import org.springframework.data.annotation.Id;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.AssociationHandler;
@@ -38,7 +33,6 @@ import org.springframework.expression.Expression;
import org.springframework.expression.ParserContext;
import org.springframework.expression.common.LiteralExpression;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.expression.spel.support.StandardEvaluationContext;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
@@ -55,7 +49,7 @@ import org.springframework.util.StringUtils;
* @author Mark Paluch
*/
public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, MongoPersistentProperty>
implements MongoPersistentEntity<T>, ApplicationContextAware {
implements MongoPersistentEntity<T> {
private static final String AMBIGUOUS_FIELD_MAPPING = "Ambiguous field mapping detected! Both %s and %s map to the same field name %s! Disambiguate using @Field annotation!";
private static final SpelExpressionParser PARSER = new SpelExpressionParser();
@@ -63,7 +57,6 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
private final String collection;
private final String language;
private final StandardEvaluationContext context;
private final @Nullable Expression expression;
/**
@@ -79,8 +72,6 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
Class<?> rawType = typeInformation.getType();
String fallback = MongoCollectionUtils.getPreferredCollectionName(rawType);
this.context = new StandardEvaluationContext();
if (this.isAnnotationPresent(Document.class)) {
Document document = this.getRequiredAnnotation(Document.class);
@@ -95,23 +86,15 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
}
}
/*
* (non-Javadoc)
* @see org.springframework.context.ApplicationContextAware#setApplicationContext(org.springframework.context.ApplicationContext)
*/
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
context.addPropertyAccessor(new BeanFactoryAccessor());
context.setBeanResolver(new BeanFactoryResolver(applicationContext));
context.setRootObject(applicationContext);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.MongoPersistentEntity#getCollection()
*/
public String getCollection() {
return expression == null ? collection : expression.getValue(context, String.class);
return expression == null //
? collection //
: expression.getValue(getEvaluationContext(null), String.class);
}
/*

View File

@@ -87,14 +87,7 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
*/
@Override
protected <T> BasicMongoPersistentEntity<T> createPersistentEntity(TypeInformation<T> typeInformation) {
BasicMongoPersistentEntity<T> entity = new BasicMongoPersistentEntity<T>(typeInformation);
if (context != null) {
entity.setApplicationContext(context);
}
return entity;
return new BasicMongoPersistentEntity<T>(typeInformation);
}
/*
@@ -103,6 +96,9 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
*/
@Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
super.setApplicationContext(applicationContext);
this.context = applicationContext;
}
}

View File

@@ -37,8 +37,8 @@ public class MapReduceOptions {
private Optional<String> outputDatabase = Optional.empty();
private MapReduceCommand.OutputType outputType = MapReduceCommand.OutputType.REPLACE;
private Map<String, Object> scopeVariables = new HashMap<String, Object>();
private Map<String, Object> extraOptions = new HashMap<String, Object>();
private Map<String, Object> scopeVariables = new HashMap<>();
private Map<String, Object> extraOptions = new HashMap<>();
private @Nullable Boolean jsMode;
private Boolean verbose = Boolean.TRUE;
private @Nullable Integer limit;

View File

@@ -29,6 +29,7 @@ import java.util.stream.Collectors;
import org.bson.BSON;
import org.bson.BsonRegularExpression;
import org.bson.Document;
import org.bson.types.Binary;
import org.springframework.data.domain.Example;
import org.springframework.data.geo.Circle;
import org.springframework.data.geo.Point;
@@ -41,6 +42,7 @@ import org.springframework.data.mongodb.core.schema.JsonSchemaProperty;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.Base64Utils;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
@@ -56,6 +58,7 @@ import com.mongodb.BasicDBList;
* @author Thomas Darimont
* @author Christoph Strobl
* @author Mark Paluch
* @author Andreas Zink
*/
public class Criteria implements CriteriaDefinition {
@@ -356,7 +359,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a criterion using the {@literal $type} operator.
*
* @param type must not be {@literal null}.
* @param types must not be {@literal null}.
* @return this
* @since 2.1
* @see <a href="https://docs.mongodb.com/manual/reference/operator/query/type/">MongoDB Query operator: $type</a>
@@ -622,6 +625,18 @@ public class Criteria implements CriteriaDefinition {
return registerCriteriaChainElement(schemaCriteria);
}
/**
* Use {@link BitwiseCriteriaOperators} as gateway to create a criterion using one of the
* <a href="https://docs.mongodb.com/manual/reference/operator/query-bitwise/">bitwise operators</a> like
* {@code $bitsAllClear}.
*
* @return new instance of {@link BitwiseCriteriaOperators}. Never {@literal null}.
* @since 2.1
*/
public BitwiseCriteriaOperators bits() {
return new BitwiseCriteriaOperatorsImpl(this);
}
/**
* Creates an 'or' criteria using the $or operator for all of the provided criteria
* <p>
@@ -880,4 +895,324 @@ public class Criteria implements CriteriaDefinition {
return value instanceof GeoJson
|| (value instanceof GeoCommand && ((GeoCommand) value).getShape() instanceof GeoJson);
}
/**
* MongoDB specific <a href="https://docs.mongodb.com/manual/reference/operator/query-bitwise/">bitwise query
* operators</a> like {@code $bitsAllClear, $bitsAllSet,...} for usage with {@link Criteria#bits()} and {@link Query}.
*
* @author Christoph Strobl
* @since 2.1
* @see <a href=
* "https://docs.mongodb.com/manual/reference/operator/query-bitwise/">https://docs.mongodb.com/manual/reference/operator/query-bitwise/</a>
* @currentRead Beyond the Shadows - Brent Weeks
*/
public interface BitwiseCriteriaOperators {
/**
* Creates a criterion using {@literal $bitsAllClear} matching documents where all given bit positions are clear
* (i.e. 0).
*
* @param numericBitmask non-negative numeric bitmask.
* @return target {@link Criteria}.
* @see <a href="https://docs.mongodb.com/manual/reference/operator/query/bitsAllClear/">MongoDB Query operator:
* $bitsAllClear</a>
* @since 2.1
*/
Criteria allClear(int numericBitmask);
/**
* Creates a criterion using {@literal $bitsAllClear} matching documents where all given bit positions are clear
* (i.e. 0).
*
* @param bitmask string representation of a bitmask that will be converted to its base64 encoded {@link Binary}
* representation. Must not be {@literal null} nor empty.
* @return target {@link Criteria}.
* @throws IllegalArgumentException when bitmask is {@literal null} or empty.
* @see <a href="https://docs.mongodb.com/manual/reference/operator/query/bitsAllClear/">MongoDB Query operator:
* $bitsAllClear</a>
* @since 2.1
*/
Criteria allClear(String bitmask);
/**
* Creates a criterion using {@literal $bitsAllClear} matching documents where all given bit positions are clear
* (i.e. 0).
*
* @param positions list of non-negative integer positions. Positions start at 0 from the least significant bit.
* Must not be {@literal null} nor contain {@literal null} elements.
* @return target {@link Criteria}.
* @throws IllegalArgumentException when positions is {@literal null} or contains {@literal null} elements.
* @see <a href="https://docs.mongodb.com/manual/reference/operator/query/bitsAllClear/">MongoDB Query operator:
* $bitsAllClear</a>
* @since 2.1
*/
Criteria allClear(List<Integer> positions);
/**
* Creates a criterion using {@literal $bitsAllSet} matching documents where all given bit positions are set (i.e.
* 1).
*
* @param numericBitmask non-negative numeric bitmask.
* @return target {@link Criteria}.
* @see <a href="https://docs.mongodb.com/manual/reference/operator/query/bitsAllSet/">MongoDB Query operator:
* $bitsAllSet</a>
* @since 2.1
*/
Criteria allSet(int numericBitmask);
/**
* Creates a criterion using {@literal $bitsAllSet} matching documents where all given bit positions are set (i.e.
* 1).
*
* @param bitmask string representation of a bitmask that will be converted to its base64 encoded {@link Binary}
* representation. Must not be {@literal null} nor empty.
* @return target {@link Criteria}.
* @throws IllegalArgumentException when bitmask is {@literal null} or empty.
* @see <a href="https://docs.mongodb.com/manual/reference/operator/query/bitsAllSet/">MongoDB Query operator:
* $bitsAllSet</a>
* @since 2.1
*/
Criteria allSet(String bitmask);
/**
* Creates a criterion using {@literal $bitsAllSet} matching documents where all given bit positions are set (i.e.
* 1).
*
* @param positions list of non-negative integer positions. Positions start at 0 from the least significant bit.
* Must not be {@literal null} nor contain {@literal null} elements.
* @return target {@link Criteria}.
* @throws IllegalArgumentException when positions is {@literal null} or contains {@literal null} elements.
* @see <a href="https://docs.mongodb.com/manual/reference/operator/query/bitsAllSet/">MongoDB Query operator:
* $bitsAllSet</a>
* @since 2.1
*/
Criteria allSet(List<Integer> positions);
/**
* Creates a criterion using {@literal $bitsAllClear} matching documents where any given bit positions are clear
* (i.e. 0).
*
* @param numericBitmask non-negative numeric bitmask.
* @return target {@link Criteria}.
* @see <a href="https://docs.mongodb.com/manual/reference/operator/query/bitsAnyClear/">MongoDB Query operator:
* $bitsAnyClear</a>
* @since 2.1
*/
Criteria anyClear(int numericBitmask);
/**
* Creates a criterion using {@literal $bitsAllClear} matching documents where any given bit positions are clear
* (i.e. 0).
*
* @param bitmask string representation of a bitmask that will be converted to its base64 encoded {@link Binary}
* representation. Must not be {@literal null} nor empty.
* @return target {@link Criteria}.
* @throws IllegalArgumentException when bitmask is {@literal null} or empty.
* @see <a href="https://docs.mongodb.com/manual/reference/operator/query/bitsAnyClear/">MongoDB Query operator:
* $bitsAnyClear</a>
* @since 2.1
*/
Criteria anyClear(String bitmask);
/**
* Creates a criterion using {@literal $bitsAllClear} matching documents where any given bit positions are clear
* (i.e. 0).
*
* @param positions list of non-negative integer positions. Positions start at 0 from the least significant bit.
* Must not be {@literal null} nor contain {@literal null} elements.
* @return target {@link Criteria}.
* @throws IllegalArgumentException when positions is {@literal null} or contains {@literal null} elements.
* @see <a href="https://docs.mongodb.com/manual/reference/operator/query/bitsAnyClear/">MongoDB Query operator:
* $bitsAnyClear</a>
* @since 2.1
*/
Criteria anyClear(List<Integer> positions);
/**
* Creates a criterion using {@literal $bitsAllSet} matching documents where any given bit positions are set (i.e.
* 1).
*
* @param numericBitmask non-negative numeric bitmask.
* @return target {@link Criteria}.
* @see <a href="https://docs.mongodb.com/manual/reference/operator/query/bitsAnySet/">MongoDB Query operator:
* $bitsAnySet</a>
* @since 2.1
*/
Criteria anySet(int numericBitmask);
/**
* Creates a criterion using {@literal $bitsAnySet} matching documents where any given bit positions are set (i.e.
* 1).
*
* @param bitmask string representation of a bitmask that will be converted to its base64 encoded {@link Binary}
* representation. Must not be {@literal null} nor empty.
* @return target {@link Criteria}.
* @throws IllegalArgumentException when bitmask is {@literal null} or empty.
* @see <a href="https://docs.mongodb.com/manual/reference/operator/query/bitsAnySet/">MongoDB Query operator:
* $bitsAnySet</a>
* @since 2.1
*/
Criteria anySet(String bitmask);
/**
* Creates a criterion using {@literal $bitsAnySet} matching documents where any given bit positions are set (i.e.
* 1).
*
* @param positions list of non-negative integer positions. Positions start at 0 from the least significant bit.
* Must not be {@literal null} nor contain {@literal null} elements.
* @return target {@link Criteria}.
* @throws IllegalArgumentException when positions is {@literal null} or contains {@literal null} elements.
* @see <a href="https://docs.mongodb.com/manual/reference/operator/query/bitsAnySet/">MongoDB Query operator:
* $bitsAnySet</a>
* @since 2.1
*/
Criteria anySet(List<Integer> positions);
}
/**
* Default implementation of {@link BitwiseCriteriaOperators}.
*
* @author Christoph Strobl
* @currentRead Beyond the Shadows - Brent Weeks
*/
private static class BitwiseCriteriaOperatorsImpl implements BitwiseCriteriaOperators {
private final Criteria target;
BitwiseCriteriaOperatorsImpl(Criteria target) {
this.target = target;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.BitwiseCriteriaOperators#allClear(int)
*/
@Override
public Criteria allClear(int numericBitmask) {
return numericBitmask("$bitsAllClear", numericBitmask);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.BitwiseCriteriaOperators#allClear(java.lang.String)
*/
@Override
public Criteria allClear(String bitmask) {
return stringBitmask("$bitsAllClear", bitmask);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.BitwiseCriteriaOperators#allClear(java.util.List)
*/
@Override
public Criteria allClear(List<Integer> positions) {
return positions("$bitsAllClear", positions);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.BitwiseCriteriaOperators#allSet(int)
*/
@Override
public Criteria allSet(int numericBitmask) {
return numericBitmask("$bitsAllSet", numericBitmask);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.BitwiseCriteriaOperators#allSet(java.lang.String)
*/
@Override
public Criteria allSet(String bitmask) {
return stringBitmask("$bitsAllSet", bitmask);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.BitwiseCriteriaOperators#allSet(java.util.List)
*/
@Override
public Criteria allSet(List<Integer> positions) {
return positions("$bitsAllSet", positions);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.BitwiseCriteriaOperators#anyClear(int)
*/
@Override
public Criteria anyClear(int numericBitmask) {
return numericBitmask("$bitsAnyClear", numericBitmask);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.BitwiseCriteriaOperators#anyClear(java.lang.String)
*/
@Override
public Criteria anyClear(String bitmask) {
return stringBitmask("$bitsAnyClear", bitmask);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.BitwiseCriteriaOperators#anyClear(java.util.List)
*/
@Override
public Criteria anyClear(List<Integer> positions) {
return positions("$bitsAnyClear", positions);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.BitwiseCriteriaOperators#anySet(int)
*/
@Override
public Criteria anySet(int numericBitmask) {
return numericBitmask("$bitsAnySet", numericBitmask);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.BitwiseCriteriaOperators#anySet(java.lang.String)
*/
@Override
public Criteria anySet(String bitmask) {
return stringBitmask("$bitsAnySet", bitmask);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.query.BitwiseCriteriaOperators#anySet(java.util.Collection)
*/
@Override
public Criteria anySet(List<Integer> positions) {
return positions("$bitsAnySet", positions);
}
private Criteria positions(String operator, List<Integer> positions) {
Assert.notNull(positions, "Positions must not be null!");
Assert.noNullElements(positions.toArray(), "Positions must not contain null values.");
target.criteria.put(operator, positions);
return target;
}
private Criteria stringBitmask(String operator, String bitmask) {
Assert.hasText(bitmask, "Bitmask must not be null!");
target.criteria.put(operator, new Binary(Base64Utils.decodeFromString(bitmask)));
return target;
}
private Criteria numericBitmask(String operator, int bitmask) {
target.criteria.put(operator, bitmask);
return target;
}
}
}

View File

@@ -38,7 +38,7 @@ public class Field {
private final Map<String, Integer> criteria = new HashMap<String, Integer>();
private final Map<String, Object> slices = new HashMap<String, Object>();
private final Map<String, Criteria> elemMatchs = new HashMap<String, Criteria>();
private @Nullable String postionKey;
private @Nullable String positionKey;
private int positionValue;
public Field include(String key) {
@@ -78,7 +78,7 @@ public class Field {
Assert.hasText(field, "DocumentField must not be null or empty!");
postionKey = field;
positionKey = field;
positionValue = value;
return this;
@@ -97,8 +97,8 @@ public class Field {
document.put(entry.getKey(), new Document("$elemMatch", entry.getValue().getCriteriaObject()));
}
if (postionKey != null) {
document.put(postionKey + ".$", positionValue);
if (positionKey != null) {
document.put(positionKey + ".$", positionValue);
}
return document;

View File

@@ -153,7 +153,8 @@ public interface GridFsOperations extends ResourcePatternResolver {
* Returns the {@link GridFsResource} with the given file name.
*
* @param filename must not be {@literal null}.
* @return the resource if it exists or {@literal null}.
* @return the resource. Use {@link org.springframework.core.io.Resource#exists()} to check if the returned
* {@link GridFsResource} is actually present.
* @see ResourcePatternResolver#getResource(String)
*/
GridFsResource getResource(String filename);

View File

@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.gridfs;
import java.io.ByteArrayInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.InputStream;
import java.util.Optional;
@@ -23,6 +24,8 @@ import java.util.Optional;
import org.springframework.core.io.InputStreamResource;
import org.springframework.core.io.Resource;
import org.springframework.data.util.Optionals;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.MongoGridFSException;
import com.mongodb.client.gridfs.model.GridFSFile;
@@ -38,8 +41,24 @@ import com.mongodb.client.gridfs.model.GridFSFile;
public class GridFsResource extends InputStreamResource {
static final String CONTENT_TYPE_FIELD = "_contentType";
private static final ByteArrayInputStream EMPTY_INPUT_STREAM = new ByteArrayInputStream(new byte[0]);
private final GridFSFile file;
private final @Nullable GridFSFile file;
private final String filename;
/**
* Creates a new, absent {@link GridFsResource}.
*
* @param filename filename of the absent resource.
* @since 2.1
*/
private GridFsResource(String filename) {
super(EMPTY_INPUT_STREAM, String.format("GridFs resource [%s]", filename));
this.file = null;
this.filename = filename;
}
/**
* Creates a new {@link GridFsResource} from the given {@link GridFSFile}.
@@ -58,8 +77,35 @@ public class GridFsResource extends InputStreamResource {
*/
public GridFsResource(GridFSFile file, InputStream inputStream) {
super(inputStream);
super(inputStream, String.format("GridFs resource [%s]", file.getFilename()));
this.file = file;
this.filename = file.getFilename();
}
/**
* Obtain an absent {@link GridFsResource}.
*
* @param filename filename of the absent resource, must not be {@literal null}.
* @return never {@literal null}.
* @since 2.1
*/
public static GridFsResource absent(String filename) {
Assert.notNull(filename, "Filename must not be null");
return new GridFsResource(filename);
}
/*
* (non-Javadoc)
* @see org.springframework.core.io.InputStreamResource#getInputStream()
*/
@Override
public InputStream getInputStream() throws IOException, IllegalStateException {
verifyExists();
return super.getInputStream();
}
/*
@@ -68,6 +114,8 @@ public class GridFsResource extends InputStreamResource {
*/
@Override
public long contentLength() throws IOException {
verifyExists();
return file.getLength();
}
@@ -77,7 +125,16 @@ public class GridFsResource extends InputStreamResource {
*/
@Override
public String getFilename() throws IllegalStateException {
return file.getFilename();
return filename;
}
/*
* (non-Javadoc)
* @see org.springframework.core.io.AbstractResource#exists()
*/
@Override
public boolean exists() {
return file != null;
}
/*
@@ -86,15 +143,30 @@ public class GridFsResource extends InputStreamResource {
*/
@Override
public long lastModified() throws IOException {
verifyExists();
return file.getUploadDate().getTime();
}
/*
* (non-Javadoc)
* @see org.springframework.core.io.AbstractResource#getDescription()
*/
@Override
public String getDescription() {
return String.format("GridFs resource [%s]", this.getFilename());
}
/**
* Returns the {@link Resource}'s id.
*
* @return never {@literal null}.
* @throws IllegalStateException if the file does not {@link #exists()}.
*/
public Object getId() {
Assert.state(exists(), () -> String.format("%s does not exist.", getDescription()));
return file.getId();
}
@@ -104,14 +176,24 @@ public class GridFsResource extends InputStreamResource {
* @return never {@literal null}.
* @throws com.mongodb.MongoGridFSException in case no content type declared on {@link GridFSFile#getMetadata()} nor
* provided via {@link GridFSFile#getContentType()}.
* @throws IllegalStateException if the file does not {@link #exists()}.
*/
@SuppressWarnings("deprecation")
public String getContentType() {
Assert.state(exists(), () -> String.format("%s does not exist.", getDescription()));
return Optionals
.firstNonEmpty(
() -> Optional.ofNullable(file.getMetadata()).map(it -> it.get(CONTENT_TYPE_FIELD, String.class)),
() -> Optional.ofNullable(file.getContentType()))
.orElseThrow(() -> new MongoGridFSException("No contentType data for this GridFS file"));
}
private void verifyExists() throws FileNotFoundException {
if (!exists()) {
throw new FileNotFoundException(String.format("%s does not exist.", getDescription()));
}
}
}

View File

@@ -227,7 +227,9 @@ public class GridFsTemplate implements GridFsOperations, ResourcePatternResolver
* @see org.springframework.core.io.ResourceLoader#getResource(java.lang.String)
*/
public GridFsResource getResource(String location) {
return Optional.ofNullable(findOne(query(whereFilename().is(location)))).map(this::getResource).orElse(null);
return Optional.ofNullable(findOne(query(whereFilename().is(location)))).map(this::getResource)
.orElseGet(() -> GridFsResource.absent(location));
}
/*

View File

@@ -39,7 +39,7 @@ import org.bson.json.JsonWriter;
import org.bson.types.Binary;
import org.springframework.data.mongodb.CodecRegistryProvider;
import org.springframework.data.mongodb.repository.query.StringBasedMongoQuery.ParameterBinding;
import org.springframework.data.repository.query.EvaluationContextProvider;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.expression.EvaluationContext;
import org.springframework.expression.Expression;
import org.springframework.expression.spel.standard.SpelExpressionParser;
@@ -65,7 +65,7 @@ import com.mongodb.util.JSON;
class ExpressionEvaluatingParameterBinder {
private final SpelExpressionParser expressionParser;
private final EvaluationContextProvider evaluationContextProvider;
private final QueryMethodEvaluationContextProvider evaluationContextProvider;
private final CodecRegistryProvider codecRegistryProvider;
/**
@@ -75,7 +75,7 @@ class ExpressionEvaluatingParameterBinder {
* @param evaluationContextProvider must not be {@literal null}.
*/
public ExpressionEvaluatingParameterBinder(SpelExpressionParser expressionParser,
EvaluationContextProvider evaluationContextProvider) {
QueryMethodEvaluationContextProvider evaluationContextProvider) {
Assert.notNull(expressionParser, "ExpressionParser must not be null!");
Assert.notNull(evaluationContextProvider, "EvaluationContextProvider must not be null!");

View File

@@ -27,7 +27,8 @@ import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.repository.query.ExpressionEvaluatingParameterBinder.BindingContext;
import org.springframework.data.mongodb.repository.query.StringBasedMongoQuery.ParameterBinding;
import org.springframework.data.mongodb.repository.query.StringBasedMongoQuery.ParameterBindingParser;
import org.springframework.data.repository.query.EvaluationContextProvider;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.data.spel.EvaluationContextProvider;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.util.Assert;
@@ -62,13 +63,13 @@ public class ReactiveStringBasedMongoQuery extends AbstractReactiveMongoQuery {
* @param evaluationContextProvider must not be {@literal null}.
*/
public ReactiveStringBasedMongoQuery(ReactiveMongoQueryMethod method, ReactiveMongoOperations mongoOperations,
SpelExpressionParser expressionParser, EvaluationContextProvider evaluationContextProvider) {
SpelExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
this(method.getAnnotatedQuery(), method, mongoOperations, expressionParser, evaluationContextProvider);
}
/**
* Creates a new {@link ReactiveStringBasedMongoQuery} for the given {@link String}, {@link MongoQueryMethod},
* {@link MongoOperations}, {@link SpelExpressionParser} and {@link EvaluationContextProvider}.
* {@link MongoOperations}, {@link SpelExpressionParser} and {@link QueryMethodEvaluationContextProvider}.
*
* @param query must not be {@literal null}.
* @param method must not be {@literal null}.
@@ -77,7 +78,7 @@ public class ReactiveStringBasedMongoQuery extends AbstractReactiveMongoQuery {
*/
public ReactiveStringBasedMongoQuery(String query, ReactiveMongoQueryMethod method,
ReactiveMongoOperations mongoOperations, SpelExpressionParser expressionParser,
EvaluationContextProvider evaluationContextProvider) {
QueryMethodEvaluationContextProvider evaluationContextProvider) {
super(method, mongoOperations);

View File

@@ -28,7 +28,7 @@ import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.repository.query.ExpressionEvaluatingParameterBinder.BindingContext;
import org.springframework.data.repository.query.EvaluationContextProvider;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
@@ -62,7 +62,8 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
private final ExpressionEvaluatingParameterBinder parameterBinder;
/**
* Creates a new {@link StringBasedMongoQuery} for the given {@link MongoQueryMethod} and {@link MongoOperations}.
* Creates a new {@link StringBasedMongoQuery} for the given {@link MongoQueryMethod}, {@link MongoOperations},
* {@link SpelExpressionParser} and {@link QueryMethodEvaluationContextProvider}.
*
* @param method must not be {@literal null}.
* @param mongoOperations must not be {@literal null}.
@@ -70,13 +71,13 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
* @param evaluationContextProvider must not be {@literal null}.
*/
public StringBasedMongoQuery(MongoQueryMethod method, MongoOperations mongoOperations,
SpelExpressionParser expressionParser, EvaluationContextProvider evaluationContextProvider) {
SpelExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
this(method.getAnnotatedQuery(), method, mongoOperations, expressionParser, evaluationContextProvider);
}
/**
* Creates a new {@link StringBasedMongoQuery} for the given {@link String}, {@link MongoQueryMethod},
* {@link MongoOperations}, {@link SpelExpressionParser} and {@link EvaluationContextProvider}.
* {@link MongoOperations}, {@link SpelExpressionParser} and {@link QueryMethodEvaluationContextProvider}.
*
* @param query must not be {@literal null}.
* @param method must not be {@literal null}.
@@ -84,7 +85,7 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
* @param expressionParser must not be {@literal null}.
*/
public StringBasedMongoQuery(String query, MongoQueryMethod method, MongoOperations mongoOperations,
SpelExpressionParser expressionParser, EvaluationContextProvider evaluationContextProvider) {
SpelExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
super(method, mongoOperations);

View File

@@ -39,9 +39,9 @@ import org.springframework.data.repository.core.RepositoryMetadata;
import org.springframework.data.repository.core.support.RepositoryComposition.RepositoryFragments;
import org.springframework.data.repository.core.support.RepositoryFactorySupport;
import org.springframework.data.repository.core.support.RepositoryFragment;
import org.springframework.data.repository.query.EvaluationContextProvider;
import org.springframework.data.repository.query.QueryLookupStrategy;
import org.springframework.data.repository.query.QueryLookupStrategy.Key;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.lang.Nullable;
@@ -131,7 +131,7 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
*/
@Override
protected Optional<QueryLookupStrategy> getQueryLookupStrategy(@Nullable Key key,
EvaluationContextProvider evaluationContextProvider) {
QueryMethodEvaluationContextProvider evaluationContextProvider) {
return Optional.of(new MongoQueryLookupStrategy(operations, evaluationContextProvider, mappingContext));
}
@@ -160,10 +160,11 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
private static class MongoQueryLookupStrategy implements QueryLookupStrategy {
private final MongoOperations operations;
private final EvaluationContextProvider evaluationContextProvider;
private final QueryMethodEvaluationContextProvider evaluationContextProvider;
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
public MongoQueryLookupStrategy(MongoOperations operations, EvaluationContextProvider evaluationContextProvider,
public MongoQueryLookupStrategy(MongoOperations operations,
QueryMethodEvaluationContextProvider evaluationContextProvider,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
this.operations = operations;

View File

@@ -36,9 +36,9 @@ import org.springframework.data.repository.core.NamedQueries;
import org.springframework.data.repository.core.RepositoryInformation;
import org.springframework.data.repository.core.RepositoryMetadata;
import org.springframework.data.repository.core.support.ReactiveRepositoryFactorySupport;
import org.springframework.data.repository.query.EvaluationContextProvider;
import org.springframework.data.repository.query.QueryLookupStrategy;
import org.springframework.data.repository.query.QueryLookupStrategy.Key;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.lang.Nullable;
@@ -99,7 +99,7 @@ public class ReactiveMongoRepositoryFactory extends ReactiveRepositoryFactorySup
*/
@Override
protected Optional<QueryLookupStrategy> getQueryLookupStrategy(@Nullable Key key,
EvaluationContextProvider evaluationContextProvider) {
QueryMethodEvaluationContextProvider evaluationContextProvider) {
return Optional.of(new MongoQueryLookupStrategy(operations, evaluationContextProvider, mappingContext));
}
@@ -130,7 +130,7 @@ public class ReactiveMongoRepositoryFactory extends ReactiveRepositoryFactorySup
private static class MongoQueryLookupStrategy implements QueryLookupStrategy {
private final ReactiveMongoOperations operations;
private final EvaluationContextProvider evaluationContextProvider;
private final QueryMethodEvaluationContextProvider evaluationContextProvider;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
/*

View File

@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.repository.support;
import java.util.Collections;
import java.util.HashSet;
import java.util.List;
import java.util.Optional;
import java.util.Set;
import java.util.regex.Pattern;
@@ -27,10 +28,12 @@ import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.util.BsonUtils;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
@@ -93,7 +96,7 @@ class SpringDataMongodbSerializer extends MongodbSerializer {
return super.visit(expr, context);
}
return converter.convertToMongoType(expr.getConstant());
return toQuerydslMongoType(expr.getConstant());
}
/*
@@ -128,7 +131,8 @@ class SpringDataMongodbSerializer extends MongodbSerializer {
Document mappedIdValue = mapper.getMappedObject((BasicDBObject) superIdValue, Optional.empty());
return (DBObject) JSON.parse(mappedIdValue.toJson());
}
return super.asDBObject(key, value instanceof Pattern ? value : converter.convertToMongoType(value));
return super.asDBObject(key, value instanceof Pattern ? value : toQuerydslMongoType(value));
}
/*
@@ -231,4 +235,25 @@ class SpringDataMongodbSerializer extends MongodbSerializer {
return property;
}
private Object toQuerydslMongoType(Object source) {
Object target = converter.convertToMongoType(source);
if (target instanceof List) {
List<Object> newList = new BasicDBList();
for (Object item : (List) target) {
if (item instanceof Document) {
newList.add(new BasicDBObject(BsonUtils.asMap((Document) item)));
} else {
newList.add(item);
}
}
return newList;
}
return target;
}
}

View File

@@ -0,0 +1,54 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core
import kotlin.reflect.KClass
/**
* Extension for [ExecutableMapReduceOperation.mapReduce] providing a [KClass] based variant.
*
* @author Christoph Strobl
* @since 2.1
*/
fun <T : Any> ExecutableMapReduceOperation.mapReduce(entityClass: KClass<T>): ExecutableMapReduceOperation.MapReduceWithMapFunction<T> =
mapReduce(entityClass.java)
/**
* Extension for [ExecutableMapReduceOperation.mapReduce] leveraging reified type parameters.
*
* @author Christoph Strobl
* @since 2.1
*/
inline fun <reified T : Any> ExecutableMapReduceOperation.mapReduce(): ExecutableMapReduceOperation.MapReduceWithMapFunction<T> =
mapReduce(T::class.java)
/**
* Extension for [ExecutableMapReduceOperation.MapReduceWithProjection.as] providing a [KClass] based variant.
*
* @author Christoph Strobl
* @since 2.1
*/
fun <T : Any> ExecutableMapReduceOperation.MapReduceWithProjection<T>.asType(resultType: KClass<T>): ExecutableMapReduceOperation.MapReduceWithQuery<T> =
`as`(resultType.java)
/**
* Extension for [ExecutableMapReduceOperation.MapReduceWithProjection.as] leveraging reified type parameters.
*
* @author Christoph Strobl
* @since 2.1
*/
inline fun <reified T : Any> ExecutableMapReduceOperation.MapReduceWithProjection<T>.asType(): ExecutableMapReduceOperation.MapReduceWithQuery<T> =
`as`(T::class.java)

View File

@@ -0,0 +1,54 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core
import kotlin.reflect.KClass
/**
* Extension for [ReactiveMapReduceOperation.mapReduce] providing a [KClass] based variant.
*
* @author Christoph Strobl
* @since 2.1
*/
fun <T : Any> ReactiveMapReduceOperation.mapReduce(entityClass: KClass<T>): ReactiveMapReduceOperation.MapReduceWithMapFunction<T> =
mapReduce(entityClass.java)
/**
* Extension for [ReactiveMapReduceOperation.mapReduce] leveraging reified type parameters.
*
* @author Christoph Strobl
* @since 2.1
*/
inline fun <reified T : Any> ReactiveMapReduceOperation.mapReduce(): ReactiveMapReduceOperation.MapReduceWithMapFunction<T> =
mapReduce(T::class.java)
/**
* Extension for [ReactiveMapReduceOperation.MapReduceWithProjection.as] providing a [KClass] based variant.
*
* @author Christoph Strobl
* @since 2.1
*/
fun <T : Any> ReactiveMapReduceOperation.MapReduceWithProjection<T>.asType(resultType: KClass<T>): ReactiveMapReduceOperation.MapReduceWithQuery<T> =
`as`(resultType.java)
/**
* Extension for [ReactiveMapReduceOperation.MapReduceWithProjection.as] leveraging reified type parameters.
*
* @author Christoph Strobl
* @since 2.1
*/
inline fun <reified T : Any> ReactiveMapReduceOperation.MapReduceWithProjection<T>.asType(): ReactiveMapReduceOperation.MapReduceWithQuery<T> =
`as`(T::class.java)

View File

@@ -16,7 +16,7 @@
package org.springframework.data.mongodb.core.query
/**
* Extension for [Criteria.is] providing an `isEqualTo` alias since `in` is a reserved keyword in Kotlin.
* Extension for [Criteria.is] providing an `isEqualTo` alias since `is` is a reserved keyword in Kotlin.
*
* @author Sebastien Deleuze
* @since 2.0

View File

@@ -0,0 +1,239 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.Assert.*;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.Mockito.*;
import javax.transaction.Status;
import javax.transaction.UserTransaction;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.junit.MockitoJUnitRunner;
import org.springframework.transaction.TransactionStatus;
import org.springframework.transaction.jta.JtaTransactionManager;
import org.springframework.transaction.support.TransactionCallbackWithoutResult;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.transaction.support.TransactionTemplate;
import com.mongodb.client.ClientSession;
import com.mongodb.client.MongoDatabase;
import com.mongodb.session.ServerSession;
/**
* @author Christoph Strobl
*/
@RunWith(MockitoJUnitRunner.class)
public class MongoDatabaseUtilsUnitTests {
@Mock ClientSession session;
@Mock ServerSession serverSession;
@Mock MongoDbFactory dbFactory;
@Mock MongoDatabase db;
@Mock UserTransaction userTransaction;
@Before
public void setUp() {
when(dbFactory.getSession(any())).thenReturn(session);
when(dbFactory.withSession(session)).thenReturn(dbFactory);
when(dbFactory.getDb()).thenReturn(db);
when(session.getServerSession()).thenReturn(serverSession);
when(session.hasActiveTransaction()).thenReturn(true);
when(serverSession.isClosed()).thenReturn(false);
}
@After
public void verifyTransactionSynchronizationManagerState() {
assertTrue(TransactionSynchronizationManager.getResourceMap().isEmpty());
assertFalse(TransactionSynchronizationManager.isSynchronizationActive());
assertNull(TransactionSynchronizationManager.getCurrentTransactionName());
assertFalse(TransactionSynchronizationManager.isCurrentTransactionReadOnly());
assertNull(TransactionSynchronizationManager.getCurrentTransactionIsolationLevel());
assertFalse(TransactionSynchronizationManager.isActualTransactionActive());
}
@Test // DATAMONGO-1920
public void shouldNotStartSessionWhenNoTransactionOngoing() {
MongoDatabaseUtils.getDatabase(dbFactory, SessionSynchronization.ON_ACTUAL_TRANSACTION);
verify(dbFactory, never()).getSession(any());
verify(dbFactory, never()).withSession(any(ClientSession.class));
}
@Test // DATAMONGO-1920
public void shouldParticipateInOngoingJtaTransactionWithCommitWhenSessionSychronizationIsAny() throws Exception {
when(userTransaction.getStatus()).thenReturn(Status.STATUS_NO_TRANSACTION, Status.STATUS_ACTIVE,
Status.STATUS_ACTIVE);
JtaTransactionManager txManager = new JtaTransactionManager(userTransaction);
TransactionTemplate txTemplate = new TransactionTemplate(txManager);
txTemplate.execute(new TransactionCallbackWithoutResult() {
@Override
protected void doInTransactionWithoutResult(TransactionStatus transactionStatus) {
assertThat(TransactionSynchronizationManager.isSynchronizationActive()).isTrue();
assertThat(transactionStatus.isNewTransaction()).isTrue();
assertThat(TransactionSynchronizationManager.hasResource(dbFactory)).isFalse();
MongoDatabaseUtils.getDatabase(dbFactory, SessionSynchronization.ALWAYS);
assertThat(TransactionSynchronizationManager.hasResource(dbFactory)).isTrue();
}
});
verify(userTransaction).begin();
verify(session).startTransaction();
verify(session).commitTransaction();
verify(session).close();
}
@Test // DATAMONGO-1920
public void shouldParticipateInOngoingJtaTransactionWithRollbackWhenSessionSychronizationIsAny() throws Exception {
when(userTransaction.getStatus()).thenReturn(Status.STATUS_NO_TRANSACTION, Status.STATUS_ACTIVE,
Status.STATUS_ACTIVE);
JtaTransactionManager txManager = new JtaTransactionManager(userTransaction);
TransactionTemplate txTemplate = new TransactionTemplate(txManager);
txTemplate.execute(new TransactionCallbackWithoutResult() {
@Override
protected void doInTransactionWithoutResult(TransactionStatus transactionStatus) {
assertThat(TransactionSynchronizationManager.isSynchronizationActive()).isTrue();
assertThat(transactionStatus.isNewTransaction()).isTrue();
assertThat(TransactionSynchronizationManager.hasResource(dbFactory)).isFalse();
MongoDatabaseUtils.getDatabase(dbFactory, SessionSynchronization.ALWAYS);
assertThat(TransactionSynchronizationManager.hasResource(dbFactory)).isTrue();
transactionStatus.setRollbackOnly();
}
});
verify(userTransaction).rollback();
verify(session).startTransaction();
verify(session).abortTransaction();
verify(session).close();
}
@Test // DATAMONGO-1920
public void shouldNotParticipateInOngoingJtaTransactionWithRollbackWhenSessionSychronizationIsNative()
throws Exception {
when(userTransaction.getStatus()).thenReturn(Status.STATUS_NO_TRANSACTION, Status.STATUS_ACTIVE,
Status.STATUS_ACTIVE);
JtaTransactionManager txManager = new JtaTransactionManager(userTransaction);
TransactionTemplate txTemplate = new TransactionTemplate(txManager);
txTemplate.execute(new TransactionCallbackWithoutResult() {
@Override
protected void doInTransactionWithoutResult(TransactionStatus transactionStatus) {
assertThat(TransactionSynchronizationManager.isSynchronizationActive()).isTrue();
assertThat(transactionStatus.isNewTransaction()).isTrue();
assertThat(TransactionSynchronizationManager.hasResource(dbFactory)).isFalse();
MongoDatabaseUtils.getDatabase(dbFactory, SessionSynchronization.ON_ACTUAL_TRANSACTION);
assertThat(TransactionSynchronizationManager.hasResource(dbFactory)).isFalse();
transactionStatus.setRollbackOnly();
}
});
verify(userTransaction).rollback();
verify(session, never()).startTransaction();
verify(session, never()).abortTransaction();
verify(session, never()).close();
}
@Test // DATAMONGO-1920
public void shouldParticipateInOngoingMongoTransactionWhenSessionSychronizationIsNative() {
MongoTransactionManager txManager = new MongoTransactionManager(dbFactory);
TransactionTemplate txTemplate = new TransactionTemplate(txManager);
txTemplate.execute(new TransactionCallbackWithoutResult() {
@Override
protected void doInTransactionWithoutResult(TransactionStatus transactionStatus) {
assertThat(TransactionSynchronizationManager.isSynchronizationActive()).isTrue();
assertThat(transactionStatus.isNewTransaction()).isTrue();
assertThat(TransactionSynchronizationManager.hasResource(dbFactory)).isTrue();
MongoDatabaseUtils.getDatabase(dbFactory, SessionSynchronization.ON_ACTUAL_TRANSACTION);
transactionStatus.setRollbackOnly();
}
});
verify(session).startTransaction();
verify(session).abortTransaction();
verify(session).close();
}
@Test // DATAMONGO-1920
public void shouldParticipateInOngoingMongoTransactionWhenSessionSychronizationIsAny() {
MongoTransactionManager txManager = new MongoTransactionManager(dbFactory);
TransactionTemplate txTemplate = new TransactionTemplate(txManager);
txTemplate.execute(new TransactionCallbackWithoutResult() {
@Override
protected void doInTransactionWithoutResult(TransactionStatus transactionStatus) {
assertThat(TransactionSynchronizationManager.isSynchronizationActive()).isTrue();
assertThat(transactionStatus.isNewTransaction()).isTrue();
assertThat(TransactionSynchronizationManager.hasResource(dbFactory)).isTrue();
MongoDatabaseUtils.getDatabase(dbFactory, SessionSynchronization.ALWAYS);
transactionStatus.setRollbackOnly();
}
});
verify(session).startTransaction();
verify(session).abortTransaction();
verify(session).close();
}
}

View File

@@ -0,0 +1,333 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb;
import static org.assertj.core.api.Assertions.*;
import static org.junit.Assert.*;
import static org.mockito.Mockito.*;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.junit.MockitoJUnitRunner;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.transaction.TransactionDefinition;
import org.springframework.transaction.TransactionStatus;
import org.springframework.transaction.UnexpectedRollbackException;
import org.springframework.transaction.support.DefaultTransactionDefinition;
import org.springframework.transaction.support.TransactionCallbackWithoutResult;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import org.springframework.transaction.support.TransactionTemplate;
import com.mongodb.client.ClientSession;
import com.mongodb.client.MongoDatabase;
import com.mongodb.session.ServerSession;
/**
* @author Christoph Strobl
*/
@RunWith(MockitoJUnitRunner.class)
public class MongoTransactionManagerUnitTests {
@Mock ClientSession session;
@Mock ClientSession session2;
@Mock ServerSession serverSession;
@Mock MongoDbFactory dbFactory;
@Mock MongoDbFactory dbFactory2;
@Mock MongoDatabase db;
@Mock MongoDatabase db2;
@Before
public void setUp() {
when(dbFactory.getSession(any())).thenReturn(session, session2);
when(dbFactory.withSession(session)).thenReturn(dbFactory);
when(dbFactory.withSession(session2)).thenReturn(dbFactory2);
when(dbFactory.getDb()).thenReturn(db);
when(dbFactory2.getDb()).thenReturn(db2);
when(session.getServerSession()).thenReturn(serverSession);
when(session2.getServerSession()).thenReturn(serverSession);
when(serverSession.isClosed()).thenReturn(false);
}
@After
public void verifyTransactionSynchronizationManager() {
assertTrue(TransactionSynchronizationManager.getResourceMap().isEmpty());
assertFalse(TransactionSynchronizationManager.isSynchronizationActive());
}
@Test // DATAMONGO-1920
public void triggerCommitCorrectly() {
MongoTransactionManager txManager = new MongoTransactionManager(dbFactory);
TransactionStatus txStatus = txManager.getTransaction(new DefaultTransactionDefinition());
MongoTemplate template = new MongoTemplate(dbFactory);
template.execute(db -> {
db.drop();
return null;
});
verify(dbFactory).withSession(eq(session));
txManager.commit(txStatus);
verify(session).startTransaction();
verify(session).commitTransaction();
verify(session).close();
}
@Test // DATAMONGO-1920
public void participateInOnGoingTransactionWithCommit() {
MongoTransactionManager txManager = new MongoTransactionManager(dbFactory);
TransactionStatus txStatus = txManager.getTransaction(new DefaultTransactionDefinition());
MongoTemplate template = new MongoTemplate(dbFactory);
template.execute(db -> {
db.drop();
return null;
});
TransactionTemplate txTemplate = new TransactionTemplate(txManager);
txTemplate.execute(new TransactionCallbackWithoutResult() {
@Override
protected void doInTransactionWithoutResult(TransactionStatus status) {
template.execute(db -> {
db.drop();
return null;
});
}
});
verify(dbFactory, times(2)).withSession(eq(session));
txManager.commit(txStatus);
verify(session).startTransaction();
verify(session).commitTransaction();
verify(session).close();
}
@Test // DATAMONGO-1920
public void participateInOnGoingTransactionWithRollbackOnly() {
MongoTransactionManager txManager = new MongoTransactionManager(dbFactory);
TransactionStatus txStatus = txManager.getTransaction(new DefaultTransactionDefinition());
MongoTemplate template = new MongoTemplate(dbFactory);
template.execute(db -> {
db.drop();
return null;
});
TransactionTemplate txTemplate = new TransactionTemplate(txManager);
txTemplate.execute(new TransactionCallbackWithoutResult() {
@Override
protected void doInTransactionWithoutResult(TransactionStatus status) {
template.execute(db -> {
db.drop();
return null;
});
status.setRollbackOnly();
}
});
verify(dbFactory, times(2)).withSession(eq(session));
assertThatExceptionOfType(UnexpectedRollbackException.class).isThrownBy(() -> txManager.commit(txStatus));
verify(session).startTransaction();
verify(session).abortTransaction();
verify(session).close();
}
@Test // DATAMONGO-1920
public void triggerRollbackCorrectly() {
MongoTransactionManager txManager = new MongoTransactionManager(dbFactory);
TransactionStatus txStatus = txManager.getTransaction(new DefaultTransactionDefinition());
MongoTemplate template = new MongoTemplate(dbFactory);
template.execute(db -> {
db.drop();
return null;
});
verify(dbFactory).withSession(eq(session));
txManager.rollback(txStatus);
verify(session).startTransaction();
verify(session).abortTransaction();
verify(session).close();
}
@Test // DATAMONGO-1920
public void suspendTransactionWhilePropagationNotSupported() {
MongoTransactionManager txManager = new MongoTransactionManager(dbFactory);
TransactionStatus txStatus = txManager.getTransaction(new DefaultTransactionDefinition());
MongoTemplate template = new MongoTemplate(dbFactory);
template.execute(db -> {
db.drop();
return null;
});
TransactionTemplate txTemplate = new TransactionTemplate(txManager);
txTemplate.setPropagationBehavior(TransactionDefinition.PROPAGATION_NOT_SUPPORTED);
txTemplate.execute(new TransactionCallbackWithoutResult() {
@Override
protected void doInTransactionWithoutResult(TransactionStatus status) {
template.execute(db -> {
db.drop();
return null;
});
}
});
template.execute(MongoDatabase::listCollections);
txManager.commit(txStatus);
verify(session).startTransaction();
verify(session2, never()).startTransaction();
verify(dbFactory, times(2)).withSession(eq(session));
verify(dbFactory, never()).withSession(eq(session2));
verify(db, times(2)).drop();
verify(db).listCollections();
verify(session).close();
verify(session2, never()).close();
}
@Test // DATAMONGO-1920
public void suspendTransactionWhilePropagationRequiresNew() {
MongoTransactionManager txManager = new MongoTransactionManager(dbFactory);
TransactionStatus txStatus = txManager.getTransaction(new DefaultTransactionDefinition());
MongoTemplate template = new MongoTemplate(dbFactory);
template.execute(db -> {
db.drop();
return null;
});
TransactionTemplate txTemplate = new TransactionTemplate(txManager);
txTemplate.setPropagationBehavior(TransactionDefinition.PROPAGATION_REQUIRES_NEW);
txTemplate.execute(new TransactionCallbackWithoutResult() {
@Override
protected void doInTransactionWithoutResult(TransactionStatus status) {
template.execute(db -> {
db.drop();
return null;
});
}
});
template.execute(MongoDatabase::listCollections);
txManager.commit(txStatus);
verify(session).startTransaction();
verify(session2).startTransaction();
verify(dbFactory, times(2)).withSession(eq(session));
verify(dbFactory).withSession(eq(session2));
verify(db).drop();
verify(db2).drop();
verify(db).listCollections();
verify(session).close();
verify(session2).close();
}
@Test // DATAMONGO-1920
public void readonlyShouldInitiateASessionStartAndCommitTransaction() {
MongoTransactionManager txManager = new MongoTransactionManager(dbFactory);
DefaultTransactionDefinition readonlyTxDefinition = new DefaultTransactionDefinition();
readonlyTxDefinition.setReadOnly(true);
TransactionStatus txStatus = txManager.getTransaction(readonlyTxDefinition);
MongoTemplate template = new MongoTemplate(dbFactory);
template.execute(db -> {
db.drop();
return null;
});
verify(dbFactory).withSession(eq(session));
txManager.commit(txStatus);
verify(session).startTransaction();
verify(session).commitTransaction();
verify(session).close();
}
@Test // DATAMONGO-1920
public void readonlyShouldInitiateASessionStartAndRollbackTransaction() {
MongoTransactionManager txManager = new MongoTransactionManager(dbFactory);
DefaultTransactionDefinition readonlyTxDefinition = new DefaultTransactionDefinition();
readonlyTxDefinition.setReadOnly(true);
TransactionStatus txStatus = txManager.getTransaction(readonlyTxDefinition);
MongoTemplate template = new MongoTemplate(dbFactory);
template.execute(db -> {
db.drop();
return null;
});
verify(dbFactory).withSession(eq(session));
txManager.rollback(txStatus);
verify(session).startTransaction();
verify(session).abortTransaction();
verify(session).close();
}
}

View File

@@ -35,9 +35,9 @@ import org.springframework.test.util.ReflectionTestUtils;
import org.springframework.util.ClassUtils;
import com.mongodb.MongoClient;
import com.mongodb.client.ClientSession;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.MongoDatabase;
import com.mongodb.session.ClientSession;
/**
* Unit tests for {@link SessionAwareMethodInterceptor}.
@@ -129,7 +129,7 @@ public class SessionAwareMethodInterceptorUnitTests {
collection.getReadConcern();
assertThat(cache.contains(readConcernMethod, MongoCollection.class)).isTrue();
assertThat(cache.lookup(readConcernMethod, MongoCollection.class)).isEmpty();
assertThat(cache.lookup(readConcernMethod, MongoCollection.class, ClientSession.class)).isEmpty();
}
@Test // DATAMONGO-1880
@@ -160,23 +160,23 @@ public class SessionAwareMethodInterceptorUnitTests {
verify(otherCollection).drop(eq(session));
}
private MongoDatabase proxyDatabase(ClientSession session, MongoDatabase database) {
private MongoDatabase proxyDatabase(com.mongodb.session.ClientSession session, MongoDatabase database) {
return createProxyInstance(session, database, MongoDatabase.class);
}
private MongoCollection proxyCollection(ClientSession session, MongoCollection collection) {
private MongoCollection proxyCollection(com.mongodb.session.ClientSession session, MongoCollection collection) {
return createProxyInstance(session, collection, MongoCollection.class);
}
private <T> T createProxyInstance(ClientSession session, T target, Class<T> targetType) {
private <T> T createProxyInstance(com.mongodb.session.ClientSession session, T target, Class<T> targetType) {
ProxyFactory factory = new ProxyFactory();
factory.setTarget(target);
factory.setInterfaces(targetType);
factory.setOpaque(true);
factory.addAdvice(new SessionAwareMethodInterceptor<>(session, target, MongoDatabase.class, this::proxyDatabase,
MongoCollection.class, this::proxyCollection));
factory.addAdvice(new SessionAwareMethodInterceptor<>(session, target, ClientSession.class, MongoDatabase.class,
this::proxyDatabase, MongoCollection.class, this::proxyCollection));
return targetType.cast(factory.getProxy());
}

View File

@@ -30,9 +30,9 @@ import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoException;
import com.mongodb.client.MongoClient;
import com.mongodb.client.MongoClients;
import com.mongodb.client.MongoCollection;
/**
@@ -43,7 +43,7 @@ import com.mongodb.client.MongoCollection;
public abstract class AbstractIntegrationTests {
@Configuration
static class TestConfig extends AbstractMongoConfiguration {
static class TestConfig extends AbstractMongoClientConfiguration {
@Override
protected String getDatabaseName() {
@@ -52,7 +52,7 @@ public abstract class AbstractIntegrationTests {
@Override
public MongoClient mongoClient() {
return new MongoClient();
return MongoClients.create();
}
}

View File

@@ -40,7 +40,8 @@ import org.springframework.data.mongodb.core.convert.MongoTypeMapper;
import org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.expression.spel.support.StandardEvaluationContext;
import org.springframework.data.spel.EvaluationContextProvider;
import org.springframework.data.spel.ExtensionAwareEvaluationContextProvider;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.MongoClient;
@@ -106,9 +107,10 @@ public class AbstractMongoConfigurationUnitTests {
AbstractApplicationContext context = new AnnotationConfigApplicationContext(SampleMongoConfiguration.class);
MongoMappingContext mappingContext = context.getBean(MongoMappingContext.class);
BasicMongoPersistentEntity<?> entity = mappingContext.getRequiredPersistentEntity(Entity.class);
StandardEvaluationContext spElContext = (StandardEvaluationContext) ReflectionTestUtils.getField(entity, "context");
EvaluationContextProvider provider = (EvaluationContextProvider) ReflectionTestUtils.getField(entity,
"evaluationContextProvider");
assertThat(spElContext.getBeanResolver(), is(notNullValue()));
assertThat(provider, is(instanceOf(ExtensionAwareEvaluationContextProvider.class)));
context.close();
}

View File

@@ -18,6 +18,9 @@ package org.springframework.data.mongodb.config;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import example.first.First;
import example.second.Second;
import java.util.Arrays;
import java.util.Collection;
import java.util.Collections;
@@ -37,16 +40,14 @@ import org.springframework.data.mongodb.core.convert.MongoTypeMapper;
import org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.expression.spel.support.StandardEvaluationContext;
import org.springframework.data.spel.EvaluationContextProvider;
import org.springframework.data.spel.ExtensionAwareEvaluationContextProvider;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.Mongo;
import com.mongodb.reactivestreams.client.MongoClient;
import com.mongodb.reactivestreams.client.MongoClients;
import example.first.First;
import example.second.Second;
/**
* Unit tests for {@link AbstractReactiveMongoConfiguration}.
*
@@ -106,9 +107,10 @@ public class AbstractReactiveMongoConfigurationUnitTests {
AbstractApplicationContext context = new AnnotationConfigApplicationContext(SampleMongoConfiguration.class);
MongoMappingContext mappingContext = context.getBean(MongoMappingContext.class);
BasicMongoPersistentEntity<?> entity = mappingContext.getRequiredPersistentEntity(Entity.class);
StandardEvaluationContext spElContext = (StandardEvaluationContext) ReflectionTestUtils.getField(entity, "context");
EvaluationContextProvider provider = (EvaluationContextProvider) ReflectionTestUtils.getField(entity,
"evaluationContextProvider");
assertThat(spElContext.getBeanResolver(), is(notNullValue()));
assertThat(provider, is(instanceOf(ExtensionAwareEvaluationContextProvider.class)));
context.close();
}

View File

@@ -16,20 +16,30 @@
package org.springframework.data.mongodb.core;
import static org.assertj.core.api.Assertions.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import com.mongodb.MongoClient;
import lombok.AllArgsConstructor;
import lombok.Data;
import org.bson.Document;
import org.junit.After;
import org.junit.Before;
import org.junit.ClassRule;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.TestRule;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.test.util.MongoTestUtils;
import org.springframework.data.mongodb.test.util.MongoVersion;
import org.springframework.data.mongodb.test.util.MongoVersionRule;
import org.springframework.data.mongodb.test.util.ReplicaSet;
import org.springframework.data.util.Version;
import com.mongodb.ClientSessionOptions;
import com.mongodb.MongoClient;
import com.mongodb.session.ClientSession;
import com.mongodb.client.ClientSession;
/**
* @author Christoph Strobl
@@ -37,8 +47,11 @@ import com.mongodb.session.ClientSession;
*/
public class ClientSessionTests {
public static @ClassRule MongoVersionRule REQUIRES_AT_LEAST_3_6_0 = MongoVersionRule.atLeast(Version.parse("3.6.0"));
public static @ClassRule TestRule replSet = ReplicaSet.required();
public @Rule MongoVersionRule REQUIRES_AT_LEAST_3_6_0 = MongoVersionRule.atLeast(Version.parse("3.6.0"));
private static final String DB_NAME = "client-session-tests";
private static final String COLLECTION_NAME = "test";
MongoTemplate template;
MongoClient client;
@@ -46,11 +59,17 @@ public class ClientSessionTests {
@Before
public void setUp() {
client = new MongoClient();
template = new MongoTemplate(client, "reflective-client-session-tests");
template.getDb().getCollection("test").drop();
client = MongoTestUtils.replSetClient();
template.getDb().getCollection("test").insertOne(new Document("_id", "id-1").append("value", "spring"));
MongoTestUtils.createOrReplaceCollection(DB_NAME, COLLECTION_NAME, client);
template = new MongoTemplate(client, DB_NAME);
template.getDb().getCollection(COLLECTION_NAME).insertOne(new Document("_id", "id-1").append("value", "spring"));
}
@After
public void tearDown() {
client.close();
}
@Test // DATAMONGO-1880
@@ -69,4 +88,66 @@ public class ClientSessionTests {
session.close();
}
@Test // DATAMONGO-1920
@MongoVersion(asOf = "3.7.3")
public void withCommittedTransaction() {
ClientSession session = client.startSession(ClientSessionOptions.builder().causallyConsistent(true).build());
assertThat(session.getOperationTime()).isNull();
session.startTransaction();
SomeDoc saved = template.withSession(() -> session).execute(action -> {
SomeDoc doc = new SomeDoc("id-2", "value2");
action.insert(doc);
return doc;
});
session.commitTransaction();
session.close();
assertThat(saved).isNotNull();
assertThat(session.getOperationTime()).isNotNull();
assertThat(template.exists(query(where("id").is(saved.getId())), SomeDoc.class)).isTrue();
}
@Test // DATAMONGO-1920
@MongoVersion(asOf = "3.7.3")
public void withAbortedTransaction() {
ClientSession session = client.startSession(ClientSessionOptions.builder().causallyConsistent(true).build());
assertThat(session.getOperationTime()).isNull();
session.startTransaction();
SomeDoc saved = template.withSession(() -> session).execute(action -> {
SomeDoc doc = new SomeDoc("id-2", "value2");
action.insert(doc);
return doc;
});
session.abortTransaction();
session.close();
assertThat(saved).isNotNull();
assertThat(session.getOperationTime()).isNotNull();
assertThat(template.exists(query(where("id").is(saved.getId())), SomeDoc.class)).isFalse();
}
@Data
@AllArgsConstructor
@org.springframework.data.mongodb.core.mapping.Document(COLLECTION_NAME)
static class SomeDoc {
@Id String id;
String value;
}
}

View File

@@ -0,0 +1,141 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.ArgumentMatchers.eq;
import static org.mockito.ArgumentMatchers.isNull;
import static org.mockito.Mockito.*;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.junit.MockitoJUnitRunner;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Query;
/**
* Unit tests for {@link ExecutableMapReduceOperationSupport}.
*
* @author Christoph Strobl
* @currentRead Beyond the Shadows - Brent Weeks
*/
@RunWith(MockitoJUnitRunner.class)
public class ExecutableMapReduceOperationSupportUnitTests {
private static final String STAR_WARS = "star-wars";
private static final String MAP_FUNCTION = "function() { emit(this.id, this.firstname) }";
private static final String REDUCE_FUNCTION = "function(id, name) { return sum(id, name); }";
@Mock MongoTemplate template;
ExecutableMapReduceOperationSupport mapReduceOpsSupport;
@Before
public void setUp() {
when(template.determineCollectionName(eq(Person.class))).thenReturn(STAR_WARS);
mapReduceOpsSupport = new ExecutableMapReduceOperationSupport(template);
}
@Test(expected = IllegalArgumentException.class) // DATAMONGO-1929
public void throwsExceptionOnNullTemplate() {
new ExecutableMapReduceOperationSupport(null);
}
@Test(expected = IllegalArgumentException.class) // DATAMONGO-1929
public void throwsExceptionOnNullDomainType() {
mapReduceOpsSupport.mapReduce(null);
}
@Test // DATAMONGO-1929
public void usesExtractedCollectionName() {
mapReduceOpsSupport.mapReduce(Person.class).map(MAP_FUNCTION).reduce(REDUCE_FUNCTION).all();
verify(template).mapReduce(any(Query.class), eq(Person.class), eq(STAR_WARS), eq(MAP_FUNCTION), eq(REDUCE_FUNCTION),
isNull(), eq(Person.class));
}
@Test // DATAMONGO-1929
public void usesExplicitCollectionName() {
mapReduceOpsSupport.mapReduce(Person.class).map(MAP_FUNCTION).reduce(REDUCE_FUNCTION)
.inCollection("the-night-angel").all();
verify(template).mapReduce(any(Query.class), eq(Person.class), eq("the-night-angel"), eq(MAP_FUNCTION),
eq(REDUCE_FUNCTION), isNull(), eq(Person.class));
}
@Test // DATAMONGO-1929
public void usesMapReduceOptionsWhenPresent() {
MapReduceOptions options = MapReduceOptions.options();
mapReduceOpsSupport.mapReduce(Person.class).map(MAP_FUNCTION).reduce(REDUCE_FUNCTION).with(options).all();
verify(template).mapReduce(any(Query.class), eq(Person.class), eq(STAR_WARS), eq(MAP_FUNCTION), eq(REDUCE_FUNCTION),
eq(options), eq(Person.class));
}
@Test // DATAMONGO-1929
public void usesQueryWhenPresent() {
Query query = new BasicQuery("{ 'lastname' : 'skywalker' }");
mapReduceOpsSupport.mapReduce(Person.class).map(MAP_FUNCTION).reduce(REDUCE_FUNCTION).matching(query).all();
verify(template).mapReduce(eq(query), eq(Person.class), eq(STAR_WARS), eq(MAP_FUNCTION), eq(REDUCE_FUNCTION),
isNull(), eq(Person.class));
}
@Test // DATAMONGO-1929
public void usesProjectionWhenPresent() {
mapReduceOpsSupport.mapReduce(Person.class).map(MAP_FUNCTION).reduce(REDUCE_FUNCTION).as(Jedi.class).all();
verify(template).mapReduce(any(Query.class), eq(Person.class), eq(STAR_WARS), eq(MAP_FUNCTION), eq(REDUCE_FUNCTION),
isNull(), eq(Jedi.class));
}
interface Contact {}
@Data
@org.springframework.data.mongodb.core.mapping.Document(collection = STAR_WARS)
static class Person implements Contact {
@Id String id;
String firstname;
String lastname;
Object ability;
Person father;
}
@Data
@AllArgsConstructor
@NoArgsConstructor
static class Jedi {
@Field("firstname") String name;
}
}

View File

@@ -23,7 +23,9 @@ import static org.springframework.data.mongodb.core.schema.JsonSchemaProperty.*;
import lombok.Data;
import reactor.test.StepVerifier;
import org.junit.AfterClass;
import org.junit.Before;
import org.junit.BeforeClass;
import org.junit.ClassRule;
import org.junit.Test;
import org.springframework.data.annotation.Id;
@@ -32,11 +34,13 @@ import org.springframework.data.mongodb.core.schema.MongoJsonSchema;
import org.springframework.data.mongodb.test.util.MongoVersionRule;
import org.springframework.data.util.Version;
import com.mongodb.MongoClient;
import com.mongodb.reactivestreams.client.MongoClients;
import com.mongodb.client.MongoClient;
import com.mongodb.client.MongoClients;
/**
* @author Christoph Strobl
* @author Mark Paluch
*/
public class JsonSchemaQueryTests {
@@ -44,13 +48,19 @@ public class JsonSchemaQueryTests {
public static @ClassRule MongoVersionRule REQUIRES_AT_LEAST_3_6_0 = MongoVersionRule.atLeast(Version.parse("3.6.0"));
static MongoClient client = MongoClients.create();
MongoTemplate template;
Person jellyBelly, roseSpringHeart, kazmardBoombub;
@BeforeClass
public static void beforeClass() {
client = MongoClients.create();
}
@Before
public void setUp() {
template = new MongoTemplate(new MongoClient(), DATABASE_NAME);
template = new MongoTemplate(client, DATABASE_NAME);
jellyBelly = new Person();
jellyBelly.id = "1";
@@ -80,6 +90,13 @@ public class JsonSchemaQueryTests {
template.save(roseSpringHeart);
template.save(kazmardBoombub);
}
@AfterClass
public static void afterClass() {
if (client != null) {
client.close();
}
}
@Test // DATAMONGO-1835
public void findsDocumentsWithRequiredFieldsCorrectly() {
@@ -95,8 +112,12 @@ public class JsonSchemaQueryTests {
MongoJsonSchema schema = MongoJsonSchema.builder().required("address").build();
StepVerifier.create(new ReactiveMongoTemplate(MongoClients.create(), DATABASE_NAME)
com.mongodb.reactivestreams.client.MongoClient mongoClient = com.mongodb.reactivestreams.client.MongoClients.create();
StepVerifier.create(new ReactiveMongoTemplate(mongoClient, DATABASE_NAME)
.find(query(matchingDocumentStructure(schema)), Person.class)).expectNextCount(2).verifyComplete();
mongoClient.close();
}
@Test // DATAMONGO-1835

View File

@@ -0,0 +1,182 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.assertj.core.api.Assertions.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import static org.springframework.data.mongodb.test.util.MongoTestUtils.*;
import lombok.AllArgsConstructor;
import lombok.Data;
import java.util.List;
import java.util.concurrent.CopyOnWriteArrayList;
import org.bson.Document;
import org.junit.Before;
import org.junit.ClassRule;
import org.junit.Test;
import org.junit.rules.RuleChain;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.annotation.Id;
import org.springframework.data.domain.Persistable;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.MongoTransactionManager;
import org.springframework.data.mongodb.config.AbstractMongoConfiguration;
import org.springframework.data.mongodb.test.util.AfterTransactionAssertion;
import org.springframework.data.mongodb.test.util.MongoTestUtils;
import org.springframework.data.mongodb.test.util.MongoVersionRule;
import org.springframework.data.mongodb.test.util.ReplicaSet;
import org.springframework.data.util.Version;
import org.springframework.test.annotation.Rollback;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.test.context.transaction.AfterTransaction;
import org.springframework.test.context.transaction.BeforeTransaction;
import org.springframework.transaction.annotation.Transactional;
import com.mongodb.MongoClient;
import com.mongodb.ReadPreference;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.model.Filters;
/**
* @author Christoph Strobl
* @currentRead Shadow's Edge - Brent Weeks
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
@Transactional(transactionManager = "txManager")
public class MongoTemplateTransactionTests {
public static @ClassRule RuleChain TEST_RULES = RuleChain.outerRule(MongoVersionRule.atLeast(Version.parse("3.7.3")))
.around(ReplicaSet.required());
static final String DB_NAME = "template-tx-tests";
static final String COLLECTION_NAME = "assassins";
@Configuration
static class Config extends AbstractMongoConfiguration {
@Bean
public MongoClient mongoClient() {
return MongoTestUtils.replSetClient();
}
@Override
protected String getDatabaseName() {
return DB_NAME;
}
@Bean
MongoTransactionManager txManager(MongoDbFactory dbFactory) {
return new MongoTransactionManager(dbFactory);
}
}
@Autowired MongoTemplate template;
@Autowired MongoClient client;
List<AfterTransactionAssertion<? extends Persistable<?>>> assertionList;
@Before
public void setUp() {
template.setReadPreference(ReadPreference.primary());
assertionList = new CopyOnWriteArrayList<>();
}
@BeforeTransaction
public void beforeTransaction() {
createOrReplaceCollection(DB_NAME, COLLECTION_NAME, client);
}
@AfterTransaction
public void verifyDbState() {
MongoCollection<Document> collection = client.getDatabase(DB_NAME).withReadPreference(ReadPreference.primary())
.getCollection(COLLECTION_NAME);
assertionList.forEach(it -> {
boolean isPresent = collection.count(Filters.eq("_id", it.getId())) != 0;
assertThat(isPresent).isEqualTo(it.shouldBePresent())
.withFailMessage(String.format("After transaction entity %s should %s.", it.getPersistable(),
it.shouldBePresent() ? "be present" : "NOT be present"));
});
}
@Rollback(false)
@Test // DATAMONGO-1920
public void shouldOperateCommitCorrectly() {
Assassin hu = new Assassin("hu", "Hu Gibbet");
template.save(hu);
assertAfterTransaction(hu).isPresent();
}
@Test // DATAMONGO-1920
public void shouldOperateRollbackCorrectly() {
Assassin vi = new Assassin("vi", "Viridiana Sovari");
template.save(vi);
assertAfterTransaction(vi).isNotPresent();
}
@Test // DATAMONGO-1920
public void shouldBeAbleToViewChangesDuringTransaction() throws InterruptedException {
Assassin durzo = new Assassin("durzo", "Durzo Blint");
template.save(durzo);
Thread.sleep(100);
Assassin retrieved = template.findOne(query(where("id").is(durzo.getId())), Assassin.class);
assertThat(retrieved).isEqualTo(durzo);
assertAfterTransaction(durzo).isNotPresent();
}
// --- Just some helpers and tests entities
private AfterTransactionAssertion assertAfterTransaction(Assassin assassin) {
AfterTransactionAssertion<Assassin> assertion = new AfterTransactionAssertion<>(assassin);
assertionList.add(assertion);
return assertion;
}
@Data
@AllArgsConstructor
@org.springframework.data.mongodb.core.mapping.Document(COLLECTION_NAME)
static class Assassin implements Persistable<String> {
@Id String id;
String name;
@Override
public boolean isNew() {
return id == null;
}
}
}

View File

@@ -22,6 +22,7 @@ import static org.mockito.Mockito.any;
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
import static org.springframework.data.mongodb.test.util.IsBsonObject.*;
import com.mongodb.client.model.ReplaceOptions;
import lombok.Data;
import java.math.BigInteger;
@@ -160,9 +161,14 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
new MongoTemplate(mongo, null);
}
@Test(expected = IllegalArgumentException.class)
public void rejectsNullMongo() throws Exception {
new MongoTemplate(null, "database");
@Test(expected = IllegalArgumentException.class) // DATAMONGO-1968
public void rejectsNullMongo() {
new MongoTemplate((MongoClient) null, "database");
}
@Test(expected = IllegalArgumentException.class) // DATAMONGO-1968
public void rejectsNullMongoClient() {
new MongoTemplate((com.mongodb.client.MongoClient) null, "database");
}
@Test(expected = IllegalArgumentException.class) // DATAMONGO-1870
@@ -716,7 +722,7 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
@Test // DATAMONGO-1518
public void findAndRemoveManyShouldUseCollationWhenPresent() {
template.doRemove("collection-1", new BasicQuery("{}").collation(Collation.of("fr")), AutogenerateableId.class);
template.doRemove("collection-1", new BasicQuery("{}").collation(Collation.of("fr")), AutogenerateableId.class, true);
ArgumentCaptor<DeleteOptions> options = ArgumentCaptor.forClass(DeleteOptions.class);
verify(collection).deleteMany(any(), options.capture());
@@ -754,7 +760,7 @@ public class MongoTemplateUnitTests extends MongoOperationsUnitTests {
template.updateFirst(new BasicQuery("{}").collation(Collation.of("fr")), new Update(), AutogenerateableId.class);
ArgumentCaptor<UpdateOptions> options = ArgumentCaptor.forClass(UpdateOptions.class);
ArgumentCaptor<ReplaceOptions> options = ArgumentCaptor.forClass(ReplaceOptions.class);
verify(collection).replaceOne(any(), any(), options.capture());
assertThat(options.getValue().getCollation().getLocale(), is("fr"));

View File

@@ -29,38 +29,48 @@ import org.junit.ClassRule;
import org.junit.Test;
import org.junit.rules.TestRule;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.test.util.MongoTestUtils;
import org.springframework.data.mongodb.test.util.MongoVersionRule;
import org.springframework.data.mongodb.test.util.ReplicaSet;
import org.springframework.data.util.Version;
import com.mongodb.ClientSessionOptions;
import com.mongodb.reactivestreams.client.ClientSession;
import com.mongodb.reactivestreams.client.MongoClient;
import com.mongodb.reactivestreams.client.MongoClients;
import com.mongodb.session.ClientSession;
import com.mongodb.reactivestreams.client.Success;
/**
* @author Christoph Strobl
* @author Mark Paluch
* @currentRead Beyond the Shadows - Brent Weeks
*/
public class ReactiveClientSessionTests {
public static @ClassRule MongoVersionRule REQUIRES_AT_LEAST_3_6_0 = MongoVersionRule.atLeast(Version.parse("3.6.0"));
public static @ClassRule TestRule replSet = ReplicaSet.required();
static final String DATABASE_NAME = "reflective-client-session-tests";
static final String COLLECTION_NAME = "test";
MongoClient client;
ReactiveMongoTemplate template;
@Before
public void setUp() {
client = MongoClients.create();
client = MongoTestUtils.reactiveReplSetClient();
template = new ReactiveMongoTemplate(client, "reflective-client-session-tests");
template = new ReactiveMongoTemplate(client, DATABASE_NAME);
StepVerifier.create(template.dropCollection("test")).verifyComplete();
MongoTestUtils.createOrReplaceCollection(DATABASE_NAME, COLLECTION_NAME, client) //
.as(StepVerifier::create) //
.expectNext(Success.SUCCESS) //
.verifyComplete();
StepVerifier.create(template.insert(new Document("_id", "id-1").append("value", "spring"), "test"))
.expectNextCount(1).verifyComplete();
template.insert(new Document("_id", "id-1").append("value", "spring"), COLLECTION_NAME) //
.as(StepVerifier::create) //
.expectNextCount(1) //
.verifyComplete();
}
@Test // DATAMONGO-1880
@@ -71,7 +81,9 @@ public class ReactiveClientSessionTests {
assertThat(session.getOperationTime()).isNull();
StepVerifier.create(template.withSession(() -> session).execute(action -> action.findAll(Document.class, "test")))
template.withSession(() -> session) //
.execute(action -> action.findAll(Document.class, COLLECTION_NAME)) //
.as(StepVerifier::create) //
.expectNextCount(1).verifyComplete();
assertThat(session.getOperationTime()).isNotNull();
@@ -88,10 +100,11 @@ public class ReactiveClientSessionTests {
assertThat(session.getOperationTime()).isNull();
StepVerifier
.create(
template.withSession(() -> session).execute(action -> action.findOne(new Query(), Document.class, "test")))
.expectNextCount(1).verifyComplete();
template.withSession(() -> session)
.execute(action -> action.findOne(new Query(), Document.class, COLLECTION_NAME)) //
.as(StepVerifier::create) //
.expectNextCount(1) //
.verifyComplete();
assertThat(session.getOperationTime()).isNotNull();
assertThat(session.getServerSession().isClosed()).isFalse();
@@ -108,13 +121,23 @@ public class ReactiveClientSessionTests {
ReactiveSessionScoped sessionScoped = template.withSession(sessionSupplier);
sessionScoped.execute(action -> action.findOne(new Query(), Document.class, "test")).blockFirst();
sessionScoped.execute(action -> action.findOne(new Query(), Document.class, COLLECTION_NAME)).blockFirst();
assertThat(sessionSupplier.getInvocationCount()).isEqualTo(1);
sessionScoped.execute(action -> action.findOne(new Query(), Document.class, "test")).blockFirst();
sessionScoped.execute(action -> action.findOne(new Query(), Document.class, COLLECTION_NAME)).blockFirst();
assertThat(sessionSupplier.getInvocationCount()).isEqualTo(1);
}
@Test // DATAMONGO-1970
public void addsClientSessionToContext() {
template.withSession(client.startSession(ClientSessionOptions.builder().causallyConsistent(true).build()))
.execute(action -> ReactiveMongoContext.getSession()) //
.as(StepVerifier::create) //
.expectNextCount(1) //
.verifyComplete();
}
static class CountingSessionSupplier implements Supplier<ClientSession> {
AtomicInteger invocationCount = new AtomicInteger(0);

View File

@@ -0,0 +1,141 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.ArgumentMatchers.eq;
import static org.mockito.ArgumentMatchers.isNull;
import static org.mockito.Mockito.*;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.junit.MockitoJUnitRunner;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Query;
/**
* Unit tests for {@link ReactiveMapReduceOperationSupport}.
*
* @author Christoph Strobl
* @currentRead Beyond the Shadows - Brent Weeks
*/
@RunWith(MockitoJUnitRunner.class)
public class ReactiveMapReduceOperationSupportUnitTests {
private static final String STAR_WARS = "star-wars";
private static final String MAP_FUNCTION = "function() { emit(this.id, this.firstname) }";
private static final String REDUCE_FUNCTION = "function(id, name) { return sum(id, name); }";
@Mock ReactiveMongoTemplate template;
ReactiveMapReduceOperationSupport mapReduceOpsSupport;
@Before
public void setUp() {
when(template.determineCollectionName(eq(Person.class))).thenReturn(STAR_WARS);
mapReduceOpsSupport = new ReactiveMapReduceOperationSupport(template);
}
@Test(expected = IllegalArgumentException.class) // DATAMONGO-1929
public void throwsExceptionOnNullTemplate() {
new ExecutableMapReduceOperationSupport(null);
}
@Test(expected = IllegalArgumentException.class) // DATAMONGO-1929
public void throwsExceptionOnNullDomainType() {
mapReduceOpsSupport.mapReduce(null);
}
@Test // DATAMONGO-1929
public void usesExtractedCollectionName() {
mapReduceOpsSupport.mapReduce(Person.class).map(MAP_FUNCTION).reduce(REDUCE_FUNCTION).all();
verify(template).mapReduce(any(Query.class), eq(Person.class), eq(STAR_WARS), eq(Person.class), eq(MAP_FUNCTION),
eq(REDUCE_FUNCTION), isNull());
}
@Test // DATAMONGO-1929
public void usesExplicitCollectionName() {
mapReduceOpsSupport.mapReduce(Person.class).map(MAP_FUNCTION).reduce(REDUCE_FUNCTION)
.inCollection("the-night-angel").all();
verify(template).mapReduce(any(Query.class), eq(Person.class), eq("the-night-angel"), eq(Person.class),
eq(MAP_FUNCTION), eq(REDUCE_FUNCTION), isNull());
}
@Test // DATAMONGO-1929
public void usesMapReduceOptionsWhenPresent() {
MapReduceOptions options = MapReduceOptions.options();
mapReduceOpsSupport.mapReduce(Person.class).map(MAP_FUNCTION).reduce(REDUCE_FUNCTION).with(options).all();
verify(template).mapReduce(any(Query.class), eq(Person.class), eq(STAR_WARS), eq(Person.class), eq(MAP_FUNCTION),
eq(REDUCE_FUNCTION), eq(options));
}
@Test // DATAMONGO-1929
public void usesQueryWhenPresent() {
Query query = new BasicQuery("{ 'lastname' : 'skywalker' }");
mapReduceOpsSupport.mapReduce(Person.class).map(MAP_FUNCTION).reduce(REDUCE_FUNCTION).matching(query).all();
verify(template).mapReduce(eq(query), eq(Person.class), eq(STAR_WARS), eq(Person.class), eq(MAP_FUNCTION),
eq(REDUCE_FUNCTION), isNull());
}
@Test // DATAMONGO-1929
public void usesProjectionWhenPresent() {
mapReduceOpsSupport.mapReduce(Person.class).map(MAP_FUNCTION).reduce(REDUCE_FUNCTION).as(Jedi.class).all();
verify(template).mapReduce(any(Query.class), eq(Person.class), eq(STAR_WARS), eq(Jedi.class), eq(MAP_FUNCTION),
eq(REDUCE_FUNCTION), isNull());
}
interface Contact {}
@Data
@org.springframework.data.mongodb.core.mapping.Document(collection = STAR_WARS)
static class Person implements Contact {
@Id String id;
String firstname;
String lastname;
Object ability;
Person father;
}
@Data
@AllArgsConstructor
@NoArgsConstructor
static class Jedi {
@Field("firstname") String name;
}
}

View File

@@ -15,14 +15,16 @@
*/
package org.springframework.data.mongodb.core;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.assertj.core.data.Index.atIndex;
import static org.springframework.data.mongodb.test.util.Assertions.*;
import lombok.Data;
import reactor.core.publisher.Flux;
import reactor.test.StepVerifier;
import java.util.List;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.LinkedBlockingQueue;
import java.util.concurrent.TimeUnit;
import org.bson.Document;
import org.junit.After;
@@ -32,19 +34,22 @@ import org.junit.Test;
import org.junit.rules.ExpectedException;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.annotation.Id;
import org.springframework.data.domain.Sort.Direction;
import org.springframework.data.mongodb.core.index.Index;
import org.springframework.data.mongodb.core.index.IndexField;
import org.springframework.data.mongodb.core.index.IndexInfo;
import org.springframework.data.mongodb.core.index.Indexed;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import com.mongodb.reactivestreams.client.ListIndexesPublisher;
import com.mongodb.reactivestreams.client.MongoCollection;
import com.mongodb.reactivestreams.client.MongoClient;
import com.mongodb.reactivestreams.client.Success;
/**
* Integration test for {@link MongoTemplate}.
* Integration test for index creation via {@link ReactiveMongoTemplate}.
*
* @author Mark Paluch
* @author Christoph Strobl
@@ -57,10 +62,14 @@ public class ReactiveMongoTemplateIndexTests {
@Autowired SimpleReactiveMongoDatabaseFactory factory;
@Autowired ReactiveMongoTemplate template;
@Autowired MongoClient client;
@Before
public void setUp() {
StepVerifier.create(template.dropCollection(Person.class)).verifyComplete();
StepVerifier.create(template.getCollection("person").drop()).expectNext(Success.SUCCESS).verifyComplete();
StepVerifier.create(template.getCollection("indexfail").drop()).expectNext(Success.SUCCESS).verifyComplete();
StepVerifier.create(template.getCollection("indexedSample").drop()).expectNext(Success.SUCCESS).verifyComplete();
}
@After
@@ -76,28 +85,29 @@ public class ReactiveMongoTemplateIndexTests {
p2.setAge(40);
template.insert(p2);
StepVerifier
.create(template.indexOps(Person.class) //
.ensureIndex(new Index().on("age", Direction.DESC).unique())) //
template.indexOps(Person.class) //
.ensureIndex(new Index().on("age", Direction.DESC).unique()) //
.as(StepVerifier::create) //
.expectNextCount(1) //
.verifyComplete();
MongoCollection<Document> coll = template.getCollection(template.getCollectionName(Person.class));
StepVerifier.create(Flux.from(coll.listIndexes()).collectList()).consumeNextWith(indexInfo -> {
Flux.from(template.getCollection(template.getCollectionName(Person.class)).listIndexes()).collectList() //
.as(StepVerifier::create) //
.consumeNextWith(indexInfo -> {
assertThat(indexInfo.size(), is(2));
Object indexKey = null;
boolean unique = false;
for (Document ix : indexInfo) {
assertThat(indexInfo).hasSize(2);
Object indexKey = null;
boolean unique = false;
for (Document ix : indexInfo) {
if ("age_-1".equals(ix.get("name"))) {
indexKey = ix.get("key");
unique = (Boolean) ix.get("unique");
}
}
assertThat(((Document) indexKey), hasEntry("age", -1));
assertThat(unique, is(true));
}).verifyComplete();
if ("age_-1".equals(ix.get("name"))) {
indexKey = ix.get("key");
unique = (Boolean) ix.get("unique");
}
}
assertThat((Document) indexKey).containsEntry("age", -1);
assertThat(unique).isTrue();
}).verifyComplete();
}
@Test // DATAMONGO-1444
@@ -105,27 +115,29 @@ public class ReactiveMongoTemplateIndexTests {
Person p1 = new Person("Oliver");
p1.setAge(25);
StepVerifier.create(template.insert(p1)).expectNextCount(1).verifyComplete();
StepVerifier
.create(template.indexOps(Person.class) //
.ensureIndex(new Index().on("age", Direction.DESC).unique())) //
template.insert(p1) //
.as(StepVerifier::create) //
.expectNextCount(1) //
.verifyComplete();
StepVerifier.create(template.indexOps(Person.class).getIndexInfo().collectList()).consumeNextWith(indexInfos -> {
template.indexOps(Person.class) //
.ensureIndex(new Index().on("age", Direction.DESC).unique()) //
.as(StepVerifier::create) //
.expectNextCount(1) //
.verifyComplete();
assertThat(indexInfos.size(), is(2));
template.indexOps(Person.class).getIndexInfo().collectList() //
.as(StepVerifier::create) //
.consumeNextWith(indexInfos -> {
IndexInfo ii = indexInfos.get(1);
assertThat(ii.isUnique(), is(true));
assertThat(ii.isSparse(), is(false));
assertThat(indexInfos).hasSize(2);
List<IndexField> indexFields = ii.getIndexFields();
IndexField field = indexFields.get(0);
IndexInfo ii = indexInfos.get(1);
assertThat(ii.isUnique()).isTrue();
assertThat(ii.isSparse()).isFalse();
assertThat(field, is(IndexField.create("age", Direction.DESC)));
}).verifyComplete();
assertThat(ii.getIndexFields()).contains(IndexField.create("age", Direction.DESC), atIndex(0));
}).verifyComplete();
}
@Test // DATAMONGO-1444
@@ -133,48 +145,88 @@ public class ReactiveMongoTemplateIndexTests {
String command = "db." + template.getCollectionName(Person.class)
+ ".createIndex({'age':-1}, {'unique':true, 'sparse':true}), 1";
StepVerifier.create(template.indexOps(Person.class).dropAllIndexes()).verifyComplete();
StepVerifier.create(template.indexOps(Person.class).getIndexInfo()).verifyComplete();
template.indexOps(Person.class).dropAllIndexes() //
.as(StepVerifier::create) //
.verifyComplete();
StepVerifier.create(factory.getMongoDatabase().runCommand(new org.bson.Document("eval", command))) //
template.indexOps(Person.class).getIndexInfo() //
.as(StepVerifier::create) //
.verifyComplete();
Flux.from(factory.getMongoDatabase().runCommand(new org.bson.Document("eval", command))) //
.as(StepVerifier::create) //
.expectNextCount(1) //
.verifyComplete();
ListIndexesPublisher<Document> listIndexesPublisher = template
.getCollection(template.getCollectionName(Person.class)).listIndexes();
StepVerifier.create(Flux.from(listIndexesPublisher).collectList()).consumeNextWith(indexInfos -> {
Flux.from(listIndexesPublisher).collectList() //
.as(StepVerifier::create) //
.consumeNextWith(indexInfos -> {
Document indexKey = null;
boolean unique = false;
Document indexKey = null;
boolean unique = false;
for (Document document : indexInfos) {
for (Document document : indexInfos) {
if ("age_-1".equals(document.get("name"))) {
indexKey = (org.bson.Document) document.get("key");
unique = (Boolean) document.get("unique");
}
}
if ("age_-1".equals(document.get("name"))) {
indexKey = (org.bson.Document) document.get("key");
unique = (Boolean) document.get("unique");
}
}
assertThat(indexKey, hasEntry("age", -1D));
assertThat(unique, is(true));
}).verifyComplete();
assertThat(indexKey).containsEntry("age", -1D);
assertThat(unique).isTrue();
}).verifyComplete();
StepVerifier.create(Flux.from(template.indexOps(Person.class).getIndexInfo().collectList()))
Flux.from(template.indexOps(Person.class).getIndexInfo().collectList()) //
.as(StepVerifier::create) //
.consumeNextWith(indexInfos -> {
IndexInfo info = indexInfos.get(1);
assertThat(info.isUnique(), is(true));
assertThat(info.isSparse(), is(true));
assertThat(info.isUnique()).isTrue();
assertThat(info.isSparse()).isTrue();
List<IndexField> indexFields = info.getIndexFields();
IndexField field = indexFields.get(0);
assertThat(field, is(IndexField.create("age", Direction.DESC)));
assertThat(info.getIndexFields()).contains(IndexField.create("age", Direction.DESC), atIndex(0));
}).verifyComplete();
}
@Test // DATAMONGO-1928
public void shouldCreateIndexOnAccess() {
StepVerifier.create(template.getCollection("indexedSample").listIndexes(Document.class)).expectNextCount(0)
.verifyComplete();
template.findAll(IndexedSample.class) //
.as(StepVerifier::create) //
.verifyComplete();
StepVerifier.create(template.getCollection("indexedSample").listIndexes(Document.class)).expectNextCount(2)
.verifyComplete();
}
@Test // DATAMONGO-1928
public void indexCreationShouldFail() throws InterruptedException {
String command = "db.indexfail" + ".createIndex({'field':1}, {'name':'foo', 'unique':true, 'sparse':true}), 1";
Flux.from(factory.getMongoDatabase().runCommand(new org.bson.Document("eval", command))) //
.as(StepVerifier::create) //
.expectNextCount(1) //
.verifyComplete();
BlockingQueue<Throwable> queue = new LinkedBlockingQueue<>();
ReactiveMongoTemplate template = new ReactiveMongoTemplate(factory, this.template.getConverter(), queue::add);
template.findAll(IndexCreationShouldFail.class).subscribe();
Throwable failure = queue.poll(10, TimeUnit.SECONDS);
assertThat(failure).isNotNull().isInstanceOf(DataIntegrityViolationException.class);
}
@Data
static class Sample {
@@ -188,4 +240,20 @@ public class ReactiveMongoTemplateIndexTests {
this.field = field;
}
}
@Data
@org.springframework.data.mongodb.core.mapping.Document
static class IndexedSample {
@Id String id;
@Indexed String field;
}
@Data
@org.springframework.data.mongodb.core.mapping.Document("indexfail")
static class IndexCreationShouldFail {
@Id String id;
@Indexed(name = "foo") String field;
}
}

View File

@@ -0,0 +1,273 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core;
import static org.assertj.core.api.Assertions.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import reactor.core.publisher.Mono;
import reactor.test.StepVerifier;
import java.util.Arrays;
import java.util.stream.Collectors;
import org.bson.Document;
import org.junit.Before;
import org.junit.ClassRule;
import org.junit.Test;
import org.junit.rules.TestRule;
import org.reactivestreams.Publisher;
import org.springframework.data.domain.Sort;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.test.util.MongoTestUtils;
import org.springframework.data.mongodb.test.util.MongoVersionRule;
import org.springframework.data.mongodb.test.util.ReplicaSet;
import org.springframework.data.util.Version;
import com.mongodb.ClientSessionOptions;
import com.mongodb.reactivestreams.client.ClientSession;
import com.mongodb.reactivestreams.client.MongoClient;
import com.mongodb.reactivestreams.client.Success;
/**
* Integration tests for Mongo Transactions using {@link ReactiveMongoTemplate}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @currentRead The Core - Peter V. Brett
*/
public class ReactiveMongoTemplateTransactionTests {
public static @ClassRule MongoVersionRule REQUIRES_AT_LEAST_3_7_5 = MongoVersionRule.atLeast(Version.parse("3.7.5"));
public static @ClassRule TestRule replSet = ReplicaSet.required();
static final String DATABASE_NAME = "reactive-template-tx-tests";
static final String COLLECTION_NAME = "test";
static final Document DOCUMENT = new Document("_id", "id-1").append("value", "spring");
static final Query ID_QUERY = query(where("_id").is("id-1"));
static final Person AHMANN = new Person("ahmann", 32);
static final Person ARLEN = new Person("arlen", 24);
static final Person LEESHA = new Person("leesha", 22);
static final Person RENNA = new Person("renna", 22);
MongoClient client;
ReactiveMongoTemplate template;
@Before
public void setUp() {
client = MongoTestUtils.reactiveReplSetClient();
template = new ReactiveMongoTemplate(client, DATABASE_NAME);
StepVerifier.create(MongoTestUtils.createOrReplaceCollection(DATABASE_NAME, COLLECTION_NAME, client)) //
.expectNext(Success.SUCCESS) //
.verifyComplete();
StepVerifier.create(MongoTestUtils.createOrReplaceCollection(DATABASE_NAME, "person", client))
.expectNext(Success.SUCCESS) //
.verifyComplete();
StepVerifier.create(template.insert(DOCUMENT, COLLECTION_NAME)).expectNextCount(1).verifyComplete();
template.insertAll(Arrays.asList(AHMANN, ARLEN, LEESHA, RENNA)) //
.as(StepVerifier::create) //
.expectNextCount(4) //
.verifyComplete();
}
@Test // DATAMONGO-1970
public void reactiveTransactionWithExplicitTransactionStart() {
Publisher<ClientSession> sessionPublisher = client
.startSession(ClientSessionOptions.builder().causallyConsistent(true).build());
ClientSession clientSession = Mono.from(sessionPublisher).block();
template.withSession(Mono.just(clientSession))
.execute(action -> ReactiveMongoContext.getSession().flatMap(session -> {
session.startTransaction();
return action.remove(ID_QUERY, Document.class, COLLECTION_NAME);
})).as(StepVerifier::create).expectNextCount(1).verifyComplete();
template.exists(ID_QUERY, COLLECTION_NAME) //
.as(StepVerifier::create) //
.expectNext(true) //
.verifyComplete();
assertThat(clientSession.hasActiveTransaction()).isTrue();
StepVerifier.create(clientSession.commitTransaction()).verifyComplete();
template.exists(ID_QUERY, COLLECTION_NAME) //
.as(StepVerifier::create) //
.expectNext(false) //
.verifyComplete();
}
@Test // DATAMONGO-1970
public void reactiveTransactionsCommitOnComplete() {
template.inTransaction().execute(action -> action.remove(ID_QUERY, Document.class, COLLECTION_NAME)) //
.as(StepVerifier::create) //
.expectNextCount(1) //
.verifyComplete();
template.exists(ID_QUERY, COLLECTION_NAME) //
.as(StepVerifier::create) //
.expectNext(false) //
.verifyComplete();
}
@Test // DATAMONGO-1970
public void reactiveTransactionsAbortOnError() {
template.inTransaction().execute(action -> {
return action.remove(ID_QUERY, Document.class, COLLECTION_NAME).flatMap(result -> Mono.fromSupplier(() -> {
throw new RuntimeException("¯\\_(ツ)_/¯");
}));
}).as(StepVerifier::create) //
.expectError() //
.verify();
template.exists(ID_QUERY, COLLECTION_NAME) //
.as(StepVerifier::create) //
.expectNext(true) //
.verifyComplete();
}
@Test // DATAMONGO-1970
public void withSessionDoesNotManageTransactions() {
Mono.from(client.startSession()).flatMap(session -> {
session.startTransaction();
return template.withSession(session).remove(ID_QUERY, Document.class, COLLECTION_NAME);
}).as(StepVerifier::create).expectNextCount(1).verifyComplete();
template.exists(ID_QUERY, COLLECTION_NAME) //
.as(StepVerifier::create) //
.expectNext(true) //
.verifyComplete();
}
@Test // DATAMONGO-1970
public void inTransactionCommitsProvidedTransactionalSession() {
ClientSession session = Mono.from(client.startSession()).block();
session.startTransaction();
template.inTransaction(Mono.just(session)).execute(action -> {
return action.remove(ID_QUERY, Document.class, COLLECTION_NAME);
}) //
.as(StepVerifier::create) //
.expectNextCount(1) //
.verifyComplete();
assertThat(session.hasActiveTransaction()).isFalse();
}
@Test // DATAMONGO-1970
public void changesNotVisibleOutsideTransaction() {
template.inTransaction().execute(action -> {
return action.remove(ID_QUERY, Document.class, COLLECTION_NAME).flatMap(val -> {
// once we use the collection directly we're no longer participating in the tx
return Mono.from(template.getCollection(COLLECTION_NAME).find(ID_QUERY.getQueryObject()));
});
}).as(StepVerifier::create).expectNext(DOCUMENT).verifyComplete();
template.exists(ID_QUERY, COLLECTION_NAME) //
.as(StepVerifier::create) //
.expectNext(false) //
.verifyComplete();
}
@Test // DATAMONGO-1970
public void executeCreatesNewTransaction() {
ReactiveSessionScoped sessionScoped = template.inTransaction();
sessionScoped.execute(action -> {
return action.remove(ID_QUERY, Document.class, COLLECTION_NAME);
}) //
.as(StepVerifier::create) //
.expectNextCount(1) //
.verifyComplete();
template.exists(ID_QUERY, COLLECTION_NAME) //
.as(StepVerifier::create) //
.expectNext(false) //
.verifyComplete();
sessionScoped.execute(action -> {
return action.insert(DOCUMENT, COLLECTION_NAME);
}) //
.as(StepVerifier::create) //
.expectNextCount(1) //
.verifyComplete();
template.exists(ID_QUERY, COLLECTION_NAME) //
.as(StepVerifier::create) //
.expectNext(true) //
.verifyComplete();
}
@Test // DATAMONGO-1970
public void takeDoesNotAbortTransaction() {
template.inTransaction().execute(action -> {
return action.find(query(where("age").exists(true)).with(Sort.by("age")), Person.class).take(3)
.flatMap(action::remove);
}) //
.as(StepVerifier::create) //
.expectNextCount(3) //
.verifyComplete();
template.count(query(where("age").exists(true)), Person.class) //
.as(StepVerifier::create) //
.expectNext(1L) //
.verifyComplete();
}
@Test // DATAMONGO-1970
public void errorInFlowOutsideTransactionDoesNotAbortIt() {
template.inTransaction().execute(action -> {
return action.find(query(where("age").is(22)).with(Sort.by("age")), Person.class).buffer(2).flatMap(values -> {
return action.remove(query(where("id").in(values.stream().map(Person::getId).collect(Collectors.toList()))),
Person.class).then(Mono.just(values));
});
}).flatMap(deleted -> {
throw new RuntimeException("error outside the transaction does not influence it.");
}) //
.as(StepVerifier::create) //
.expectError() //
.verify();
template.count(query(where("age").exists(true)), Person.class) //
.as(StepVerifier::create) //
.expectNext(2L) //
.verifyComplete();
}
}

View File

@@ -21,6 +21,7 @@ import static org.mockito.Mockito.*;
import static org.mockito.Mockito.any;
import static org.springframework.data.mongodb.core.aggregation.Aggregation.*;
import com.mongodb.client.model.ReplaceOptions;
import lombok.Data;
import reactor.core.publisher.Mono;
import reactor.test.StepVerifier;
@@ -233,12 +234,12 @@ public class ReactiveMongoTemplateUnitTests {
@Test // DATAMONGO-1518
public void replaceOneShouldUseCollationWhenPresent() {
when(collection.replaceOne(any(Bson.class), any(), any())).thenReturn(Mono.empty());
when(collection.replaceOne(any(Bson.class), any(), any(ReplaceOptions.class))).thenReturn(Mono.empty());
template.updateFirst(new BasicQuery("{}").collation(Collation.of("fr")), new Update(), AutogenerateableId.class)
.subscribe();
ArgumentCaptor<UpdateOptions> options = ArgumentCaptor.forClass(UpdateOptions.class);
ArgumentCaptor<ReplaceOptions> options = ArgumentCaptor.forClass(ReplaceOptions.class);
verify(collection).replaceOne(any(Bson.class), any(), options.capture());
assertThat(options.getValue().getCollation().getLocale(), is("fr"));

View File

@@ -53,12 +53,12 @@ import com.mongodb.client.model.DeleteOptions;
import com.mongodb.client.model.FindOneAndUpdateOptions;
import com.mongodb.client.model.UpdateOptions;
import com.mongodb.reactivestreams.client.AggregatePublisher;
import com.mongodb.reactivestreams.client.ClientSession;
import com.mongodb.reactivestreams.client.DistinctPublisher;
import com.mongodb.reactivestreams.client.FindPublisher;
import com.mongodb.reactivestreams.client.MongoClient;
import com.mongodb.reactivestreams.client.MongoCollection;
import com.mongodb.reactivestreams.client.MongoDatabase;
import com.mongodb.session.ClientSession;
/**
* Unit tests for {@link ReactiveSessionBoundMongoTemplate}.

View File

@@ -16,8 +16,7 @@
package org.springframework.data.mongodb.core;
import static org.assertj.core.api.Assertions.*;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.ArgumentMatchers.eq;
import static org.mockito.ArgumentMatchers.*;
import static org.mockito.Mockito.*;
import lombok.Data;
@@ -36,6 +35,7 @@ import org.junit.ClassRule;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.ExpectedException;
import org.junit.rules.TestRule;
import org.mockito.Mockito;
import org.springframework.aop.Advisor;
import org.springframework.aop.framework.Advised;
@@ -56,14 +56,15 @@ import org.springframework.data.mongodb.core.mapping.DBRef;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.test.util.MongoVersionRule;
import org.springframework.data.mongodb.test.util.ReplicaSet;
import org.springframework.data.util.Version;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.ClientSessionOptions;
import com.mongodb.MongoClient;
import com.mongodb.client.ClientSession;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.MongoDatabase;
import com.mongodb.session.ClientSession;
/**
* Integration tests for {@link SessionBoundMongoTemplate} operating up an active {@link ClientSession}.
@@ -73,6 +74,7 @@ import com.mongodb.session.ClientSession;
public class SessionBoundMongoTemplateTests {
public static @ClassRule MongoVersionRule REQUIRES_AT_LEAST_3_6_0 = MongoVersionRule.atLeast(Version.parse("3.6.0"));
public static @ClassRule TestRule replSet = ReplicaSet.required();
public @Rule ExpectedException exception = ExpectedException.none();

View File

@@ -56,7 +56,7 @@ import com.mongodb.client.model.CountOptions;
import com.mongodb.client.model.DeleteOptions;
import com.mongodb.client.model.FindOneAndUpdateOptions;
import com.mongodb.client.model.UpdateOptions;
import com.mongodb.session.ClientSession;
import com.mongodb.client.ClientSession;
/**
* Unit test for {@link SessionBoundMongoTemplate} making sure a proxied {@link MongoCollection} and

View File

@@ -37,7 +37,7 @@ import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.MongoClient;
import com.mongodb.MongoClientURI;
import com.mongodb.client.MongoDatabase;
import com.mongodb.session.ClientSession;
import com.mongodb.client.ClientSession;
/**
* Unit tests for {@link SimpleMongoDbFactory}.

View File

@@ -31,9 +31,9 @@ import org.springframework.aop.framework.AopProxyUtils;
import org.springframework.data.mongodb.ReactiveMongoDatabaseFactory;
import org.springframework.test.util.ReflectionTestUtils;
import com.mongodb.reactivestreams.client.ClientSession;
import com.mongodb.reactivestreams.client.MongoClient;
import com.mongodb.reactivestreams.client.MongoDatabase;
import com.mongodb.session.ClientSession;
/**
* Unit tests for {@link SimpleReactiveMongoDatabaseFactory}.

View File

@@ -20,10 +20,11 @@ import static org.junit.Assert.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import lombok.Data;
import java.util.Arrays;
import java.util.List;
import com.mongodb.client.MongoCollection;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
@@ -43,16 +44,17 @@ import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.index.GeoSpatialIndexType;
import org.springframework.data.mongodb.core.index.GeoSpatialIndexed;
import org.springframework.data.mongodb.core.index.GeospatialIndex;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.test.util.BasicDbListBuilder;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import com.mongodb.Mongo;
import com.mongodb.MongoClient;
import com.mongodb.MongoException;
import com.mongodb.WriteConcern;
import com.mongodb.client.MongoCollection;
/**
* @author Christoph Strobl
@@ -140,7 +142,7 @@ public class GeoJsonTests {
}
@Test // DATAMONGO-1137
public void shouleSaveAndRetrieveDocumentWithGeoJsonPointTypeCorrectly() {
public void shouldSaveAndRetrieveDocumentWithGeoJsonPointTypeCorrectly() {
DocumentWithPropertyUsingGeoJsonType obj = new DocumentWithPropertyUsingGeoJsonType();
obj.id = "geoJsonPoint";
@@ -155,7 +157,7 @@ public class GeoJsonTests {
}
@Test // DATAMONGO-1137
public void shouleSaveAndRetrieveDocumentWithGeoJsonPolygonTypeCorrectly() {
public void shouldSaveAndRetrieveDocumentWithGeoJsonPolygonTypeCorrectly() {
DocumentWithPropertyUsingGeoJsonType obj = new DocumentWithPropertyUsingGeoJsonType();
obj.id = "geoJsonPolygon";
@@ -171,7 +173,7 @@ public class GeoJsonTests {
}
@Test // DATAMONGO-1137
public void shouleSaveAndRetrieveDocumentWithGeoJsonLineStringTypeCorrectly() {
public void shouldSaveAndRetrieveDocumentWithGeoJsonLineStringTypeCorrectly() {
DocumentWithPropertyUsingGeoJsonType obj = new DocumentWithPropertyUsingGeoJsonType();
obj.id = "geoJsonLineString";
@@ -186,12 +188,13 @@ public class GeoJsonTests {
}
@Test // DATAMONGO-1137
public void shouleSaveAndRetrieveDocumentWithGeoJsonMultiLineStringTypeCorrectly() {
public void shouldSaveAndRetrieveDocumentWithGeoJsonMultiLineStringTypeCorrectly() {
DocumentWithPropertyUsingGeoJsonType obj = new DocumentWithPropertyUsingGeoJsonType();
obj.id = "geoJsonMultiLineString";
obj.geoJsonMultiLineString = new GeoJsonMultiLineString(Arrays.asList(new GeoJsonLineString(new Point(0, 0),
new Point(0, 1), new Point(1, 1)), new GeoJsonLineString(new Point(199, 0), new Point(2, 3))));
obj.geoJsonMultiLineString = new GeoJsonMultiLineString(
Arrays.asList(new GeoJsonLineString(new Point(0, 0), new Point(0, 1), new Point(1, 1)),
new GeoJsonLineString(new Point(199, 0), new Point(2, 3))));
template.save(obj);
@@ -202,7 +205,7 @@ public class GeoJsonTests {
}
@Test // DATAMONGO-1137
public void shouleSaveAndRetrieveDocumentWithGeoJsonMultiPointTypeCorrectly() {
public void shouldSaveAndRetrieveDocumentWithGeoJsonMultiPointTypeCorrectly() {
DocumentWithPropertyUsingGeoJsonType obj = new DocumentWithPropertyUsingGeoJsonType();
obj.id = "geoJsonMultiPoint";
@@ -217,12 +220,12 @@ public class GeoJsonTests {
}
@Test // DATAMONGO-1137
public void shouleSaveAndRetrieveDocumentWithGeoJsonMultiPolygonTypeCorrectly() {
public void shouldSaveAndRetrieveDocumentWithGeoJsonMultiPolygonTypeCorrectly() {
DocumentWithPropertyUsingGeoJsonType obj = new DocumentWithPropertyUsingGeoJsonType();
obj.id = "geoJsonMultiPolygon";
obj.geoJsonMultiPolygon = new GeoJsonMultiPolygon(Arrays.asList(new GeoJsonPolygon(new Point(0, 0),
new Point(0, 1), new Point(1, 1), new Point(0, 0))));
obj.geoJsonMultiPolygon = new GeoJsonMultiPolygon(
Arrays.asList(new GeoJsonPolygon(new Point(0, 0), new Point(0, 1), new Point(1, 1), new Point(0, 0))));
template.save(obj);
@@ -233,13 +236,12 @@ public class GeoJsonTests {
}
@Test // DATAMONGO-1137
public void shouleSaveAndRetrieveDocumentWithGeoJsonGeometryCollectionTypeCorrectly() {
public void shouldSaveAndRetrieveDocumentWithGeoJsonGeometryCollectionTypeCorrectly() {
DocumentWithPropertyUsingGeoJsonType obj = new DocumentWithPropertyUsingGeoJsonType();
obj.id = "geoJsonGeometryCollection";
obj.geoJsonGeometryCollection = new GeoJsonGeometryCollection(Arrays.<GeoJson<?>> asList(
new GeoJsonPoint(100, 200), new GeoJsonPolygon(new Point(0, 0), new Point(0, 1), new Point(1, 1), new Point(1,
0), new Point(0, 0))));
obj.geoJsonGeometryCollection = new GeoJsonGeometryCollection(Arrays.<GeoJson<?>> asList(new GeoJsonPoint(100, 200),
new GeoJsonPolygon(new Point(0, 0), new Point(0, 1), new Point(1, 1), new Point(1, 0), new Point(0, 0))));
template.save(obj);
@@ -286,7 +288,8 @@ public class GeoJsonTests {
new CollectionCallback<Object>() {
@Override
public Object doInCollection(MongoCollection<org.bson.Document> collection) throws MongoException, DataAccessException {
public Object doInCollection(MongoCollection<org.bson.Document> collection)
throws MongoException, DataAccessException {
org.bson.Document pointRepresentation = new org.bson.Document();
pointRepresentation.put("type", "Point");
@@ -313,7 +316,8 @@ public class GeoJsonTests {
new CollectionCallback<Object>() {
@Override
public Object doInCollection(MongoCollection<org.bson.Document> collection) throws MongoException, DataAccessException {
public Object doInCollection(MongoCollection<org.bson.Document> collection)
throws MongoException, DataAccessException {
org.bson.Document lineStringRepresentation = new org.bson.Document();
lineStringRepresentation.put("type", "LineString");
@@ -337,6 +341,27 @@ public class GeoJsonTests {
is(equalTo(new GeoJsonLineString(new Point(0D, 0D), new Point(1, 1)))));
}
@Test // DATAMONGO-1466
public void readGeoJsonBasedOnEmbeddedTypeInformation() {
Point first = new Point(-73.99756, 40.73083);
Point second = new Point(-73.99756, 40.741404);
Point third = new Point(-73.988135, 40.741404);
Point fourth = new Point(-73.988135, 40.73083);
GeoJsonPolygon polygon = new GeoJsonPolygon(first, second, third, fourth, first);
ConcreteGeoJson source = new ConcreteGeoJson();
source.shape = polygon;
source.id = "id-1";
template.save(source);
OpenGeoJson target = template.findOne(query(where("id").is(source.id)), OpenGeoJson.class);
assertThat(target.shape, is(equalTo(source.shape)));
}
private void addVenues() {
template.insert(new Venue2DSphere("Penn Station", -73.99408, 40.75057));
@@ -355,8 +380,8 @@ public class GeoJsonTests {
protected void createIndex() {
dropIndex();
template.indexOps(Venue2DSphere.class).ensureIndex(
new GeospatialIndex("location").typed(GeoSpatialIndexType.GEO_2DSPHERE));
template.indexOps(Venue2DSphere.class)
.ensureIndex(new GeospatialIndex("location").typed(GeoSpatialIndexType.GEO_2DSPHERE));
}
protected void dropIndex() {
@@ -416,4 +441,18 @@ public class GeoJsonTests {
GeoJsonGeometryCollection geoJsonGeometryCollection;
}
@Data
@Document("geo-json-shapes")
static class ConcreteGeoJson {
String id;
GeoJsonPolygon shape;
}
@Data
@Document("geo-json-shapes")
static class OpenGeoJson {
String id;
GeoJson shape;
}
}

View File

@@ -0,0 +1,147 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.index;
import static org.assertj.core.api.Assertions.*;
import static org.mockito.Mockito.*;
import reactor.core.publisher.Mono;
import reactor.test.StepVerifier;
import java.util.Collections;
import java.util.concurrent.TimeUnit;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.ArgumentCaptor;
import org.mockito.Mock;
import org.mockito.junit.MockitoJUnitRunner;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.data.mongodb.ReactiveMongoDatabaseFactory;
import org.springframework.data.mongodb.core.MongoExceptionTranslator;
import org.springframework.data.mongodb.core.ReactiveMongoTemplate;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import com.mongodb.MongoException;
import com.mongodb.client.model.IndexOptions;
import com.mongodb.reactivestreams.client.MongoCollection;
import com.mongodb.reactivestreams.client.MongoDatabase;
/**
* Unit tests for {@link ReactiveMongoPersistentEntityIndexCreator}.
*
* @author Mark Paluch
*/
@RunWith(MockitoJUnitRunner.class)
public class ReactiveMongoPersistentEntityIndexCreatorUnitTests {
ReactiveIndexOperations indexOperations;
@Mock ReactiveMongoDatabaseFactory factory;
@Mock MongoDatabase db;
@Mock MongoCollection<org.bson.Document> collection;
ArgumentCaptor<org.bson.Document> keysCaptor;
ArgumentCaptor<IndexOptions> optionsCaptor;
ArgumentCaptor<String> collectionCaptor;
@Before
@SuppressWarnings("unchecked")
public void setUp() {
when(factory.getExceptionTranslator()).thenReturn(new MongoExceptionTranslator());
when(factory.getMongoDatabase()).thenReturn(db);
when(db.getCollection(any(), any(Class.class))).thenReturn(collection);
indexOperations = new ReactiveMongoTemplate(factory).indexOps("foo");
keysCaptor = ArgumentCaptor.forClass(org.bson.Document.class);
optionsCaptor = ArgumentCaptor.forClass(IndexOptions.class);
collectionCaptor = ArgumentCaptor.forClass(String.class);
when(collection.createIndex(keysCaptor.capture(), optionsCaptor.capture())).thenReturn(Mono.just("OK"));
}
@Test // DATAMONGO-1928
public void buildsIndexDefinitionUsingFieldName() {
MongoMappingContext mappingContext = prepareMappingContext(Person.class);
Mono<Void> publisher = checkForIndexes(mappingContext);
verifyZeroInteractions(collection);
publisher.as(StepVerifier::create).verifyComplete();
assertThat(keysCaptor.getValue()).isNotNull().containsKey("fieldname");
assertThat(optionsCaptor.getValue().getName()).isEqualTo("indexName");
assertThat(optionsCaptor.getValue().isBackground()).isFalse();
assertThat(optionsCaptor.getValue().getExpireAfter(TimeUnit.SECONDS)).isNull();
}
@Test // DATAMONGO-1928
public void createIndexShouldUsePersistenceExceptionTranslatorForNonDataIntegrityConcerns() {
when(collection.createIndex(any(org.bson.Document.class), any(IndexOptions.class)))
.thenReturn(Mono.error(new MongoException(6, "HostUnreachable")));
MongoMappingContext mappingContext = prepareMappingContext(Person.class);
Mono<Void> publisher = checkForIndexes(mappingContext);
publisher.as(StepVerifier::create).expectError(DataAccessResourceFailureException.class).verify();
}
@Test // DATAMONGO-1928
public void createIndexShouldNotConvertUnknownExceptionTypes() {
when(collection.createIndex(any(org.bson.Document.class), any(IndexOptions.class)))
.thenReturn(Mono.error(new ClassCastException("o_O")));
MongoMappingContext mappingContext = prepareMappingContext(Person.class);
Mono<Void> publisher = checkForIndexes(mappingContext);
publisher.as(StepVerifier::create).expectError(ClassCastException.class).verify();
}
private static MongoMappingContext prepareMappingContext(Class<?> type) {
MongoMappingContext mappingContext = new MongoMappingContext();
mappingContext.setInitialEntitySet(Collections.singleton(type));
mappingContext.initialize();
return mappingContext;
}
private Mono<Void> checkForIndexes(MongoMappingContext mappingContext) {
return new ReactiveMongoPersistentEntityIndexCreator(mappingContext, it -> indexOperations)
.checkForIndexes(mappingContext.getRequiredPersistentEntity(Person.class));
}
@Document
static class Person {
@Indexed(name = "indexName") //
@Field("fieldname") //
String field;
}
}

View File

@@ -15,14 +15,16 @@
*/
package org.springframework.data.mongodb.core.mapping;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.assertj.core.api.Assertions.*;
import static org.mockito.Mockito.*;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
import java.util.Arrays;
import java.util.Collections;
import java.util.Map;
import org.junit.Test;
import org.junit.runner.RunWith;
@@ -31,6 +33,8 @@ import org.mockito.junit.MockitoJUnitRunner;
import org.springframework.context.ApplicationContext;
import org.springframework.core.annotation.AliasFor;
import org.springframework.data.mapping.MappingException;
import org.springframework.data.spel.ExtensionAwareEvaluationContextProvider;
import org.springframework.data.spel.spi.EvaluationContextExtension;
import org.springframework.data.util.ClassTypeInformation;
/**
@@ -50,7 +54,7 @@ public class BasicMongoPersistentEntityUnitTests {
BasicMongoPersistentEntity<Person> entity = new BasicMongoPersistentEntity<Person>(
ClassTypeInformation.from(Person.class));
assertThat(entity.getCollection(), is("contacts"));
assertThat(entity.getCollection()).isEqualTo("contacts");
}
@Test
@@ -58,7 +62,7 @@ public class BasicMongoPersistentEntityUnitTests {
MongoPersistentEntity<Company> entity = new BasicMongoPersistentEntity<Company>(
ClassTypeInformation.from(Company.class));
assertThat(entity.getCollection(), is("35"));
assertThat(entity.getCollection()).isEqualTo("35");
}
@Test // DATAMONGO-65, DATAMONGO-1108
@@ -68,16 +72,15 @@ public class BasicMongoPersistentEntityUnitTests {
provider.collectionName = "reference";
when(context.getBean("myBean")).thenReturn(provider);
when(context.containsBean("myBean")).thenReturn(true);
BasicMongoPersistentEntity<DynamicallyMapped> entity = new BasicMongoPersistentEntity<DynamicallyMapped>(
ClassTypeInformation.from(DynamicallyMapped.class));
entity.setApplicationContext(context);
entity.setEvaluationContextProvider(new ExtensionAwareEvaluationContextProvider(context));
assertThat(entity.getCollection(), is("reference"));
assertThat(entity.getCollection()).isEqualTo("reference");
provider.collectionName = "otherReference";
assertThat(entity.getCollection(), is("otherReference"));
assertThat(entity.getCollection()).isEqualTo("otherReference");
}
@Test // DATAMONGO-937
@@ -85,31 +88,31 @@ public class BasicMongoPersistentEntityUnitTests {
BasicMongoPersistentEntity<DocumentWithLanguage> entity = new BasicMongoPersistentEntity<DocumentWithLanguage>(
ClassTypeInformation.from(DocumentWithLanguage.class));
assertThat(entity.getLanguage(), is("spanish"));
assertThat(entity.getLanguage()).isEqualTo("spanish");
}
@SuppressWarnings({ "unchecked", "rawtypes" })
@Test(expected = MappingException.class) // DATAMONGO-1053
@Test // DATAMONGO-1053
public void verifyShouldThrowExceptionForInvalidTypeOfExplicitLanguageProperty() {
doReturn(true).when(propertyMock).isExplicitLanguageProperty();
doReturn(Number.class).when(propertyMock).getActualType();
BasicMongoPersistentEntity<AnyDocument> entity = new BasicMongoPersistentEntity<AnyDocument>(
ClassTypeInformation.from(AnyDocument.class));
when(propertyMock.isExplicitLanguageProperty()).thenReturn(true);
when(propertyMock.getActualType()).thenReturn((Class) Number.class);
entity.addPersistentProperty(propertyMock);
entity.verify();
assertThatExceptionOfType(MappingException.class).isThrownBy(() -> entity.verify());
}
@SuppressWarnings({ "unchecked", "rawtypes" })
@Test // DATAMONGO-1053
public void verifyShouldPassForStringAsExplicitLanguageProperty() {
doReturn(true).when(propertyMock).isExplicitLanguageProperty();
doReturn(String.class).when(propertyMock).getActualType();
BasicMongoPersistentEntity<AnyDocument> entity = new BasicMongoPersistentEntity<AnyDocument>(
ClassTypeInformation.from(AnyDocument.class));
when(propertyMock.isExplicitLanguageProperty()).thenReturn(true);
when(propertyMock.getActualType()).thenReturn((Class) String.class);
entity.addPersistentProperty(propertyMock);
entity.verify();
@@ -118,7 +121,6 @@ public class BasicMongoPersistentEntityUnitTests {
verify(propertyMock, times(1)).getActualType();
}
@SuppressWarnings({ "unchecked", "rawtypes" })
@Test // DATAMONGO-1053
public void verifyShouldIgnoreNonExplicitLanguageProperty() {
@@ -133,71 +135,74 @@ public class BasicMongoPersistentEntityUnitTests {
verify(propertyMock, never()).getActualType();
}
@SuppressWarnings({ "unchecked", "rawtypes" })
@Test(expected = MappingException.class) // DATAMONGO-1157
@Test // DATAMONGO-1157
public void verifyShouldThrowErrorForLazyDBRefOnFinalClass() {
BasicMongoPersistentEntity<AnyDocument> entity = new BasicMongoPersistentEntity<AnyDocument>(
ClassTypeInformation.from(AnyDocument.class));
org.springframework.data.mongodb.core.mapping.DBRef dbRefMock = mock(
org.springframework.data.mongodb.core.mapping.DBRef.class);
when(propertyMock.isDbReference()).thenReturn(true);
when(propertyMock.getDBRef()).thenReturn(dbRefMock);
when(dbRefMock.lazy()).thenReturn(true);
when(propertyMock.getActualType()).thenReturn((Class) Class.class);
entity.addPersistentProperty(propertyMock);
entity.verify();
}
@Test(expected = MappingException.class) // DATAMONGO-1157
public void verifyShouldThrowErrorForLazyDBRefArray() {
doReturn(Class.class).when(propertyMock).getActualType();
doReturn(true).when(propertyMock).isDbReference();
doReturn(dbRefMock).when(propertyMock).getDBRef();
doReturn(true).when(dbRefMock).lazy();
BasicMongoPersistentEntity<AnyDocument> entity = new BasicMongoPersistentEntity<AnyDocument>(
ClassTypeInformation.from(AnyDocument.class));
org.springframework.data.mongodb.core.mapping.DBRef dbRefMock = mock(
org.springframework.data.mongodb.core.mapping.DBRef.class);
when(propertyMock.isDbReference()).thenReturn(true);
when(propertyMock.getDBRef()).thenReturn(dbRefMock);
when(dbRefMock.lazy()).thenReturn(true);
when(propertyMock.isArray()).thenReturn(true);
entity.addPersistentProperty(propertyMock);
entity.verify();
assertThatExceptionOfType(MappingException.class).isThrownBy(() -> entity.verify());
}
@Test // DATAMONGO-1157
@SuppressWarnings({ "unchecked", "rawtypes" })
public void verifyShouldPassForLazyDBRefOnNonArrayNonFinalClass() {
public void verifyShouldThrowErrorForLazyDBRefArray() {
org.springframework.data.mongodb.core.mapping.DBRef dbRefMock = mock(
org.springframework.data.mongodb.core.mapping.DBRef.class);
doReturn(true).when(propertyMock).isDbReference();
doReturn(true).when(propertyMock).isArray();
doReturn(dbRefMock).when(propertyMock).getDBRef();
doReturn(true).when(dbRefMock).lazy();
BasicMongoPersistentEntity<AnyDocument> entity = new BasicMongoPersistentEntity<AnyDocument>(
ClassTypeInformation.from(AnyDocument.class));
org.springframework.data.mongodb.core.mapping.DBRef dbRefMock = mock(
org.springframework.data.mongodb.core.mapping.DBRef.class);
when(propertyMock.isDbReference()).thenReturn(true);
when(propertyMock.getDBRef()).thenReturn(dbRefMock);
when(dbRefMock.lazy()).thenReturn(true);
when(propertyMock.getActualType()).thenReturn((Class) Object.class);
entity.addPersistentProperty(propertyMock);
assertThatExceptionOfType(MappingException.class).isThrownBy(() -> entity.verify());
}
@Test // DATAMONGO-1157
public void verifyShouldPassForLazyDBRefOnNonArrayNonFinalClass() {
org.springframework.data.mongodb.core.mapping.DBRef dbRefMock = mock(
org.springframework.data.mongodb.core.mapping.DBRef.class);
doReturn(true).when(propertyMock).isDbReference();
doReturn(Object.class).when(propertyMock).getActualType();
doReturn(dbRefMock).when(propertyMock).getDBRef();
doReturn(true).when(dbRefMock).lazy();
BasicMongoPersistentEntity<AnyDocument> entity = new BasicMongoPersistentEntity<AnyDocument>(
ClassTypeInformation.from(AnyDocument.class));
entity.addPersistentProperty(propertyMock);
entity.verify();
verify(propertyMock, times(1)).isDbReference();
}
@Test // DATAMONGO-1157
@SuppressWarnings({ "unchecked", "rawtypes" })
public void verifyShouldPassForNonLazyDBRefOnFinalClass() {
org.springframework.data.mongodb.core.mapping.DBRef dbRefMock = mock(
org.springframework.data.mongodb.core.mapping.DBRef.class);
doReturn(true).when(propertyMock).isDbReference();
doReturn(dbRefMock).when(propertyMock).getDBRef();
doReturn(false).when(dbRefMock).lazy();
BasicMongoPersistentEntity<AnyDocument> entity = new BasicMongoPersistentEntity<AnyDocument>(
ClassTypeInformation.from(AnyDocument.class));
org.springframework.data.mongodb.core.mapping.DBRef dbRefMock = mock(
org.springframework.data.mongodb.core.mapping.DBRef.class);
when(propertyMock.isDbReference()).thenReturn(true);
when(propertyMock.getDBRef()).thenReturn(dbRefMock);
when(dbRefMock.lazy()).thenReturn(false);
entity.addPersistentProperty(propertyMock);
entity.verify();
verify(dbRefMock, times(1)).lazy();
@@ -209,7 +214,7 @@ public class BasicMongoPersistentEntityUnitTests {
BasicMongoPersistentEntity<DocumentWithCustomAnnotation> entity = new BasicMongoPersistentEntity<DocumentWithCustomAnnotation>(
ClassTypeInformation.from(DocumentWithCustomAnnotation.class));
assertThat(entity.getCollection(), is("collection-1"));
assertThat(entity.getCollection()).isEqualTo("collection-1");
}
@Test // DATAMONGO-1373
@@ -218,7 +223,18 @@ public class BasicMongoPersistentEntityUnitTests {
BasicMongoPersistentEntity<DocumentWithComposedAnnotation> entity = new BasicMongoPersistentEntity<DocumentWithComposedAnnotation>(
ClassTypeInformation.from(DocumentWithComposedAnnotation.class));
assertThat(entity.getCollection(), is("custom-collection"));
assertThat(entity.getCollection()).isEqualTo("custom-collection");
}
@Test // DATAMONGO-1874
public void usesEvaluationContextExtensionInDynamicDocumentName() {
BasicMongoPersistentEntity<MappedWithExtension> entity = new BasicMongoPersistentEntity<>(
ClassTypeInformation.from(MappedWithExtension.class));
entity.setEvaluationContextProvider(
new ExtensionAwareEvaluationContextProvider(Arrays.asList(new SampleExtension())));
assertThat(entity.getCollection()).isEqualTo("collectionName");
}
@Document("contacts")
@@ -229,7 +245,7 @@ public class BasicMongoPersistentEntityUnitTests {
@Document("#{35}")
class Company {}
@Document("#{myBean.collectionName}")
@Document("#{@myBean.collectionName}")
class DynamicallyMapped {}
class CollectionProvider {
@@ -265,4 +281,30 @@ public class BasicMongoPersistentEntityUnitTests {
@AliasFor(annotation = Document.class, attribute = "collection")
String name() default "custom-collection";
}
// DATAMONGO-1874
@Document("#{myProperty}")
class MappedWithExtension {}
static class SampleExtension implements EvaluationContextExtension {
/*
* (non-Javadoc)
* @see org.springframework.data.spel.spi.EvaluationContextExtension#getExtensionId()
*/
@Override
public String getExtensionId() {
return "sampleExtension";
}
/*
* (non-Javadoc)
* @see org.springframework.data.spel.spi.EvaluationContextExtension#getProperties()
*/
@Override
public Map<String, Object> getProperties() {
return Collections.singletonMap("myProperty", "collectionName");
}
}
}

View File

@@ -0,0 +1,170 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapreduce;
import static org.assertj.core.api.Assertions.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import lombok.Data;
import reactor.test.StepVerifier;
import java.util.Arrays;
import org.bson.Document;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.mongodb.core.Person;
import org.springframework.data.mongodb.core.ReactiveMongoTemplate;
import org.springframework.data.mongodb.core.SimpleReactiveMongoDatabaseFactory;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import com.mongodb.reactivestreams.client.MongoCollection;
import com.mongodb.reactivestreams.client.Success;
/**
* @author Christoph Strobl
* @currentRead Beyond the Shadows - Brent Weeks
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration("classpath:reactive-infrastructure.xml")
public class ReactiveMapReduceTests {
@Autowired SimpleReactiveMongoDatabaseFactory factory;
@Autowired ReactiveMongoTemplate template;
private String mapFunction = "function(){ for ( var i=0; i<this.x.length; i++ ){ emit( this.x[i] , 1 ); } }";
private String reduceFunction = "function(key,values){ var sum=0; for( var i=0; i<values.length; i++ ) sum += values[i]; return sum;}";
@Before
public void setUp() {
StepVerifier
.create(template.dropCollection(ValueObject.class) //
.mergeWith(template.dropCollection("jmr1")) //
.mergeWith(template.dropCollection("jmr1_out"))) //
.verifyComplete();
}
@Test // DATAMONGO-1890
public void mapReduceWithInlineResult() {
createMapReduceData();
StepVerifier
.create(template.mapReduce(new Query(), Person.class, "jmr1", ValueObject.class, mapFunction, reduceFunction,
MapReduceOptions.options()).buffer(4)) //
.consumeNextWith(result -> {
assertThat(result).containsExactlyInAnyOrder(new ValueObject("a", 1), new ValueObject("b", 2),
new ValueObject("c", 2), new ValueObject("d", 1));
}) //
.verifyComplete();
}
@Test // DATAMONGO-1890
public void mapReduceWithInlineAndFilterQuery() {
createMapReduceData();
StepVerifier
.create(template.mapReduce(query(where("x").ne(new String[] { "a", "b" })), ValueObject.class, "jmr1",
ValueObject.class, mapFunction, reduceFunction, MapReduceOptions.options()).buffer(4)) //
.consumeNextWith(result -> {
assertThat(result).containsExactlyInAnyOrder(new ValueObject("b", 1), new ValueObject("c", 2),
new ValueObject("d", 1));
}) //
.verifyComplete();
}
@Test // DATAMONGO-1890
public void mapReduceWithOutputCollection() {
createMapReduceData();
StepVerifier
.create(template.mapReduce(new Query(), ValueObject.class, "jmr1", ValueObject.class, mapFunction,
reduceFunction, MapReduceOptions.options().outputCollection("jmr1_out")))
.expectNextCount(4).verifyComplete();
StepVerifier.create(template.find(new Query(), ValueObject.class, "jmr1_out").buffer(4)) //
.consumeNextWith(result -> {
assertThat(result).containsExactlyInAnyOrder(new ValueObject("a", 1), new ValueObject("b", 2),
new ValueObject("c", 2), new ValueObject("d", 1));
}) //
.verifyComplete();
}
@Test // DATAMONGO-1890
public void mapReduceWithInlineAndMappedFilterQuery() {
createMapReduceData();
StepVerifier
.create(template.mapReduce(query(where("values").ne(new String[] { "a", "b" })), MappedFieldsValueObject.class,
"jmr1", ValueObject.class, mapFunction, reduceFunction, MapReduceOptions.options()).buffer(4)) //
.consumeNextWith(result -> {
assertThat(result).containsExactlyInAnyOrder(new ValueObject("b", 1), new ValueObject("c", 2),
new ValueObject("d", 1));
}) //
.verifyComplete();
}
@Test // DATAMONGO-1890
public void mapReduceWithInlineFilterQueryAndExtractedCollection() {
createMapReduceData();
StepVerifier
.create(template.mapReduce(query(where("values").ne(new String[] { "a", "b" })), MappedFieldsValueObject.class,
ValueObject.class, mapFunction, reduceFunction, MapReduceOptions.options()).buffer(4)) //
.consumeNextWith(result -> {
assertThat(result).containsExactlyInAnyOrder(new ValueObject("b", 1), new ValueObject("c", 2),
new ValueObject("d", 1));
}) //
.verifyComplete();
}
@Test // DATAMONGO-1890
public void throwsExceptionWhenTryingToLoadFunctionsFromDisk() {
assertThatExceptionOfType(IllegalArgumentException.class).isThrownBy(() -> template.mapReduce(new Query(), Person.class,
"foo", ValueObject.class, "classpath:map.js", "classpath:reduce.js", MapReduceOptions.options()))
.withMessageContaining("classpath:map.js");
}
private void createMapReduceData() {
MongoCollection<Document> collection = factory.getMongoDatabase().getCollection("jmr1", Document.class);
StepVerifier
.create(collection.insertMany(Arrays.asList(new Document("x", Arrays.asList("a", "b")),
new Document("x", Arrays.asList("b", "c")), new Document("x", Arrays.asList("c", "d")))))
.expectNext(Success.SUCCESS).verifyComplete();
}
@org.springframework.data.mongodb.core.mapping.Document("jmr1")
@Data
static class MappedFieldsValueObject {
@Field("x") String[] values;
}
}

View File

@@ -1,23 +1,38 @@
/*
* Copyright 2011-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.mapreduce;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
/**
* @author Mark Pollack
* @author Oliver Gierke
* @author Christoph Strobl
*/
@Data
@NoArgsConstructor
@AllArgsConstructor
public class ValueObject {
private String id;
public String getId() {
return id;
}
private float value;
public float getValue() {
return value;
}
public void setValue(float value) {
this.value = value;
}
@Override
public String toString() {
return "ValueObject [id=" + id + ", value=" + value + "]";

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2018 the original author or authors.
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -15,225 +15,151 @@
*/
package org.springframework.data.mongodb.core.query;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.test.util.IsBsonObject.*;
import static org.assertj.core.api.Assertions.*;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import static org.springframework.data.mongodb.core.query.Query.*;
import java.util.Collections;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.EqualsAndHashCode;
import org.bson.Document;
import java.util.Arrays;
import org.bson.types.Binary;
import org.junit.Before;
import org.junit.Test;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.data.mongodb.core.geo.GeoJsonLineString;
import org.springframework.data.mongodb.core.geo.GeoJsonPoint;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.util.Base64Utils;
import com.mongodb.MongoClient;
/**
* @author Oliver Gierke
* @author Thomas Darimont
* Integration tests for {@link Criteria} usage as part of a {@link Query}.
*
* @author Christoph Strobl
* @author Andreas Zink
*/
public class CriteriaTests {
@Test
public void testSimpleCriteria() {
Criteria c = new Criteria("name").is("Bubba");
assertEquals(Document.parse("{ \"name\" : \"Bubba\"}"), c.getCriteriaObject());
MongoOperations ops;
MongoClient client;
static final DocumentWithBitmask FIFTY_FOUR/*00110110*/ = new DocumentWithBitmask("1", Integer.valueOf(54),
Integer.toBinaryString(54));
static final DocumentWithBitmask TWENTY_INT/*00010100*/ = new DocumentWithBitmask("2", Integer.valueOf(20),
Integer.toBinaryString(20));
static final DocumentWithBitmask TWENTY_FLOAT/*00010100*/ = new DocumentWithBitmask("3", Float.valueOf(20),
Integer.toBinaryString(20));
static final DocumentWithBitmask ONE_HUNDRED_TWO/*01100110*/ = new DocumentWithBitmask("4",
new Binary(Base64Utils.decodeFromString("Zg==")), "01100110");
@Before
public void setUp() {
client = new MongoClient();
ops = new MongoTemplate(client, "criteria-tests");
ops.dropCollection(DocumentWithBitmask.class);
ops.insert(FIFTY_FOUR);
ops.insert(TWENTY_INT);
ops.insert(TWENTY_FLOAT);
ops.insert(ONE_HUNDRED_TWO);
}
@Test
public void testNotEqualCriteria() {
Criteria c = new Criteria("name").ne("Bubba");
assertEquals(Document.parse("{ \"name\" : { \"$ne\" : \"Bubba\"}}"), c.getCriteriaObject());
@Test // DATAMONGO-1808
public void bitsAllClearWithBitPositions() {
assertThat(ops.find(query(where("value").bits().allClear(Arrays.asList(1, 5))), DocumentWithBitmask.class))
.containsExactlyInAnyOrder(TWENTY_INT, TWENTY_FLOAT);
}
@Test
public void buildsIsNullCriteriaCorrectly() {
@Test // DATAMONGO-1808
public void bitsAllClearWithNumericBitmask() {
Document reference = new Document("name", null);
Criteria criteria = new Criteria("name").is(null);
assertThat(criteria.getCriteriaObject(), is(reference));
assertThat(ops.find(query(where("value").bits().allClear(35)), DocumentWithBitmask.class))
.containsExactlyInAnyOrder(TWENTY_INT, TWENTY_FLOAT);
}
@Test
public void testChainedCriteria() {
Criteria c = new Criteria("name").is("Bubba").and("age").lt(21);
assertEquals(Document.parse("{ \"name\" : \"Bubba\" , \"age\" : { \"$lt\" : 21}}"), c.getCriteriaObject());
@Test // DATAMONGO-1808
public void bitsAllClearWithStringBitmask() {
assertThat(ops.find(query(where("value").bits().allClear("ID==")), DocumentWithBitmask.class))
.containsExactlyInAnyOrder(TWENTY_INT, TWENTY_FLOAT);
}
@Test(expected = InvalidMongoDbApiUsageException.class)
public void testCriteriaWithMultipleConditionsForSameKey() {
Criteria c = new Criteria("name").gte("M").and("name").ne("A");
c.getCriteriaObject();
@Test // DATAMONGO-1808
public void bitsAllSetWithBitPositions() {
assertThat(ops.find(query(where("value").bits().allSet(Arrays.asList(1, 5))), DocumentWithBitmask.class))
.containsExactlyInAnyOrder(FIFTY_FOUR, ONE_HUNDRED_TWO);
}
@Test
public void equalIfCriteriaMatches() {
@Test // DATAMONGO-1808
public void bitsAllSetWithNumericBitmask() {
Criteria left = new Criteria("name").is("Foo").and("lastname").is("Bar");
Criteria right = new Criteria("name").is("Bar").and("lastname").is("Bar");
assertThat(left, is(not(right)));
assertThat(right, is(not(left)));
assertThat(ops.find(query(where("value").bits().allSet(50)), DocumentWithBitmask.class))
.containsExactlyInAnyOrder(FIFTY_FOUR);
}
@Test(expected = IllegalArgumentException.class) // DATAMONGO-507
public void shouldThrowExceptionWhenTryingToNegateAndOperation() {
@Test // DATAMONGO-1808
public void bitsAllSetWithStringBitmask() {
new Criteria() //
.not() //
.andOperator(Criteria.where("delete").is(true).and("_id").is(42)); //
assertThat(ops.find(query(where("value").bits().allSet("MC==")), DocumentWithBitmask.class))
.containsExactlyInAnyOrder(FIFTY_FOUR);
}
@Test(expected = IllegalArgumentException.class) // DATAMONGO-507
public void shouldThrowExceptionWhenTryingToNegateOrOperation() {
@Test // DATAMONGO-1808
public void bitsAnyClearWithBitPositions() {
new Criteria() //
.not() //
.orOperator(Criteria.where("delete").is(true).and("_id").is(42)); //
assertThat(ops.find(query(where("value").bits().anyClear(Arrays.asList(1, 5))), DocumentWithBitmask.class))
.containsExactlyInAnyOrder(TWENTY_INT, TWENTY_FLOAT);
}
@Test(expected = IllegalArgumentException.class) // DATAMONGO-507
public void shouldThrowExceptionWhenTryingToNegateNorOperation() {
@Test // DATAMONGO-1808
public void bitsAnyClearWithNumericBitmask() {
new Criteria() //
.not() //
.norOperator(Criteria.where("delete").is(true).and("_id").is(42)); //
assertThat(ops.find(query(where("value").bits().anyClear(35)), DocumentWithBitmask.class))
.containsExactlyInAnyOrder(FIFTY_FOUR, TWENTY_INT, TWENTY_FLOAT, ONE_HUNDRED_TWO);
}
@Test // DATAMONGO-507
public void shouldNegateFollowingSimpleExpression() {
@Test // DATAMONGO-1808
public void bitsAnyClearWithStringBitmask() {
Criteria c = Criteria.where("age").not().gt(18).and("status").is("student");
Document co = c.getCriteriaObject();
assertThat(co, is(notNullValue()));
assertThat(co, is(Document.parse("{ \"age\" : { \"$not\" : { \"$gt\" : 18}} , \"status\" : \"student\"}")));
assertThat(ops.find(query(where("value").bits().anyClear("MC==")), DocumentWithBitmask.class))
.containsExactlyInAnyOrder(TWENTY_INT, TWENTY_FLOAT, ONE_HUNDRED_TWO);
}
@Test // DATAMONGO-1068
public void getCriteriaObjectShouldReturnEmptyDocumentWhenNoCriteriaSpecified() {
@Test // DATAMONGO-1808
public void bitsAnySetWithBitPositions() {
Document document = new Criteria().getCriteriaObject();
assertThat(document, equalTo(new Document()));
assertThat(ops.find(query(where("value").bits().anySet(Arrays.asList(1, 5))), DocumentWithBitmask.class))
.containsExactlyInAnyOrder(FIFTY_FOUR, ONE_HUNDRED_TWO);
}
@Test // DATAMONGO-1068
public void getCriteriaObjectShouldUseCritieraValuesWhenNoKeyIsPresent() {
@Test // DATAMONGO-1808
public void bitsAnySetWithNumericBitmask() {
Document document = new Criteria().lt("foo").getCriteriaObject();
assertThat(document, equalTo(new Document().append("$lt", "foo")));
assertThat(ops.find(query(where("value").bits().anySet(35)), DocumentWithBitmask.class))
.containsExactlyInAnyOrder(FIFTY_FOUR, ONE_HUNDRED_TWO);
}
@Test // DATAMONGO-1068
public void getCriteriaObjectShouldUseCritieraValuesWhenNoKeyIsPresentButMultipleCriteriasPresent() {
@Test // DATAMONGO-1808
public void bitsAnySetWithStringBitmask() {
Document document = new Criteria().lt("foo").gt("bar").getCriteriaObject();
assertThat(document, equalTo(new Document().append("$lt", "foo").append("$gt", "bar")));
assertThat(ops.find(query(where("value").bits().anySet("MC==")), DocumentWithBitmask.class))
.containsExactlyInAnyOrder(FIFTY_FOUR, TWENTY_INT, TWENTY_FLOAT, ONE_HUNDRED_TWO);
}
@Test // DATAMONGO-1068
public void getCriteriaObjectShouldRespectNotWhenNoKeyPresent() {
@Data
@EqualsAndHashCode(exclude = "value")
@AllArgsConstructor
static class DocumentWithBitmask {
Document document = new Criteria().lt("foo").not().getCriteriaObject();
assertThat(document, equalTo(new Document().append("$not", new Document("$lt", "foo"))));
}
@Test // DATAMONGO-1135
public void geoJsonTypesShouldBeWrappedInGeometry() {
Document document = new Criteria("foo").near(new GeoJsonPoint(100, 200)).getCriteriaObject();
assertThat(document, isBsonObject().containing("foo.$near.$geometry", new GeoJsonPoint(100, 200)));
}
@Test // DATAMONGO-1135
public void legacyCoordinateTypesShouldNotBeWrappedInGeometry() {
Document document = new Criteria("foo").near(new Point(100, 200)).getCriteriaObject();
assertThat(document, isBsonObject().notContaining("foo.$near.$geometry"));
}
@Test // DATAMONGO-1135
public void maxDistanceShouldBeMappedInsideNearWhenUsedAlongWithGeoJsonType() {
Document document = new Criteria("foo").near(new GeoJsonPoint(100, 200)).maxDistance(50D).getCriteriaObject();
assertThat(document, isBsonObject().containing("foo.$near.$maxDistance", 50D));
}
@Test // DATAMONGO-1135
public void maxDistanceShouldBeMappedInsideNearSphereWhenUsedAlongWithGeoJsonType() {
Document document = new Criteria("foo").nearSphere(new GeoJsonPoint(100, 200)).maxDistance(50D).getCriteriaObject();
assertThat(document, isBsonObject().containing("foo.$nearSphere.$maxDistance", 50D));
}
@Test // DATAMONGO-1110
public void minDistanceShouldBeMappedInsideNearWhenUsedAlongWithGeoJsonType() {
Document document = new Criteria("foo").near(new GeoJsonPoint(100, 200)).minDistance(50D).getCriteriaObject();
assertThat(document, isBsonObject().containing("foo.$near.$minDistance", 50D));
}
@Test // DATAMONGO-1110
public void minDistanceShouldBeMappedInsideNearSphereWhenUsedAlongWithGeoJsonType() {
Document document = new Criteria("foo").nearSphere(new GeoJsonPoint(100, 200)).minDistance(50D).getCriteriaObject();
assertThat(document, isBsonObject().containing("foo.$nearSphere.$minDistance", 50D));
}
@Test // DATAMONGO-1110
public void minAndMaxDistanceShouldBeMappedInsideNearSphereWhenUsedAlongWithGeoJsonType() {
Document document = new Criteria("foo").nearSphere(new GeoJsonPoint(100, 200)).minDistance(50D).maxDistance(100D)
.getCriteriaObject();
assertThat(document, isBsonObject().containing("foo.$nearSphere.$minDistance", 50D));
assertThat(document, isBsonObject().containing("foo.$nearSphere.$maxDistance", 100D));
}
@Test(expected = IllegalArgumentException.class) // DATAMONGO-1134
public void intersectsShouldThrowExceptionWhenCalledWihtNullValue() {
new Criteria("foo").intersects(null);
}
@Test // DATAMONGO-1134
public void intersectsShouldWrapGeoJsonTypeInGeometryCorrectly() {
GeoJsonLineString lineString = new GeoJsonLineString(new Point(0, 0), new Point(10, 10));
Document document = new Criteria("foo").intersects(lineString).getCriteriaObject();
assertThat(document, isBsonObject().containing("foo.$geoIntersects.$geometry", lineString));
}
@Test // DATAMONGO-1835
public void extractsJsonSchemaInChainCorrectly() {
MongoJsonSchema schema = MongoJsonSchema.builder().required("name").build();
Criteria critera = Criteria.where("foo").is("bar").andDocumentStructureMatches(schema);
assertThat(critera.getCriteriaObject(), is(equalTo(new Document("foo", "bar").append("$jsonSchema",
new Document("type", "object").append("required", Collections.singletonList("name"))))));
}
@Test // DATAMONGO-1835
public void extractsJsonSchemaFromFactoryMethodCorrectly() {
MongoJsonSchema schema = MongoJsonSchema.builder().required("name").build();
Criteria critera = Criteria.matchingDocumentStructure(schema);
assertThat(critera.getCriteriaObject(), is(equalTo(new Document("$jsonSchema",
new Document("type", "object").append("required", Collections.singletonList("name"))))));
@Id String id;
Object value;
String binaryValue;
}
}

View File

@@ -0,0 +1,313 @@
/*
* Copyright 2010-2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query;
import static org.hamcrest.CoreMatchers.*;
import static org.junit.Assert.*;
import static org.springframework.data.mongodb.test.util.IsBsonObject.*;
import java.util.Arrays;
import java.util.Collections;
import org.bson.Document;
import org.junit.Test;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.data.mongodb.core.geo.GeoJsonLineString;
import org.springframework.data.mongodb.core.geo.GeoJsonPoint;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema;
/**
* @author Oliver Gierke
* @author Thomas Darimont
* @author Christoph Strobl
* @author Andreas Zink
*/
public class CriteriaUnitTests {
@Test
public void testSimpleCriteria() {
Criteria c = new Criteria("name").is("Bubba");
assertEquals(Document.parse("{ \"name\" : \"Bubba\"}"), c.getCriteriaObject());
}
@Test
public void testNotEqualCriteria() {
Criteria c = new Criteria("name").ne("Bubba");
assertEquals(Document.parse("{ \"name\" : { \"$ne\" : \"Bubba\"}}"), c.getCriteriaObject());
}
@Test
public void buildsIsNullCriteriaCorrectly() {
Document reference = new Document("name", null);
Criteria criteria = new Criteria("name").is(null);
assertThat(criteria.getCriteriaObject(), is(reference));
}
@Test
public void testChainedCriteria() {
Criteria c = new Criteria("name").is("Bubba").and("age").lt(21);
assertEquals(Document.parse("{ \"name\" : \"Bubba\" , \"age\" : { \"$lt\" : 21}}"), c.getCriteriaObject());
}
@Test(expected = InvalidMongoDbApiUsageException.class)
public void testCriteriaWithMultipleConditionsForSameKey() {
Criteria c = new Criteria("name").gte("M").and("name").ne("A");
c.getCriteriaObject();
}
@Test
public void equalIfCriteriaMatches() {
Criteria left = new Criteria("name").is("Foo").and("lastname").is("Bar");
Criteria right = new Criteria("name").is("Bar").and("lastname").is("Bar");
assertThat(left, is(not(right)));
assertThat(right, is(not(left)));
}
@Test(expected = IllegalArgumentException.class) // DATAMONGO-507
public void shouldThrowExceptionWhenTryingToNegateAndOperation() {
new Criteria() //
.not() //
.andOperator(Criteria.where("delete").is(true).and("_id").is(42)); //
}
@Test(expected = IllegalArgumentException.class) // DATAMONGO-507
public void shouldThrowExceptionWhenTryingToNegateOrOperation() {
new Criteria() //
.not() //
.orOperator(Criteria.where("delete").is(true).and("_id").is(42)); //
}
@Test(expected = IllegalArgumentException.class) // DATAMONGO-507
public void shouldThrowExceptionWhenTryingToNegateNorOperation() {
new Criteria() //
.not() //
.norOperator(Criteria.where("delete").is(true).and("_id").is(42)); //
}
@Test // DATAMONGO-507
public void shouldNegateFollowingSimpleExpression() {
Criteria c = Criteria.where("age").not().gt(18).and("status").is("student");
Document co = c.getCriteriaObject();
assertThat(co, is(notNullValue()));
assertThat(co, is(Document.parse("{ \"age\" : { \"$not\" : { \"$gt\" : 18}} , \"status\" : \"student\"}")));
}
@Test // DATAMONGO-1068
public void getCriteriaObjectShouldReturnEmptyDocumentWhenNoCriteriaSpecified() {
Document document = new Criteria().getCriteriaObject();
assertThat(document, equalTo(new Document()));
}
@Test // DATAMONGO-1068
public void getCriteriaObjectShouldUseCritieraValuesWhenNoKeyIsPresent() {
Document document = new Criteria().lt("foo").getCriteriaObject();
assertThat(document, equalTo(new Document().append("$lt", "foo")));
}
@Test // DATAMONGO-1068
public void getCriteriaObjectShouldUseCritieraValuesWhenNoKeyIsPresentButMultipleCriteriasPresent() {
Document document = new Criteria().lt("foo").gt("bar").getCriteriaObject();
assertThat(document, equalTo(new Document().append("$lt", "foo").append("$gt", "bar")));
}
@Test // DATAMONGO-1068
public void getCriteriaObjectShouldRespectNotWhenNoKeyPresent() {
Document document = new Criteria().lt("foo").not().getCriteriaObject();
assertThat(document, equalTo(new Document().append("$not", new Document("$lt", "foo"))));
}
@Test // DATAMONGO-1135
public void geoJsonTypesShouldBeWrappedInGeometry() {
Document document = new Criteria("foo").near(new GeoJsonPoint(100, 200)).getCriteriaObject();
assertThat(document, isBsonObject().containing("foo.$near.$geometry", new GeoJsonPoint(100, 200)));
}
@Test // DATAMONGO-1135
public void legacyCoordinateTypesShouldNotBeWrappedInGeometry() {
Document document = new Criteria("foo").near(new Point(100, 200)).getCriteriaObject();
assertThat(document, isBsonObject().notContaining("foo.$near.$geometry"));
}
@Test // DATAMONGO-1135
public void maxDistanceShouldBeMappedInsideNearWhenUsedAlongWithGeoJsonType() {
Document document = new Criteria("foo").near(new GeoJsonPoint(100, 200)).maxDistance(50D).getCriteriaObject();
assertThat(document, isBsonObject().containing("foo.$near.$maxDistance", 50D));
}
@Test // DATAMONGO-1135
public void maxDistanceShouldBeMappedInsideNearSphereWhenUsedAlongWithGeoJsonType() {
Document document = new Criteria("foo").nearSphere(new GeoJsonPoint(100, 200)).maxDistance(50D).getCriteriaObject();
assertThat(document, isBsonObject().containing("foo.$nearSphere.$maxDistance", 50D));
}
@Test // DATAMONGO-1110
public void minDistanceShouldBeMappedInsideNearWhenUsedAlongWithGeoJsonType() {
Document document = new Criteria("foo").near(new GeoJsonPoint(100, 200)).minDistance(50D).getCriteriaObject();
assertThat(document, isBsonObject().containing("foo.$near.$minDistance", 50D));
}
@Test // DATAMONGO-1110
public void minDistanceShouldBeMappedInsideNearSphereWhenUsedAlongWithGeoJsonType() {
Document document = new Criteria("foo").nearSphere(new GeoJsonPoint(100, 200)).minDistance(50D).getCriteriaObject();
assertThat(document, isBsonObject().containing("foo.$nearSphere.$minDistance", 50D));
}
@Test // DATAMONGO-1110
public void minAndMaxDistanceShouldBeMappedInsideNearSphereWhenUsedAlongWithGeoJsonType() {
Document document = new Criteria("foo").nearSphere(new GeoJsonPoint(100, 200)).minDistance(50D).maxDistance(100D)
.getCriteriaObject();
assertThat(document, isBsonObject().containing("foo.$nearSphere.$minDistance", 50D));
assertThat(document, isBsonObject().containing("foo.$nearSphere.$maxDistance", 100D));
}
@Test(expected = IllegalArgumentException.class) // DATAMONGO-1134
public void intersectsShouldThrowExceptionWhenCalledWihtNullValue() {
new Criteria("foo").intersects(null);
}
@Test // DATAMONGO-1134
public void intersectsShouldWrapGeoJsonTypeInGeometryCorrectly() {
GeoJsonLineString lineString = new GeoJsonLineString(new Point(0, 0), new Point(10, 10));
Document document = new Criteria("foo").intersects(lineString).getCriteriaObject();
assertThat(document, isBsonObject().containing("foo.$geoIntersects.$geometry", lineString));
}
@Test // DATAMONGO-1835
public void extractsJsonSchemaInChainCorrectly() {
MongoJsonSchema schema = MongoJsonSchema.builder().required("name").build();
Criteria criteria = Criteria.where("foo").is("bar").andDocumentStructureMatches(schema);
assertThat(criteria.getCriteriaObject(), is(equalTo(new Document("foo", "bar").append("$jsonSchema",
new Document("type", "object").append("required", Collections.singletonList("name"))))));
}
@Test // DATAMONGO-1835
public void extractsJsonSchemaFromFactoryMethodCorrectly() {
MongoJsonSchema schema = MongoJsonSchema.builder().required("name").build();
Criteria criteria = Criteria.matchingDocumentStructure(schema);
assertThat(criteria.getCriteriaObject(), is(equalTo(new Document("$jsonSchema",
new Document("type", "object").append("required", Collections.singletonList("name"))))));
}
@Test // DATAMONGO-1808
public void shouldAppendBitsAllClearWithIntBitmaskCorrectly() {
Criteria numericBitmaskCriteria = new Criteria("field").bits().allClear(0b101);
assertThat(numericBitmaskCriteria.getCriteriaObject(),
is(equalTo(Document.parse("{ \"field\" : { \"$bitsAllClear\" : 5} }"))));
}
@Test // DATAMONGO-1808
public void shouldAppendBitsAllClearWithPositionListCorrectly() {
Criteria bitPositionsBitmaskCriteria = new Criteria("field").bits().allClear(Arrays.asList(0, 2));
assertThat(bitPositionsBitmaskCriteria.getCriteriaObject(),
is(equalTo(Document.parse("{ \"field\" : { \"$bitsAllClear\" : [ 0, 2 ]} }"))));
}
@Test // DATAMONGO-1808
public void shouldAppendBitsAllSetWithIntBitmaskCorrectly() {
Criteria numericBitmaskCriteria = new Criteria("field").bits().allSet(0b101);
assertThat(numericBitmaskCriteria.getCriteriaObject(),
is(equalTo(Document.parse("{ \"field\" : { \"$bitsAllSet\" : 5} }"))));
}
@Test // DATAMONGO-1808
public void shouldAppendBitsAllSetWithPositionListCorrectly() {
Criteria bitPositionsBitmaskCriteria = new Criteria("field").bits().allSet(Arrays.asList(0, 2));
assertThat(bitPositionsBitmaskCriteria.getCriteriaObject(),
is(equalTo(Document.parse("{ \"field\" : { \"$bitsAllSet\" : [ 0, 2 ]} }"))));
}
@Test // DATAMONGO-1808
public void shouldAppendBitsAnyClearWithIntBitmaskCorrectly() {
Criteria numericBitmaskCriteria = new Criteria("field").bits().anyClear(0b101);
assertThat(numericBitmaskCriteria.getCriteriaObject(),
is(equalTo(Document.parse("{ \"field\" : { \"$bitsAnyClear\" : 5} }"))));
}
@Test // DATAMONGO-1808
public void shouldAppendBitsAnyClearWithPositionListCorrectly() {
Criteria bitPositionsBitmaskCriteria = new Criteria("field").bits().anyClear(Arrays.asList(0, 2));
assertThat(bitPositionsBitmaskCriteria.getCriteriaObject(),
is(equalTo(Document.parse("{ \"field\" : { \"$bitsAnyClear\" : [ 0, 2 ]} }"))));
}
@Test // DATAMONGO-1808
public void shouldAppendBitsAnySetWithIntBitmaskCorrectly() {
Criteria numericBitmaskCriteria = new Criteria("field").bits().anySet(0b101);
assertThat(numericBitmaskCriteria.getCriteriaObject(),
is(equalTo(Document.parse("{ \"field\" : { \"$bitsAnySet\" : 5} }"))));
}
@Test // DATAMONGO-1808
public void shouldAppendBitsAnySetWithPositionListCorrectly() {
Criteria bitPositionsBitmaskCriteria = new Criteria("field").bits().anySet(Arrays.asList(0, 2));
assertThat(bitPositionsBitmaskCriteria.getCriteriaObject(),
is(equalTo(Document.parse("{ \"field\" : { \"$bitsAnySet\" : [ 0, 2 ]} }"))));
}
}

View File

@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.gridfs;
import static org.assertj.core.api.Assertions.*;
import java.io.FileNotFoundException;
import java.util.Date;
import org.bson.BsonObjectId;
@@ -61,4 +62,33 @@ public class GridFsResourceUnitTests {
assertThatThrownBy(resource::getContentType).isInstanceOf(MongoGridFSException.class);
}
@Test // DATAMONGO-1914
public void gettersThrowExceptionForAbsentResource() {
GridFsResource absent = GridFsResource.absent("foo");
assertThat(absent.exists()).isFalse();
assertThat(absent.getDescription()).contains("GridFs resource [foo]");
assertThatExceptionOfType(IllegalStateException.class).isThrownBy(absent::getContentType);
assertThatExceptionOfType(IllegalStateException.class).isThrownBy(absent::getId);
assertThatExceptionOfType(FileNotFoundException.class).isThrownBy(absent::contentLength);
assertThatExceptionOfType(FileNotFoundException.class).isThrownBy(absent::getInputStream);
assertThatExceptionOfType(FileNotFoundException.class).isThrownBy(absent::lastModified);
assertThatExceptionOfType(FileNotFoundException.class).isThrownBy(absent::getURI);
assertThatExceptionOfType(FileNotFoundException.class).isThrownBy(absent::getURL);
}
@Test // DATAMONGO-1914
public void shouldReturnFilenameForAbsentResource() {
GridFsResource absent = GridFsResource.absent("foo");
assertThat(absent.exists()).isFalse();
assertThat(absent.getDescription()).contains("GridFs resource [foo]");
assertThat(absent.getFilename()).isEqualTo("foo");
}
}

View File

@@ -177,9 +177,9 @@ public class GridFsTemplateIntegrationTests {
operations.find(null);
}
@Test // DATAMONGO-813
public void getResourceShouldReturnNullForNonExistingResource() {
assertThat(operations.getResource("doesnotexist")).isNull();
@Test // DATAMONGO-813, DATAMONGO-1914
public void getResourceShouldReturnAbsentResourceForNonExistingResource() {
assertThat(operations.getResource("doesnotexist")).isEqualTo(GridFsResource.absent("doesnotexist"));
}
@Test // DATAMONGO-809

View File

@@ -0,0 +1,207 @@
/*
* Copyright 2018 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository;
import static org.assertj.core.api.Assertions.*;
import static org.springframework.data.mongodb.test.util.MongoTestUtils.*;
import java.util.Arrays;
import java.util.List;
import java.util.concurrent.CopyOnWriteArrayList;
import org.bson.Document;
import org.bson.types.ObjectId;
import org.junit.Before;
import org.junit.ClassRule;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.ExpectedException;
import org.junit.rules.RuleChain;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.domain.Persistable;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.MongoTransactionManager;
import org.springframework.data.mongodb.config.AbstractMongoConfiguration;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.repository.config.EnableMongoRepositories;
import org.springframework.data.mongodb.test.util.AfterTransactionAssertion;
import org.springframework.data.mongodb.test.util.MongoTestUtils;
import org.springframework.data.mongodb.test.util.MongoVersionRule;
import org.springframework.data.mongodb.test.util.ReplicaSet;
import org.springframework.data.util.Version;
import org.springframework.lang.Nullable;
import org.springframework.test.annotation.Rollback;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.test.context.transaction.AfterTransaction;
import org.springframework.test.context.transaction.BeforeTransaction;
import org.springframework.transaction.annotation.Transactional;
import com.mongodb.MongoClient;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.model.Filters;
/**
* @author Christoph Strobl
* @currentRead Shadow's Edge - Brent Weeks
*/
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
@Transactional(transactionManager = "txManager")
public class PersonRepositoryTransactionalTests {
public static @ClassRule RuleChain TEST_RULES = RuleChain.outerRule(MongoVersionRule.atLeast(Version.parse("3.7.3")))
.around(ReplicaSet.required());
static final String DB_NAME = "repository-tx-tests";
@Configuration
@EnableMongoRepositories
static class Config extends AbstractMongoConfiguration {
@Bean
public MongoClient mongoClient() {
return MongoTestUtils.replSetClient();
}
@Override
protected String getDatabaseName() {
return DB_NAME;
}
@Bean
MongoTransactionManager txManager(MongoDbFactory dbFactory) {
return new MongoTransactionManager(dbFactory);
}
}
public @Rule ExpectedException expectedException = ExpectedException.none();
@Autowired MongoClient client;
@Autowired PersonRepository repository;
@Autowired MongoTemplate template;
Person durzo, kylar, vi;
List<Person> all;
List<AfterTransactionAssertion<? extends Persistable<?>>> assertionList;
@Before
public void setUp() {
assertionList = new CopyOnWriteArrayList<>();
}
@BeforeTransaction
public void beforeTransaction() {
createOrReplaceCollection(DB_NAME, template.getCollectionName(Person.class), client);
durzo = new Person("Durzo", "Blint", 700);
kylar = new Person("Kylar", "Stern", 21);
vi = new Person("Viridiana", "Sovari", 20);
all = repository.saveAll(Arrays.asList(durzo, kylar, vi));
}
@AfterTransaction
public void verifyDbState() throws InterruptedException {
Thread.sleep(100);
MongoCollection<Document> collection = client.getDatabase(DB_NAME) //
.withWriteConcern(WriteConcern.MAJORITY) //
.withReadPreference(ReadPreference.primary()) //
.getCollection(template.getCollectionName(Person.class));
try {
assertionList.forEach(it -> {
boolean isPresent = collection.find(Filters.eq("_id", new ObjectId(it.getId().toString()))).iterator()
.hasNext();
assertThat(isPresent) //
.withFailMessage(String.format("After transaction entity %s should %s.", it.getPersistable(),
it.shouldBePresent() ? "be present" : "NOT be present"))
.isEqualTo(it.shouldBePresent());
});
} finally {
assertionList.clear();
}
}
@Rollback(false)
@Test // DATAMONGO-1920
public void shouldHonorCommitForDerivedQuery() {
repository.removePersonByLastnameUsingAnnotatedQuery(durzo.getLastname());
assertAfterTransaction(durzo).isNotPresent();
}
@Rollback(false)
@Test // DATAMONGO-1920
public void shouldHonorCommit() {
Person hu = new Person("Hu", "Gibbet", 43);
repository.save(hu);
assertAfterTransaction(hu).isPresent();
}
@Test // DATAMONGO-1920
public void shouldHonorRollback() {
Person hu = new Person("Hu", "Gibbet", 43);
repository.save(hu);
assertAfterTransaction(hu).isNotPresent();
}
private AfterTransactionAssertion assertAfterTransaction(Person person) {
AfterTransactionAssertion assertion = new AfterTransactionAssertion<>(new Persistable<Object>() {
@Nullable
@Override
public Object getId() {
return person.id;
}
@Override
public boolean isNew() {
return person.id != null;
}
@Override
public String toString() {
return getId() + " - " + person.toString();
}
});
assertionList.add(assertion);
return assertion;
}
}

View File

@@ -55,7 +55,7 @@ import org.springframework.data.mongodb.repository.Person.Sex;
import org.springframework.data.mongodb.repository.support.ReactiveMongoRepositoryFactory;
import org.springframework.data.mongodb.repository.support.SimpleReactiveMongoRepository;
import org.springframework.data.repository.Repository;
import org.springframework.data.repository.query.DefaultEvaluationContextProvider;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.util.ClassUtils;
@@ -97,7 +97,7 @@ public class ReactiveMongoRepositoryTests implements BeanClassLoaderAware, BeanF
factory.setRepositoryBaseClass(SimpleReactiveMongoRepository.class);
factory.setBeanClassLoader(classLoader);
factory.setBeanFactory(beanFactory);
factory.setEvaluationContextProvider(DefaultEvaluationContextProvider.INSTANCE);
factory.setEvaluationContextProvider(QueryMethodEvaluationContextProvider.DEFAULT);
repository = factory.getRepository(ReactivePersonRepository.class);
cappedRepository = factory.getRepository(ReactiveCappedCollectionRepository.class);
@@ -174,10 +174,9 @@ public class ReactiveMongoRepositoryTests implements BeanClassLoaderAware, BeanF
@Test // DATAMONGO-1444
public void shouldUseTailableCursor() throws Exception {
StepVerifier
.create(template.dropCollection(Capped.class) //
.then(template.createCollection(Capped.class, //
CollectionOptions.empty().size(1000).maxDocuments(100).capped()))) //
StepVerifier.create(template.dropCollection(Capped.class) //
.then(template.createCollection(Capped.class, //
CollectionOptions.empty().size(1000).maxDocuments(100).capped()))) //
.expectNextCount(1) //
.verifyComplete();
@@ -199,10 +198,9 @@ public class ReactiveMongoRepositoryTests implements BeanClassLoaderAware, BeanF
@Test // DATAMONGO-1444
public void shouldUseTailableCursorWithProjection() throws Exception {
StepVerifier
.create(template.dropCollection(Capped.class) //
.then(template.createCollection(Capped.class, //
CollectionOptions.empty().size(1000).maxDocuments(100).capped()))) //
StepVerifier.create(template.dropCollection(Capped.class) //
.then(template.createCollection(Capped.class, //
CollectionOptions.empty().size(1000).maxDocuments(100).capped()))) //
.expectNextCount(1) //
.verifyComplete();
@@ -246,9 +244,8 @@ public class ReactiveMongoRepositoryTests implements BeanClassLoaderAware, BeanF
dave.setLocation(point);
StepVerifier.create(repository.save(dave)).expectNextCount(1).verifyComplete();
StepVerifier
.create(repository.findByLocationWithin(new Circle(-78.99171, 45.738868, 170), //
PageRequest.of(0, 10))) //
StepVerifier.create(repository.findByLocationWithin(new Circle(-78.99171, 45.738868, 170), //
PageRequest.of(0, 10))) //
.expectNext(dave) //
.verifyComplete();
}
@@ -276,10 +273,9 @@ public class ReactiveMongoRepositoryTests implements BeanClassLoaderAware, BeanF
dave.setLocation(point);
StepVerifier.create(repository.save(dave)).expectNextCount(1).verifyComplete();
StepVerifier
.create(repository.findByLocationNear(new Point(-73.99, 40.73), //
new Distance(2000, Metrics.KILOMETERS), //
PageRequest.of(0, 10))) //
StepVerifier.create(repository.findByLocationNear(new Point(-73.99, 40.73), //
new Distance(2000, Metrics.KILOMETERS), //
PageRequest.of(0, 10))) //
.consumeNextWith(actual -> {
assertThat(actual.getDistance().getValue(), is(closeTo(1, 1)));
@@ -294,9 +290,8 @@ public class ReactiveMongoRepositoryTests implements BeanClassLoaderAware, BeanF
dave.setLocation(point);
StepVerifier.create(repository.save(dave)).expectNextCount(1).verifyComplete();
StepVerifier
.create(repository.findPersonByLocationNear(new Point(-73.99, 40.73), //
new Distance(2000, Metrics.KILOMETERS))) //
StepVerifier.create(repository.findPersonByLocationNear(new Point(-73.99, 40.73), //
new Distance(2000, Metrics.KILOMETERS))) //
.expectNext(dave) //
.verifyComplete();
}

View File

@@ -44,7 +44,7 @@ import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mongodb.core.ReactiveMongoTemplate;
import org.springframework.data.mongodb.repository.support.ReactiveMongoRepositoryFactory;
import org.springframework.data.mongodb.repository.support.SimpleReactiveMongoRepository;
import org.springframework.data.repository.query.DefaultEvaluationContextProvider;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.util.ClassUtils;
@@ -86,7 +86,7 @@ public class SimpleReactiveMongoRepositoryTests implements BeanClassLoaderAware,
factory.setRepositoryBaseClass(SimpleReactiveMongoRepository.class);
factory.setBeanClassLoader(classLoader);
factory.setBeanFactory(beanFactory);
factory.setEvaluationContextProvider(DefaultEvaluationContextProvider.INSTANCE);
factory.setEvaluationContextProvider(QueryMethodEvaluationContextProvider.DEFAULT);
repository = factory.getRepository(ReactivePersonRepostitory.class);

View File

@@ -17,8 +17,8 @@ package org.springframework.data.mongodb.repository.query;
import static org.hamcrest.Matchers.*;
import static org.junit.Assert.*;
import static org.mockito.ArgumentMatchers.any;
import static org.mockito.Mockito.*;
import static org.mockito.Mockito.any;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
@@ -51,7 +51,7 @@ import org.springframework.data.projection.ProjectionFactory;
import org.springframework.data.projection.SpelAwareProxyProjectionFactory;
import org.springframework.data.repository.Repository;
import org.springframework.data.repository.core.support.DefaultRepositoryMetadata;
import org.springframework.data.repository.query.DefaultEvaluationContextProvider;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.expression.spel.standard.SpelExpressionParser;
/**
@@ -223,7 +223,7 @@ public class ReactiveStringBasedMongoQueryUnitTests {
ReactiveMongoQueryMethod queryMethod = new ReactiveMongoQueryMethod(method,
new DefaultRepositoryMetadata(SampleRepository.class), factory, converter.getMappingContext());
return new ReactiveStringBasedMongoQuery(queryMethod, operations, PARSER,
DefaultEvaluationContextProvider.INSTANCE);
QueryMethodEvaluationContextProvider.DEFAULT);
}
private interface SampleRepository extends Repository<Person, Long> {

View File

@@ -53,7 +53,7 @@ import org.springframework.data.projection.ProjectionFactory;
import org.springframework.data.projection.SpelAwareProxyProjectionFactory;
import org.springframework.data.repository.Repository;
import org.springframework.data.repository.core.support.DefaultRepositoryMetadata;
import org.springframework.data.repository.query.DefaultEvaluationContextProvider;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.expression.spel.standard.SpelExpressionParser;
/**
@@ -565,7 +565,7 @@ public class StringBasedMongoQueryUnitTests {
ProjectionFactory factory = new SpelAwareProxyProjectionFactory();
MongoQueryMethod queryMethod = new MongoQueryMethod(method, new DefaultRepositoryMetadata(SampleRepository.class),
factory, converter.getMappingContext());
return new StringBasedMongoQuery(queryMethod, operations, PARSER, DefaultEvaluationContextProvider.INSTANCE);
return new StringBasedMongoQuery(queryMethod, operations, PARSER, QueryMethodEvaluationContextProvider.DEFAULT);
} catch (Exception e) {
throw new IllegalArgumentException(e.getMessage(), e);

View File

@@ -45,6 +45,8 @@ import org.springframework.data.mongodb.repository.QPerson;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.util.JSON;
import com.querydsl.core.types.dsl.BooleanExpression;
import com.querydsl.core.types.dsl.BooleanOperation;
import com.querydsl.core.types.dsl.PathBuilder;
import com.querydsl.core.types.dsl.SimplePath;
@@ -184,6 +186,16 @@ public class SpringDataMongodbSerializerUnitTests {
assertThat(((DBObject) mappedPredicate).get("sex"), is((Object) "f"));
}
@Test // DATAMONGO-1943
public void shouldRemarshallListsAndDocuments() {
BooleanExpression criteria = QPerson.person.firstname.isNotEmpty()
.and(QPerson.person.firstname.containsIgnoreCase("foo")).not();
assertThat(this.serializer.handle(criteria), is(equalTo(JSON.parse("{ \"$or\" : [ { \"firstname\" : { \"$ne\" : { "
+ "\"$ne\" : \"\"}}} , { \"firstname\" : { \"$not\" : { \"$regex\" : \".*\\\\Qfoo\\\\E.*\" , \"$options\" : \"i\"}}}]}"))));
}
class Address {
String id;
String street;

Some files were not shown because too many files have changed in this diff Show More