Compare commits

..

83 Commits

Author SHA1 Message Date
Mark Paluch
ec0fe6f994 DATAMONGO-2647 - Release version 3.0.6 (Neumann SR6). 2020-12-09 11:16:20 +01:00
Mark Paluch
d3d690b908 DATAMONGO-2647 - Prepare 3.0.6 (Neumann SR6). 2020-12-09 11:15:54 +01:00
Mark Paluch
b98b5125f1 DATAMONGO-2647 - Updated changelog. 2020-12-09 11:15:51 +01:00
Mark Paluch
d68bd4b44a DATAMONGO-2646 - Updated changelog. 2020-12-09 09:59:11 +01:00
Mark Paluch
a73ae9a1a5 DATAMONGO-2663 - Document Spring Data to MongoDB compatibility.
Original Pull Request: #895
2020-12-07 14:39:41 +01:00
Mark Paluch
ac3c578e93 DATAMONGO-2661 - Polishing.
Add ticket reference.

Original pull request: #894.
2020-11-26 11:48:07 +01:00
Yoann de Martino
9872f8cb07 DATAMONGO-2661 - Handle nullable types for KPropertyPath.
Original pull request: #894.
2020-11-26 11:47:03 +01:00
Mark Paluch
1b1ab2c495 DATAMONGO-2648 - Updated changelog. 2020-11-11 12:34:41 +01:00
Christoph Strobl
6554274dde DATAMONGO-2644 - ProjectOperation no longer errors on inclusion of default _id field.
Original pull request: #890.
2020-11-10 09:41:09 +01:00
Mark Paluch
34f91acbfe DATAMONGO-2639 - Updated changelog. 2020-10-28 16:28:04 +01:00
Mark Paluch
8e7d508cfe DATAMONGO-2625 - After release cleanups. 2020-10-28 14:51:05 +01:00
Mark Paluch
04dd78cb36 DATAMONGO-2625 - Prepare next development iteration. 2020-10-28 14:51:02 +01:00
Mark Paluch
52c554cfe1 DATAMONGO-2625 - Release version 3.0.5 (Neumann SR5). 2020-10-28 14:34:40 +01:00
Mark Paluch
c1bb8c4ba5 DATAMONGO-2625 - Prepare 3.0.5 (Neumann SR5). 2020-10-28 14:34:15 +01:00
Mark Paluch
ed9a367bb8 DATAMONGO-2625 - Updated changelog. 2020-10-28 14:34:04 +01:00
Mark Paluch
b88ce46239 DATAMONGO-2624 - Updated changelog. 2020-10-28 12:15:06 +01:00
Mark Paluch
42ec94d321 DATAMONGO-2641 - Updated changelog. 2020-10-28 11:32:32 +01:00
Robin Dupret
53f35e185f DATAMONGO-2638 - Fix list item rendering in reference documentation.
Original Pull Request: #885
2020-10-27 13:33:15 +01:00
LiangYong
e08bfd253c DATAMONGO-2638 - Fix aggregation input parameter syntax in reference documentation.
Original Pull Request: #881
2020-10-27 13:33:11 +01:00
Mark Paluch
caaafa275d DATAMONGO-2643 - Adopt to AssertJ API changes. 2020-10-26 10:59:30 +01:00
Christoph Strobl
1f082abc7f DATAMONGO-2626 - Updated changelog. 2020-10-14 14:51:55 +02:00
Mark Paluch
df77fcc19d DATAMONGO-2616 - Polishing.
Reformat code. Merge if-statements.

Original pull request: #889.
2020-10-07 11:35:57 +02:00
Christoph Strobl
007b965673 DATAMONGO-2616 - Short circuit id value assignment in MongoConverter.
Original pull request: #889.
2020-10-07 11:35:57 +02:00
Christoph Strobl
1b9680cece DATAMONGO-2633 - Fix json parsing of nested arrays in ParameterBindingDocumentCodec.
Original pull request: #888.
2020-10-05 15:35:07 +02:00
Mark Paluch
13f1d21919 DATAMONGO-2608 - Updated changelog. 2020-09-16 14:12:10 +02:00
Mark Paluch
4c10bf30bc DATAMONGO-2609 - After release cleanups. 2020-09-16 12:15:45 +02:00
Mark Paluch
8f78d0e0d8 DATAMONGO-2609 - Prepare next development iteration. 2020-09-16 12:15:40 +02:00
Mark Paluch
56115a263c DATAMONGO-2609 - Release version 3.0.4 (Neumann SR4). 2020-09-16 11:43:14 +02:00
Mark Paluch
66b809318a DATAMONGO-2609 - Prepare 3.0.4 (Neumann SR4). 2020-09-16 11:42:46 +02:00
Mark Paluch
bfff60d915 DATAMONGO-2609 - Updated changelog. 2020-09-16 11:42:29 +02:00
Mark Paluch
051e973226 DATAMONGO-2593 - Updated changelog. 2020-09-16 11:20:13 +02:00
Mark Paluch
eb61629f10 DATAMONGO-2592 - Updated changelog. 2020-09-16 10:39:02 +02:00
Christoph Strobl
85783e5354 DATAMONGO-2618 - Fix visibility of ReplaceRootDocumentOperation. 2020-09-14 13:44:36 +02:00
Michal Kurcius
433b012b91 DATAMONGO-2613 - Fix single element ArrayJsonSchemaObject to document mapping.
Now toDocument calls toDocument on items correctly.

Original Pull Request: #883
2020-08-20 09:05:58 +02:00
Mark Paluch
8dca0049ca DATAMONGO-2594 - After release cleanups. 2020-08-12 13:19:58 +02:00
Mark Paluch
635f3b82be DATAMONGO-2594 - Prepare next development iteration. 2020-08-12 13:19:55 +02:00
Mark Paluch
e69c7e1134 DATAMONGO-2594 - Release version 3.0.3 (Neumann SR3). 2020-08-12 13:07:46 +02:00
Mark Paluch
1f94e74b75 DATAMONGO-2594 - Prepare 3.0.3 (Neumann SR3). 2020-08-12 13:07:20 +02:00
Mark Paluch
98858e0f5f DATAMONGO-2594 - Updated changelog. 2020-08-12 13:07:08 +02:00
Mark Paluch
90f311de51 DATAMONGO-2579 - Updated changelog. 2020-08-12 12:01:26 +02:00
Mark Paluch
7f7015fd86 DATAMONGO-2601 - Suppress results for suspended query methods returning kotlin.Unit.
We now discard results for suspended query methods if the return type is kotlin.Unit.

Related ticket: DATACMNS-1779
2020-07-31 11:43:59 +02:00
Mark Paluch
ab9c5d73a0 DATAMONGO-2599 - Eagerly consider enum types as simple types.
MongoSimpleTypes now eagerly checks if a type is a simple one to avoid PersistentEntity registration for ChronoUnit.
2020-07-30 16:24:32 +02:00
Mark Paluch
13caa162db DATAMONGO-2598 - Polishing.
Original pull request: #872.
2020-07-28 15:22:02 +02:00
Jay Bryant
d5d620d777 DATAMONGO-2598 - Wording changes.
Removed the language of oppression and violence
and replaced it with more neutral language.

Note that problematic words in the code have
to remain in the docs until the code changes.

Original pull request: #872.
2020-07-28 15:22:01 +02:00
Mark Paluch
21d50f2a72 DATAMONGO-2568 - After release cleanups. 2020-07-22 10:37:08 +02:00
Mark Paluch
4912d62be6 DATAMONGO-2568 - Prepare next development iteration. 2020-07-22 10:37:05 +02:00
Mark Paluch
2bf6f226d6 DATAMONGO-2568 - Release version 3.0.2 (Neumann SR2). 2020-07-22 10:21:08 +02:00
Mark Paluch
1a6f7e371a DATAMONGO-2568 - Prepare 3.0.2 (Neumann SR2). 2020-07-22 10:20:44 +02:00
Mark Paluch
2ef5d795ce DATAMONGO-2568 - Updated changelog. 2020-07-22 10:20:31 +02:00
Mark Paluch
e3d2f16202 DATAMONGO-2567 - Updated changelog. 2020-07-22 10:08:49 +02:00
Mark Paluch
c44232ff39 DATAMONGO-2566 - Updated changelog. 2020-07-22 09:44:34 +02:00
Mark Paluch
385e911708 DATAMONGO-2589 - Upgrade to MongoDB Driver 4.0.5. 2020-07-17 10:49:56 +02:00
Mark Paluch
171d8b2b1e DATAMONGO-2536 - Polishing.
Encapsulate skipResults in AggregationOptions. Reformat code. Add override Javadoc.

Original pull request: #876.
2020-07-16 09:42:53 +02:00
Christoph Strobl
7ac7eefad6 DATAMONGO-2536 - Add option to skip reading aggregation result.
Introduce dedicated AggregationPipeline to encapsulate pipeline stages.

Original pull request: #876.
2020-07-16 09:42:53 +02:00
Mark Paluch
7a39e94e4b DATAMONGO-2571 - Polishing.
Reduce test method visibility for JUnit 5.

Original pull request: #873.
2020-07-15 15:33:48 +02:00
Christoph Strobl
25733664b3 DATAMONGO-2571 - Fix regular expression parameter binding for String-based queries.
Original pull request: #873.
2020-07-15 15:33:48 +02:00
Mark Paluch
81da10f499 DATAMONGO-2490 - Polishing.
Remove unnecessary code. Reuse session-associated collection when logging to avoid unqualified calls to MongoDbFactory.getMongoDatabase(). Create collection before transaction in test for compatibility with older MongoDB servers.

Original pull request: #875.
2020-07-15 15:14:08 +02:00
Christoph Strobl
bde114ed19 DATAMONGO-2490 - Fix dbref fetching during ongoing transaction.
Original pull request: #875.
2020-07-15 15:14:08 +02:00
Mark Paluch
a2403f58ec DATAMONGO-2544 - Updated changelog. 2020-06-25 12:00:24 +02:00
Mark Paluch
875b8eda9f DATAMONGO-2570 - Polishing.
Add assertions. Use method references where possible.

Original pull request: #871.
2020-06-22 10:40:04 +02:00
Mehran Behnam
fc0a021937 DATAMONGO-2570 - Change type of count variable to primitive.
Avoiding unintentional unboxing.

Original pull request: #871.
2020-06-22 10:40:04 +02:00
Mark Paluch
a6aa174ff5 DATAMONGO-2572 - Remove usage of Oppressive Language.
Replaced blacklist with denylist and introduce meta keyword SECONDARY_READS as we no longer use MongoDB API with the initial replication concept.

Original Pull Request: #870
2020-06-17 13:32:40 +02:00
Mark Paluch
8b36617752 DATAMONGO-2543 - After release cleanups. 2020-06-10 14:29:28 +02:00
Mark Paluch
56e8799c22 DATAMONGO-2543 - Prepare next development iteration. 2020-06-10 14:29:25 +02:00
Mark Paluch
88e60070d6 DATAMONGO-2543 - Release version 3.0.1 (Neumann SR1). 2020-06-10 14:02:26 +02:00
Mark Paluch
6c7039580f DATAMONGO-2543 - Prepare 3.0.1 (Neumann SR1). 2020-06-10 14:01:58 +02:00
Mark Paluch
fba003f215 DATAMONGO-2543 - Updated changelog. 2020-06-10 14:01:47 +02:00
Mark Paluch
1a3239554c DATAMONGO-2533 - Updated changelog. 2020-06-10 12:30:00 +02:00
Mark Paluch
37b541931d DATAMONGO-2532 - Updated changelog. 2020-06-10 11:48:56 +02:00
Mark Paluch
9038280f68 DATAMONGO-2565 - Polishing.
Add unit test to verify behavior. Cleanup code.

Original pull request: #869.
2020-06-10 10:15:12 +02:00
BraveLeeLee
75935a2bdb DATAMONGO-2565 - Evaluate correct expression when obtaining collation from MongoPersistentEntity.
Original pull request: #869.
2020-06-10 10:14:34 +02:00
Mark Paluch
d9ca3d7eb3 DATAMONGO-2562 - Polishing.
Fix typo in exception message.
2020-06-09 11:18:33 +02:00
Mark Paluch
04e77ad5ab DATAMONGO-2562 - Fix return type detection for suspended Kotlin methods.
See DATACMNS-1738 for further reference.
2020-06-09 11:18:33 +02:00
Mark Paluch
0ab39a17a7 DATAMONGO-2560 - Upgrade MongoDB drivers to 4.0.4. 2020-06-08 15:59:02 +02:00
Mark Paluch
49a6f13797 DATAMONGO-2542 - Polishing.
Fix nullable annotation.

Original pull request: #863.
2020-05-26 10:32:39 +02:00
Christoph Strobl
b0fd6f691b DATAMONGO-2542 - Shortcut PersistentPropertyPath resolution during query mapping.
By shortcutting the path resolution we avoid checking keywords like $in against a potential path expression.

Original pull request: #863.
2020-05-26 10:32:39 +02:00
Mark Paluch
b5778772d9 DATAMONGO-2545 - Polishing.
Fix warnings and typos.

Original pull request: #864.
2020-05-26 10:14:27 +02:00
Christoph Strobl
0f55fb305d DATAMONGO-2545 - Fix full Query Document binding resulting from SpEL.
We reenabled annotated queries using a SpEL expression resulting in the actual query document.

Original pull request: #864.
2020-05-26 10:14:27 +02:00
Christoph Strobl
5ae7547465 DATAMONGO-2545 - Fix regression in String query SpEL parameter binding.
We reenabled parameter binding within SpEL using query parameter placeholders ?0, ?1,... instead of their array index [0],[1],...

Original pull request: #864.
2020-05-26 10:14:27 +02:00
Christoph Strobl
cf4e04a30e DATAMONGO-2547 - Use target class ClassLoader instead of default CL when creating proxy instances.
Original pull request: #865.
2020-05-26 08:56:09 +02:00
Mark Paluch
89c1dc77d9 DATAMONGO-2553 - Fix reference documentation links.
Remove links to removed documentations sections.
2020-05-19 11:36:44 +02:00
Mark Paluch
a2c842b59b DATAMONGO-2534 - After release cleanups. 2020-05-12 12:56:12 +02:00
Mark Paluch
0cd0be9478 DATAMONGO-2534 - Prepare next development iteration. 2020-05-12 12:40:53 +02:00
152 changed files with 1408 additions and 5214 deletions

27
CODE_OF_CONDUCT.adoc Normal file
View File

@@ -0,0 +1,27 @@
= Contributor Code of Conduct
As contributors and maintainers of this project, and in the interest of fostering an open and welcoming community, we pledge to respect all people who contribute through reporting issues, posting feature requests, updating documentation, submitting pull requests or patches, and other activities.
We are committed to making participation in this project a harassment-free experience for everyone, regardless of level of experience, gender, gender identity and expression, sexual orientation, disability, personal appearance, body size, race, ethnicity, age, religion, or nationality.
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery
* Personal attacks
* Trolling or insulting/derogatory comments
* Public or private harassment
* Publishing other's private information, such as physical or electronic addresses,
without explicit permission
* Other unethical or unprofessional conduct
Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful.
By adopting this Code of Conduct, project maintainers commit themselves to fairly and consistently applying these principles to every aspect of managing this project. Project maintainers who do not follow or enforce the Code of Conduct may be permanently removed from the project team.
This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community.
Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting a project maintainer at spring-code-of-conduct@pivotal.io.
All complaints will be reviewed and investigated and will result in a response that is deemed necessary and appropriate to the circumstances.
Maintainers are obligated to maintain confidentiality with regard to the reporter of an incident.
This Code of Conduct is adapted from the https://contributor-covenant.org[Contributor Covenant], version 1.3.0, available at https://contributor-covenant.org/version/1/3/0/[contributor-covenant.org/version/1/3/0/].

12
Jenkinsfile vendored
View File

@@ -3,7 +3,7 @@ pipeline {
triggers {
pollSCM 'H/10 * * * *'
upstream(upstreamProjects: "spring-data-commons/master", threshold: hudson.model.Result.SUCCESS)
upstream(upstreamProjects: "spring-data-commons/2.3.x", threshold: hudson.model.Result.SUCCESS)
}
options {
@@ -68,7 +68,7 @@ pipeline {
stage("test: baseline (jdk8)") {
when {
anyOf {
branch 'master'
branch '3.0.x'
not { triggeredBy 'UpstreamCause' }
}
}
@@ -93,8 +93,8 @@ pipeline {
stage("Test other configurations") {
when {
allOf {
branch 'master'
anyOf {
branch '3.0.x'
not { triggeredBy 'UpstreamCause' }
}
}
@@ -164,7 +164,7 @@ pipeline {
stage('Release to artifactory') {
when {
anyOf {
branch 'master'
branch '3.0.x'
not { triggeredBy 'UpstreamCause' }
}
}
@@ -196,7 +196,7 @@ pipeline {
stage('Publish documentation') {
when {
branch 'master'
branch '3.0.x'
}
agent {
docker {

View File

@@ -10,7 +10,7 @@ Key functional areas of Spring Data MongoDB are a POJO centric model for interac
== Code of Conduct
This project is governed by the https://github.com/spring-projects/.github/blob/e3cc2ff230d8f1dca06535aa6b5a4a23815861d4/CODE_OF_CONDUCT.md[Spring Code of Conduct]. By participating, you are expected to uphold this code of conduct. Please report unacceptable behavior to spring-code-of-conduct@pivotal.io.
This project is governed by the link:CODE_OF_CONDUCT.adoc[Spring Code of Conduct]. By participating, you are expected to uphold this code of conduct. Please report unacceptable behavior to spring-code-of-conduct@pivotal.io.
== Getting Started

12
pom.xml
View File

@@ -5,7 +5,7 @@
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.1.0-STATIC-METADATA-SNAPSHOT</version>
<version>3.0.6.RELEASE</version>
<packaging>pom</packaging>
<name>Spring Data MongoDB</name>
@@ -15,7 +15,7 @@
<parent>
<groupId>org.springframework.data.build</groupId>
<artifactId>spring-data-parent</artifactId>
<version>2.4.0-SNAPSHOT</version>
<version>2.3.6.RELEASE</version>
</parent>
<modules>
@@ -26,8 +26,8 @@
<properties>
<project.type>multi</project.type>
<dist.id>spring-data-mongodb</dist.id>
<springdata.commons>2.4.0-BUILD-TIME-DOMAIN-TYPE-METADATA-SNAPSHOT</springdata.commons>
<mongo>4.1.0</mongo>
<springdata.commons>2.3.6.RELEASE</springdata.commons>
<mongo>4.0.5</mongo>
<mongo.reactivestreams>${mongo}</mongo.reactivestreams>
<jmh.version>1.19</jmh.version>
</properties>
@@ -134,8 +134,8 @@
<repositories>
<repository>
<id>spring-libs-snapshot</id>
<url>https://repo.spring.io/libs-snapshot</url>
<id>spring-libs-release</id>
<url>https://repo.spring.io/libs-release</url>
</repository>
<repository>
<id>sonatype-libs-snapshot</id>

View File

@@ -7,7 +7,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.1.0-STATIC-METADATA-SNAPSHOT</version>
<version>3.0.6.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -14,7 +14,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.1.0-STATIC-METADATA-SNAPSHOT</version>
<version>3.0.6.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>

View File

@@ -11,7 +11,7 @@
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>3.1.0-STATIC-METADATA-SNAPSHOT</version>
<version>3.0.6.RELEASE</version>
<relativePath>../pom.xml</relativePath>
</parent>
@@ -136,13 +136,6 @@
<optional>true</optional>
</dependency>
<dependency>
<groupId>io.reactivex.rxjava3</groupId>
<artifactId>rxjava</artifactId>
<version>${rxjava3}</version>
<optional>true</optional>
</dependency>
<!-- CDI -->
<!-- Dependency order required to build against CDI 1.0 and test with CDI 2.0 -->
<dependency>
@@ -199,14 +192,7 @@
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<version>5.4.3.Final</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.glassfish</groupId>
<artifactId>javax.el</artifactId>
<version>3.0.1-b11</version>
<version>5.2.4.Final</version>
<scope>test</scope>
</dependency>

View File

@@ -61,8 +61,8 @@ public @interface EnableMongoAuditing {
boolean modifyOnCreate() default true;
/**
* Configures a {@link DateTimeProvider} bean name that allows customizing the timestamp to be used for setting
* creation and modification dates.
* Configures a {@link DateTimeProvider} bean name that allows customizing the {@link org.joda.time.DateTime} to be
* used for setting creation and modification dates.
*
* @return empty {@link String} by default.
*/

View File

@@ -1,70 +0,0 @@
/*
* Copyright 2020 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.lang.annotation.Documented;
import java.lang.annotation.ElementType;
import java.lang.annotation.Inherited;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
import org.springframework.context.annotation.Import;
import org.springframework.data.auditing.DateTimeProvider;
import org.springframework.data.domain.ReactiveAuditorAware;
/**
* Annotation to enable auditing in MongoDB using reactive infrastructure via annotation configuration.
*
* @author Mark Paluch
* @since 3.1
*/
@Inherited
@Documented
@Target(ElementType.TYPE)
@Retention(RetentionPolicy.RUNTIME)
@Import(ReactiveMongoAuditingRegistrar.class)
public @interface EnableReactiveMongoAuditing {
/**
* Configures the {@link ReactiveAuditorAware} bean to be used to lookup the current principal.
*
* @return empty {@link String} by default.
*/
String auditorAwareRef() default "";
/**
* Configures whether the creation and modification dates are set. Defaults to {@literal true}.
*
* @return {@literal true} by default.
*/
boolean setDates() default true;
/**
* Configures whether the entity shall be marked as modified on creation. Defaults to {@literal true}.
*
* @return {@literal true} by default.
*/
boolean modifyOnCreate() default true;
/**
* Configures a {@link DateTimeProvider} bean name that allows customizing the timestamp to be used for setting
* creation and modification dates.
*
* @return empty {@link String} by default.
*/
String dateTimeProviderRef() default "";
}

View File

@@ -17,6 +17,7 @@ package org.springframework.data.mongodb.config;
import java.lang.annotation.Annotation;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
@@ -27,8 +28,14 @@ import org.springframework.data.auditing.IsNewAwareAuditingHandler;
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
import org.springframework.data.auditing.config.AuditingConfiguration;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.mapping.event.AuditingEntityCallback;
import org.springframework.data.mongodb.core.mapping.event.ReactiveAuditingEntityCallback;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
/**
* {@link ImportBeanDefinitionRegistrar} to enable {@link EnableMongoAuditing} annotation.
@@ -39,6 +46,9 @@ import org.springframework.util.Assert;
*/
class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
private static boolean PROJECT_REACTOR_AVAILABLE = ClassUtils.isPresent("reactor.core.publisher.Mono",
MongoAuditingRegistrar.class.getClassLoader());
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAnnotation()
@@ -81,7 +91,7 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(IsNewAwareAuditingHandler.class);
BeanDefinitionBuilder definition = BeanDefinitionBuilder.genericBeanDefinition(PersistentEntitiesFactoryBean.class);
BeanDefinitionBuilder definition = BeanDefinitionBuilder.genericBeanDefinition(MongoMappingContextLookup.class);
definition.setAutowireMode(AbstractBeanDefinition.AUTOWIRE_CONSTRUCTOR);
builder.addConstructorArgValue(definition.getBeanDefinition());
@@ -106,6 +116,68 @@ class MongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
registerInfrastructureBeanWithId(listenerBeanDefinitionBuilder.getBeanDefinition(),
AuditingEntityCallback.class.getName(), registry);
if (PROJECT_REACTOR_AVAILABLE) {
registerReactiveAuditingEntityCallback(registry, auditingHandlerDefinition.getSource());
}
}
private void registerReactiveAuditingEntityCallback(BeanDefinitionRegistry registry, Object source) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(ReactiveAuditingEntityCallback.class);
builder.addConstructorArgValue(ParsingUtils.getObjectFactoryBeanDefinition(getAuditingHandlerBeanName(), registry));
builder.getRawBeanDefinition().setSource(source);
registerInfrastructureBeanWithId(builder.getBeanDefinition(), ReactiveAuditingEntityCallback.class.getName(),
registry);
}
/**
* Simple helper to be able to wire the {@link MappingContext} from a {@link MappingMongoConverter} bean available in
* the application context.
*
* @author Oliver Gierke
*/
static class MongoMappingContextLookup
implements FactoryBean<MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty>> {
private final MappingMongoConverter converter;
/**
* Creates a new {@link MongoMappingContextLookup} for the given {@link MappingMongoConverter}.
*
* @param converter must not be {@literal null}.
*/
public MongoMappingContextLookup(MappingMongoConverter converter) {
this.converter = converter;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
@Override
public MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> getObject() throws Exception {
return converter.getMappingContext();
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
@Override
public Class<?> getObjectType() {
return MappingContext.class;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#isSingleton()
*/
@Override
public boolean isSingleton() {
return true;
}
}
}

View File

@@ -1,61 +0,0 @@
/*
* Copyright 2020 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.FactoryBean;
import org.springframework.data.mapping.context.PersistentEntities;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
/**
* Simple helper to be able to wire the {@link PersistentEntities} from a {@link MappingMongoConverter} bean available
* in the application context.
*
* @author Oliver Gierke
* @author Mark Paluch
* @author Christoph Strobl
* @since 3.1
*/
class PersistentEntitiesFactoryBean implements FactoryBean<PersistentEntities> {
private final MappingMongoConverter converter;
/**
* Creates a new {@link PersistentEntitiesFactoryBean} for the given {@link MappingMongoConverter}.
*
* @param converter must not be {@literal null}.
*/
public PersistentEntitiesFactoryBean(MappingMongoConverter converter) {
this.converter = converter;
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObject()
*/
@Override
public PersistentEntities getObject() {
return PersistentEntities.of(converter.getMappingContext());
}
/*
* (non-Javadoc)
* @see org.springframework.beans.factory.FactoryBean#getObjectType()
*/
@Override
public Class<?> getObjectType() {
return PersistentEntities.class;
}
}

View File

@@ -1,97 +0,0 @@
/*
* Copyright 2020 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
import java.lang.annotation.Annotation;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.context.annotation.ImportBeanDefinitionRegistrar;
import org.springframework.core.type.AnnotationMetadata;
import org.springframework.data.auditing.ReactiveIsNewAwareAuditingHandler;
import org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport;
import org.springframework.data.auditing.config.AuditingConfiguration;
import org.springframework.data.config.ParsingUtils;
import org.springframework.data.mongodb.core.mapping.event.ReactiveAuditingEntityCallback;
import org.springframework.util.Assert;
/**
* {@link ImportBeanDefinitionRegistrar} to enable {@link EnableReactiveMongoAuditing} annotation.
*
* @author Mark Paluch
* @since 3.1
*/
class ReactiveMongoAuditingRegistrar extends AuditingBeanDefinitionRegistrarSupport {
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAnnotation()
*/
@Override
protected Class<? extends Annotation> getAnnotation() {
return EnableReactiveMongoAuditing.class;
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditingHandlerBeanName()
*/
@Override
protected String getAuditingHandlerBeanName() {
return "reactiveMongoAuditingHandler";
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#getAuditHandlerBeanDefinitionBuilder(org.springframework.data.auditing.config.AuditingConfiguration)
*/
@Override
protected BeanDefinitionBuilder getAuditHandlerBeanDefinitionBuilder(AuditingConfiguration configuration) {
Assert.notNull(configuration, "AuditingConfiguration must not be null!");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(ReactiveIsNewAwareAuditingHandler.class);
BeanDefinitionBuilder definition = BeanDefinitionBuilder.genericBeanDefinition(PersistentEntitiesFactoryBean.class);
definition.setAutowireMode(AbstractBeanDefinition.AUTOWIRE_CONSTRUCTOR);
builder.addConstructorArgValue(definition.getBeanDefinition());
return configureDefaultAuditHandlerAttributes(configuration, builder);
}
/*
* (non-Javadoc)
* @see org.springframework.data.auditing.config.AuditingBeanDefinitionRegistrarSupport#registerAuditListener(org.springframework.beans.factory.config.BeanDefinition, org.springframework.beans.factory.support.BeanDefinitionRegistry)
*/
@Override
protected void registerAuditListenerBeanDefinition(BeanDefinition auditingHandlerDefinition,
BeanDefinitionRegistry registry) {
Assert.notNull(auditingHandlerDefinition, "BeanDefinition must not be null!");
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(ReactiveAuditingEntityCallback.class);
builder.addConstructorArgValue(ParsingUtils.getObjectFactoryBeanDefinition(getAuditingHandlerBeanName(), registry));
builder.getRawBeanDefinition().setSource(auditingHandlerDefinition.getSource());
registerInfrastructureBeanWithId(builder.getBeanDefinition(), ReactiveAuditingEntityCallback.class.getName(),
registry);
}
}

View File

@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AllArgsConstructor;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
@@ -28,7 +30,6 @@ import org.springframework.data.mongodb.core.aggregation.AggregationOperation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.AggregationOptions;
import org.springframework.data.mongodb.core.aggregation.CountOperation;
import org.springframework.data.mongodb.core.aggregation.RelaxedTypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypeBasedAggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.QueryMapper;
@@ -48,18 +49,12 @@ import org.springframework.util.ObjectUtils;
* @author Mark Paluch
* @since 2.1
*/
@AllArgsConstructor
class AggregationUtil {
QueryMapper queryMapper;
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
AggregationUtil(QueryMapper queryMapper,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
this.queryMapper = queryMapper;
this.mappingContext = mappingContext;
}
/**
* Prepare the {@link AggregationOperationContext} for a given aggregation by either returning the context itself it
* is not {@literal null}, create a {@link TypeBasedAggregationOperationContext} if the aggregation contains type
@@ -76,17 +71,12 @@ class AggregationUtil {
return context;
}
if (!(aggregation instanceof TypedAggregation)) {
return Aggregation.DEFAULT_CONTEXT;
if (aggregation instanceof TypedAggregation) {
return new TypeBasedAggregationOperationContext(((TypedAggregation) aggregation).getInputType(), mappingContext,
queryMapper);
}
Class<?> inputType = ((TypedAggregation) aggregation).getInputType();
if (aggregation.getPipeline().containsUnionWith()) {
return new RelaxedTypeBasedAggregationOperationContext(inputType, mappingContext, queryMapper);
}
return new TypeBasedAggregationOperationContext(inputType, mappingContext, queryMapper);
return Aggregation.DEFAULT_CONTEXT;
}
/**

View File

@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core;
import lombok.EqualsAndHashCode;
import java.time.Instant;
import java.util.concurrent.atomic.AtomicReferenceFieldUpdater;
@@ -25,7 +27,6 @@ import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.messaging.Message;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
import org.springframework.util.ObjectUtils;
import com.mongodb.client.model.changestream.ChangeStreamDocument;
import com.mongodb.client.model.changestream.OperationType;
@@ -38,6 +39,7 @@ import com.mongodb.client.model.changestream.OperationType;
* @author Mark Paluch
* @since 2.1
*/
@EqualsAndHashCode
public class ChangeStreamEvent<T> {
@SuppressWarnings("rawtypes") //
@@ -185,8 +187,8 @@ public class ChangeStreamEvent<T> {
return CONVERTED_UPDATER.compareAndSet(this, null, result) ? result : CONVERTED_UPDATER.get(this);
}
throw new IllegalArgumentException(
String.format("No converter found capable of converting %s to %s", fullDocument.getClass(), targetType));
throw new IllegalArgumentException(String.format("No converter found capable of converting %s to %s",
fullDocument.getClass(), targetType));
}
/*
@@ -197,27 +199,4 @@ public class ChangeStreamEvent<T> {
public String toString() {
return "ChangeStreamEvent {" + "raw=" + raw + ", targetType=" + targetType + '}';
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
ChangeStreamEvent<?> that = (ChangeStreamEvent<?>) o;
if (!ObjectUtils.nullSafeEquals(this.raw, that.raw)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.targetType, that.targetType);
}
@Override
public int hashCode() {
int result = raw != null ? raw.hashCode() : 0;
result = 31 * result + ObjectUtils.nullSafeHashCode(targetType);
return result;
}
}

View File

@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core;
import lombok.EqualsAndHashCode;
import java.time.Instant;
import java.util.Arrays;
import java.util.Optional;
@@ -23,6 +25,7 @@ import org.bson.BsonDocument;
import org.bson.BsonTimestamp;
import org.bson.BsonValue;
import org.bson.Document;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.lang.Nullable;
@@ -42,6 +45,7 @@ import com.mongodb.client.model.changestream.FullDocument;
* @author Mark Paluch
* @since 2.1
*/
@EqualsAndHashCode
public class ChangeStreamOptions {
private @Nullable Object filter;
@@ -152,44 +156,6 @@ public class ChangeStreamOptions {
+ ObjectUtils.nullSafeClassName(timestamp));
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
ChangeStreamOptions that = (ChangeStreamOptions) o;
if (!ObjectUtils.nullSafeEquals(this.filter, that.filter)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.resumeToken, that.resumeToken)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.fullDocumentLookup, that.fullDocumentLookup)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.collation, that.collation)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.resumeTimestamp, that.resumeTimestamp)) {
return false;
}
return resume == that.resume;
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(filter);
result = 31 * result + ObjectUtils.nullSafeHashCode(resumeToken);
result = 31 * result + ObjectUtils.nullSafeHashCode(fullDocumentLookup);
result = 31 * result + ObjectUtils.nullSafeHashCode(collation);
result = 31 * result + ObjectUtils.nullSafeHashCode(resumeTimestamp);
result = 31 * result + ObjectUtils.nullSafeHashCode(resume);
return result;
}
/**
* @author Christoph Strobl
* @since 2.2

View File

@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core;
import lombok.RequiredArgsConstructor;
import java.util.Optional;
import org.springframework.data.mongodb.core.query.Collation;
@@ -310,6 +312,7 @@ public class CollectionOptions {
* @author Andreas Zink
* @since 2.1
*/
@RequiredArgsConstructor
public static class ValidationOptions {
private static final ValidationOptions NONE = new ValidationOptions(null, null, null);
@@ -318,13 +321,6 @@ public class CollectionOptions {
private final @Nullable ValidationLevel validationLevel;
private final @Nullable ValidationAction validationAction;
public ValidationOptions(Validator validator, ValidationLevel validationLevel, ValidationAction validationAction) {
this.validator = validator;
this.validationLevel = validationLevel;
this.validationAction = validationAction;
}
/**
* Create an empty {@link ValidationOptions}.
*

View File

@@ -15,6 +15,9 @@
*/
package org.springframework.data.mongodb.core;
import lombok.NonNull;
import lombok.Value;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
@@ -24,9 +27,8 @@ import java.util.stream.Collectors;
import org.bson.Document;
import org.bson.conversions.Bson;
import org.springframework.context.ApplicationEventPublisher;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
import org.springframework.data.mapping.callback.EntityCallbacks;
import org.springframework.data.mongodb.BulkOperationException;
import org.springframework.data.mongodb.core.convert.QueryMapper;
import org.springframework.data.mongodb.core.convert.UpdateMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
@@ -45,9 +47,7 @@ import org.springframework.data.mongodb.core.query.UpdateDefinition.ArrayFilter;
import org.springframework.data.util.Pair;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import com.mongodb.MongoBulkWriteException;
import com.mongodb.WriteConcern;
import com.mongodb.bulk.BulkWriteResult;
import com.mongodb.client.MongoCollection;
@@ -64,7 +64,6 @@ import com.mongodb.client.model.*;
* @author Jens Schauder
* @author Michail Nikolaev
* @author Roman Puchkovskiy
* @author Jacob Botuck
* @since 1.9
*/
class DefaultBulkOperations implements BulkOperations {
@@ -74,6 +73,7 @@ class DefaultBulkOperations implements BulkOperations {
private final BulkOperationContext bulkOperationContext;
private final List<SourceAwareWriteModelHolder> models = new ArrayList<>();
private PersistenceExceptionTranslator exceptionTranslator;
private @Nullable WriteConcern defaultWriteConcern;
private BulkWriteOptions bulkOptions;
@@ -97,9 +97,19 @@ class DefaultBulkOperations implements BulkOperations {
this.mongoOperations = mongoOperations;
this.collectionName = collectionName;
this.bulkOperationContext = bulkOperationContext;
this.exceptionTranslator = new MongoExceptionTranslator();
this.bulkOptions = getBulkWriteOptions(bulkOperationContext.getBulkMode());
}
/**
* Configures the {@link PersistenceExceptionTranslator} to be used. Defaults to {@link MongoExceptionTranslator}.
*
* @param exceptionTranslator can be {@literal null}.
*/
public void setExceptionTranslator(@Nullable PersistenceExceptionTranslator exceptionTranslator) {
this.exceptionTranslator = exceptionTranslator == null ? new MongoExceptionTranslator() : exceptionTranslator;
}
/**
* Configures the default {@link WriteConcern} to be used. Defaults to {@literal null}.
*
@@ -306,26 +316,11 @@ class DefaultBulkOperations implements BulkOperations {
collection = collection.withWriteConcern(defaultWriteConcern);
}
try {
return collection.bulkWrite( //
models.stream() //
.map(this::extractAndMapWriteModel) //
.collect(Collectors.toList()), //
bulkOptions);
} catch (RuntimeException ex) {
if (ex instanceof MongoBulkWriteException) {
MongoBulkWriteException mongoBulkWriteException = (MongoBulkWriteException) ex;
if (mongoBulkWriteException.getWriteConcernError() != null) {
throw new DataIntegrityViolationException(ex.getMessage(), ex);
}
throw new BulkOperationException(ex.getMessage(), mongoBulkWriteException);
}
throw ex;
}
return collection.bulkWrite( //
models.stream() //
.map(this::extractAndMapWriteModel) //
.collect(Collectors.toList()), //
bulkOptions);
}
private WriteModel<Document> extractAndMapWriteModel(SourceAwareWriteModelHolder it) {
@@ -552,93 +547,15 @@ class DefaultBulkOperations implements BulkOperations {
* @author Christoph Strobl
* @since 2.0
*/
static final class BulkOperationContext {
@Value
static class BulkOperationContext {
private final BulkMode bulkMode;
private final Optional<? extends MongoPersistentEntity<?>> entity;
private final QueryMapper queryMapper;
private final UpdateMapper updateMapper;
private final ApplicationEventPublisher eventPublisher;
private final EntityCallbacks entityCallbacks;
BulkOperationContext(BulkOperations.BulkMode bulkMode, Optional<? extends MongoPersistentEntity<?>> entity,
QueryMapper queryMapper, UpdateMapper updateMapper, ApplicationEventPublisher eventPublisher,
EntityCallbacks entityCallbacks) {
this.bulkMode = bulkMode;
this.entity = entity;
this.queryMapper = queryMapper;
this.updateMapper = updateMapper;
this.eventPublisher = eventPublisher;
this.entityCallbacks = entityCallbacks;
}
public BulkMode getBulkMode() {
return this.bulkMode;
}
public Optional<? extends MongoPersistentEntity<?>> getEntity() {
return this.entity;
}
public QueryMapper getQueryMapper() {
return this.queryMapper;
}
public UpdateMapper getUpdateMapper() {
return this.updateMapper;
}
public ApplicationEventPublisher getEventPublisher() {
return this.eventPublisher;
}
public EntityCallbacks getEntityCallbacks() {
return this.entityCallbacks;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
BulkOperationContext that = (BulkOperationContext) o;
if (bulkMode != that.bulkMode)
return false;
if (!ObjectUtils.nullSafeEquals(this.entity, that.entity)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.queryMapper, that.queryMapper)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.updateMapper, that.updateMapper)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.eventPublisher, that.eventPublisher)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.entityCallbacks, that.entityCallbacks);
}
@Override
public int hashCode() {
int result = bulkMode != null ? bulkMode.hashCode() : 0;
result = 31 * result + ObjectUtils.nullSafeHashCode(entity);
result = 31 * result + ObjectUtils.nullSafeHashCode(queryMapper);
result = 31 * result + ObjectUtils.nullSafeHashCode(updateMapper);
result = 31 * result + ObjectUtils.nullSafeHashCode(eventPublisher);
result = 31 * result + ObjectUtils.nullSafeHashCode(entityCallbacks);
return result;
}
public String toString() {
return "DefaultBulkOperations.BulkOperationContext(bulkMode=" + this.getBulkMode() + ", entity="
+ this.getEntity() + ", queryMapper=" + this.getQueryMapper() + ", updateMapper=" + this.getUpdateMapper()
+ ", eventPublisher=" + this.getEventPublisher() + ", entityCallbacks=" + this.getEntityCallbacks() + ")";
}
@NonNull BulkMode bulkMode;
@NonNull Optional<? extends MongoPersistentEntity<?>> entity;
@NonNull QueryMapper queryMapper;
@NonNull UpdateMapper updateMapper;
ApplicationEventPublisher eventPublisher;
EntityCallbacks entityCallbacks;
}
/**
@@ -647,50 +564,10 @@ class DefaultBulkOperations implements BulkOperations {
* @since 2.2
* @author Christoph Strobl
*/
private static final class SourceAwareWriteModelHolder {
@Value
private static class SourceAwareWriteModelHolder {
private final Object source;
private final WriteModel<Document> model;
SourceAwareWriteModelHolder(Object source, WriteModel<Document> model) {
this.source = source;
this.model = model;
}
public Object getSource() {
return this.source;
}
public WriteModel<Document> getModel() {
return this.model;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
SourceAwareWriteModelHolder that = (SourceAwareWriteModelHolder) o;
if (!ObjectUtils.nullSafeEquals(this.source, that.source)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.model, that.model);
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(model);
result = 31 * result + ObjectUtils.nullSafeHashCode(source);
return result;
}
public String toString() {
return "DefaultBulkOperations.SourceAwareWriteModelHolder(source=" + this.getSource() + ", model="
+ this.getModel() + ")";
}
Object source;
WriteModel<Document> model;
}
}

View File

@@ -15,6 +15,10 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import java.util.Collection;
import java.util.Map;
import java.util.Optional;
@@ -51,15 +55,12 @@ import org.springframework.util.MultiValueMap;
* @see MongoTemplate
* @see ReactiveMongoTemplate
*/
@RequiredArgsConstructor
class EntityOperations {
private static final String ID_FIELD = "_id";
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context;
EntityOperations(MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context) {
this.context = context;
}
private final @NonNull MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context;
/**
* Creates a new {@link Entity} for the given bean.
@@ -68,7 +69,7 @@ class EntityOperations {
* @return new instance of {@link Entity}.
*/
@SuppressWarnings({ "unchecked", "rawtypes" })
<T> Entity<T> forEntity(T entity) {
public <T> Entity<T> forEntity(T entity) {
Assert.notNull(entity, "Bean must not be null!");
@@ -91,7 +92,7 @@ class EntityOperations {
* @return new instance of {@link AdaptibleEntity}.
*/
@SuppressWarnings({ "unchecked", "rawtypes" })
<T> AdaptibleEntity<T> forEntity(T entity, ConversionService conversionService) {
public <T> AdaptibleEntity<T> forEntity(T entity, ConversionService conversionService) {
Assert.notNull(entity, "Bean must not be null!");
Assert.notNull(conversionService, "ConversionService must not be null!");
@@ -345,14 +346,11 @@ class EntityOperations {
Number getVersion();
}
@RequiredArgsConstructor
private static class UnmappedEntity<T extends Map<String, Object>> implements AdaptibleEntity<T> {
private final T map;
protected UnmappedEntity(T map) {
this.map = map;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.EntityOperations.PersistableSource#getIdPropertyName()
@@ -462,7 +460,7 @@ class EntityOperations {
private static class SimpleMappedEntity<T extends Map<String, Object>> extends UnmappedEntity<T> {
protected SimpleMappedEntity(T map) {
SimpleMappedEntity(T map) {
super(map);
}
@@ -485,19 +483,12 @@ class EntityOperations {
}
}
@RequiredArgsConstructor(access = AccessLevel.PROTECTED)
private static class MappedEntity<T> implements Entity<T> {
private final MongoPersistentEntity<?> entity;
private final IdentifierAccessor idAccessor;
private final PersistentPropertyAccessor<T> propertyAccessor;
protected MappedEntity(MongoPersistentEntity<?> entity, IdentifierAccessor idAccessor,
PersistentPropertyAccessor<T> propertyAccessor) {
this.entity = entity;
this.idAccessor = idAccessor;
this.propertyAccessor = propertyAccessor;
}
private final @NonNull MongoPersistentEntity<?> entity;
private final @NonNull IdentifierAccessor idAccessor;
private final @NonNull PersistentPropertyAccessor<T> propertyAccessor;
private static <T> MappedEntity<T> of(T bean,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> context) {
@@ -768,12 +759,11 @@ class EntityOperations {
* {@link TypedOperations} for generic entities that are not represented with {@link PersistentEntity} (e.g. custom
* conversions).
*/
@RequiredArgsConstructor
enum UntypedOperations implements TypedOperations<Object> {
INSTANCE;
UntypedOperations() {}
@SuppressWarnings({ "unchecked", "rawtypes" })
public static <T> TypedOperations<T> instance() {
return (TypedOperations) INSTANCE;
@@ -808,13 +798,10 @@ class EntityOperations {
*
* @param <T>
*/
@RequiredArgsConstructor
static class TypedEntityOperations<T> implements TypedOperations<T> {
private final MongoPersistentEntity<T> entity;
protected TypedEntityOperations(MongoPersistentEntity<T> entity) {
this.entity = entity;
}
private final @NonNull MongoPersistentEntity<T> entity;
/*
* (non-Javadoc)

View File

@@ -15,10 +15,16 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationResults;
import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.util.CloseableIterator;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
@@ -29,13 +35,10 @@ import org.springframework.util.StringUtils;
* @author Mark Paluch
* @since 2.0
*/
@RequiredArgsConstructor
class ExecutableAggregationOperationSupport implements ExecutableAggregationOperation {
private final MongoTemplate template;
ExecutableAggregationOperationSupport(MongoTemplate template) {
this.template = template;
}
private final @NonNull MongoTemplate template;
/*
* (non-Javadoc)
@@ -53,21 +56,15 @@ class ExecutableAggregationOperationSupport implements ExecutableAggregationOper
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
static class ExecutableAggregationSupport<T>
implements AggregationWithAggregation<T>, ExecutableAggregation<T>, TerminatingAggregation<T> {
private final MongoTemplate template;
private final Class<T> domainType;
private final Aggregation aggregation;
private final String collection;
public ExecutableAggregationSupport(MongoTemplate template, Class<T> domainType, Aggregation aggregation,
String collection) {
this.template = template;
this.domainType = domainType;
this.aggregation = aggregation;
this.collection = collection;
}
@NonNull MongoTemplate template;
@NonNull Class<T> domainType;
@Nullable Aggregation aggregation;
@Nullable String collection;
/*
* (non-Javadoc)

View File

@@ -15,6 +15,12 @@
*/
package org.springframework.data.mongodb.core;
import com.mongodb.ReadPreference;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import java.util.List;
import java.util.Optional;
import java.util.stream.Stream;
@@ -31,7 +37,6 @@ import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
import com.mongodb.ReadPreference;
import com.mongodb.client.FindIterable;
/**
@@ -41,15 +46,12 @@ import com.mongodb.client.FindIterable;
* @author Mark Paluch
* @since 2.0
*/
@RequiredArgsConstructor
class ExecutableFindOperationSupport implements ExecutableFindOperation {
private static final Query ALL_QUERY = new Query();
private final MongoTemplate template;
ExecutableFindOperationSupport(MongoTemplate template) {
this.template = template;
}
private final @NonNull MongoTemplate template;
/*
* (non-Javadoc)
@@ -68,23 +70,16 @@ class ExecutableFindOperationSupport implements ExecutableFindOperation {
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
static class ExecutableFindSupport<T>
implements ExecutableFind<T>, FindWithCollection<T>, FindWithProjection<T>, FindWithQuery<T> {
private final MongoTemplate template;
private final Class<?> domainType;
private final Class<T> returnType;
@Nullable private final String collection;
private final Query query;
ExecutableFindSupport(MongoTemplate template, Class<?> domainType, Class<T> returnType,
String collection, Query query) {
this.template = template;
this.domainType = domainType;
this.returnType = returnType;
this.collection = collection;
this.query = query;
}
@NonNull MongoTemplate template;
@NonNull Class<?> domainType;
Class<T> returnType;
@Nullable String collection;
Query query;
/*
* (non-Javadoc)

View File

@@ -15,6 +15,11 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import java.util.ArrayList;
import java.util.Collection;
@@ -32,13 +37,10 @@ import com.mongodb.bulk.BulkWriteResult;
* @author Mark Paluch
* @since 2.0
*/
@RequiredArgsConstructor
class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
private final MongoTemplate template;
ExecutableInsertOperationSupport(MongoTemplate template) {
this.template = template;
}
private final @NonNull MongoTemplate template;
/*
* (non-Javadoc)
@@ -56,20 +58,14 @@ class ExecutableInsertOperationSupport implements ExecutableInsertOperation {
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
static class ExecutableInsertSupport<T> implements ExecutableInsert<T> {
private final MongoTemplate template;
private final Class<T> domainType;
@Nullable private final String collection;
@Nullable private final BulkMode bulkMode;
ExecutableInsertSupport(MongoTemplate template, Class<T> domainType, String collection, BulkMode bulkMode) {
this.template = template;
this.domainType = domainType;
this.collection = collection;
this.bulkMode = bulkMode;
}
@NonNull MongoTemplate template;
@NonNull Class<T> domainType;
@Nullable String collection;
@Nullable BulkMode bulkMode;
/*
* (non-Javadoc)

View File

@@ -15,6 +15,9 @@
*/
package org.springframework.data.mongodb.core;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import java.util.List;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
@@ -29,17 +32,12 @@ import org.springframework.util.StringUtils;
* @author Christoph Strobl
* @since 2.1
*/
@RequiredArgsConstructor
class ExecutableMapReduceOperationSupport implements ExecutableMapReduceOperation {
private static final Query ALL_QUERY = new Query();
private final MongoTemplate template;
ExecutableMapReduceOperationSupport(MongoTemplate template) {
Assert.notNull(template, "Template must not be null!");
this.template = template;
}
private final @NonNull MongoTemplate template;
/*
* (non-Javascript)

View File

@@ -15,6 +15,11 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import java.util.List;
import org.springframework.data.mongodb.core.query.Query;
@@ -31,15 +36,12 @@ import com.mongodb.client.result.DeleteResult;
* @author Mark Paluch
* @since 2.0
*/
@RequiredArgsConstructor
class ExecutableRemoveOperationSupport implements ExecutableRemoveOperation {
private static final Query ALL_QUERY = new Query();
private final MongoTemplate tempate;
public ExecutableRemoveOperationSupport(MongoTemplate tempate) {
this.tempate = tempate;
}
private final @NonNull MongoTemplate tempate;
/*
* (non-Javadoc)
@@ -57,19 +59,14 @@ class ExecutableRemoveOperationSupport implements ExecutableRemoveOperation {
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
static class ExecutableRemoveSupport<T> implements ExecutableRemove<T>, RemoveWithCollection<T> {
private final MongoTemplate template;
private final Class<T> domainType;
private final Query query;
@Nullable private final String collection;
public ExecutableRemoveSupport(MongoTemplate template, Class<T> domainType, Query query, String collection) {
this.template = template;
this.domainType = domainType;
this.query = query;
this.collection = collection;
}
@NonNull MongoTemplate template;
@NonNull Class<T> domainType;
Query query;
@Nullable String collection;
/*
* (non-Javadoc)

View File

@@ -15,6 +15,11 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.lang.Nullable;
@@ -30,15 +35,12 @@ import com.mongodb.client.result.UpdateResult;
* @author Mark Paluch
* @since 2.0
*/
@RequiredArgsConstructor
class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
private static final Query ALL_QUERY = new Query();
private final MongoTemplate template;
ExecutableUpdateOperationSupport(MongoTemplate template) {
this.template = template;
}
private final @NonNull MongoTemplate template;
/*
* (non-Javadoc)
@@ -56,34 +58,21 @@ class ExecutableUpdateOperationSupport implements ExecutableUpdateOperation {
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
static class ExecutableUpdateSupport<T>
implements ExecutableUpdate<T>, UpdateWithCollection<T>, UpdateWithQuery<T>, TerminatingUpdate<T>,
FindAndReplaceWithOptions<T>, TerminatingFindAndReplace<T>, FindAndReplaceWithProjection<T> {
private final MongoTemplate template;
private final Class domainType;
private final Query query;
@Nullable private final UpdateDefinition update;
@Nullable private final String collection;
@Nullable private final FindAndModifyOptions findAndModifyOptions;
@Nullable private final FindAndReplaceOptions findAndReplaceOptions;
@Nullable private final Object replacement;
private final Class<T> targetType;
ExecutableUpdateSupport(MongoTemplate template, Class domainType, Query query, UpdateDefinition update,
String collection, FindAndModifyOptions findAndModifyOptions, FindAndReplaceOptions findAndReplaceOptions,
Object replacement, Class<T> targetType) {
this.template = template;
this.domainType = domainType;
this.query = query;
this.update = update;
this.collection = collection;
this.findAndModifyOptions = findAndModifyOptions;
this.findAndReplaceOptions = findAndReplaceOptions;
this.replacement = replacement;
this.targetType = targetType;
}
@NonNull MongoTemplate template;
@NonNull Class domainType;
Query query;
@Nullable UpdateDefinition update;
@Nullable String collection;
@Nullable FindAndModifyOptions findAndModifyOptions;
@Nullable FindAndReplaceOptions findAndReplaceOptions;
@Nullable Object replacement;
@NonNull Class<T> targetType;
/*
* (non-Javadoc)

View File

@@ -15,6 +15,9 @@
*/
package org.springframework.data.mongodb.core;
import lombok.Getter;
import lombok.RequiredArgsConstructor;
import java.util.Collection;
import java.util.List;
@@ -24,6 +27,8 @@ import org.springframework.data.mongodb.core.query.Update;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.data.util.StreamUtils;
import com.mongodb.client.model.Filters;
/**
* A MongoDB document in its mapped state. I.e. after a source document has been mapped using mapping information of the
* entity the source document was supposed to represent.
@@ -31,20 +36,13 @@ import org.springframework.data.util.StreamUtils;
* @author Oliver Gierke
* @since 2.1
*/
@RequiredArgsConstructor(staticName = "of")
public class MappedDocument {
private static final String ID_FIELD = "_id";
private static final Document ID_ONLY_PROJECTION = new Document(ID_FIELD, 1);
private final Document document;
private MappedDocument(Document document) {
this.document = document;
}
public static MappedDocument of(Document document) {
return new MappedDocument(document);
}
private final @Getter Document document;
public static Document getIdOnlyProjection() {
return ID_ONLY_PROJECTION;
@@ -93,10 +91,6 @@ public class MappedDocument {
return new MappedUpdate(Update.fromDocument(document, ID_FIELD));
}
public Document getDocument() {
return this.document;
}
/**
* An {@link UpdateDefinition} that indicates that the {@link #getUpdateObject() update object} has already been
* mapped to the specific domain type.

View File

@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core;
import lombok.Value;
import org.springframework.aop.framework.ProxyFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.support.PersistenceExceptionTranslator;
@@ -22,7 +24,6 @@ import org.springframework.data.mongodb.MongoDatabaseFactory;
import org.springframework.data.mongodb.SessionAwareMethodInterceptor;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import com.mongodb.ClientSessionOptions;
import com.mongodb.WriteConcern;
@@ -170,15 +171,11 @@ public abstract class MongoDatabaseFactorySupport<C> implements MongoDatabaseFac
* @author Christoph Strobl
* @since 2.1
*/
static final class ClientSessionBoundMongoDbFactory implements MongoDatabaseFactory {
@Value
static class ClientSessionBoundMongoDbFactory implements MongoDatabaseFactory {
private final ClientSession session;
private final MongoDatabaseFactory delegate;
public ClientSessionBoundMongoDbFactory(ClientSession session, MongoDatabaseFactory delegate) {
this.session = session;
this.delegate = delegate;
}
ClientSession session;
MongoDatabaseFactory delegate;
/*
* (non-Javadoc)
@@ -259,40 +256,5 @@ public abstract class MongoDatabaseFactorySupport<C> implements MongoDatabaseFac
return targetType.cast(factory.getProxy(target.getClass().getClassLoader()));
}
public ClientSession getSession() {
return this.session;
}
public MongoDatabaseFactory getDelegate() {
return this.delegate;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
ClientSessionBoundMongoDbFactory that = (ClientSessionBoundMongoDbFactory) o;
if (!ObjectUtils.nullSafeEquals(this.session, that.session)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.delegate, that.delegate);
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(this.session);
result = 31 * result + ObjectUtils.nullSafeHashCode(this.delegate);
return result;
}
public String toString() {
return "MongoDatabaseFactorySupport.ClientSessionBoundMongoDbFactory(session=" + this.getSession() + ", delegate="
+ this.getDelegate() + ")";
}
}
}

View File

@@ -1184,29 +1184,6 @@ public interface MongoOperations extends FluentMongoOperations {
*/
long count(Query query, String collectionName);
/**
* Estimate the number of documents, in the collection {@link #getCollectionName(Class) identified by the given type},
* based on collection statistics.
*
* @param entityClass must not be {@literal null}.
* @return the estimated number of documents.
* @since 3.1
*/
default long estimatedCount(Class<?> entityClass) {
Assert.notNull(entityClass, "Entity class must not be null!");
return estimatedCount(getCollectionName(entityClass));
}
/**
* Estimate the number of documents in the given collection based on collection statistics.
*
* @param collectionName must not be {@literal null}.
* @return the estimated number of documents.
* @since 3.1
*/
long estimatedCount(String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the given collection using the given entity
* class to map the given {@link Query}. <br />

View File

@@ -17,6 +17,11 @@ package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.core.query.SerializationUtils.*;
import lombok.AccessLevel;
import lombok.AllArgsConstructor;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import java.io.IOException;
import java.math.BigDecimal;
import java.math.RoundingMode;
@@ -155,7 +160,6 @@ import com.mongodb.client.result.UpdateResult;
* @author Cimon Lucas
* @author Michael J. Simons
* @author Roman Puchkovskiy
* @author Yadhukrishna S Pai
*/
public class MongoTemplate implements MongoOperations, ApplicationContextAware, IndexOperationsProvider {
@@ -759,6 +763,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
new BulkOperationContext(mode, Optional.ofNullable(getPersistentEntity(entityType)), queryMapper, updateMapper,
eventPublisher, entityCallbacks));
operations.setExceptionTranslator(exceptionTranslator);
operations.setDefaultWriteConcern(writeConcern);
return operations;
@@ -1135,19 +1140,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
collection -> collection.countDocuments(CountQuery.of(filter).toQueryDocument(), options));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#estimatedCount(java.lang.String)
*/
@Override
public long estimatedCount(String collectionName) {
return doEstimatedCount(collectionName, new EstimatedDocumentCountOptions());
}
protected long doEstimatedCount(String collectionName, EstimatedDocumentCountOptions options) {
return execute(collectionName, collection -> collection.estimatedDocumentCount(options));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#insert(java.lang.Object)
@@ -1977,7 +1969,9 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
Assert.notNull(aggregation, "Aggregation pipeline must not be null!");
return aggregate(aggregation, inputCollectionName, outputType, null);
AggregationOperationContext context = new TypeBasedAggregationOperationContext(aggregation.getInputType(),
mappingContext, queryMapper);
return aggregate(aggregation, inputCollectionName, outputType, context);
}
/* (non-Javadoc)
@@ -2147,7 +2141,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
}
options.getComment().ifPresent(aggregateIterable::comment);
options.getHint().ifPresent(aggregateIterable::hint);
if (options.hasExecutionTimeLimit()) {
aggregateIterable = aggregateIterable.maxTime(options.getMaxTime().toMillis(), TimeUnit.MILLISECONDS);
@@ -2206,7 +2199,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
}
options.getComment().ifPresent(cursor::comment);
options.getHint().ifPresent(cursor::hint);
Class<?> domainType = aggregation instanceof TypedAggregation ? ((TypedAggregation) aggregation).getInputType()
: null;
@@ -2969,17 +2961,12 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
private class ExistsCallback implements CollectionCallback<Boolean> {
private final Document mappedQuery;
private final com.mongodb.client.model.Collation collation;
ExistsCallback(Document mappedQuery, com.mongodb.client.model.Collation collation) {
this.mappedQuery = mappedQuery;
this.collation = collation;
}
@Override
public Boolean doInCollection(MongoCollection<Document> collection) throws MongoException, DataAccessException {
@@ -3001,7 +2988,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
private final Document sort;
private final Optional<Collation> collation;
FindAndRemoveCallback(Document query, Document fields, Document sort, @Nullable Collation collation) {
public FindAndRemoveCallback(Document query, Document fields, Document sort, @Nullable Collation collation) {
this.query = query;
this.fields = fields;
@@ -3027,9 +3014,8 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
private final List<Document> arrayFilters;
private final FindAndModifyOptions options;
FindAndModifyCallback(Document query, Document fields, Document sort, Object update, List<Document> arrayFilters,
FindAndModifyOptions options) {
public FindAndModifyCallback(Document query, Document fields, Document sort, Object update,
List<Document> arrayFilters, FindAndModifyOptions options) {
this.query = query;
this.fields = fields;
this.sort = sort;
@@ -3138,19 +3124,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @author Christoph Strobl
* @author Roman Puchkovskiy
*/
@RequiredArgsConstructor
private class ReadDocumentCallback<T> implements DocumentCallback<T> {
private final EntityReader<? super T, Bson> reader;
private final Class<T> type;
private final @NonNull EntityReader<? super T, Bson> reader;
private final @NonNull Class<T> type;
private final String collectionName;
ReadDocumentCallback(EntityReader<? super T, Bson> reader, Class<T> type, String collectionName) {
this.reader = reader;
this.type = type;
this.collectionName = collectionName;
}
@Nullable
public T doWith(@Nullable Document document) {
@@ -3178,21 +3158,13 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @param <T>
* @since 2.0
*/
@RequiredArgsConstructor
private class ProjectingReadCallback<S, T> implements DocumentCallback<T> {
private final EntityReader<Object, Bson> reader;
private final Class<S> entityType;
private final Class<T> targetType;
private final String collectionName;
ProjectingReadCallback(EntityReader<Object, Bson> reader, Class<S> entityType, Class<T> targetType,
String collectionName) {
this.reader = reader;
this.entityType = entityType;
this.targetType = targetType;
this.collectionName = collectionName;
}
private final @NonNull EntityReader<Object, Bson> reader;
private final @NonNull Class<S> entityType;
private final @NonNull Class<T> targetType;
private final @NonNull String collectionName;
/*
* (non-Javadoc)
@@ -3228,7 +3200,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
private final Query query;
private final @Nullable Class<?> type;
QueryCursorPreparer(Query query, @Nullable Class<?> type) {
public QueryCursorPreparer(Query query, @Nullable Class<?> type) {
this.query = query;
this.type = type;
@@ -3372,6 +3344,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @author Thomas Darimont
* @since 1.7
*/
@AllArgsConstructor(access = AccessLevel.PACKAGE)
static class CloseableIterableCursorAdapter<T> implements CloseableIterator<T> {
private volatile @Nullable MongoCursor<Document> cursor;
@@ -3385,22 +3358,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware,
* @param exceptionTranslator
* @param objectReadCallback
*/
CloseableIterableCursorAdapter(MongoIterable<Document> cursor, PersistenceExceptionTranslator exceptionTranslator,
DocumentCallback<T> objectReadCallback) {
public CloseableIterableCursorAdapter(MongoIterable<Document> cursor,
PersistenceExceptionTranslator exceptionTranslator, DocumentCallback<T> objectReadCallback) {
this.cursor = cursor.iterator();
this.exceptionTranslator = exceptionTranslator;
this.objectReadCallback = objectReadCallback;
}
CloseableIterableCursorAdapter(MongoCursor<Document> cursor, PersistenceExceptionTranslator exceptionTranslator,
DocumentCallback<T> objectReadCallback) {
this.cursor = cursor;
this.exceptionTranslator = exceptionTranslator;
this.objectReadCallback = objectReadCallback;
}
@Override
public boolean hasNext() {

View File

@@ -15,6 +15,9 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.RequiredArgsConstructor;
import org.bson.Document;
import org.springframework.data.mapping.SimplePropertyHandler;
import org.springframework.data.mapping.context.MappingContext;
@@ -30,14 +33,11 @@ import org.springframework.util.ClassUtils;
* @author Christoph Strobl
* @since 2.1
*/
@RequiredArgsConstructor(access = AccessLevel.PACKAGE)
class PropertyOperations {
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
PropertyOperations(MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
this.mappingContext = mappingContext;
}
/**
* For cases where {@code fields} is {@link Document#isEmpty() empty} include only fields that are required for
* creating the projection (target) type if the {@code targetType} is a {@literal DTO projection} or a

View File

@@ -15,6 +15,10 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import reactor.core.publisher.Flux;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
@@ -58,22 +62,15 @@ class ReactiveAggregationOperationSupport implements ReactiveAggregationOperatio
return new ReactiveAggregationSupport<>(template, domainType, null, null);
}
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
static class ReactiveAggregationSupport<T>
implements AggregationOperationWithAggregation<T>, ReactiveAggregation<T>, TerminatingAggregationOperation<T> {
private final ReactiveMongoTemplate template;
private final Class<T> domainType;
private final Aggregation aggregation;
private final String collection;
ReactiveAggregationSupport(ReactiveMongoTemplate template, Class<T> domainType, Aggregation aggregation,
String collection) {
this.template = template;
this.domainType = domainType;
this.aggregation = aggregation;
this.collection = collection;
}
@NonNull ReactiveMongoTemplate template;
@NonNull Class<T> domainType;
Aggregation aggregation;
String collection;
/*
* (non-Javadoc)

View File

@@ -15,10 +15,15 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import org.bson.Document;
import org.springframework.dao.IncorrectResultSizeDataAccessException;
import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
@@ -34,15 +39,12 @@ import org.springframework.util.StringUtils;
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
class ReactiveFindOperationSupport implements ReactiveFindOperation {
private static final Query ALL_QUERY = new Query();
private final ReactiveMongoTemplate template;
ReactiveFindOperationSupport(ReactiveMongoTemplate template) {
this.template = template;
}
private final @NonNull ReactiveMongoTemplate template;
/*
* (non-Javadoc)
@@ -62,24 +64,16 @@ class ReactiveFindOperationSupport implements ReactiveFindOperation {
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
static class ReactiveFindSupport<T>
implements ReactiveFind<T>, FindWithCollection<T>, FindWithProjection<T>, FindWithQuery<T> {
private final ReactiveMongoTemplate template;
private final Class<?> domainType;
private final Class<T> returnType;
private final String collection;
private final Query query;
ReactiveFindSupport(ReactiveMongoTemplate template, Class<?> domainType, Class<T> returnType,
String collection, Query query) {
this.template = template;
this.domainType = domainType;
this.returnType = returnType;
this.collection = collection;
this.query = query;
}
@NonNull ReactiveMongoTemplate template;
@NonNull Class<?> domainType;
Class<T> returnType;
String collection;
Query query;
/*
* (non-Javadoc)

View File

@@ -15,6 +15,10 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
@@ -30,13 +34,10 @@ import org.springframework.util.StringUtils;
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
class ReactiveInsertOperationSupport implements ReactiveInsertOperation {
private final ReactiveMongoTemplate template;
ReactiveInsertOperationSupport(ReactiveMongoTemplate template) {
this.template = template;
}
private final @NonNull ReactiveMongoTemplate template;
/*
* (non-Javadoc)
@@ -50,18 +51,13 @@ class ReactiveInsertOperationSupport implements ReactiveInsertOperation {
return new ReactiveInsertSupport<>(template, domainType, null);
}
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
static class ReactiveInsertSupport<T> implements ReactiveInsert<T> {
private final ReactiveMongoTemplate template;
private final Class<T> domainType;
private final String collection;
ReactiveInsertSupport(ReactiveMongoTemplate template, Class<T> domainType, String collection) {
this.template = template;
this.domainType = domainType;
this.collection = collection;
}
@NonNull ReactiveMongoTemplate template;
@NonNull Class<T> domainType;
String collection;
/*
* (non-Javadoc)

View File

@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import reactor.core.publisher.Flux;
import org.springframework.data.mongodb.core.mapreduce.MapReduceOptions;
@@ -29,15 +31,12 @@ import org.springframework.util.StringUtils;
* @author Christoph Strobl
* @since 2.1
*/
@RequiredArgsConstructor
class ReactiveMapReduceOperationSupport implements ReactiveMapReduceOperation {
private static final Query ALL_QUERY = new Query();
private final ReactiveMongoTemplate template;
ReactiveMapReduceOperationSupport(ReactiveMongoTemplate template) {
this.template = template;
}
private final @NonNull ReactiveMongoTemplate template;
/*
* (non-Javascript)

View File

@@ -15,15 +15,11 @@
*/
package org.springframework.data.mongodb.core;
import org.reactivestreams.Publisher;
import org.springframework.util.Assert;
import reactor.core.publisher.Mono;
import reactor.util.context.Context;
import java.util.function.Function;
import org.reactivestreams.Publisher;
import org.springframework.util.Assert;
import com.mongodb.reactivestreams.client.ClientSession;
/**
@@ -33,7 +29,7 @@ import com.mongodb.reactivestreams.client.ClientSession;
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.1
* @see Mono#deferContextual(Function)
* @see Mono#subscriberContext()
* @see Context
*/
public class ReactiveMongoContext {
@@ -50,14 +46,8 @@ public class ReactiveMongoContext {
*/
public static Mono<ClientSession> getSession() {
return Mono.deferContextual(ctx -> {
if (ctx.hasKey(SESSION_KEY)) {
return ctx.<Mono<ClientSession>> get(SESSION_KEY);
}
return Mono.empty();
});
return Mono.subscriberContext().filter(ctx -> ctx.hasKey(SESSION_KEY))
.flatMap(ctx -> ctx.<Mono<ClientSession>> get(SESSION_KEY));
}
/**

View File

@@ -980,29 +980,6 @@ public interface ReactiveMongoOperations extends ReactiveFluentMongoOperations {
*/
Mono<Long> count(Query query, @Nullable Class<?> entityClass, String collectionName);
/**
* Estimate the number of documents, in the collection {@link #getCollectionName(Class) identified by the given type},
* based on collection statistics.
*
* @param entityClass must not be {@literal null}.
* @return a {@link Mono} emitting the estimated number of documents.
* @since 3.1
*/
default Mono<Long> estimatedCount(Class<?> entityClass) {
Assert.notNull(entityClass, "Entity class must not be null!");
return estimatedCount(getCollectionName(entityClass));
}
/**
* Estimate the number of documents in the given collection based on collection statistics.
*
* @param collectionName must not be {@literal null}.
* @return a {@link Mono} emitting the estimated number of documents.
* @since 3.1
*/
Mono<Long> estimatedCount(String collectionName);
/**
* Insert the object into the collection for the entity type of the object to save.
* <p/>

View File

@@ -17,6 +17,9 @@ package org.springframework.data.mongodb.core;
import static org.springframework.data.mongodb.core.query.SerializationUtils.*;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import reactor.util.function.Tuple2;
@@ -117,7 +120,16 @@ import com.mongodb.CursorType;
import com.mongodb.MongoException;
import com.mongodb.ReadPreference;
import com.mongodb.WriteConcern;
import com.mongodb.client.model.*;
import com.mongodb.client.model.CountOptions;
import com.mongodb.client.model.CreateCollectionOptions;
import com.mongodb.client.model.DeleteOptions;
import com.mongodb.client.model.FindOneAndDeleteOptions;
import com.mongodb.client.model.FindOneAndReplaceOptions;
import com.mongodb.client.model.FindOneAndUpdateOptions;
import com.mongodb.client.model.ReplaceOptions;
import com.mongodb.client.model.ReturnDocument;
import com.mongodb.client.model.UpdateOptions;
import com.mongodb.client.model.ValidationOptions;
import com.mongodb.client.model.changestream.FullDocument;
import com.mongodb.client.result.DeleteResult;
import com.mongodb.client.result.InsertOneResult;
@@ -146,7 +158,6 @@ import com.mongodb.reactivestreams.client.MongoDatabase;
* @author Christoph Strobl
* @author Roman Puchkovskiy
* @author Mathieu Ouellet
* @author Yadhukrishna S Pai
* @since 2.0
*/
public class ReactiveMongoTemplate implements ReactiveMongoOperations, ApplicationContextAware {
@@ -573,7 +584,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
ReactiveMongoTemplate.this);
return Flux.from(action.doInSession(operations)) //
.contextWrite(ctx -> ReactiveMongoContext.setSession(ctx, Mono.just(session)));
.subscriberContext(ctx -> ReactiveMongoContext.setSession(ctx, Mono.just(session)));
}
/*
@@ -1024,7 +1035,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
}
options.getComment().ifPresent(cursor::comment);
options.getHint().ifPresent(cursor::hint);
Optionals.firstNonEmpty(options::getCollation, () -> operations.forType(inputType).getCollation()) //
.map(Collation::toMongoCollation) //
@@ -1248,15 +1258,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#estimatedCount(java.lang.String)
*/
@Override
public Mono<Long> estimatedCount(String collectionName) {
return doEstimatedCount(collectionName, new EstimatedDocumentCountOptions());
}
/**
* Run the actual count operation against the collection with given name.
*
@@ -1271,11 +1272,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
collection -> collection.countDocuments(CountQuery.of(filter).toQueryDocument(), options));
}
protected Mono<Long> doEstimatedCount(String collectionName, EstimatedDocumentCountOptions options) {
return createMono(collectionName, collection -> collection.estimatedDocumentCount(options));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveMongoOperations#insert(reactor.core.publisher.Mono)
@@ -2422,8 +2418,8 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @param query the query document that specifies the criteria used to find a record.
* @param fields the document that specifies the fields to be returned.
* @param entityClass the parameterized type of the returned list.
* @param preparer allows for customization of the {@link com.mongodb.client.FindIterable} used when iterating over
* the result set, (apply limits, skips and so on).
* @param preparer allows for customization of the {@link com.mongodb.client.FindIterable} used when iterating over the result set, (apply
* limits, skips and so on).
* @return the {@link List} of converted objects.
*/
protected <T> Flux<T> doFind(String collectionName, Document query, Document fields, Class<T> entityClass,
@@ -2897,6 +2893,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
*
* @author Mark Paluch
*/
@RequiredArgsConstructor
private static class FindCallback implements ReactiveCollectionQueryCallback<Document> {
private final @Nullable Document query;
@@ -2906,12 +2903,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
this(query, null);
}
FindCallback(Document query, Document fields) {
this.query = query;
this.fields = fields;
}
@Override
public FindPublisher<Document> doInCollection(MongoCollection<Document> collection) {
@@ -2965,6 +2956,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
/**
* @author Mark Paluch
*/
@RequiredArgsConstructor
private static class FindAndModifyCallback implements ReactiveCollectionCallback<Document> {
private final Document query;
@@ -2974,17 +2966,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
private final List<Document> arrayFilters;
private final FindAndModifyOptions options;
FindAndModifyCallback(Document query, Document fields, Document sort, Object update, List<Document> arrayFilters,
FindAndModifyOptions options) {
this.query = query;
this.fields = fields;
this.sort = sort;
this.update = update;
this.arrayFilters = arrayFilters;
this.options = options;
}
@Override
public Publisher<Document> doInCollection(MongoCollection<Document> collection)
throws MongoException, DataAccessException {
@@ -3040,6 +3021,7 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @author Christoph Strobl
* @since 2.1
*/
@RequiredArgsConstructor(access = AccessLevel.PACKAGE)
private static class FindAndReplaceCallback implements ReactiveCollectionCallback<Document> {
private final Document query;
@@ -3049,17 +3031,6 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
private final @Nullable com.mongodb.client.model.Collation collation;
private final FindAndReplaceOptions options;
FindAndReplaceCallback(Document query, Document fields, Document sort, Document update,
com.mongodb.client.model.Collation collation, FindAndReplaceOptions options) {
this.query = query;
this.fields = fields;
this.sort = sort;
this.update = update;
this.collation = collation;
this.options = options;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.ReactiveCollectionCallback#doInCollection(com.mongodb.reactivestreams.client.MongoCollection)
@@ -3176,20 +3147,13 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
* @author Roman Puchkovskiy
* @since 2.0
*/
@RequiredArgsConstructor
private class ProjectingReadCallback<S, T> implements DocumentCallback<T> {
private final EntityReader<Object, Bson> reader;
private final Class<S> entityType;
private final Class<T> targetType;
private final String collectionName;
ProjectingReadCallback(EntityReader<Object, Bson> reader, Class<S> entityType, Class<T> targetType,
String collectionName) {
this.reader = reader;
this.entityType = entityType;
this.targetType = targetType;
this.collectionName = collectionName;
}
private final @NonNull EntityReader<Object, Bson> reader;
private final @NonNull Class<S> entityType;
private final @NonNull Class<T> targetType;
private final @NonNull String collectionName;
@SuppressWarnings("unchecked")
public Mono<T> doWith(Document document) {
@@ -3410,14 +3374,11 @@ public class ReactiveMongoTemplate implements ReactiveMongoOperations, Applicati
}
}
@RequiredArgsConstructor
class IndexCreatorEventListener implements ApplicationListener<MappingContextEvent<?, ?>> {
final Consumer<Throwable> subscriptionExceptionHandler;
public IndexCreatorEventListener(Consumer<Throwable> subscriptionExceptionHandler) {
this.subscriptionExceptionHandler = subscriptionExceptionHandler;
}
@Override
public void onApplicationEvent(MappingContextEvent<?, ?> event) {

View File

@@ -15,6 +15,10 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
@@ -31,15 +35,12 @@ import com.mongodb.client.result.DeleteResult;
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
class ReactiveRemoveOperationSupport implements ReactiveRemoveOperation {
private static final Query ALL_QUERY = new Query();
private final ReactiveMongoTemplate tempate;
ReactiveRemoveOperationSupport(ReactiveMongoTemplate tempate) {
this.tempate = tempate;
}
private final @NonNull ReactiveMongoTemplate tempate;
/*
* (non-Javadoc)
@@ -53,20 +54,14 @@ class ReactiveRemoveOperationSupport implements ReactiveRemoveOperation {
return new ReactiveRemoveSupport<>(tempate, domainType, ALL_QUERY, null);
}
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
static class ReactiveRemoveSupport<T> implements ReactiveRemove<T>, RemoveWithCollection<T> {
private final ReactiveMongoTemplate template;
private final Class<T> domainType;
private final Query query;
private final String collection;
ReactiveRemoveSupport(ReactiveMongoTemplate template, Class<T> domainType, Query query, String collection) {
this.template = template;
this.domainType = domainType;
this.query = query;
this.collection = collection;
}
@NonNull ReactiveMongoTemplate template;
@NonNull Class<T> domainType;
Query query;
String collection;
/*
* (non-Javadoc)

View File

@@ -15,10 +15,13 @@
*/
package org.springframework.data.mongodb.core;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import lombok.experimental.FieldDefaults;
import reactor.core.publisher.Mono;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.UpdateDefinition;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
@@ -32,15 +35,12 @@ import com.mongodb.client.result.UpdateResult;
* @author Christoph Strobl
* @since 2.0
*/
@RequiredArgsConstructor
class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
private static final Query ALL_QUERY = new Query();
private final ReactiveMongoTemplate template;
ReactiveUpdateOperationSupport(ReactiveMongoTemplate template) {
this.template = template;
}
private final @NonNull ReactiveMongoTemplate template;
/*
* (non-Javadoc)
@@ -54,34 +54,21 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
return new ReactiveUpdateSupport<>(template, domainType, ALL_QUERY, null, null, null, null, null, domainType);
}
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
static class ReactiveUpdateSupport<T>
implements ReactiveUpdate<T>, UpdateWithCollection<T>, UpdateWithQuery<T>, TerminatingUpdate<T>,
FindAndReplaceWithOptions<T>, FindAndReplaceWithProjection<T>, TerminatingFindAndReplace<T> {
private final ReactiveMongoTemplate template;
private final Class<?> domainType;
private final Query query;
private final org.springframework.data.mongodb.core.query.UpdateDefinition update;
@Nullable private final String collection;
@Nullable private final FindAndModifyOptions findAndModifyOptions;
@Nullable private final FindAndReplaceOptions findAndReplaceOptions;
@Nullable private final Object replacement;
private final Class<T> targetType;
ReactiveUpdateSupport(ReactiveMongoTemplate template, Class<?> domainType, Query query, UpdateDefinition update,
String collection, FindAndModifyOptions findAndModifyOptions, FindAndReplaceOptions findAndReplaceOptions,
Object replacement, Class<T> targetType) {
this.template = template;
this.domainType = domainType;
this.query = query;
this.update = update;
this.collection = collection;
this.findAndModifyOptions = findAndModifyOptions;
this.findAndReplaceOptions = findAndReplaceOptions;
this.replacement = replacement;
this.targetType = targetType;
}
@NonNull ReactiveMongoTemplate template;
@NonNull Class<?> domainType;
Query query;
org.springframework.data.mongodb.core.query.UpdateDefinition update;
@Nullable String collection;
@Nullable FindAndModifyOptions findAndModifyOptions;
@Nullable FindAndReplaceOptions findAndReplaceOptions;
@Nullable Object replacement;
@NonNull Class<T> targetType;
/*
* (non-Javadoc)
@@ -136,9 +123,7 @@ class ReactiveUpdateOperationSupport implements ReactiveUpdateOperation {
String collectionName = getCollectionName();
return template.findAndModify(query, update,
findAndModifyOptions != null ? findAndModifyOptions : FindAndModifyOptions.none(), targetType,
collectionName);
return template.findAndModify(query, update, findAndModifyOptions != null ? findAndModifyOptions : FindAndModifyOptions.none(), targetType, collectionName);
}
/*

View File

@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.core;
import lombok.Value;
import reactor.core.publisher.Mono;
import org.bson.codecs.configuration.CodecRegistry;
@@ -26,7 +27,6 @@ import org.springframework.data.mongodb.ReactiveMongoDatabaseFactory;
import org.springframework.data.mongodb.SessionAwareMethodInterceptor;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
import com.mongodb.ClientSessionOptions;
import com.mongodb.ConnectionString;
@@ -175,16 +175,11 @@ public class SimpleReactiveMongoDatabaseFactory implements DisposableBean, React
* @author Christoph Strobl
* @since 2.1
*/
static final class ClientSessionBoundMongoDbFactory implements ReactiveMongoDatabaseFactory {
@Value
static class ClientSessionBoundMongoDbFactory implements ReactiveMongoDatabaseFactory {
private final ClientSession session;
private final ReactiveMongoDatabaseFactory delegate;
ClientSessionBoundMongoDbFactory(ClientSession session, ReactiveMongoDatabaseFactory delegate) {
this.session = session;
this.delegate = delegate;
}
ClientSession session;
ReactiveMongoDatabaseFactory delegate;
/*
* (non-Javadoc)
@@ -273,40 +268,5 @@ public class SimpleReactiveMongoDatabaseFactory implements DisposableBean, React
return targetType.cast(factory.getProxy(target.getClass().getClassLoader()));
}
public ClientSession getSession() {
return this.session;
}
public ReactiveMongoDatabaseFactory getDelegate() {
return this.delegate;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
ClientSessionBoundMongoDbFactory that = (ClientSessionBoundMongoDbFactory) o;
if (!ObjectUtils.nullSafeEquals(this.session, that.session)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.delegate, that.delegate);
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(this.session);
result = 31 * result + ObjectUtils.nullSafeHashCode(this.delegate);
return result;
}
public String toString() {
return "SimpleReactiveMongoDatabaseFactory.ClientSessionBoundMongoDbFactory(session=" + this.getSession()
+ ", delegate=" + this.getDelegate() + ")";
}
}
}

View File

@@ -29,11 +29,8 @@ import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Support class for {@link AggregationExpression} implementations.
*
* @author Christoph Strobl
* @author Matt Morrissette
* @author Mark Paluch
* @since 1.10
*/
abstract class AbstractAggregationExpression implements AggregationExpression {
@@ -52,6 +49,7 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
return toDocument(this.value, context);
}
@SuppressWarnings("unchecked")
public Document toDocument(Object value, AggregationOperationContext context) {
return new Document(getMongoMethod(), unpack(value, context));
}
@@ -103,19 +101,17 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
return value;
}
@SuppressWarnings("unchecked")
protected List<Object> append(Object value, Expand expandList) {
if (this.value instanceof List) {
List<Object> clone = new ArrayList<>((List<Object>) this.value);
List<Object> clone = new ArrayList<Object>((List) this.value);
if (value instanceof Collection && Expand.EXPAND_VALUES.equals(expandList)) {
clone.addAll((Collection<?>) value);
} else {
clone.add(value);
}
return clone;
}
@@ -133,72 +129,25 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
return append(value, Expand.EXPAND_VALUES);
}
@SuppressWarnings({ "unchecked", "rawtypes" })
protected Map<String, Object> append(String key, Object value) {
@SuppressWarnings("unchecked")
protected java.util.Map<String, Object> append(String key, Object value) {
Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
Map<String, Object> clone = new LinkedHashMap<>((java.util.Map) this.value);
java.util.Map<String, Object> clone = new LinkedHashMap<>((java.util.Map) this.value);
clone.put(key, value);
return clone;
}
@SuppressWarnings({ "unchecked", "rawtypes" })
protected Map<String, Object> remove(String key) {
Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
Map<String, Object> clone = new LinkedHashMap<>((java.util.Map) this.value);
clone.remove(key);
return clone;
}
/**
* Append the given key at the position in the underlying {@link LinkedHashMap}.
*
* @param index
* @param key
* @param value
* @return
* @since 3.1
*/
@SuppressWarnings({ "unchecked" })
protected Map<String, Object> appendAt(int index, String key, Object value) {
Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
Map<String, Object> clone = new LinkedHashMap<>();
int i = 0;
for (Map.Entry<String, Object> entry : ((Map<String, Object>) this.value).entrySet()) {
if (i == index) {
clone.put(key, value);
}
if (!entry.getKey().equals(key)) {
clone.put(entry.getKey(), entry.getValue());
}
i++;
}
if (i <= index) {
clone.put(key, value);
}
return clone;
}
@SuppressWarnings({ "rawtypes" })
protected List<Object> values() {
if (value instanceof List) {
return new ArrayList<Object>((List) value);
}
if (value instanceof java.util.Map) {
return new ArrayList<Object>(((java.util.Map) value).values());
}
return new ArrayList<>(Collections.singletonList(value));
}
@@ -228,7 +177,7 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
return (T) ((Map<String, Object>) this.value).get(key);
return (T) ((java.util.Map<String, Object>) this.value).get(key);
}
/**
@@ -238,11 +187,11 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
* @return
*/
@SuppressWarnings("unchecked")
protected Map<String, Object> argumentMap() {
protected java.util.Map<String, Object> argumentMap() {
Assert.isInstanceOf(Map.class, this.value, "Value must be a type of Map!");
return Collections.unmodifiableMap((java.util.Map<String, Object>) value);
return Collections.unmodifiableMap((java.util.Map) value);
}
/**
@@ -259,7 +208,7 @@ abstract class AbstractAggregationExpression implements AggregationExpression {
return false;
}
return ((Map<String, Object>) this.value).containsKey(key);
return ((java.util.Map<String, Object>) this.value).containsKey(key);
}
protected abstract String getMongoMethod();

View File

@@ -32,7 +32,6 @@ import org.springframework.util.Assert;
* @author Oliver Gierke
* @author Christoph Strobl
* @author Mark Paluch
* @author Yadhukrishna S Pai
* @see Aggregation#withOptions(AggregationOptions)
* @see TypedAggregation#withOptions(AggregationOptions)
* @since 1.6
@@ -46,14 +45,12 @@ public class AggregationOptions {
private static final String COLLATION = "collation";
private static final String COMMENT = "comment";
private static final String MAX_TIME = "maxTimeMS";
private static final String HINT = "hint";
private final boolean allowDiskUse;
private final boolean explain;
private final Optional<Document> cursor;
private final Optional<Collation> collation;
private final Optional<String> comment;
private final Optional<Document> hint;
private Duration maxTime = Duration.ZERO;
private ResultOptions resultOptions = ResultOptions.READ;
@@ -74,13 +71,13 @@ public class AggregationOptions {
* @param allowDiskUse whether to off-load intensive sort-operations to disk.
* @param explain whether to get the execution plan for the aggregation instead of the actual results.
* @param cursor can be {@literal null}, used to pass additional options (such as {@code batchSize}) to the
* aggregation.
* aggregation.
* @param collation collation for string comparison. Can be {@literal null}.
* @since 2.0
*/
public AggregationOptions(boolean allowDiskUse, boolean explain, @Nullable Document cursor,
@Nullable Collation collation) {
this(allowDiskUse, explain, cursor, collation, null, null);
this(allowDiskUse, explain, cursor, collation, null);
}
/**
@@ -89,37 +86,19 @@ public class AggregationOptions {
* @param allowDiskUse whether to off-load intensive sort-operations to disk.
* @param explain whether to get the execution plan for the aggregation instead of the actual results.
* @param cursor can be {@literal null}, used to pass additional options (such as {@code batchSize}) to the
* aggregation.
* aggregation.
* @param collation collation for string comparison. Can be {@literal null}.
* @param comment execution comment. Can be {@literal null}.
* @since 2.2
*/
public AggregationOptions(boolean allowDiskUse, boolean explain, @Nullable Document cursor,
@Nullable Collation collation, @Nullable String comment) {
this(allowDiskUse, explain, cursor, collation, comment, null);
}
/**
* Creates a new {@link AggregationOptions}.
*
* @param allowDiskUse whether to off-load intensive sort-operations to disk.
* @param explain whether to get the execution plan for the aggregation instead of the actual results.
* @param cursor can be {@literal null}, used to pass additional options (such as {@code batchSize}) to the
* aggregation.
* @param collation collation for string comparison. Can be {@literal null}.
* @param comment execution comment. Can be {@literal null}.
* @param hint can be {@literal null}, used to provide an index that would be forcibly used by query optimizer.
* @since 3.1
*/
private AggregationOptions(boolean allowDiskUse, boolean explain, @Nullable Document cursor,
@Nullable Collation collation, @Nullable String comment, @Nullable Document hint) {
this.allowDiskUse = allowDiskUse;
this.explain = explain;
this.cursor = Optional.ofNullable(cursor);
this.collation = Optional.ofNullable(collation);
this.comment = Optional.ofNullable(comment);
this.hint = Optional.ofNullable(hint);
}
/**
@@ -151,9 +130,8 @@ public class AggregationOptions {
Collation collation = document.containsKey(COLLATION) ? Collation.from(document.get(COLLATION, Document.class))
: null;
String comment = document.getString(COMMENT);
Document hint = document.get(HINT, Document.class);
AggregationOptions options = new AggregationOptions(allowDiskUse, explain, cursor, collation, comment, hint);
AggregationOptions options = new AggregationOptions(allowDiskUse, explain, cursor, collation, comment);
if (document.containsKey(MAX_TIME)) {
options.maxTime = Duration.ofMillis(document.getLong(MAX_TIME));
}
@@ -234,16 +212,6 @@ public class AggregationOptions {
return comment;
}
/**
* Get the hint used to to fulfill the aggregation.
*
* @return never {@literal null}.
* @since 3.1
*/
public Optional<Document> getHint() {
return hint;
}
/**
* @return the time limit for processing. {@link Duration#ZERO} is used for the default unbounded behavior.
* @since 3.0
@@ -280,10 +248,6 @@ public class AggregationOptions {
result.put(EXPLAIN, explain);
}
if (result.containsKey(HINT)) {
hint.ifPresent(val -> result.append(HINT, val));
}
if (!result.containsKey(CURSOR)) {
cursor.ifPresent(val -> result.put(CURSOR, val));
}
@@ -313,7 +277,6 @@ public class AggregationOptions {
cursor.ifPresent(val -> document.put(CURSOR, val));
collation.ifPresent(val -> document.append(COLLATION, val.toDocument()));
comment.ifPresent(val -> document.append(COMMENT, val));
hint.ifPresent(val -> document.append(HINT, val));
if (hasExecutionTimeLimit()) {
document.append(MAX_TIME, maxTime.toMillis());
@@ -355,7 +318,6 @@ public class AggregationOptions {
private @Nullable Document cursor;
private @Nullable Collation collation;
private @Nullable String comment;
private @Nullable Document hint;
private @Nullable Duration maxTime;
private @Nullable ResultOptions resultOptions;
@@ -434,24 +396,11 @@ public class AggregationOptions {
return this;
}
/**
* Define a hint that is used by query optimizer to to fulfill the aggregation.
*
* @param hint can be {@literal null}.
* @return this.
* @since 3.1
*/
public Builder hint(@Nullable Document hint) {
this.hint = hint;
return this;
}
/**
* Set the time limit for processing.
*
* @param maxTime {@link Duration#ZERO} is used for the default unbounded behavior. {@link Duration#isNegative()
* Negative} values will be ignored.
* Negative} values will be ignored.
* @return this.
* @since 3.0
*/
@@ -482,7 +431,7 @@ public class AggregationOptions {
*/
public AggregationOptions build() {
AggregationOptions options = new AggregationOptions(allowDiskUse, explain, cursor, collation, comment, hint);
AggregationOptions options = new AggregationOptions(allowDiskUse, explain, cursor, collation, comment);
if (maxTime != null) {
options.maxTime = maxTime;
}

View File

@@ -18,7 +18,6 @@ package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.function.Predicate;
import org.bson.Document;
import org.springframework.util.Assert;
@@ -27,7 +26,6 @@ import org.springframework.util.Assert;
* The {@link AggregationPipeline} holds the collection of {@link AggregationOperation aggregation stages}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 3.0.2
*/
public class AggregationPipeline {
@@ -47,8 +45,6 @@ public class AggregationPipeline {
* @param aggregationOperations must not be {@literal null}.
*/
public AggregationPipeline(List<AggregationOperation> aggregationOperations) {
Assert.notNull(aggregationOperations, "AggregationOperations must not be null!");
pipeline = new ArrayList<>(aggregationOperations);
}
@@ -86,77 +82,30 @@ public class AggregationPipeline {
*/
public boolean isOutOrMerge() {
if (isEmpty()) {
if (pipeline.isEmpty()) {
return false;
}
AggregationOperation operation = pipeline.get(pipeline.size() - 1);
return isOut(operation) || isMerge(operation);
String operator = pipeline.get(pipeline.size() - 1).getOperator();
return operator.equals("$out") || operator.equals("$merge");
}
void verify() {
// check $out/$merge is the last operation if it exists
for (AggregationOperation operation : pipeline) {
for (AggregationOperation aggregationOperation : pipeline) {
if (isOut(operation) && !isLast(operation)) {
if (aggregationOperation instanceof OutOperation && !isLast(aggregationOperation)) {
throw new IllegalArgumentException("The $out operator must be the last stage in the pipeline.");
}
if (isMerge(operation) && !isLast(operation)) {
if (aggregationOperation instanceof MergeOperation && !isLast(aggregationOperation)) {
throw new IllegalArgumentException("The $merge operator must be the last stage in the pipeline.");
}
}
}
/**
* Return whether this aggregation pipeline defines a {@code $unionWith} stage that may contribute documents from
* other collections. Checking for presence of union stages is useful when attempting to determine the aggregation
* element type for mapping metadata computation.
*
* @return {@literal true} the aggregation pipeline makes use of {@code $unionWith}.
* @since 3.1
*/
public boolean containsUnionWith() {
return containsOperation(AggregationPipeline::isUnionWith);
}
/**
* @return {@literal true} if the pipeline does not contain any stages.
* @since 3.1
*/
public boolean isEmpty() {
return pipeline.isEmpty();
}
private boolean containsOperation(Predicate<AggregationOperation> predicate) {
if (isEmpty()) {
return false;
}
for (AggregationOperation element : pipeline) {
if (predicate.test(element)) {
return true;
}
}
return false;
}
private boolean isLast(AggregationOperation aggregationOperation) {
return pipeline.indexOf(aggregationOperation) == pipeline.size() - 1;
}
private static boolean isUnionWith(AggregationOperation operator) {
return operator instanceof UnionWithOperation || operator.getOperator().equals("$unionWith");
}
private static boolean isMerge(AggregationOperation operator) {
return operator instanceof MergeOperation || operator.getOperator().equals("$merge");
}
private static boolean isOut(AggregationOperation operator) {
return operator instanceof OutOperation || operator.getOperator().equals("$out");
}
}

View File

@@ -149,12 +149,4 @@ class ExposedFieldsAggregationOperationContext implements AggregationOperationCo
}
return null;
}
/**
* @return obtain the root context used to resolve references.
* @since 3.1
*/
AggregationOperationContext getRootContext() {
return rootContext;
}
}

View File

@@ -264,7 +264,7 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return new Document(getOperator(), fieldObject);
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#getOperator()
*/
@@ -1450,6 +1450,14 @@ public class ProjectionOperation implements FieldsExposingAggregationOperation {
return field.getTarget();
}
if (field.getTarget().equals(Fields.UNDERSCORE_ID)) {
try {
return context.getReference(field).getReferenceValue();
} catch (java.lang.IllegalArgumentException e) {
return Fields.UNDERSCORE_ID_REF;
}
}
// check whether referenced field exists in the context
return context.getReference(field).getReferenceValue();

View File

@@ -1,587 +0,0 @@
/*
* Copyright 2020 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import org.springframework.data.mongodb.core.aggregation.ScriptOperators.Accumulator.AccumulatorBuilder;
import org.springframework.data.mongodb.core.aggregation.ScriptOperators.Accumulator.AccumulatorInitBuilder;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
/**
* Gateway to {@literal $function} and {@literal $accumulator} aggregation operations.
* <p />
* Using {@link ScriptOperators} as part of the {@link Aggregation} requires MongoDB server to have
* <a href="https://docs.mongodb.com/master/core/server-side-javascript/">server-side JavaScript</a> execution
* <a href="https://docs.mongodb.com/master/reference/configuration-options/#security.javascriptEnabled">enabled</a>.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 3.1
*/
public class ScriptOperators {
/**
* Create a custom aggregation
* <a href="https://docs.mongodb.com/master/reference/operator/aggregation/function/">$function<a /> in JavaScript.
*
* @param body The function definition. Must not be {@literal null}.
* @return new instance of {@link Function}.
*/
public static Function function(String body) {
return Function.function(body);
}
/**
* Create a custom <a href="https://docs.mongodb.com/master/reference/operator/aggregation/accumulator/">$accumulator
* operator</a> in Javascript.
*
* @return new instance of {@link AccumulatorInitBuilder}.
*/
public static AccumulatorInitBuilder accumulatorBuilder() {
return new AccumulatorBuilder();
}
/**
* {@link Function} defines a custom aggregation
* <a href="https://docs.mongodb.com/master/reference/operator/aggregation/function/">$function</a> in JavaScript.
* <p />
* <code class="java">
* {
* $function: {
* body: ...,
* args: ...,
* lang: "js"
* }
* }
* </code>
* <p />
* {@link Function} cannot be used as part of {@link org.springframework.data.mongodb.core.schema.MongoJsonSchema
* schema} validation query expression. <br />
* <b>NOTE:</b> <a href="https://docs.mongodb.com/master/core/server-side-javascript/">Server-Side JavaScript</a>
* execution must be
* <a href="https://docs.mongodb.com/master/reference/configuration-options/#security.javascriptEnabled">enabled</a>
*
* @see <a href="https://docs.mongodb.com/master/reference/operator/aggregation/function/">MongoDB Documentation:
* $function</a>
*/
public static class Function extends AbstractAggregationExpression {
private Function(Map<String, Object> values) {
super(values);
}
/**
* Create a new {@link Function} with the given function definition.
*
* @param body must not be {@literal null}.
* @return new instance of {@link Function}.
*/
public static Function function(String body) {
Assert.notNull(body, "Function body must not be null!");
Map<String, Object> function = new LinkedHashMap<>(2);
function.put(Fields.BODY.toString(), body);
function.put(Fields.ARGS.toString(), Collections.emptyList());
function.put(Fields.LANG.toString(), "js");
return new Function(function);
}
/**
* Set the arguments passed to the function body.
*
* @param args the arguments passed to the function body. Leave empty if the function does not take any arguments.
* @return new instance of {@link Function}.
*/
public Function args(Object... args) {
return args(Arrays.asList(args));
}
/**
* Set the arguments passed to the function body.
*
* @param args the arguments passed to the function body. Leave empty if the function does not take any arguments.
* @return new instance of {@link Function}.
*/
public Function args(List<Object> args) {
Assert.notNull(args, "Args must not be null! Use an empty list instead.");
return new Function(appendAt(1, Fields.ARGS.toString(), args));
}
/**
* The language used in the body.
*
* @param lang must not be {@literal null} nor empty.
* @return new instance of {@link Function}.
*/
public Function lang(String lang) {
Assert.hasText(lang, "Lang must not be null nor empty! The default would be 'js'.");
return new Function(appendAt(2, Fields.LANG.toString(), lang));
}
@Nullable
List<Object> getArgs() {
return get(Fields.ARGS.toString());
}
String getBody() {
return get(Fields.BODY.toString());
}
String getLang() {
return get(Fields.LANG.toString());
}
@Override
protected String getMongoMethod() {
return "$function";
}
enum Fields {
BODY, ARGS, LANG;
@Override
public String toString() {
return name().toLowerCase();
}
}
}
/**
* {@link Accumulator} defines a custom aggregation
* <a href="https://docs.mongodb.com/master/reference/operator/aggregation/accumulator/">$accumulator operator</a>,
* one that maintains its state (e.g. totals, maximums, minimums, and related data) as documents progress through the
* pipeline, in JavaScript.
* <p />
* <code class="java">
* {
* $accumulator: {
* init: ...,
* intArgs: ...,
* accumulate: ...,
* accumulateArgs: ...,
* merge: ...,
* finalize: ...,
* lang: "js"
* }
* }
* </code>
* <p />
* {@link Accumulator} can be used as part of {@link GroupOperation $group}, {@link BucketOperation $bucket} and
* {@link BucketAutoOperation $bucketAuto} pipeline stages. <br />
* <b>NOTE:</b> <a href="https://docs.mongodb.com/master/core/server-side-javascript/">Server-Side JavaScript</a>
* execution must be
* <a href="https://docs.mongodb.com/master/reference/configuration-options/#security.javascriptEnabled">enabled</a>
*
* @see <a href="https://docs.mongodb.com/master/reference/operator/aggregation/accumulator/">MongoDB Documentation:
* $accumulator</a>
*/
public static class Accumulator extends AbstractAggregationExpression {
private Accumulator(Map<String, Object> value) {
super(value);
}
@Override
protected String getMongoMethod() {
return "$accumulator";
}
enum Fields {
ACCUMULATE("accumulate"), //
ACCUMULATE_ARGS("accumulateArgs"), //
FINALIZE("finalize"), //
INIT("init"), //
INIT_ARGS("initArgs"), //
LANG("lang"), //
MERGE("merge"); //
private String field;
Fields(String field) {
this.field = field;
}
@Override
public String toString() {
return field;
}
}
public interface AccumulatorInitBuilder {
/**
* Define the {@code init} {@link Function} for the {@link Accumulator accumulators} initial state. The function
* receives its arguments from the {@link Function#args(Object...) initArgs} array expression.
* <p />
* <code class="java">
* function(initArg1, initArg2, ...) {
* ...
* return initialState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
default AccumulatorAccumulateBuilder init(Function function) {
return init(function.getBody()).initArgs(function.getArgs());
}
/**
* Define the {@code init} function for the {@link Accumulator accumulators} initial state. The function receives
* its arguments from the {@link AccumulatorInitArgsBuilder#initArgs(Object...)} array expression.
* <p />
* <code class="java">
* function(initArg1, initArg2, ...) {
* ...
* return initialState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
AccumulatorInitArgsBuilder init(String function);
/**
* The language used in the {@code $accumulator} code.
*
* @param lang must not be {@literal null}. Default is {@literal js}.
* @return this.
*/
AccumulatorInitBuilder lang(String lang);
}
public interface AccumulatorInitArgsBuilder extends AccumulatorAccumulateBuilder {
/**
* Define the optional {@code initArgs} for the {@link AccumulatorInitBuilder#init(String)} function.
*
* @param args must not be {@literal null}.
* @return this.
*/
default AccumulatorAccumulateBuilder initArgs(Object... args) {
return initArgs(Arrays.asList(args));
}
/**
* Define the optional {@code initArgs} for the {@link AccumulatorInitBuilder#init(String)} function.
*
* @param args must not be {@literal null}.
* @return this.
*/
AccumulatorAccumulateBuilder initArgs(List<Object> args);
}
public interface AccumulatorAccumulateBuilder {
/**
* Set the {@code accumulate} {@link Function} that updates the state for each document. The functions first
* argument is the current {@code state}, additional arguments can be defined via {@link Function#args(Object...)
* accumulateArgs}.
* <p />
* <code class="java">
* function(state, accumArg1, accumArg2, ...) {
* ...
* return newState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
default AccumulatorMergeBuilder accumulate(Function function) {
return accumulate(function.getBody()).accumulateArgs(function.getArgs());
}
/**
* Set the {@code accumulate} function that updates the state for each document. The functions first argument is
* the current {@code state}, additional arguments can be defined via
* {@link AccumulatorAccumulateArgsBuilder#accumulateArgs(Object...)}.
* <p />
* <code class="java">
* function(state, accumArg1, accumArg2, ...) {
* ...
* return newState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
AccumulatorAccumulateArgsBuilder accumulate(String function);
}
public interface AccumulatorAccumulateArgsBuilder extends AccumulatorMergeBuilder {
/**
* Define additional {@code accumulateArgs} for the {@link AccumulatorAccumulateBuilder#accumulate(String)}
* function.
*
* @param args must not be {@literal null}.
* @return this.
*/
default AccumulatorMergeBuilder accumulateArgs(Object... args) {
return accumulateArgs(Arrays.asList(args));
}
/**
* Define additional {@code accumulateArgs} for the {@link AccumulatorAccumulateBuilder#accumulate(String)}
* function.
*
* @param args must not be {@literal null}.
* @return this.
*/
AccumulatorMergeBuilder accumulateArgs(List<Object> args);
}
public interface AccumulatorMergeBuilder {
/**
* Set the {@code merge} function used to merge two internal states. <br />
* This might be required because the operation is run on a sharded cluster or when the operator exceeds its
* memory limit.
* <p />
* <code class="java">
* function(state1, state2) {
* ...
* return newState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
AccumulatorFinalizeBuilder merge(String function);
}
public interface AccumulatorFinalizeBuilder {
/**
* Set the {@code finalize} function used to update the result of the accumulation when all documents have been
* processed.
* <p />
* <code class="java">
* function(state) {
* ...
* return finalState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return new instance of {@link Accumulator}.
*/
Accumulator finalize(String function);
/**
* Build the {@link Accumulator} object without specifying a {@link #finalize(String) finalize function}.
*
* @return new instance of {@link Accumulator}.
*/
Accumulator build();
}
static class AccumulatorBuilder
implements AccumulatorInitBuilder, AccumulatorInitArgsBuilder, AccumulatorAccumulateBuilder,
AccumulatorAccumulateArgsBuilder, AccumulatorMergeBuilder, AccumulatorFinalizeBuilder {
private List<Object> initArgs;
private String initFunction;
private List<Object> accumulateArgs;
private String accumulateFunction;
private String mergeFunction;
private String finalizeFunction;
private String lang = "js";
/**
* Define the {@code init} function for the {@link Accumulator accumulators} initial state. The function receives
* its arguments from the {@link #initArgs(Object...)} array expression.
* <p />
* <code class="java">
* function(initArg1, initArg2, ...) {
* ...
* return initialState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
@Override
public AccumulatorBuilder init(String function) {
this.initFunction = function;
return this;
}
/**
* Define the optional {@code initArgs} for the {@link #init(String)} function.
*
* @param function must not be {@literal null}.
* @return this.
*/
@Override
public AccumulatorBuilder initArgs(List<Object> args) {
Assert.notNull(args, "Args must not be null");
this.initArgs = new ArrayList<>(args);
return this;
}
/**
* Set the {@code accumulate} function that updates the state for each document. The functions first argument is
* the current {@code state}, additional arguments can be defined via {@link #accumulateArgs(Object...)}.
* <p />
* <code class="java">
* function(state, accumArg1, accumArg2, ...) {
* ...
* return newState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
@Override
public AccumulatorBuilder accumulate(String function) {
Assert.notNull(function, "Accumulate function must not be null");
this.accumulateFunction = function;
return this;
}
/**
* Define additional {@code accumulateArgs} for the {@link #accumulate(String)} function.
*
* @param args must not be {@literal null}.
* @return this.
*/
@Override
public AccumulatorBuilder accumulateArgs(List<Object> args) {
Assert.notNull(args, "Args must not be null");
this.accumulateArgs = new ArrayList<>(args);
return this;
}
/**
* Set the {@code merge} function used to merge two internal states. <br />
* This might be required because the operation is run on a sharded cluster or when the operator exceeds its
* memory limit.
* <p />
* <code class="java">
* function(state1, state2) {
* ...
* return newState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return this.
*/
@Override
public AccumulatorBuilder merge(String function) {
Assert.notNull(function, "Merge function must not be null");
this.mergeFunction = function;
return this;
}
/**
* The language used in the {@code $accumulator} code.
*
* @param lang must not be {@literal null}. Default is {@literal js}.
* @return this.
*/
public AccumulatorBuilder lang(String lang) {
Assert.hasText(lang, "Lang must not be null nor empty! The default would be 'js'.");
this.lang = lang;
return this;
}
/**
* Set the {@code finalize} function used to update the result of the accumulation when all documents have been
* processed.
* <p />
* <code class="java">
* function(state) {
* ...
* return finalState
* }
* </code>
*
* @param function must not be {@literal null}.
* @return new instance of {@link Accumulator}.
*/
@Override
public Accumulator finalize(String function) {
Assert.notNull(function, "Finalize function must not be null");
this.finalizeFunction = function;
Map<String, Object> args = createArgumentMap();
args.put(Fields.FINALIZE.toString(), finalizeFunction);
return new Accumulator(args);
}
@Override
public Accumulator build() {
return new Accumulator(createArgumentMap());
}
private Map<String, Object> createArgumentMap() {
Map<String, Object> args = new LinkedHashMap<>();
args.put(Fields.INIT.toString(), initFunction);
if (!CollectionUtils.isEmpty(initArgs)) {
args.put(Fields.INIT_ARGS.toString(), initArgs);
}
args.put(Fields.ACCUMULATE.toString(), accumulateFunction);
if (!CollectionUtils.isEmpty(accumulateArgs)) {
args.put(Fields.ACCUMULATE_ARGS.toString(), accumulateArgs);
}
args.put(Fields.MERGE.toString(), mergeFunction);
args.put(Fields.LANG.toString(), lang);
return args;
}
}
}
}

View File

@@ -133,19 +133,6 @@ public class TypeBasedAggregationOperationContext implements AggregationOperatio
*/
@Override
public AggregationOperationContext continueOnMissingFieldReference() {
return continueOnMissingFieldReference(type);
}
/**
* This toggle allows the {@link AggregationOperationContext context} to use any given field name without checking for
* its existence. Typically the {@link AggregationOperationContext} fails when referencing unknown fields, those that
* are not present in one of the previous stages or the input source, throughout the pipeline.
*
* @param type The domain type to map fields to.
* @return a more relaxed {@link AggregationOperationContext}.
* @since 3.1
*/
public AggregationOperationContext continueOnMissingFieldReference(Class<?> type) {
return new RelaxedTypeBasedAggregationOperationContext(type, mappingContext, mapper);
}

View File

@@ -1,168 +0,0 @@
/*
* Copyright 2020 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.aggregation;
import java.util.Arrays;
import java.util.List;
import org.bson.Document;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
/**
* The <a href="https://docs.mongodb.com/master/reference/operator/aggregation/unionWith/">$unionWith</a> aggregation
* stage (available since MongoDB 4.4) performs a union of two collections by combining pipeline results, potentially
* containing duplicates, into a single result set that is handed over to the next stage. <br />
* In order to remove duplicates it is possible to append a {@link GroupOperation} right after
* {@link UnionWithOperation}.
* <p />
* If the {@link UnionWithOperation} uses a
* <a href="https://docs.mongodb.com/master/reference/operator/aggregation/unionWith/#unionwith-pipeline">pipeline</a>
* to process documents, field names within the pipeline will be treated as is. In order to map domain type property
* names to actual field names (considering potential {@link org.springframework.data.mongodb.core.mapping.Field}
* annotations) make sure the enclosing aggregation is a {@link TypedAggregation} and provide the target type for the
* {@code $unionWith} stage via {@link #mapFieldsTo(Class)}.
*
* @author Christoph Strobl
* @see <a href="https://docs.mongodb.com/master/reference/operator/aggregation/unionWith/">Aggregation Pipeline Stage:
* $unionWith</a>
* @since 3.1
*/
public class UnionWithOperation implements AggregationOperation {
private final String collection;
private final @Nullable AggregationPipeline pipeline;
private final @Nullable Class<?> domainType;
public UnionWithOperation(String collection, @Nullable AggregationPipeline pipeline, @Nullable Class<?> domainType) {
Assert.notNull(collection, "Collection must not be null!");
this.collection = collection;
this.pipeline = pipeline;
this.domainType = domainType;
}
/**
* Set the name of the collection from which pipeline results should be included in the result set.<br />
* The collection name is used to set the {@code coll} parameter of {@code $unionWith}.
*
* @param collection the MongoDB collection name. Must not be {@literal null}.
* @return new instance of {@link UnionWithOperation}.
* @throws IllegalArgumentException if the required argument is {@literal null}.
*/
public static UnionWithOperation unionWith(String collection) {
return new UnionWithOperation(collection, null, null);
}
/**
* Set the {@link AggregationPipeline} to apply to the specified collection. The pipeline corresponds to the optional
* {@code pipeline} field of the {@code $unionWith} aggregation stage and is used to compute the documents going into
* the result set.
*
* @param pipeline the {@link AggregationPipeline} that computes the documents. Must not be {@literal null}.
* @return new instance of {@link UnionWithOperation}.
* @throws IllegalArgumentException if the required argument is {@literal null}.
*/
public UnionWithOperation pipeline(AggregationPipeline pipeline) {
return new UnionWithOperation(collection, pipeline, domainType);
}
/**
* Set the aggregation pipeline stages to apply to the specified collection. The pipeline corresponds to the optional
* {@code pipeline} field of the {@code $unionWith} aggregation stage and is used to compute the documents going into
* the result set.
*
* @param aggregationStages the aggregation pipeline stages that compute the documents. Must not be {@literal null}.
* @return new instance of {@link UnionWithOperation}.
* @throws IllegalArgumentException if the required argument is {@literal null}.
*/
public UnionWithOperation pipeline(List<AggregationOperation> aggregationStages) {
return new UnionWithOperation(collection, new AggregationPipeline(aggregationStages), domainType);
}
/**
* Set the aggregation pipeline stages to apply to the specified collection. The pipeline corresponds to the optional
* {@code pipeline} field of the {@code $unionWith} aggregation stage and is used to compute the documents going into
* the result set.
*
* @param aggregationStages the aggregation pipeline stages that compute the documents. Must not be {@literal null}.
* @return new instance of {@link UnionWithOperation}.
* @throws IllegalArgumentException if the required argument is {@literal null}.
*/
public UnionWithOperation pipeline(AggregationOperation... aggregationStages) {
return new UnionWithOperation(collection, new AggregationPipeline(Arrays.asList(aggregationStages)), domainType);
}
/**
* Set domain type used for field name mapping of property references used by the {@link AggregationPipeline}.
* Remember to also use a {@link TypedAggregation} in the outer pipeline.<br />
* If not set, field names used within {@link AggregationOperation pipeline operations} are taken as is.
*
* @param domainType the domain type to map field names used in pipeline operations to. Must not be {@literal null}.
* @return new instance of {@link UnionWithOperation}.
* @throws IllegalArgumentException if the required argument is {@literal null}.
*/
public UnionWithOperation mapFieldsTo(Class<?> domainType) {
Assert.notNull(domainType, "DomainType must not be null!");
return new UnionWithOperation(collection, pipeline, domainType);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#toDocument(org.springframework.data.mongodb.core.aggregation.AggregationOperationContext)
*/
@Override
public Document toDocument(AggregationOperationContext context) {
Document $unionWith = new Document("coll", collection);
if (pipeline == null || pipeline.isEmpty()) {
return new Document(getOperator(), $unionWith);
}
$unionWith.append("pipeline", pipeline.toDocuments(computeContext(context)));
return new Document(getOperator(), $unionWith);
}
private AggregationOperationContext computeContext(AggregationOperationContext source) {
if (domainType == null) {
return Aggregation.DEFAULT_CONTEXT;
}
if (source instanceof TypeBasedAggregationOperationContext) {
return ((TypeBasedAggregationOperationContext) source).continueOnMissingFieldReference(domainType);
}
if (source instanceof ExposedFieldsAggregationOperationContext) {
return computeContext(((ExposedFieldsAggregationOperationContext) source).getRootContext());
}
return source;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.aggregation.AggregationOperation#getOperator()
*/
@Override
public String getOperator() {
return "$unionWith";
}
}

View File

@@ -46,7 +46,6 @@ import java.lang.annotation.Target;
* @author Philipp Schneider
* @author Johno Crawford
* @author Christoph Strobl
* @author Dave Perryman
*/
@Target({ ElementType.TYPE })
@Documented
@@ -96,8 +95,7 @@ public @interface CompoundIndex {
boolean unique() default false;
/**
* If set to true index will skip over any document that is missing the indexed field. <br />
* Must not be used with {@link #partialFilter()}.
* If set to true index will skip over any document that is missing the indexed field.
*
* @return {@literal false} by default.
* @see <a href=
@@ -172,14 +170,4 @@ public @interface CompoundIndex {
*/
boolean background() default false;
/**
* Only index the documents in a collection that meet a specified {@link IndexFilter filter expression}. <br />
* Must not be used with {@link #sparse() sparse = true}.
*
* @return empty by default.
* @see <a href=
* "https://docs.mongodb.com/manual/core/index-partial/">https://docs.mongodb.com/manual/core/index-partial/</a>
* @since 3.1
*/
String partialFilter() default "";
}

View File

@@ -53,8 +53,7 @@ public @interface Indexed {
IndexDirection direction() default IndexDirection.ASCENDING;
/**
* If set to true index will skip over any document that is missing the indexed field. <br />
* Must not be used with {@link #partialFilter()}.
* If set to true index will skip over any document that is missing the indexed field.
*
* @return {@literal false} by default.
* @see <a href=
@@ -171,15 +170,4 @@ public @interface Indexed {
* @since 2.2
*/
String expireAfter() default "";
/**
* Only index the documents in a collection that meet a specified {@link IndexFilter filter expression}. <br />
* Must not be used with {@link #sparse() sparse = true}.
*
* @return empty by default.
* @see <a href=
* "https://docs.mongodb.com/manual/core/index-partial/">https://docs.mongodb.com/manual/core/index-partial/</a>
* @since 3.1
*/
String partialFilter() default "";
}

View File

@@ -15,6 +15,10 @@
*/
package org.springframework.data.mongodb.core.index;
import lombok.AccessLevel;
import lombok.EqualsAndHashCode;
import lombok.RequiredArgsConstructor;
import java.time.Duration;
import java.util.ArrayList;
import java.util.Arrays;
@@ -29,6 +33,7 @@ import java.util.stream.Collectors;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.domain.Sort;
import org.springframework.data.mapping.Association;
@@ -46,7 +51,6 @@ import org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.Document;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.util.BsonUtils;
import org.springframework.data.spel.EvaluationContextProvider;
import org.springframework.data.util.TypeInformation;
import org.springframework.expression.EvaluationContext;
@@ -70,7 +74,6 @@ import org.springframework.util.StringUtils;
* @author Thomas Darimont
* @author Martin Macko
* @author Mark Paluch
* @author Dave Perryman
* @since 1.5
*/
public class MongoPersistentEntityIndexResolver implements IndexResolver {
@@ -382,10 +385,6 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
indexDefinition.background();
}
if (StringUtils.hasText(index.partialFilter())) {
indexDefinition.partial(evaluatePartialFilter(index.partialFilter(), entity));
}
return new IndexDefinitionHolder(dotPath, indexDefinition, collection);
}
@@ -475,25 +474,9 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
}
}
if (StringUtils.hasText(index.partialFilter())) {
indexDefinition.partial(evaluatePartialFilter(index.partialFilter(), persistentProperty.getOwner()));
}
return new IndexDefinitionHolder(dotPath, indexDefinition, collection);
}
private PartialIndexFilter evaluatePartialFilter(String filterExpression, PersistentEntity<?,?> entity) {
Object result = evaluate(filterExpression, getEvaluationContextForProperty(entity));
if (result instanceof org.bson.Document) {
return PartialIndexFilter.of((org.bson.Document) result);
}
return PartialIndexFilter.of(BsonUtils.parse(filterExpression, null));
}
/**
* Creates {@link HashedIndex} wrapped in {@link IndexDefinitionHolder} out of {@link HashIndexed} for a given
* {@link MongoPersistentProperty}.
@@ -750,6 +733,8 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
* @author Christoph Strobl
* @author Mark Paluch
*/
@RequiredArgsConstructor(access = AccessLevel.PRIVATE)
@EqualsAndHashCode
static class Path {
private static final Path EMPTY = new Path(Collections.emptyList(), false);
@@ -757,11 +742,6 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
private final List<PersistentProperty<?>> elements;
private final boolean cycle;
private Path(List<PersistentProperty<?>> elements, boolean cycle) {
this.elements = elements;
this.cycle = cycle;
}
/**
* @return an empty {@link Path}.
* @since 1.10.8
@@ -863,28 +843,6 @@ public class MongoPersistentEntityIndexResolver implements IndexResolver {
return builder.toString();
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
Path that = (Path) o;
if (this.cycle != that.cycle) {
return false;
}
return ObjectUtils.nullSafeEquals(this.elements, that.elements);
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(elements);
result = 31 * result + (cycle ? 1 : 0);
return result;
}
}
}

View File

@@ -16,8 +16,14 @@
package org.springframework.data.mongodb.core.index;
import org.bson.Document;
import lombok.AccessLevel;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* {@link IndexFilter} implementation for usage with plain {@link Document} as well as {@link CriteriaDefinition} filter
@@ -26,16 +32,10 @@ import org.springframework.util.Assert;
* @author Christoph Strobl
* @since 1.10
*/
@RequiredArgsConstructor(access = AccessLevel.PRIVATE)
public class PartialIndexFilter implements IndexFilter {
private final Object filterExpression;
private PartialIndexFilter(Object filterExpression) {
Assert.notNull(filterExpression, "FilterExpression must not be null!");
this.filterExpression = filterExpression;
}
private final @NonNull Object filterExpression;
/**
* Create new {@link PartialIndexFilter} for given {@link Document filter expression}.

View File

@@ -15,12 +15,13 @@
*/
package org.springframework.data.mongodb.core.mapping.event;
import org.reactivestreams.Publisher;
import reactor.core.publisher.Mono;
import org.reactivestreams.Publisher;
import org.springframework.beans.factory.ObjectFactory;
import org.springframework.core.Ordered;
import org.springframework.data.auditing.AuditingHandler;
import org.springframework.data.auditing.ReactiveIsNewAwareAuditingHandler;
import org.springframework.data.auditing.IsNewAwareAuditingHandler;
import org.springframework.data.mapping.callback.EntityCallback;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.util.Assert;
@@ -33,7 +34,7 @@ import org.springframework.util.Assert;
*/
public class ReactiveAuditingEntityCallback implements ReactiveBeforeConvertCallback<Object>, Ordered {
private final ObjectFactory<ReactiveIsNewAwareAuditingHandler> auditingHandlerFactory;
private final ObjectFactory<IsNewAwareAuditingHandler> auditingHandlerFactory;
/**
* Creates a new {@link ReactiveAuditingEntityCallback} using the given {@link MappingContext} and
@@ -41,19 +42,19 @@ public class ReactiveAuditingEntityCallback implements ReactiveBeforeConvertCall
*
* @param auditingHandlerFactory must not be {@literal null}.
*/
public ReactiveAuditingEntityCallback(ObjectFactory<ReactiveIsNewAwareAuditingHandler> auditingHandlerFactory) {
public ReactiveAuditingEntityCallback(ObjectFactory<IsNewAwareAuditingHandler> auditingHandlerFactory) {
Assert.notNull(auditingHandlerFactory, "IsNewAwareAuditingHandler must not be null!");
this.auditingHandlerFactory = auditingHandlerFactory;
}
/*
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.mapping.event.ReactiveBeforeConvertCallback#onBeforeConvert(java.lang.Object, java.lang.String)
*/
@Override
public Publisher<Object> onBeforeConvert(Object entity, String collection) {
return auditingHandlerFactory.getObject().markAudited(entity);
return Mono.just(auditingHandlerFactory.getObject().markAudited(entity));
}
/*

View File

@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core.messaging;
import lombok.AllArgsConstructor;
import java.time.Instant;
import java.util.Arrays;
import java.util.Collections;
@@ -215,17 +217,12 @@ class ChangeStreamTask extends CursorReadingTask<ChangeStreamDocument<Document>,
*
* @since 2.1
*/
@AllArgsConstructor
static class ChangeStreamEventMessage<T> implements Message<ChangeStreamDocument<Document>, T> {
private final ChangeStreamEvent<T> delegate;
private final MessageProperties messageProperties;
ChangeStreamEventMessage(ChangeStreamEvent<T> delegate, MessageProperties messageProperties) {
this.delegate = delegate;
this.messageProperties = messageProperties;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.messaging.Message#getRaw()

View File

@@ -15,6 +15,10 @@
*/
package org.springframework.data.mongodb.core.messaging;
import lombok.AccessLevel;
import lombok.EqualsAndHashCode;
import lombok.RequiredArgsConstructor;
import java.time.Duration;
import java.util.LinkedHashMap;
import java.util.Map;
@@ -30,7 +34,6 @@ import org.springframework.data.mongodb.core.messaging.SubscriptionRequest.Reque
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ErrorHandler;
import org.springframework.util.ObjectUtils;
/**
* Simple {@link Executor} based {@link MessageListenerContainer} implementation for running {@link Task tasks} like
@@ -256,6 +259,7 @@ public class DefaultMessageListenerContainer implements MessageListenerContainer
* @author Christoph Strobl
* @since 2.1
*/
@EqualsAndHashCode
static class TaskSubscription implements Subscription {
private final Task task;
@@ -282,39 +286,19 @@ public class DefaultMessageListenerContainer implements MessageListenerContainer
public void cancel() throws DataAccessResourceFailureException {
task.cancel();
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
TaskSubscription that = (TaskSubscription) o;
return ObjectUtils.nullSafeEquals(this.task, that.task);
}
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(task);
}
}
/**
* @author Christoph Strobl
* @since 2.1
*/
@RequiredArgsConstructor(access = AccessLevel.PACKAGE)
private static class DecoratingLoggingErrorHandler implements ErrorHandler {
private final Log logger = LogFactory.getLog(DecoratingLoggingErrorHandler.class);
private final ErrorHandler delegate;
DecoratingLoggingErrorHandler(ErrorHandler delegate) {
this.delegate = delegate;
}
@Override
public void handleError(Throwable t) {

View File

@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core.messaging;
import lombok.ToString;
import org.bson.Document;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.util.ClassUtils;
@@ -24,6 +26,7 @@ import org.springframework.util.ClassUtils;
* @author Mark Paluch
* @since 2.1
*/
@ToString(of = { "delegate", "targetType" })
class LazyMappingDelegatingMessage<S, T> implements Message<S, T> {
private final Message<S, ?> delegate;
@@ -79,8 +82,4 @@ class LazyMappingDelegatingMessage<S, T> implements Message<S, T> {
public MessageProperties getProperties() {
return delegate.getProperties();
}
public String toString() {
return "LazyMappingDelegatingMessage(delegate=" + this.delegate + ", targetType=" + this.targetType + ")";
}
}

View File

@@ -15,9 +15,11 @@
*/
package org.springframework.data.mongodb.core.messaging;
import lombok.EqualsAndHashCode;
import lombok.ToString;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* General message abstraction for any type of Event / Message published by MongoDB server to the client. This might be
@@ -63,6 +65,8 @@ public interface Message<S, T> {
* @author Christoph Strobl
* @since 2.1
*/
@ToString
@EqualsAndHashCode
class MessageProperties {
private static final MessageProperties EMPTY = new MessageProperties();
@@ -107,34 +111,6 @@ public interface Message<S, T> {
return new MessagePropertiesBuilder();
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
MessageProperties that = (MessageProperties) o;
if (!ObjectUtils.nullSafeEquals(this.databaseName, that.databaseName)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.collectionName, that.collectionName);
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(databaseName);
result = 31 * result + ObjectUtils.nullSafeHashCode(collectionName);
return result;
}
public String toString() {
return "Message.MessageProperties(databaseName=" + this.getDatabaseName() + ", collectionName="
+ this.getCollectionName() + ")";
}
/**
* Builder for {@link MessageProperties}.
*

View File

@@ -15,9 +15,11 @@
*/
package org.springframework.data.mongodb.core.messaging;
import lombok.EqualsAndHashCode;
import lombok.ToString;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Trivial {@link Message} implementation.
@@ -25,6 +27,8 @@ import org.springframework.util.ObjectUtils;
* @author Christoph Strobl
* @since 2.1
*/
@EqualsAndHashCode
@ToString
class SimpleMessage<S, T> implements Message<S, T> {
private @Nullable final S raw;
@@ -71,35 +75,4 @@ class SimpleMessage<S, T> implements Message<S, T> {
public MessageProperties getProperties() {
return properties;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
SimpleMessage<?, ?> that = (SimpleMessage<?, ?>) o;
if (!ObjectUtils.nullSafeEquals(this.raw, that.raw)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(this.body, that.body)) {
return false;
}
return ObjectUtils.nullSafeEquals(this.properties, that.properties);
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(raw);
result = 31 * result + ObjectUtils.nullSafeHashCode(body);
result = 31 * result + ObjectUtils.nullSafeHashCode(properties);
return result;
}
public String toString() {
return "SimpleMessage(raw=" + this.getRaw() + ", body=" + this.getBody() + ", properties=" + this.getProperties()
+ ")";
}
}

View File

@@ -67,7 +67,6 @@ public class BasicUpdate extends Update {
}
@Override
@Deprecated
public Update pushAll(String key, Object[] values) {
Document keyValue = new Document();
keyValue.put(key, values);

View File

@@ -15,6 +15,11 @@
*/
package org.springframework.data.mongodb.core.query;
import lombok.AccessLevel;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.RequiredArgsConstructor;
import java.util.Locale;
import java.util.Optional;
@@ -510,6 +515,8 @@ public class Collation {
*
* @since 2.0
*/
@AllArgsConstructor(access = AccessLevel.PACKAGE)
@Getter
static class ICUComparisonLevel implements ComparisonLevel {
private final int level;
@@ -519,24 +526,6 @@ public class Collation {
ICUComparisonLevel(int level) {
this(level, Optional.empty(), Optional.empty());
}
ICUComparisonLevel(int level, Optional<CaseFirst> caseFirst, Optional<Boolean> caseLevel) {
this.level = level;
this.caseFirst = caseFirst;
this.caseLevel = caseLevel;
}
public int getLevel() {
return this.level;
}
public Optional<CaseFirst> getCaseFirst() {
return this.caseFirst;
}
public Optional<Boolean> getCaseLevel() {
return this.caseLevel;
}
}
/**
@@ -661,6 +650,7 @@ public class Collation {
/**
* @since 2.0
*/
@RequiredArgsConstructor(access = AccessLevel.PRIVATE)
public static class CaseFirst {
private static final CaseFirst UPPER = new CaseFirst("upper");
@@ -669,10 +659,6 @@ public class Collation {
private final String state;
private CaseFirst(String state) {
this.state = state;
}
/**
* Sort uppercase before lowercase.
*
@@ -704,6 +690,7 @@ public class Collation {
/**
* @since 2.0
*/
@RequiredArgsConstructor(access = AccessLevel.PACKAGE)
public static class Alternate {
private static final Alternate NON_IGNORABLE = new Alternate("non-ignorable", Optional.empty());
@@ -711,11 +698,6 @@ public class Collation {
final String alternate;
final Optional<String> maxVariable;
Alternate(String alternate, Optional<String> maxVariable) {
this.alternate = alternate;
this.maxVariable = maxVariable;
}
/**
* Consider Whitespace and punctuation as base characters.
*
@@ -779,17 +761,12 @@ public class Collation {
* @see <a href="http://site.icu-project.org">ICU - International Components for Unicode</a>
* @since 2.0
*/
@RequiredArgsConstructor(access = AccessLevel.PRIVATE)
public static class CollationLocale {
private final String language;
private final Optional<String> variant;
private CollationLocale(String language, Optional<String> variant) {
this.language = language;
this.variant = variant;
}
/**
* Create new {@link CollationLocale} for given language.
*

View File

@@ -15,134 +15,54 @@
*/
package org.springframework.data.mongodb.core.query;
import lombok.EqualsAndHashCode;
import java.util.HashMap;
import java.util.Map;
import java.util.Map.Entry;
import org.bson.Document;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Field projection.
*
* @author Thomas Risberg
* @author Oliver Gierke
* @author Patryk Wasik
* @author Christoph Strobl
* @author Mark Paluch
* @author Owen Q
*/
@EqualsAndHashCode
public class Field {
private final Map<String, Integer> criteria = new HashMap<>();
private final Map<String, Object> slices = new HashMap<>();
private final Map<String, Criteria> elemMatchs = new HashMap<>();
private final Map<String, Integer> criteria = new HashMap<String, Integer>();
private final Map<String, Object> slices = new HashMap<String, Object>();
private final Map<String, Criteria> elemMatchs = new HashMap<String, Criteria>();
private @Nullable String positionKey;
private int positionValue;
/**
* Include a single {@code field} to be returned by the query operation.
*
* @param field the document field name to be included.
* @return {@code this} field projection instance.
*/
public Field include(String field) {
Assert.notNull(field, "Key must not be null!");
criteria.put(field, 1);
public Field include(String key) {
criteria.put(key, Integer.valueOf(1));
return this;
}
/**
* Include one or more {@code fields} to be returned by the query operation.
*
* @param fields the document field names to be included.
* @return {@code this} field projection instance.
* @since 3.1
*/
public Field include(String... fields) {
Assert.notNull(fields, "Keys must not be null!");
for (String key : fields) {
criteria.put(key, 1);
}
public Field exclude(String key) {
criteria.put(key, Integer.valueOf(0));
return this;
}
/**
* Exclude a single {@code field} from being returned by the query operation.
*
* @param field the document field name to be included.
* @return {@code this} field projection instance.
*/
public Field exclude(String field) {
Assert.notNull(field, "Key must not be null!");
criteria.put(field, 0);
public Field slice(String key, int size) {
slices.put(key, Integer.valueOf(size));
return this;
}
/**
* Exclude one or more {@code fields} from being returned by the query operation.
*
* @param fields the document field names to be included.
* @return {@code this} field projection instance.
* @since 3.1
*/
public Field exclude(String... fields) {
Assert.notNull(fields, "Keys must not be null!");
for (String key : fields) {
criteria.put(key, 0);
}
public Field slice(String key, int offset, int size) {
slices.put(key, new Integer[] { Integer.valueOf(offset), Integer.valueOf(size) });
return this;
}
/**
* Project a {@code $slice} of the array {@code field} using the first {@code size} elements.
*
* @param field the document field name to project, must be an array field.
* @param size the number of elements to include.
* @return {@code this} field projection instance.
*/
public Field slice(String field, int size) {
Assert.notNull(field, "Key must not be null!");
slices.put(field, size);
return this;
}
/**
* Project a {@code $slice} of the array {@code field} using the first {@code size} elements starting at
* {@code offset}.
*
* @param field the document field name to project, must be an array field.
* @param offset the offset to start at.
* @param size the number of elements to include.
* @return {@code this} field projection instance.
*/
public Field slice(String field, int offset, int size) {
slices.put(field, new Integer[] { offset, size });
return this;
}
public Field elemMatch(String field, Criteria elemMatchCriteria) {
elemMatchs.put(field, elemMatchCriteria);
public Field elemMatch(String key, Criteria elemMatchCriteria) {
elemMatchs.put(key, elemMatchCriteria);
return this;
}
@@ -152,7 +72,7 @@ public class Field {
*
* @param field query array field, must not be {@literal null} or empty.
* @param value
* @return {@code this} field projection instance.
* @return
*/
public Field position(String field, int value) {
@@ -183,40 +103,4 @@ public class Field {
return document;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
Field field = (Field) o;
if (positionValue != field.positionValue) {
return false;
}
if (!ObjectUtils.nullSafeEquals(criteria, field.criteria)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(slices, field.slices)) {
return false;
}
if (!ObjectUtils.nullSafeEquals(elemMatchs, field.elemMatchs)) {
return false;
}
return ObjectUtils.nullSafeEquals(positionKey, field.positionKey);
}
@Override
public int hashCode() {
int result = ObjectUtils.nullSafeHashCode(criteria);
result = 31 * result + ObjectUtils.nullSafeHashCode(slices);
result = 31 * result + ObjectUtils.nullSafeHashCode(elemMatchs);
result = 31 * result + ObjectUtils.nullSafeHashCode(positionKey);
result = 31 * result + positionValue;
return result;
}
}

View File

@@ -15,10 +15,14 @@
*/
package org.springframework.data.mongodb.core.query;
import lombok.AccessLevel;
import lombok.EqualsAndHashCode;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import java.util.Set;
import org.springframework.data.domain.ExampleMatcher;
import org.springframework.util.ObjectUtils;
/**
* {@link ExampleMatcher} implementation for query by example (QBE). Unlike plain {@link ExampleMatcher} this untyped
@@ -29,13 +33,11 @@ import org.springframework.util.ObjectUtils;
* @author Mark Paluch
* @since 2.0
*/
@EqualsAndHashCode
@RequiredArgsConstructor(access = AccessLevel.PRIVATE)
public class UntypedExampleMatcher implements ExampleMatcher {
private final ExampleMatcher delegate;
private UntypedExampleMatcher(ExampleMatcher delegate) {
this.delegate = delegate;
}
private final @NonNull ExampleMatcher delegate;
/*
* (non-Javadoc)
@@ -222,21 +224,4 @@ public class UntypedExampleMatcher implements ExampleMatcher {
return delegate.getMatchMode();
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
UntypedExampleMatcher that = (UntypedExampleMatcher) o;
return ObjectUtils.nullSafeEquals(delegate, that.delegate);
}
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(delegate);
}
}

View File

@@ -15,8 +15,10 @@
*/
package org.springframework.data.mongodb.core.schema;
import lombok.AllArgsConstructor;
import lombok.NonNull;
import org.bson.Document;
import org.springframework.util.Assert;
/**
* Value object representing a MongoDB-specific JSON schema which is the default {@link MongoJsonSchema} implementation.
@@ -25,15 +27,10 @@ import org.springframework.util.Assert;
* @author Mark Paluch
* @since 2.1
*/
@AllArgsConstructor
class DefaultMongoJsonSchema implements MongoJsonSchema {
private final JsonSchemaObject root;
DefaultMongoJsonSchema(JsonSchemaObject root) {
Assert.notNull(root, "Root must not be null!");
this.root = root;
}
private final @NonNull JsonSchemaObject root;
/*
* (non-Javadoc)

View File

@@ -15,8 +15,10 @@
*/
package org.springframework.data.mongodb.core.schema;
import lombok.AllArgsConstructor;
import lombok.NonNull;
import org.bson.Document;
import org.springframework.util.Assert;
/**
* JSON schema backed by a {@link org.bson.Document} object.
@@ -24,15 +26,10 @@ import org.springframework.util.Assert;
* @author Mark Paluch
* @since 2.1
*/
@AllArgsConstructor
class DocumentJsonSchema implements MongoJsonSchema {
private final Document document;
DocumentJsonSchema(Document document) {
Assert.notNull(document, "Document must not be null!");
this.document = document;
}
private final @NonNull Document document;
/*
* (non-Javadoc)

View File

@@ -15,6 +15,9 @@
*/
package org.springframework.data.mongodb.core.schema;
import lombok.EqualsAndHashCode;
import lombok.RequiredArgsConstructor;
import java.math.BigDecimal;
import java.util.Arrays;
import java.util.Collection;
@@ -41,7 +44,6 @@ import org.springframework.data.mongodb.core.schema.TypedJsonSchemaObject.String
import org.springframework.data.mongodb.core.schema.TypedJsonSchemaObject.TimestampJsonSchemaObject;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
import org.springframework.util.ObjectUtils;
/**
* Interface that can be implemented by objects that know how to serialize themselves to JSON schema using
@@ -498,14 +500,12 @@ public interface JsonSchemaObject {
* @author Christpoh Strobl
* @since 2.1
*/
@RequiredArgsConstructor
@EqualsAndHashCode
class JsonType implements Type {
private final String name;
public JsonType(String name) {
this.name = name;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.schema.JsonSchemaObject.Type#representation()
@@ -523,37 +523,18 @@ public interface JsonSchemaObject {
public String value() {
return name;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
JsonType jsonType = (JsonType) o;
return ObjectUtils.nullSafeEquals(name, jsonType.name);
}
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(name);
}
}
/**
* @author Christpoh Strobl
* @since 2.1
*/
@RequiredArgsConstructor
@EqualsAndHashCode
class BsonType implements Type {
private final String name;
BsonType(String name) {
this.name = name;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.schema.JsonSchemaObject.Type#representation()
@@ -571,24 +552,6 @@ public interface JsonSchemaObject {
public String value() {
return name;
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
BsonType bsonType = (BsonType) o;
return ObjectUtils.nullSafeEquals(name, bsonType.name);
}
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(name);
}
}
}
}

View File

@@ -15,6 +15,9 @@
*/
package org.springframework.data.mongodb.core.schema;
import lombok.AccessLevel;
import lombok.RequiredArgsConstructor;
import org.springframework.data.mongodb.core.schema.TypedJsonSchemaObject.NumericJsonSchemaObject;
import org.springframework.data.mongodb.core.schema.TypedJsonSchemaObject.ObjectJsonSchemaObject;
import org.springframework.data.mongodb.core.schema.IdentifiableJsonSchemaProperty.*;
@@ -236,14 +239,11 @@ public interface JsonSchemaProperty extends JsonSchemaObject {
/**
* Builder for {@link IdentifiableJsonSchemaProperty}.
*/
@RequiredArgsConstructor(access = AccessLevel.PACKAGE)
class JsonSchemaPropertyBuilder {
private final String identifier;
JsonSchemaPropertyBuilder(String identifier) {
this.identifier = identifier;
}
/**
* Configure a {@link Type} for the property.
*

View File

@@ -15,6 +15,9 @@
*/
package org.springframework.data.mongodb.core.schema;
import lombok.AccessLevel;
import lombok.RequiredArgsConstructor;
import java.util.Collection;
import java.util.Collections;
import java.util.List;
@@ -179,6 +182,7 @@ public class UntypedJsonSchemaObject implements JsonSchemaObject {
* @author Christoph Strobl
* @since 2.1
*/
@RequiredArgsConstructor(access = AccessLevel.PACKAGE)
static class Restrictions {
private final Collection<? extends Object> possibleValues;
@@ -187,16 +191,6 @@ public class UntypedJsonSchemaObject implements JsonSchemaObject {
private final Collection<JsonSchemaObject> oneOf;
private final @Nullable JsonSchemaObject notMatch;
Restrictions(Collection<? extends Object> possibleValues, Collection<JsonSchemaObject> allOf,
Collection<JsonSchemaObject> anyOf, Collection<JsonSchemaObject> oneOf, JsonSchemaObject notMatch) {
this.possibleValues = possibleValues;
this.allOf = allOf;
this.anyOf = anyOf;
this.oneOf = oneOf;
this.notMatch = notMatch;
}
/**
* @return new empty {@link Restrictions}.
*/

View File

@@ -15,12 +15,15 @@
*/
package org.springframework.data.mongodb.core.validation;
import lombok.AccessLevel;
import lombok.EqualsAndHashCode;
import lombok.RequiredArgsConstructor;
import org.bson.Document;
import org.springframework.data.mongodb.core.query.Criteria;
import org.springframework.data.mongodb.core.query.CriteriaDefinition;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* {@link Validator} implementation based on {@link CriteriaDefinition query expressions}.
@@ -31,14 +34,12 @@ import org.springframework.util.ObjectUtils;
* @see Criteria
* @see <a href="https://docs.mongodb.com/manual/core/schema-validation/#query-expressions">Schema Validation</a>
*/
@RequiredArgsConstructor(access = AccessLevel.PRIVATE)
@EqualsAndHashCode
class CriteriaValidator implements Validator {
private final CriteriaDefinition criteria;
private CriteriaValidator(CriteriaDefinition criteria) {
this.criteria = criteria;
}
/**
* Creates a new {@link Validator} object, which is basically setup of query operators, based on a
* {@link CriteriaDefinition} instance.
@@ -71,21 +72,4 @@ class CriteriaValidator implements Validator {
public String toString() {
return SerializationUtils.serializeToJsonSafely(toDocument());
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
CriteriaValidator that = (CriteriaValidator) o;
return ObjectUtils.nullSafeEquals(criteria, that.criteria);
}
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(criteria);
}
}

View File

@@ -15,10 +15,13 @@
*/
package org.springframework.data.mongodb.core.validation;
import lombok.AccessLevel;
import lombok.EqualsAndHashCode;
import lombok.RequiredArgsConstructor;
import org.bson.Document;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* Most trivial {@link Validator} implementation using plain {@link Document} to describe the desired document structure
@@ -29,14 +32,12 @@ import org.springframework.util.ObjectUtils;
* @since 2.1
* @see <a href="https://docs.mongodb.com/manual/core/schema-validation/">Schema Validation</a>
*/
@RequiredArgsConstructor(access = AccessLevel.PRIVATE)
@EqualsAndHashCode
class DocumentValidator implements Validator {
private final Document validatorObject;
private DocumentValidator(Document validatorObject) {
this.validatorObject = validatorObject;
}
/**
* Create new {@link DocumentValidator} defining validation rules via a plain {@link Document}.
*
@@ -67,22 +68,4 @@ class DocumentValidator implements Validator {
public String toString() {
return SerializationUtils.serializeToJsonSafely(validatorObject);
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
DocumentValidator that = (DocumentValidator) o;
return ObjectUtils.nullSafeEquals(validatorObject, that.validatorObject);
}
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(validatorObject);
}
}

View File

@@ -15,11 +15,14 @@
*/
package org.springframework.data.mongodb.core.validation;
import lombok.AccessLevel;
import lombok.EqualsAndHashCode;
import lombok.RequiredArgsConstructor;
import org.bson.Document;
import org.springframework.data.mongodb.core.query.SerializationUtils;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema;
import org.springframework.util.Assert;
import org.springframework.util.ObjectUtils;
/**
* {@link Validator} implementation based on {@link MongoJsonSchema JSON Schema}.
@@ -29,14 +32,12 @@ import org.springframework.util.ObjectUtils;
* @since 2.1
* @see <a href="https://docs.mongodb.com/manual/core/schema-validation/#json-schema">Schema Validation</a>
*/
@RequiredArgsConstructor(access = AccessLevel.PRIVATE)
@EqualsAndHashCode
class JsonSchemaValidator implements Validator {
private final MongoJsonSchema schema;
private JsonSchemaValidator(MongoJsonSchema schema) {
this.schema = schema;
}
/**
* Create new {@link JsonSchemaValidator} defining validation rules via {@link MongoJsonSchema}.
*
@@ -67,22 +68,4 @@ class JsonSchemaValidator implements Validator {
public String toString() {
return SerializationUtils.serializeToJsonSafely(toDocument());
}
@Override
public boolean equals(Object o) {
if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
JsonSchemaValidator that = (JsonSchemaValidator) o;
return ObjectUtils.nullSafeEquals(schema, that.schema);
}
@Override
public int hashCode() {
return ObjectUtils.nullSafeHashCode(schema);
}
}

View File

@@ -129,7 +129,7 @@ public interface GridFsOperations extends ResourcePatternResolver {
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}. If not empty, may override content type within {@literal metadata}.
* @param contentType can be {@literal null}.
* @param metadata can be {@literal null}.
* @return the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just created.
*/
@@ -140,12 +140,12 @@ public interface GridFsOperations extends ResourcePatternResolver {
if (StringUtils.hasText(filename)) {
uploadBuilder.filename(filename);
}
if (!ObjectUtils.isEmpty(metadata)) {
uploadBuilder.metadata(metadata);
}
if (StringUtils.hasText(contentType)) {
uploadBuilder.contentType(contentType);
}
if (!ObjectUtils.isEmpty(metadata)) {
uploadBuilder.metadata(metadata);
}
return store(uploadBuilder.build());
}

View File

@@ -135,7 +135,7 @@ public interface ReactiveGridFsOperations {
*
* @param content must not be {@literal null}.
* @param filename must not be {@literal null} or empty.
* @param contentType can be {@literal null}. If not empty, may override content type within {@literal metadata}.
* @param contentType can be {@literal null}.
* @param metadata can be {@literal null}.
* @return a {@link Mono} emitting the {@link ObjectId} of the {@link com.mongodb.client.gridfs.model.GridFSFile} just
* created.
@@ -148,12 +148,12 @@ public interface ReactiveGridFsOperations {
if (StringUtils.hasText(filename)) {
uploadBuilder.filename(filename);
}
if (!ObjectUtils.isEmpty(metadata)) {
uploadBuilder.metadata(metadata);
}
if (StringUtils.hasText(contentType)) {
uploadBuilder.contentType(contentType);
}
if (!ObjectUtils.isEmpty(metadata)) {
uploadBuilder.metadata(metadata);
}
return store(uploadBuilder.build());
}

View File

@@ -16,8 +16,6 @@
package org.springframework.data.mongodb.repository.query;
import org.bson.Document;
import org.bson.codecs.configuration.CodecRegistry;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.ExecutableFindOperation.ExecutableFind;
import org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithQuery;
import org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind;
@@ -32,14 +30,10 @@ import org.springframework.data.repository.query.ParameterAccessor;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.data.repository.query.ResultProcessor;
import org.springframework.data.spel.ExpressionDependencies;
import org.springframework.expression.EvaluationContext;
import org.springframework.expression.ExpressionParser;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.client.MongoDatabase;
/**
* Base class for {@link RepositoryQuery} implementations for Mongo.
*
@@ -53,7 +47,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
private final MongoQueryMethod method;
private final MongoOperations operations;
private final ExecutableFind<?> executableFind;
private final ExpressionParser expressionParser;
private final SpelExpressionParser expressionParser;
private final QueryMethodEvaluationContextProvider evaluationContextProvider;
/**
@@ -64,7 +58,7 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
* @param expressionParser must not be {@literal null}.
* @param evaluationContextProvider must not be {@literal null}.
*/
public AbstractMongoQuery(MongoQueryMethod method, MongoOperations operations, ExpressionParser expressionParser,
public AbstractMongoQuery(MongoQueryMethod method, MongoOperations operations, SpelExpressionParser expressionParser,
QueryMethodEvaluationContextProvider evaluationContextProvider) {
Assert.notNull(operations, "MongoOperations must not be null!");
@@ -214,29 +208,6 @@ public abstract class AbstractMongoQuery implements RepositoryQuery {
return applyQueryMetaAttributesWhenPresent(createQuery(accessor));
}
/**
* Obtain a the {@link EvaluationContext} suitable to evaluate expressions backed by the given dependencies.
*
* @param dependencies must not be {@literal null}.
* @param accessor must not be {@literal null}.
* @return the {@link SpELExpressionEvaluator}.
* @since 2.4
*/
protected SpELExpressionEvaluator getSpELExpressionEvaluatorFor(ExpressionDependencies dependencies,
ConvertingParameterAccessor accessor) {
return new DefaultSpELExpressionEvaluator(expressionParser, evaluationContextProvider
.getEvaluationContext(getQueryMethod().getParameters(), accessor.getValues(), dependencies));
}
/**
* @return the {@link CodecRegistry} used.
* @since 2.4
*/
protected CodecRegistry getCodecRegistry() {
return operations.execute(MongoDatabase::getCodecRegistry);
}
/**
* Creates a {@link Query} instance using the given {@link ParameterAccessor}
*

View File

@@ -15,15 +15,13 @@
*/
package org.springframework.data.mongodb.repository.query;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import org.bson.Document;
import org.bson.codecs.configuration.CodecRegistry;
import org.reactivestreams.Publisher;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.mapping.model.EntityInstantiators;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.ReactiveFindOperation.FindWithProjection;
import org.springframework.data.mongodb.core.ReactiveFindOperation.FindWithQuery;
@@ -35,17 +33,14 @@ import org.springframework.data.mongodb.repository.query.ReactiveMongoQueryExecu
import org.springframework.data.mongodb.repository.query.ReactiveMongoQueryExecution.ResultProcessingConverter;
import org.springframework.data.mongodb.repository.query.ReactiveMongoQueryExecution.ResultProcessingExecution;
import org.springframework.data.repository.query.ParameterAccessor;
import org.springframework.data.repository.query.ReactiveQueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.data.repository.query.ResultProcessor;
import org.springframework.data.spel.ExpressionDependencies;
import org.springframework.data.util.TypeInformation;
import org.springframework.expression.ExpressionParser;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import com.mongodb.MongoClientSettings;
/**
* Base class for reactive {@link RepositoryQuery} implementations for MongoDB.
*
@@ -59,8 +54,8 @@ public abstract class AbstractReactiveMongoQuery implements RepositoryQuery {
private final ReactiveMongoOperations operations;
private final EntityInstantiators instantiators;
private final FindWithProjection<?> findOperationWithProjection;
private final ExpressionParser expressionParser;
private final ReactiveQueryMethodEvaluationContextProvider evaluationContextProvider;
private final SpelExpressionParser expressionParser;
private final QueryMethodEvaluationContextProvider evaluationContextProvider;
/**
* Creates a new {@link AbstractReactiveMongoQuery} from the given {@link MongoQueryMethod} and
@@ -72,12 +67,12 @@ public abstract class AbstractReactiveMongoQuery implements RepositoryQuery {
* @param evaluationContextProvider must not be {@literal null}.
*/
public AbstractReactiveMongoQuery(ReactiveMongoQueryMethod method, ReactiveMongoOperations operations,
ExpressionParser expressionParser, ReactiveQueryMethodEvaluationContextProvider evaluationContextProvider) {
SpelExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
Assert.notNull(method, "MongoQueryMethod must not be null!");
Assert.notNull(operations, "ReactiveMongoOperations must not be null!");
Assert.notNull(expressionParser, "SpelExpressionParser must not be null!");
Assert.notNull(evaluationContextProvider, "ReactiveEvaluationContextExtension must not be null!");
Assert.notNull(evaluationContextProvider, "QueryMethodEvaluationContextProvider must not be null!");
this.method = method;
this.operations = operations;
@@ -103,21 +98,25 @@ public abstract class AbstractReactiveMongoQuery implements RepositoryQuery {
* (non-Javadoc)
* @see org.springframework.data.repository.query.RepositoryQuery#execute(java.lang.Object[])
*/
public Publisher<Object> execute(Object[] parameters) {
public Object execute(Object[] parameters) {
return method.hasReactiveWrapperParameter() ? executeDeferred(parameters)
: execute(new MongoParametersParameterAccessor(method, parameters));
}
@SuppressWarnings("unchecked")
private Publisher<Object> executeDeferred(Object[] parameters) {
private Object executeDeferred(Object[] parameters) {
ReactiveMongoParameterAccessor parameterAccessor = new ReactiveMongoParameterAccessor(method, parameters);
return execute(parameterAccessor);
if (getQueryMethod().isCollectionQuery()) {
return Flux.defer(() -> (Publisher<Object>) execute(parameterAccessor));
}
return Mono.defer(() -> (Mono<Object>) execute(parameterAccessor));
}
private Publisher<Object> execute(MongoParameterAccessor parameterAccessor) {
private Object execute(MongoParameterAccessor parameterAccessor) {
ConvertingParameterAccessor accessor = new ConvertingParameterAccessor(operations.getConverter(),
parameterAccessor);
@@ -142,26 +141,24 @@ public abstract class AbstractReactiveMongoQuery implements RepositoryQuery {
* @param accessor for providing invocation arguments. Never {@literal null}.
* @param typeToRead the desired component target type. Can be {@literal null}.
*/
protected Publisher<Object> doExecute(ReactiveMongoQueryMethod method, ResultProcessor processor,
protected Object doExecute(ReactiveMongoQueryMethod method, ResultProcessor processor,
ConvertingParameterAccessor accessor, @Nullable Class<?> typeToRead) {
return createQuery(accessor).flatMapMany(it -> {
Query query = createQuery(accessor);
Query query = it;
applyQueryMetaAttributesWhenPresent(query);
query = applyAnnotatedDefaultSortIfPresent(query);
query = applyAnnotatedCollationIfPresent(query, accessor);
applyQueryMetaAttributesWhenPresent(query);
query = applyAnnotatedDefaultSortIfPresent(query);
query = applyAnnotatedCollationIfPresent(query, accessor);
FindWithQuery<?> find = typeToRead == null //
? findOperationWithProjection //
: findOperationWithProjection.as(typeToRead);
FindWithQuery<?> find = typeToRead == null //
? findOperationWithProjection //
: findOperationWithProjection.as(typeToRead);
String collection = method.getEntityInformation().getCollectionName();
String collection = method.getEntityInformation().getCollectionName();
ReactiveMongoQueryExecution execution = getExecution(accessor,
new ResultProcessingConverter(processor, operations, instantiators), find);
return execution.execute(query, processor.getReturnedType().getDomainType(), collection);
});
ReactiveMongoQueryExecution execution = getExecution(accessor,
new ResultProcessingConverter(processor, operations, instantiators), find);
return execution.execute(query, processor.getReturnedType().getDomainType(), collection);
}
/**
@@ -257,37 +254,8 @@ public abstract class AbstractReactiveMongoQuery implements RepositoryQuery {
* @param accessor must not be {@literal null}.
* @return
*/
protected Mono<Query> createCountQuery(ConvertingParameterAccessor accessor) {
return createQuery(accessor).map(this::applyQueryMetaAttributesWhenPresent);
}
/**
* Obtain a {@link Mono publisher} emitting the {@link SpELExpressionEvaluator} suitable to evaluate expressions
* backed by the given dependencies.
*
* @param dependencies must not be {@literal null}.
* @param accessor must not be {@literal null}.
* @return a {@link Mono} emitting the {@link SpELExpressionEvaluator} when ready.
* @since 2.4
*/
protected Mono<SpELExpressionEvaluator> getSpelEvaluatorFor(ExpressionDependencies dependencies,
ConvertingParameterAccessor accessor) {
return evaluationContextProvider
.getEvaluationContextLater(getQueryMethod().getParameters(), accessor.getValues(), dependencies)
.map(evaluationContext -> (SpELExpressionEvaluator) new DefaultSpELExpressionEvaluator(expressionParser,
evaluationContext))
.defaultIfEmpty(DefaultSpELExpressionEvaluator.unsupported());
}
/**
* @return a {@link Mono} emitting the {@link CodecRegistry} when ready.
* @since 2.4
*/
protected Mono<CodecRegistry> getCodecRegistry() {
return Mono.from(operations.execute(db -> Mono.just(db.getCodecRegistry())))
.defaultIfEmpty(MongoClientSettings.getDefaultCodecRegistry());
protected Query createCountQuery(ConvertingParameterAccessor accessor) {
return applyQueryMetaAttributesWhenPresent(createQuery(accessor));
}
/**
@@ -296,7 +264,7 @@ public abstract class AbstractReactiveMongoQuery implements RepositoryQuery {
* @param accessor must not be {@literal null}.
* @return
*/
protected abstract Mono<Query> createQuery(ConvertingParameterAccessor accessor);
protected abstract Query createQuery(ConvertingParameterAccessor accessor);
/**
* Returns whether the query should get a count projection applied.

View File

@@ -15,7 +15,10 @@
*/
package org.springframework.data.mongodb.repository.query;
import lombok.experimental.UtilityClass;
import java.time.Duration;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
@@ -24,13 +27,16 @@ import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort.Order;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
import org.springframework.data.mongodb.core.aggregation.AggregationOptions;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Meta;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.util.json.ParameterBindingContext;
import org.springframework.data.mongodb.util.json.ParameterBindingDocumentCodec;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.expression.ExpressionParser;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.lang.Nullable;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
@@ -43,9 +49,10 @@ import org.springframework.util.StringUtils;
* @author Mark Paluch
* @since 2.2
*/
abstract class AggregationUtils {
@UtilityClass
class AggregationUtils {
private AggregationUtils() {}
private static final ParameterBindingDocumentCodec CODEC = new ParameterBindingDocumentCodec();
/**
* Apply a collation extracted from the given {@literal collationExpression} to the given
@@ -57,12 +64,12 @@ abstract class AggregationUtils {
* @param accessor must not be {@literal null}.
* @return the {@link Query} having proper {@link Collation}.
* @see AggregationOptions#getCollation()
* @see CollationUtils#computeCollation(String, ConvertingParameterAccessor, MongoParameters, ExpressionParser,
* @see CollationUtils#computeCollation(String, ConvertingParameterAccessor, MongoParameters, SpelExpressionParser,
* QueryMethodEvaluationContextProvider)
*/
static AggregationOptions.Builder applyCollation(AggregationOptions.Builder builder,
@Nullable String collationExpression, ConvertingParameterAccessor accessor, MongoParameters parameters,
ExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
SpelExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
Collation collation = CollationUtils.computeCollation(collationExpression, accessor, parameters, expressionParser,
evaluationContextProvider);
@@ -98,6 +105,32 @@ abstract class AggregationUtils {
return builder;
}
/**
* Compute the {@link AggregationOperation aggregation} pipeline for the given {@link MongoQueryMethod}. The raw
* {@link org.springframework.data.mongodb.repository.Aggregation#pipeline()} is parsed with a
* {@link ParameterBindingDocumentCodec} to obtain the MongoDB native {@link Document} representation returned by
* {@link AggregationOperation#toDocument(AggregationOperationContext)} that is mapped against the domain type
* properties.
*
* @param method
* @param accessor
* @param expressionParser
* @param evaluationContextProvider
* @return
*/
static List<AggregationOperation> computePipeline(MongoQueryMethod method, ConvertingParameterAccessor accessor,
SpelExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
ParameterBindingContext bindingContext = new ParameterBindingContext((accessor::getBindableValue), expressionParser,
() -> evaluationContextProvider.getEvaluationContext(method.getParameters(), accessor.getValues()));
List<AggregationOperation> target = new ArrayList<>(method.getAnnotatedAggregation().length);
for (String source : method.getAnnotatedAggregation()) {
target.add(ctx -> ctx.getMappedObject(CODEC.decode(source, bindingContext), method.getDomainClass()));
}
return target;
}
/**
* Append {@code $sort} aggregation stage if {@link ConvertingParameterAccessor#getSort()} is present.
*

View File

@@ -15,17 +15,16 @@
*/
package org.springframework.data.mongodb.repository.query;
import lombok.experimental.UtilityClass;
/**
* Utility class containing methods to interact with boolean values.
*
* @author Mark Paluch
* @since 2.0.9
*/
final class BooleanUtil {
private BooleanUtil() {
throw new UnsupportedOperationException("This is a utility class and cannot be instantiated");
}
@UtilityClass
class BooleanUtil {
/**
* Count the number of {@literal true} values.

View File

@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.repository.query;
import lombok.experimental.UtilityClass;
import java.util.Locale;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
@@ -24,7 +26,6 @@ import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.util.json.ParameterBindingContext;
import org.springframework.data.mongodb.util.json.ParameterBindingDocumentCodec;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.expression.ExpressionParser;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.lang.Nullable;
import org.springframework.util.NumberUtils;
@@ -38,14 +39,12 @@ import org.springframework.util.StringUtils;
* @author Christoph Strobl
* @since 2.2
*/
abstract class CollationUtils {
@UtilityClass
class CollationUtils {
private static final ParameterBindingDocumentCodec CODEC = new ParameterBindingDocumentCodec();
private static final Pattern PARAMETER_BINDING_PATTERN = Pattern.compile("\\?(\\d+)");
private CollationUtils() {
}
/**
* Compute the {@link Collation} by inspecting the {@link ConvertingParameterAccessor#getCollation() parameter
* accessor} or parsing a potentially given {@literal collationExpression}.
@@ -60,7 +59,7 @@ abstract class CollationUtils {
*/
@Nullable
static Collation computeCollation(@Nullable String collationExpression, ConvertingParameterAccessor accessor,
MongoParameters parameters, ExpressionParser expressionParser,
MongoParameters parameters, SpelExpressionParser expressionParser,
QueryMethodEvaluationContextProvider evaluationContextProvider) {
if (accessor.getCollation() != null) {
@@ -73,9 +72,8 @@ abstract class CollationUtils {
if (StringUtils.trimLeadingWhitespace(collationExpression).startsWith("{")) {
ParameterBindingContext bindingContext = ParameterBindingContext.forExpressions(accessor::getBindableValue,
expressionParser, dependencies -> evaluationContextProvider.getEvaluationContext(parameters,
accessor.getValues(), dependencies));
ParameterBindingContext bindingContext = new ParameterBindingContext((accessor::getBindableValue),
expressionParser, () -> evaluationContextProvider.getEvaluationContext(parameters, accessor.getValues()));
return Collation.from(CODEC.decode(collationExpression, bindingContext));
}

View File

@@ -1,73 +0,0 @@
/*
* Copyright 2020 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.query;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.expression.EvaluationContext;
import org.springframework.expression.ExpressionParser;
/**
* Simple {@link SpELExpressionEvaluator} implementation using {@link ExpressionParser} and {@link EvaluationContext}.
*
* @author Mark Paluch
* @since 3.1
*/
class DefaultSpELExpressionEvaluator implements SpELExpressionEvaluator {
private final ExpressionParser parser;
private final EvaluationContext context;
DefaultSpELExpressionEvaluator(ExpressionParser parser, EvaluationContext context) {
this.parser = parser;
this.context = context;
}
/**
* Return a {@link SpELExpressionEvaluator} that does not support expression evaluation.
*
* @return a {@link SpELExpressionEvaluator} that does not support expression evaluation.
* @since 3.1
*/
public static SpELExpressionEvaluator unsupported() {
return NoOpExpressionEvaluator.INSTANCE;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.model.SpELExpressionEvaluator#evaluate(java.lang.String)
*/
@Override
@SuppressWarnings("unchecked")
public <T> T evaluate(String expression) {
return (T) parser.parseExpression(expression).getValue(context, Object.class);
}
/**
* {@link SpELExpressionEvaluator} that does not support SpEL evaluation.
*
* @author Mark Paluch
* @since 3.1
*/
enum NoOpExpressionEvaluator implements SpELExpressionEvaluator {
INSTANCE;
@Override
public <T> T evaluate(String expression) {
throw new UnsupportedOperationException("Expression evaluation not supported");
}
}
}

View File

@@ -15,6 +15,9 @@
*/
package org.springframework.data.mongodb.repository.query;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import java.util.List;
import org.springframework.data.domain.Page;
@@ -27,7 +30,6 @@ import org.springframework.data.geo.GeoPage;
import org.springframework.data.geo.GeoResult;
import org.springframework.data.geo.GeoResults;
import org.springframework.data.geo.Point;
import org.springframework.data.mongodb.core.ExecutableFindOperation;
import org.springframework.data.mongodb.core.ExecutableFindOperation.FindWithQuery;
import org.springframework.data.mongodb.core.ExecutableFindOperation.TerminatingFind;
import org.springframework.data.mongodb.core.MongoOperations;
@@ -35,7 +37,6 @@ import org.springframework.data.mongodb.core.query.NearQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.repository.support.PageableExecutionUtils;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import com.mongodb.client.result.DeleteResult;
@@ -61,19 +62,11 @@ interface MongoQueryExecution {
* @author Christoph Strobl
* @since 1.5
*/
@RequiredArgsConstructor
final class SlicedExecution implements MongoQueryExecution {
private final FindWithQuery<?> find;
private final Pageable pageable;
public SlicedExecution(ExecutableFindOperation.FindWithQuery<?> find, Pageable pageable) {
Assert.notNull(find, "Find must not be null!");
Assert.notNull(pageable, "Pageable must not be null!");
this.find = find;
this.pageable = pageable;
}
private final @NonNull FindWithQuery<?> find;
private final @NonNull Pageable pageable;
/*
* (non-Javadoc)
@@ -102,19 +95,11 @@ interface MongoQueryExecution {
* @author Mark Paluch
* @author Christoph Strobl
*/
@RequiredArgsConstructor
final class PagedExecution implements MongoQueryExecution {
private final FindWithQuery<?> operation;
private final Pageable pageable;
public PagedExecution(ExecutableFindOperation.FindWithQuery<?> operation, Pageable pageable) {
Assert.notNull(operation, "Operation must not be null!");
Assert.notNull(pageable, "Pageable must not be null!");
this.operation = operation;
this.pageable = pageable;
}
private final @NonNull FindWithQuery<?> operation;
private final @NonNull Pageable pageable;
/*
* (non-Javadoc)
@@ -148,23 +133,12 @@ interface MongoQueryExecution {
*
* @author Oliver Gierke
*/
@RequiredArgsConstructor
class GeoNearExecution implements MongoQueryExecution {
private final FindWithQuery<?> operation;
private final MongoQueryMethod method;
private final MongoParameterAccessor accessor;
public GeoNearExecution(ExecutableFindOperation.FindWithQuery<?> operation, MongoQueryMethod method,
MongoParameterAccessor accessor) {
Assert.notNull(operation, "Operation must not be null!");
Assert.notNull(method, "Method must not be null!");
Assert.notNull(accessor, "Accessor must not be null!");
this.operation = operation;
this.method = method;
this.accessor = accessor;
}
private final @NonNull FindWithQuery<?> operation;
private final @NonNull MongoQueryMethod method;
private final @NonNull MongoParameterAccessor accessor;
/*
* (non-Javadoc)
@@ -262,19 +236,11 @@ interface MongoQueryExecution {
* @author Christoph Strobl
* @since 1.5
*/
@RequiredArgsConstructor
final class DeleteExecution implements MongoQueryExecution {
private final MongoOperations operations;
private final MongoQueryMethod method;
public DeleteExecution(MongoOperations operations, MongoQueryMethod method) {
Assert.notNull(operations, "Operations must not be null!");
Assert.notNull(method, "Method must not be null!");
this.operations = operations;
this.method = method;
}
private final @NonNull MongoOperations operations;
private final @NonNull MongoQueryMethod method;
/*
* (non-Javadoc)

View File

@@ -32,7 +32,7 @@ import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.data.repository.query.ResultProcessor;
import org.springframework.data.repository.query.ReturnedType;
import org.springframework.data.repository.query.parser.PartTree;
import org.springframework.expression.ExpressionParser;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.util.StringUtils;
/**
@@ -59,7 +59,7 @@ public class PartTreeMongoQuery extends AbstractMongoQuery {
* @param evaluationContextProvider must not be {@literal null}.
*/
public PartTreeMongoQuery(MongoQueryMethod method, MongoOperations mongoOperations,
ExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
SpelExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
super(method, mongoOperations, expressionParser, evaluationContextProvider);

View File

@@ -21,7 +21,6 @@ import org.springframework.aop.framework.ProxyFactory;
import org.springframework.data.mongodb.core.query.Collation;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.expression.ExpressionParser;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.lang.Nullable;
@@ -77,7 +76,7 @@ class QueryUtils {
* @since 2.2
*/
static Query applyCollation(Query query, @Nullable String collationExpression, ConvertingParameterAccessor accessor,
MongoParameters parameters, ExpressionParser expressionParser,
MongoParameters parameters, SpelExpressionParser expressionParser,
QueryMethodEvaluationContextProvider evaluationContextProvider) {
Collation collation = CollationUtils.computeCollation(collationExpression, accessor, parameters, expressionParser,

View File

@@ -15,10 +15,13 @@
*/
package org.springframework.data.mongodb.repository.query;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import org.reactivestreams.Publisher;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Range;
@@ -35,7 +38,6 @@ import org.springframework.data.repository.util.ReactiveWrappers;
import org.springframework.data.util.ReflectionUtils;
import org.springframework.data.util.TypeInformation;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
/**
@@ -49,33 +51,26 @@ import org.springframework.util.ClassUtils;
*/
interface ReactiveMongoQueryExecution {
Publisher<? extends Object> execute(Query query, Class<?> type, String collection);
Object execute(Query query, Class<?> type, String collection);
/**
* {@link MongoQueryExecution} to execute geo-near queries.
*
* @author Mark Paluch
*/
@RequiredArgsConstructor
class GeoNearExecution implements ReactiveMongoQueryExecution {
private final ReactiveMongoOperations operations;
private final MongoParameterAccessor accessor;
private final TypeInformation<?> returnType;
public GeoNearExecution(ReactiveMongoOperations operations, MongoParameterAccessor accessor,
TypeInformation<?> returnType) {
this.operations = operations;
this.accessor = accessor;
this.returnType = returnType;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery.Execution#execute(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String)
*/
@Override
public Publisher<? extends Object> execute(Query query, Class<?> type, String collection) {
public Object execute(Query query, Class<?> type, String collection) {
Flux<GeoResult<Object>> results = doExecuteQuery(query, type, collection);
return isStreamOfGeoResult() ? results : results.map(GeoResult::getContent);
@@ -118,22 +113,18 @@ interface ReactiveMongoQueryExecution {
* @author Mark Paluch
* @author Artyom Gabeev
*/
@RequiredArgsConstructor
final class DeleteExecution implements ReactiveMongoQueryExecution {
private final ReactiveMongoOperations operations;
private final MongoQueryMethod method;
public DeleteExecution(ReactiveMongoOperations operations, MongoQueryMethod method) {
this.operations = operations;
this.method = method;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery.Execution#execute(org.springframework.data.mongodb.core.query.Query, java.lang.Class, java.lang.String)
*/
@Override
public Publisher<? extends Object> execute(Query query, Class<?> type, String collection) {
public Object execute(Query query, Class<?> type, String collection) {
if (method.isCollectionQuery()) {
return operations.findAllAndRemove(query, type, collection);
@@ -152,23 +143,15 @@ interface ReactiveMongoQueryExecution {
* An {@link ReactiveMongoQueryExecution} that wraps the results of the given delegate with the given result
* processing.
*/
@RequiredArgsConstructor
final class ResultProcessingExecution implements ReactiveMongoQueryExecution {
private final ReactiveMongoQueryExecution delegate;
private final Converter<Object, Object> converter;
public ResultProcessingExecution(ReactiveMongoQueryExecution delegate, Converter<Object, Object> converter) {
Assert.notNull(delegate, "Delegate must not be null!");
Assert.notNull(converter, "Converter must not be null!");
this.delegate = delegate;
this.converter = converter;
}
private final @NonNull ReactiveMongoQueryExecution delegate;
private final @NonNull Converter<Object, Object> converter;
@Override
public Publisher<? extends Object> execute(Query query, Class<?> type, String collection) {
return (Publisher) converter.convert(delegate.execute(query, type, collection));
public Object execute(Query query, Class<?> type, String collection) {
return converter.convert(delegate.execute(query, type, collection));
}
}
@@ -177,23 +160,12 @@ interface ReactiveMongoQueryExecution {
*
* @author Mark Paluch
*/
@RequiredArgsConstructor
final class ResultProcessingConverter implements Converter<Object, Object> {
private final ResultProcessor processor;
private final ReactiveMongoOperations operations;
private final EntityInstantiators instantiators;
public ResultProcessingConverter(ResultProcessor processor, ReactiveMongoOperations operations,
EntityInstantiators instantiators) {
Assert.notNull(processor, "Processor must not be null!");
Assert.notNull(operations, "Operations must not be null!");
Assert.notNull(instantiators, "Instantiators must not be null!");
this.processor = processor;
this.operations = operations;
this.instantiators = instantiators;
}
private final @NonNull ResultProcessor processor;
private final @NonNull ReactiveMongoOperations operations;
private final @NonNull EntityInstantiators instantiators;
/*
* (non-Javadoc)

View File

@@ -94,8 +94,8 @@ public class ReactiveMongoQueryMethod extends MongoQueryMethod {
}
this.method = method;
this.isCollectionQuery = Lazy.of(() -> (!(isPageQuery() || isSliceQuery())
&& ReactiveWrappers.isMultiValueType(metadata.getReturnType(method).getType()) || super.isCollectionQuery()));
this.isCollectionQuery = Lazy.of(() -> !(isPageQuery() || isSliceQuery())
&& ReactiveWrappers.isMultiValueType(metadata.getReturnType(method).getType()));
}
/*

View File

@@ -15,10 +15,9 @@
*/
package org.springframework.data.mongodb.repository.query;
import reactor.core.publisher.Mono;
import org.bson.Document;
import org.bson.json.JsonParseException;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.ReactiveMongoOperations;
@@ -27,12 +26,12 @@ import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.core.query.TextCriteria;
import org.springframework.data.repository.query.QueryMethod;
import org.springframework.data.repository.query.ReactiveQueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.data.repository.query.ResultProcessor;
import org.springframework.data.repository.query.ReturnedType;
import org.springframework.data.repository.query.parser.PartTree;
import org.springframework.expression.ExpressionParser;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.util.StringUtils;
/**
@@ -58,7 +57,7 @@ public class ReactivePartTreeMongoQuery extends AbstractReactiveMongoQuery {
* @param evaluationContextProvider must not be {@literal null}.
*/
public ReactivePartTreeMongoQuery(ReactiveMongoQueryMethod method, ReactiveMongoOperations mongoOperations,
ExpressionParser expressionParser, ReactiveQueryMethodEvaluationContextProvider evaluationContextProvider) {
SpelExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
super(method, mongoOperations, expressionParser, evaluationContextProvider);
@@ -82,28 +81,11 @@ public class ReactivePartTreeMongoQuery extends AbstractReactiveMongoQuery {
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#createQuery(org.springframework.data.mongodb.repository.query.ConvertingParameterAccessor, boolean)
*/
@Override
protected Mono<Query> createQuery(ConvertingParameterAccessor accessor) {
return Mono.fromSupplier(() -> createQueryInternal(accessor, false));
}
protected Query createQuery(ConvertingParameterAccessor accessor) {
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#createCountQuery(org.springframework.data.mongodb.repository.query.ConvertingParameterAccessor)
*/
@Override
protected Mono<Query> createCountQuery(ConvertingParameterAccessor accessor) {
return Mono.fromSupplier(() -> createQueryInternal(accessor, true));
}
private Query createQueryInternal(ConvertingParameterAccessor accessor, boolean isCountQuery) {
MongoQueryCreator creator = new MongoQueryCreator(tree, accessor, context, isCountQuery ? false : isGeoNearQuery);
MongoQueryCreator creator = new MongoQueryCreator(tree, accessor, context, isGeoNearQuery);
Query query = creator.createQuery();
if (isCountQuery) {
return query;
}
if (tree.isLimiting()) {
query.limit(tree.getMaxResults());
}
@@ -132,12 +114,22 @@ public class ReactivePartTreeMongoQuery extends AbstractReactiveMongoQuery {
result.setSortObject(query.getSortObject());
return result;
} catch (JsonParseException o_O) {
throw new IllegalStateException(String.format("Invalid query or field specification in %s!", getQueryMethod()),
o_O);
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#createCountQuery(org.springframework.data.mongodb.repository.query.ConvertingParameterAccessor)
*/
@Override
protected Query createCountQuery(ConvertingParameterAccessor accessor) {
return new MongoQueryCreator(tree, accessor, context, false).createQuery();
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#isCountQuery()

View File

@@ -16,13 +16,11 @@
package org.springframework.data.mongodb.repository.query;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
import java.util.ArrayList;
import java.util.List;
import org.bson.Document;
import org.reactivestreams.Publisher;
import org.springframework.data.mongodb.core.ReactiveMongoOperations;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperation;
@@ -31,12 +29,9 @@ import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.util.json.ParameterBindingContext;
import org.springframework.data.mongodb.util.json.ParameterBindingDocumentCodec;
import org.springframework.data.repository.query.ReactiveQueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.ResultProcessor;
import org.springframework.data.spel.ExpressionDependencies;
import org.springframework.expression.ExpressionParser;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.util.ClassUtils;
/**
@@ -49,8 +44,8 @@ import org.springframework.util.ClassUtils;
*/
public class ReactiveStringBasedAggregation extends AbstractReactiveMongoQuery {
private final ExpressionParser expressionParser;
private final ReactiveQueryMethodEvaluationContextProvider evaluationContextProvider;
private final SpelExpressionParser expressionParser;
private final QueryMethodEvaluationContextProvider evaluationContextProvider;
private final ReactiveMongoOperations reactiveMongoOperations;
private final MongoConverter mongoConverter;
@@ -61,8 +56,8 @@ public class ReactiveStringBasedAggregation extends AbstractReactiveMongoQuery {
* @param evaluationContextProvider must not be {@literal null}.
*/
public ReactiveStringBasedAggregation(ReactiveMongoQueryMethod method,
ReactiveMongoOperations reactiveMongoOperations, ExpressionParser expressionParser,
ReactiveQueryMethodEvaluationContextProvider evaluationContextProvider) {
ReactiveMongoOperations reactiveMongoOperations, SpelExpressionParser expressionParser,
QueryMethodEvaluationContextProvider evaluationContextProvider) {
super(method, reactiveMongoOperations, expressionParser, evaluationContextProvider);
@@ -77,75 +72,52 @@ public class ReactiveStringBasedAggregation extends AbstractReactiveMongoQuery {
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#doExecute(org.springframework.data.mongodb.repository.query.ReactiveMongoQueryMethod, org.springframework.data.repository.query.ResultProcessor, org.springframework.data.mongodb.repository.query.ConvertingParameterAccessor, java.lang.Class)
*/
@Override
protected Publisher<Object> doExecute(ReactiveMongoQueryMethod method, ResultProcessor processor,
protected Object doExecute(ReactiveMongoQueryMethod method, ResultProcessor processor,
ConvertingParameterAccessor accessor, Class<?> typeToRead) {
return computePipeline(accessor).flatMapMany(it -> {
Class<?> sourceType = method.getDomainClass();
Class<?> targetType = typeToRead;
Class<?> sourceType = method.getDomainClass();
Class<?> targetType = typeToRead;
List<AggregationOperation> pipeline = computePipeline(accessor);
AggregationUtils.appendSortIfPresent(pipeline, accessor, typeToRead);
AggregationUtils.appendLimitAndOffsetIfPresent(pipeline, accessor);
List<AggregationOperation> pipeline = it;
boolean isSimpleReturnType = isSimpleReturnType(typeToRead);
boolean isRawReturnType = ClassUtils.isAssignable(org.bson.Document.class, typeToRead);
AggregationUtils.appendSortIfPresent(pipeline, accessor, typeToRead);
AggregationUtils.appendLimitAndOffsetIfPresent(pipeline, accessor);
if (isSimpleReturnType || isRawReturnType) {
targetType = Document.class;
}
boolean isSimpleReturnType = isSimpleReturnType(typeToRead);
boolean isRawReturnType = ClassUtils.isAssignable(org.bson.Document.class, typeToRead);
AggregationOptions options = computeOptions(method, accessor);
TypedAggregation<?> aggregation = new TypedAggregation<>(sourceType, pipeline, options);
if (isSimpleReturnType || isRawReturnType) {
targetType = Document.class;
}
Flux<?> flux = reactiveMongoOperations.aggregate(aggregation, targetType);
AggregationOptions options = computeOptions(method, accessor);
TypedAggregation<?> aggregation = new TypedAggregation<>(sourceType, pipeline, options);
if (isSimpleReturnType && !isRawReturnType) {
flux = flux.handle((it, sink) -> {
Flux<?> flux = reactiveMongoOperations.aggregate(aggregation, targetType);
Object result = AggregationUtils.extractSimpleTypeResult((Document) it, typeToRead, mongoConverter);
if (isSimpleReturnType && !isRawReturnType) {
flux = flux.handle((item, sink) -> {
if (result != null) {
sink.next(result);
}
});
}
Object result = AggregationUtils.extractSimpleTypeResult((Document) item, typeToRead, mongoConverter);
if (result != null) {
sink.next(result);
}
});
}
return method.isCollectionQuery() ? flux : flux.next();
});
if (method.isCollectionQuery()) {
return flux;
} else {
return flux.next();
}
}
private boolean isSimpleReturnType(Class<?> targetType) {
return MongoSimpleTypes.HOLDER.isSimpleType(targetType);
}
private Mono<List<AggregationOperation>> computePipeline(ConvertingParameterAccessor accessor) {
return getCodecRegistry().map(ParameterBindingDocumentCodec::new).flatMap(codec -> {
String[] sourcePipeline = getQueryMethod().getAnnotatedAggregation();
List<Mono<AggregationOperation>> stages = new ArrayList<>(sourcePipeline.length);
for (String source : sourcePipeline) {
stages.add(computePipelineStage(source, accessor, codec));
}
return Flux.concat(stages).collectList();
});
}
private Mono<AggregationOperation> computePipelineStage(String source, ConvertingParameterAccessor accessor,
ParameterBindingDocumentCodec codec) {
ExpressionDependencies dependencies = codec.captureExpressionDependencies(source, accessor::getBindableValue,
expressionParser);
return getSpelEvaluatorFor(dependencies, accessor).map(it -> {
ParameterBindingContext bindingContext = new ParameterBindingContext(accessor::getBindableValue, it);
return ctx -> ctx.getMappedObject(codec.decode(source, bindingContext), getQueryMethod().getDomainClass());
});
List<AggregationOperation> computePipeline(ConvertingParameterAccessor accessor) {
return AggregationUtils.computePipeline(getQueryMethod(), accessor, expressionParser, evaluationContextProvider);
}
private AggregationOptions computeOptions(MongoQueryMethod method, ConvertingParameterAccessor accessor) {
@@ -164,7 +136,7 @@ public class ReactiveStringBasedAggregation extends AbstractReactiveMongoQuery {
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#createQuery(org.springframework.data.mongodb.repository.query.ConvertingParameterAccessor)
*/
@Override
protected Mono<Query> createQuery(ConvertingParameterAccessor accessor) {
protected Query createQuery(ConvertingParameterAccessor accessor) {
throw new UnsupportedOperationException("No query support for aggregation");
}

View File

@@ -15,21 +15,17 @@
*/
package org.springframework.data.mongodb.repository.query;
import reactor.core.publisher.Mono;
import org.bson.Document;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.ReactiveMongoOperations;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.util.json.ParameterBindingContext;
import org.springframework.data.mongodb.util.json.ParameterBindingDocumentCodec;
import org.springframework.data.repository.query.ReactiveExtensionAwareQueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.ReactiveQueryMethodEvaluationContextProvider;
import org.springframework.data.spel.ExpressionDependencies;
import org.springframework.expression.ExpressionParser;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.util.Assert;
@@ -44,12 +40,13 @@ public class ReactiveStringBasedMongoQuery extends AbstractReactiveMongoQuery {
private static final String COUNT_EXISTS_AND_DELETE = "Manually defined query for %s cannot be a count and exists or delete query at the same time!";
private static final Logger LOG = LoggerFactory.getLogger(ReactiveStringBasedMongoQuery.class);
private static final ParameterBindingDocumentCodec CODEC = new ParameterBindingDocumentCodec();
private final String query;
private final String fieldSpec;
private final ExpressionParser expressionParser;
private final ReactiveQueryMethodEvaluationContextProvider evaluationContextProvider;
private final SpelExpressionParser expressionParser;
private final QueryMethodEvaluationContextProvider evaluationContextProvider;
private final boolean isCountQuery;
private final boolean isExistsQuery;
@@ -65,14 +62,13 @@ public class ReactiveStringBasedMongoQuery extends AbstractReactiveMongoQuery {
* @param evaluationContextProvider must not be {@literal null}.
*/
public ReactiveStringBasedMongoQuery(ReactiveMongoQueryMethod method, ReactiveMongoOperations mongoOperations,
ExpressionParser expressionParser, ReactiveQueryMethodEvaluationContextProvider evaluationContextProvider) {
SpelExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
this(method.getAnnotatedQuery(), method, mongoOperations, expressionParser, evaluationContextProvider);
}
/**
* Creates a new {@link ReactiveStringBasedMongoQuery} for the given {@link String}, {@link MongoQueryMethod},
* {@link MongoOperations}, {@link SpelExpressionParser} and
* {@link ReactiveExtensionAwareQueryMethodEvaluationContextProvider}.
* {@link MongoOperations}, {@link SpelExpressionParser} and {@link QueryMethodEvaluationContextProvider}.
*
* @param query must not be {@literal null}.
* @param method must not be {@literal null}.
@@ -80,8 +76,8 @@ public class ReactiveStringBasedMongoQuery extends AbstractReactiveMongoQuery {
* @param expressionParser must not be {@literal null}.
*/
public ReactiveStringBasedMongoQuery(String query, ReactiveMongoQueryMethod method,
ReactiveMongoOperations mongoOperations, ExpressionParser expressionParser,
ReactiveQueryMethodEvaluationContextProvider evaluationContextProvider) {
ReactiveMongoOperations mongoOperations, SpelExpressionParser expressionParser,
QueryMethodEvaluationContextProvider evaluationContextProvider) {
super(method, mongoOperations, expressionParser, evaluationContextProvider);
@@ -118,36 +114,21 @@ public class ReactiveStringBasedMongoQuery extends AbstractReactiveMongoQuery {
* @see org.springframework.data.mongodb.repository.query.AbstractReactiveMongoQuery#createQuery(org.springframework.data.mongodb.repository.query.ConvertingParameterAccessor)
*/
@Override
protected Mono<Query> createQuery(ConvertingParameterAccessor accessor) {
protected Query createQuery(ConvertingParameterAccessor accessor) {
return getCodecRegistry().map(ParameterBindingDocumentCodec::new).flatMap(codec -> {
ParameterBindingContext bindingContext = new ParameterBindingContext((accessor::getBindableValue), expressionParser,
() -> evaluationContextProvider.getEvaluationContext(getQueryMethod().getParameters(), accessor.getValues()));
Mono<Document> queryObject = getBindingContext(query, accessor, codec)
.map(context -> codec.decode(query, context));
Mono<Document> fieldsObject = getBindingContext(fieldSpec, accessor, codec)
.map(context -> codec.decode(fieldSpec, context));
Document queryObject = CODEC.decode(this.query, bindingContext);
Document fieldsObject = CODEC.decode(this.fieldSpec, bindingContext);
return queryObject.zipWith(fieldsObject).map(tuple -> {
Query query = new BasicQuery(queryObject, fieldsObject).with(accessor.getSort());
Query query = new BasicQuery(tuple.getT1(), tuple.getT2()).with(accessor.getSort());
if (LOG.isDebugEnabled()) {
LOG.debug(String.format("Created query %s for %s fields.", query.getQueryObject(), query.getFieldsObject()));
}
if (LOG.isDebugEnabled()) {
LOG.debug(String.format("Created query %s for %s fields.", query.getQueryObject(), query.getFieldsObject()));
}
return query;
});
});
}
private Mono<ParameterBindingContext> getBindingContext(String json, ConvertingParameterAccessor accessor,
ParameterBindingDocumentCodec codec) {
ExpressionDependencies dependencies = codec.captureExpressionDependencies(json, accessor::getBindableValue,
expressionParser);
return getSpelEvaluatorFor(dependencies, accessor)
.map(it -> new ParameterBindingContext(accessor::getBindableValue, it));
return query;
}
/*

View File

@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.repository.query;
import lombok.Getter;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.util.Assert;
@@ -26,7 +28,7 @@ import org.springframework.util.Assert;
class SimpleMongoEntityMetadata<T> implements MongoEntityMetadata<T> {
private final Class<T> type;
private final MongoPersistentEntity<?> collectionEntity;
private final @Getter MongoPersistentEntity<?> collectionEntity;
/**
* Creates a new {@link SimpleMongoEntityMetadata} using the given type and {@link MongoPersistentEntity} to use for
@@ -59,8 +61,4 @@ class SimpleMongoEntityMetadata<T> implements MongoEntityMetadata<T> {
public String getCollectionName() {
return collectionEntity.getCollection();
}
public MongoPersistentEntity<?> getCollectionEntity() {
return this.collectionEntity;
}
}

View File

@@ -15,12 +15,11 @@
*/
package org.springframework.data.mongodb.repository.query;
import java.util.ArrayList;
import java.util.List;
import java.util.stream.Collectors;
import org.bson.Document;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.aggregation.Aggregation;
@@ -31,12 +30,9 @@ import org.springframework.data.mongodb.core.aggregation.TypedAggregation;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.mapping.MongoSimpleTypes;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.util.json.ParameterBindingContext;
import org.springframework.data.mongodb.util.json.ParameterBindingDocumentCodec;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.ResultProcessor;
import org.springframework.data.spel.ExpressionDependencies;
import org.springframework.expression.ExpressionParser;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.util.ClassUtils;
/**
@@ -47,7 +43,7 @@ public class StringBasedAggregation extends AbstractMongoQuery {
private final MongoOperations mongoOperations;
private final MongoConverter mongoConverter;
private final ExpressionParser expressionParser;
private final SpelExpressionParser expressionParser;
private final QueryMethodEvaluationContextProvider evaluationContextProvider;
/**
@@ -59,7 +55,7 @@ public class StringBasedAggregation extends AbstractMongoQuery {
* @param evaluationContextProvider
*/
public StringBasedAggregation(MongoQueryMethod method, MongoOperations mongoOperations,
ExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
SpelExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
super(method, mongoOperations, expressionParser, evaluationContextProvider);
this.mongoOperations = mongoOperations;
@@ -77,9 +73,7 @@ public class StringBasedAggregation extends AbstractMongoQuery {
ConvertingParameterAccessor accessor, Class<?> typeToRead) {
if (method.isPageQuery() || method.isSliceQuery()) {
throw new InvalidMongoDbApiUsageException(String.format(
"Repository aggregation method '%s' does not support '%s' return type. Please use eg. 'List' instead.",
method.getName(), method.getReturnType().getType().getSimpleName()));
throw new InvalidMongoDbApiUsageException(String.format("Repository aggregation method '%s' does not support '%s' return type. Please use eg. 'List' instead.", method.getName(), method.getReturnType().getType().getSimpleName()));
}
Class<?> sourceType = method.getDomainClass();
@@ -131,26 +125,7 @@ public class StringBasedAggregation extends AbstractMongoQuery {
}
List<AggregationOperation> computePipeline(MongoQueryMethod method, ConvertingParameterAccessor accessor) {
ParameterBindingDocumentCodec codec = new ParameterBindingDocumentCodec(getCodecRegistry());
String[] sourcePipeline = method.getAnnotatedAggregation();
List<AggregationOperation> stages = new ArrayList<>(sourcePipeline.length);
for (String source : sourcePipeline) {
stages.add(computePipelineStage(source, accessor, codec));
}
return stages;
}
private AggregationOperation computePipelineStage(String source, ConvertingParameterAccessor accessor,
ParameterBindingDocumentCodec codec) {
ExpressionDependencies dependencies = codec.captureExpressionDependencies(source, accessor::getBindableValue,
expressionParser);
SpELExpressionEvaluator evaluator = getSpELExpressionEvaluatorFor(dependencies, accessor);
ParameterBindingContext bindingContext = new ParameterBindingContext(accessor::getBindableValue, evaluator);
return ctx -> ctx.getMappedObject(codec.decode(source, bindingContext), getQueryMethod().getDomainClass());
return AggregationUtils.computePipeline(method, accessor, expressionParser, evaluationContextProvider);
}
private AggregationOptions computeOptions(MongoQueryMethod method, ConvertingParameterAccessor accessor) {

View File

@@ -16,20 +16,21 @@
package org.springframework.data.mongodb.repository.query;
import org.bson.Document;
import org.bson.codecs.configuration.CodecRegistry;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.query.BasicQuery;
import org.springframework.data.mongodb.core.query.Query;
import org.springframework.data.mongodb.util.json.ParameterBindingContext;
import org.springframework.data.mongodb.util.json.ParameterBindingDocumentCodec;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.data.spel.ExpressionDependencies;
import org.springframework.expression.ExpressionParser;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.util.Assert;
import com.mongodb.MongoClientSettings;
import com.mongodb.client.MongoDatabase;
/**
* Query to use a plain JSON String to create the {@link Query} to actually execute.
*
@@ -46,7 +47,8 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
private final String query;
private final String fieldSpec;
private final ExpressionParser expressionParser;
private final ParameterBindingDocumentCodec codec;
private final SpelExpressionParser expressionParser;
private final QueryMethodEvaluationContextProvider evaluationContextProvider;
private final boolean isCountQuery;
@@ -63,7 +65,7 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
* @param evaluationContextProvider must not be {@literal null}.
*/
public StringBasedMongoQuery(MongoQueryMethod method, MongoOperations mongoOperations,
ExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
SpelExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
this(method.getAnnotatedQuery(), method, mongoOperations, expressionParser, evaluationContextProvider);
}
@@ -77,7 +79,7 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
* @param expressionParser must not be {@literal null}.
*/
public StringBasedMongoQuery(String query, MongoQueryMethod method, MongoOperations mongoOperations,
ExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
SpelExpressionParser expressionParser, QueryMethodEvaluationContextProvider evaluationContextProvider) {
super(method, mongoOperations, expressionParser, evaluationContextProvider);
@@ -107,6 +109,10 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
this.isExistsQuery = false;
this.isDeleteQuery = false;
}
CodecRegistry codecRegistry = mongoOperations.execute(MongoDatabase::getCodecRegistry);
this.codec = new ParameterBindingDocumentCodec(
codecRegistry != null ? codecRegistry : MongoClientSettings.getDefaultCodecRegistry());
}
/*
@@ -116,10 +122,11 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
@Override
protected Query createQuery(ConvertingParameterAccessor accessor) {
ParameterBindingDocumentCodec codec = getParameterBindingCodec();
ParameterBindingContext bindingContext = new ParameterBindingContext((accessor::getBindableValue), expressionParser,
() -> evaluationContextProvider.getEvaluationContext(getQueryMethod().getParameters(), accessor.getValues()));
Document queryObject = codec.decode(this.query, getBindingContext(this.query, accessor, codec));
Document fieldsObject = codec.decode(this.fieldSpec, getBindingContext(this.fieldSpec, accessor, codec));
Document queryObject = codec.decode(this.query, bindingContext);
Document fieldsObject = codec.decode(this.fieldSpec, bindingContext);
Query query = new BasicQuery(queryObject, fieldsObject).with(accessor.getSort());
@@ -130,16 +137,6 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
return query;
}
private ParameterBindingContext getBindingContext(String json, ConvertingParameterAccessor accessor,
ParameterBindingDocumentCodec codec) {
ExpressionDependencies dependencies = codec.captureExpressionDependencies(json, accessor::getBindableValue,
expressionParser);
SpELExpressionEvaluator evaluator = getSpELExpressionEvaluatorFor(dependencies, accessor);
return new ParameterBindingContext(accessor::getBindableValue, evaluator);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.repository.query.AbstractMongoQuery#isCountQuery()
@@ -180,8 +177,4 @@ public class StringBasedMongoQuery extends AbstractMongoQuery {
boolean isDeleteQuery) {
return BooleanUtil.countBooleanTrueValues(isCountQuery, isExistsQuery, isDeleteQuery) > 1;
}
private ParameterBindingDocumentCodec getParameterBindingCodec() {
return new ParameterBindingDocumentCodec(getCodecRegistry());
}
}

View File

@@ -1,59 +0,0 @@
/*
* Copyright 2020 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository.support;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import org.springframework.expression.Expression;
import org.springframework.expression.ExpressionParser;
import org.springframework.expression.ParseException;
import org.springframework.expression.ParserContext;
/**
* Caching variant of {@link ExpressionParser}. This implementation does not support
* {@link #parseExpression(String, ParserContext) parsing with ParseContext}.
*
* @author Mark Paluch
* @since 3.1
*/
class CachingExpressionParser implements ExpressionParser {
private final ExpressionParser delegate;
private final Map<String, Expression> cache = new ConcurrentHashMap<>();
CachingExpressionParser(ExpressionParser delegate) {
this.delegate = delegate;
}
/*
* (non-Javadoc)
* @see org.springframework.expression.ExpressionParser#parseExpression(java.lang.String)
*/
@Override
public Expression parseExpression(String expressionString) throws ParseException {
return cache.computeIfAbsent(expressionString, delegate::parseExpression);
}
/*
* (non-Javadoc)
* @see org.springframework.expression.ExpressionParser#parseExpression(java.lang.String, org.springframework.expression.ParserContext)
*/
@Override
public Expression parseExpression(String expressionString, ParserContext context) throws ParseException {
throw new UnsupportedOperationException("Parsing using ParserContext is not supported");
}
}

View File

@@ -44,7 +44,6 @@ import org.springframework.data.repository.query.QueryLookupStrategy;
import org.springframework.data.repository.query.QueryLookupStrategy.Key;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.expression.ExpressionParser;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
@@ -163,8 +162,7 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
private final MongoOperations operations;
private final QueryMethodEvaluationContextProvider evaluationContextProvider;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final ExpressionParser expressionParser = new CachingExpressionParser(EXPRESSION_PARSER);
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
public MongoQueryLookupStrategy(MongoOperations operations,
QueryMethodEvaluationContextProvider evaluationContextProvider,
@@ -188,14 +186,14 @@ public class MongoRepositoryFactory extends RepositoryFactorySupport {
if (namedQueries.hasQuery(namedQueryName)) {
String namedQuery = namedQueries.getQuery(namedQueryName);
return new StringBasedMongoQuery(namedQuery, queryMethod, operations, expressionParser,
return new StringBasedMongoQuery(namedQuery, queryMethod, operations, EXPRESSION_PARSER,
evaluationContextProvider);
} else if (queryMethod.hasAnnotatedAggregation()) {
return new StringBasedAggregation(queryMethod, operations, expressionParser, evaluationContextProvider);
return new StringBasedAggregation(queryMethod, operations, EXPRESSION_PARSER, evaluationContextProvider);
} else if (queryMethod.hasAnnotatedQuery()) {
return new StringBasedMongoQuery(queryMethod, operations, expressionParser, evaluationContextProvider);
return new StringBasedMongoQuery(queryMethod, operations, EXPRESSION_PARSER, evaluationContextProvider);
} else {
return new PartTreeMongoQuery(queryMethod, operations, expressionParser, evaluationContextProvider);
return new PartTreeMongoQuery(queryMethod, operations, EXPRESSION_PARSER, evaluationContextProvider);
}
}
}

View File

@@ -17,6 +17,9 @@ package org.springframework.data.mongodb.repository.support;
import static org.springframework.data.querydsl.QuerydslUtils.*;
import lombok.AccessLevel;
import lombok.RequiredArgsConstructor;
import java.io.Serializable;
import java.lang.reflect.Method;
import java.util.Optional;
@@ -42,9 +45,7 @@ import org.springframework.data.repository.core.support.RepositoryFragment;
import org.springframework.data.repository.query.QueryLookupStrategy;
import org.springframework.data.repository.query.QueryLookupStrategy.Key;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.ReactiveQueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.expression.ExpressionParser;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
@@ -75,7 +76,6 @@ public class ReactiveMongoRepositoryFactory extends ReactiveRepositoryFactorySup
this.operations = mongoOperations;
this.mappingContext = mongoOperations.getConverter().getMappingContext();
setEvaluationContextProvider(ReactiveQueryMethodEvaluationContextProvider.DEFAULT);
}
/*
@@ -130,8 +130,7 @@ public class ReactiveMongoRepositoryFactory extends ReactiveRepositoryFactorySup
@Override
protected Optional<QueryLookupStrategy> getQueryLookupStrategy(@Nullable Key key,
QueryMethodEvaluationContextProvider evaluationContextProvider) {
return Optional.of(new MongoQueryLookupStrategy(operations,
(ReactiveQueryMethodEvaluationContextProvider) evaluationContextProvider, mappingContext));
return Optional.of(new MongoQueryLookupStrategy(operations, evaluationContextProvider, mappingContext));
}
/*
@@ -158,21 +157,12 @@ public class ReactiveMongoRepositoryFactory extends ReactiveRepositoryFactorySup
* @author Mark Paluch
* @author Christoph Strobl
*/
@RequiredArgsConstructor(access = AccessLevel.PACKAGE)
private static class MongoQueryLookupStrategy implements QueryLookupStrategy {
private final ReactiveMongoOperations operations;
private final ReactiveQueryMethodEvaluationContextProvider evaluationContextProvider;
private final QueryMethodEvaluationContextProvider evaluationContextProvider;
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
private final ExpressionParser expressionParser = new CachingExpressionParser(EXPRESSION_PARSER);
MongoQueryLookupStrategy(ReactiveMongoOperations operations,
ReactiveQueryMethodEvaluationContextProvider evaluationContextProvider,
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
this.operations = operations;
this.evaluationContextProvider = evaluationContextProvider;
this.mappingContext = mappingContext;
}
/*
* (non-Javadoc)
@@ -187,15 +177,15 @@ public class ReactiveMongoRepositoryFactory extends ReactiveRepositoryFactorySup
if (namedQueries.hasQuery(namedQueryName)) {
String namedQuery = namedQueries.getQuery(namedQueryName);
return new ReactiveStringBasedMongoQuery(namedQuery, queryMethod, operations, expressionParser,
return new ReactiveStringBasedMongoQuery(namedQuery, queryMethod, operations, EXPRESSION_PARSER,
evaluationContextProvider);
} else if (queryMethod.hasAnnotatedAggregation()) {
return new ReactiveStringBasedAggregation(queryMethod, operations, expressionParser,
return new ReactiveStringBasedAggregation(queryMethod, operations, EXPRESSION_PARSER,
evaluationContextProvider);
} else if (queryMethod.hasAnnotatedQuery()) {
return new ReactiveStringBasedMongoQuery(queryMethod, operations, expressionParser, evaluationContextProvider);
return new ReactiveStringBasedMongoQuery(queryMethod, operations, EXPRESSION_PARSER, evaluationContextProvider);
} else {
return new ReactivePartTreeMongoQuery(queryMethod, operations, expressionParser, evaluationContextProvider);
return new ReactivePartTreeMongoQuery(queryMethod, operations, EXPRESSION_PARSER, evaluationContextProvider);
}
}
}

View File

@@ -16,17 +16,13 @@
package org.springframework.data.mongodb.repository.support;
import java.io.Serializable;
import java.util.Optional;
import org.springframework.beans.factory.ListableBeanFactory;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.ReactiveMongoOperations;
import org.springframework.data.mongodb.core.index.IndexOperationsAdapter;
import org.springframework.data.repository.Repository;
import org.springframework.data.repository.core.support.RepositoryFactoryBeanSupport;
import org.springframework.data.repository.core.support.RepositoryFactorySupport;
import org.springframework.data.repository.query.QueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.ReactiveExtensionAwareQueryMethodEvaluationContextProvider;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
@@ -39,7 +35,6 @@ import org.springframework.util.Assert;
* @since 2.0
* @see org.springframework.data.repository.reactive.ReactiveSortingRepository
* @see org.springframework.data.repository.reactive.RxJava2SortingRepository
* @see org.springframework.data.repository.reactive.RxJava3SortingRepository
*/
public class ReactiveMongoRepositoryFactoryBean<T extends Repository<S, ID>, S, ID extends Serializable>
extends RepositoryFactoryBeanSupport<T, S, ID> {
@@ -88,7 +83,10 @@ public class ReactiveMongoRepositoryFactoryBean<T extends Repository<S, ID>, S,
/*
* (non-Javadoc)
* @see org.springframework.data.repository.core.support.RepositoryFactoryBeanSupport#createRepositoryFactory()
*
* @see
* org.springframework.data.repository.support.RepositoryFactoryBeanSupport
* #createRepositoryFactory()
*/
@Override
protected RepositoryFactorySupport createRepositoryFactory() {
@@ -103,16 +101,6 @@ public class ReactiveMongoRepositoryFactoryBean<T extends Repository<S, ID>, S,
return factory;
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.core.support.RepositoryFactoryBeanSupport#createDefaultQueryMethodEvaluationContextProvider(ListableBeanFactory)
*/
@Override
protected Optional<QueryMethodEvaluationContextProvider> createDefaultQueryMethodEvaluationContextProvider(
ListableBeanFactory beanFactory) {
return Optional.of(new ReactiveExtensionAwareQueryMethodEvaluationContextProvider(beanFactory));
}
/**
* Creates and initializes a {@link RepositoryFactorySupport} instance.
*
@@ -125,7 +113,10 @@ public class ReactiveMongoRepositoryFactoryBean<T extends Repository<S, ID>, S,
/*
* (non-Javadoc)
* @see org.springframework.data.repository.core.support.RepositoryFactoryBeanSupport#afterPropertiesSet()
*
* @see
* org.springframework.data.repository.support.RepositoryFactoryBeanSupport
* #afterPropertiesSet()
*/
@Override
public void afterPropertiesSet() {

View File

@@ -17,6 +17,8 @@ package org.springframework.data.mongodb.repository.support;
import static org.springframework.data.mongodb.core.query.Criteria.*;
import lombok.NonNull;
import lombok.RequiredArgsConstructor;
import reactor.core.publisher.Flux;
import reactor.core.publisher.Mono;
@@ -49,20 +51,11 @@ import com.mongodb.client.result.DeleteResult;
* @author Ruben J Garcia
* @since 2.0
*/
@RequiredArgsConstructor
public class SimpleReactiveMongoRepository<T, ID extends Serializable> implements ReactiveMongoRepository<T, ID> {
private final MongoEntityInformation<T, ID> entityInformation;
private final ReactiveMongoOperations mongoOperations;
public SimpleReactiveMongoRepository(MongoEntityInformation<T, ID> entityInformation,
ReactiveMongoOperations mongoOperations) {
Assert.notNull(entityInformation, "EntityInformation must not be null!");
Assert.notNull(mongoOperations, "MongoOperations must not be null!");
this.entityInformation = entityInformation;
this.mongoOperations = mongoOperations;
}
private final @NonNull MongoEntityInformation<T, ID> entityInformation;
private final @NonNull ReactiveMongoOperations mongoOperations;
/*
* (non-Javadoc)

View File

@@ -15,15 +15,11 @@
*/
package org.springframework.data.mongodb.util.json;
import java.util.function.Function;
import java.util.function.Supplier;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.spel.ExpressionDependencies;
import org.springframework.data.util.Lazy;
import org.springframework.expression.EvaluationContext;
import org.springframework.expression.Expression;
import org.springframework.expression.ExpressionParser;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.lang.Nullable;
@@ -32,21 +28,25 @@ import org.springframework.lang.Nullable;
* To be used along with {@link ParameterBindingDocumentCodec#decode(String, ParameterBindingContext)}.
*
* @author Christoph Strobl
* @author Mark Paluch
* @since 2.2
*/
public class ParameterBindingContext {
private final ValueProvider valueProvider;
private final SpELExpressionEvaluator expressionEvaluator;
private final SpelExpressionParser expressionParser;
private final Lazy<EvaluationContext> evaluationContext;
/**
* @param valueProvider
* @param expressionParser
* @param evaluationContext
* @deprecated since 2.2.3 - Please use
* {@link #ParameterBindingContext(ValueProvider, SpelExpressionParser, Supplier)} instead.
*/
@Deprecated
public ParameterBindingContext(ValueProvider valueProvider, SpelExpressionParser expressionParser,
EvaluationContext evaluationContext) {
this(valueProvider, expressionParser, () -> evaluationContext);
}
@@ -56,51 +56,12 @@ public class ParameterBindingContext {
* @param evaluationContext a {@link Supplier} for {@link Lazy} context retrieval.
* @since 2.2.3
*/
public ParameterBindingContext(ValueProvider valueProvider, ExpressionParser expressionParser,
public ParameterBindingContext(ValueProvider valueProvider, SpelExpressionParser expressionParser,
Supplier<EvaluationContext> evaluationContext) {
this(valueProvider, new SpELExpressionEvaluator() {
@Override
public <T> T evaluate(String expressionString) {
return (T) expressionParser.parseExpression(expressionString).getValue(evaluationContext.get(), Object.class);
}
});
}
/**
* @param valueProvider
* @param expressionEvaluator
* @since 3.1
*/
public ParameterBindingContext(ValueProvider valueProvider, SpELExpressionEvaluator expressionEvaluator) {
this.valueProvider = valueProvider;
this.expressionEvaluator = expressionEvaluator;
}
/**
* Create a new {@link ParameterBindingContext} that is capable of expression parsing and can provide a
* {@link EvaluationContext} based on {@link ExpressionDependencies}.
*
* @param valueProvider
* @param expressionParser
* @param contextFunction
* @return
* @since 3.1
*/
public static ParameterBindingContext forExpressions(ValueProvider valueProvider,
ExpressionParser expressionParser, Function<ExpressionDependencies, EvaluationContext> contextFunction) {
return new ParameterBindingContext(valueProvider, new SpELExpressionEvaluator() {
@Override
public <T> T evaluate(String expressionString) {
Expression expression = expressionParser.parseExpression(expressionString);
ExpressionDependencies dependencies = ExpressionDependencies.discover(expression);
EvaluationContext evaluationContext = contextFunction.apply(dependencies);
return (T) expression.getValue(evaluationContext, Object.class);
}
});
this.expressionParser = expressionParser;
this.evaluationContext = evaluationContext instanceof Lazy ? (Lazy) evaluationContext : Lazy.of(evaluationContext);
}
@Nullable
@@ -110,7 +71,17 @@ public class ParameterBindingContext {
@Nullable
public Object evaluateExpression(String expressionString) {
return expressionEvaluator.evaluate(expressionString);
Expression expression = expressionParser.parseExpression(expressionString);
return expression.getValue(getEvaluationContext(), Object.class);
}
public EvaluationContext getEvaluationContext() {
return this.evaluationContext.get();
}
public SpelExpressionParser getExpressionParser() {
return expressionParser;
}
public ValueProvider getValueProvider() {

View File

@@ -24,7 +24,6 @@ import java.util.Collection;
import java.util.Date;
import java.util.List;
import java.util.Map;
import java.util.Optional;
import java.util.UUID;
import org.bson.AbstractBsonReader.State;
@@ -40,10 +39,7 @@ import org.bson.Transformer;
import org.bson.codecs.*;
import org.bson.codecs.configuration.CodecRegistry;
import org.bson.json.JsonParseException;
import org.springframework.data.mapping.model.SpELExpressionEvaluator;
import org.springframework.data.spel.EvaluationContextProvider;
import org.springframework.data.spel.ExpressionDependencies;
import org.springframework.expression.ExpressionParser;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.lang.Nullable;
import org.springframework.util.NumberUtils;
@@ -167,7 +163,7 @@ public class ParameterBindingDocumentCodec implements CollectibleCodec<Document>
public Document decode(@Nullable String json, Object[] values) {
return decode(json, new ParameterBindingContext((index) -> values[index], new SpelExpressionParser(),
EvaluationContextProvider.DEFAULT.getEvaluationContext(values)));
() -> EvaluationContextProvider.DEFAULT.getEvaluationContext(values)));
}
public Document decode(@Nullable String json, ParameterBindingContext bindingContext) {
@@ -180,31 +176,6 @@ public class ParameterBindingDocumentCodec implements CollectibleCodec<Document>
return this.decode(reader, DecoderContext.builder().build());
}
/**
* Determine {@link ExpressionDependencies} from Expressions that are nested in the {@code json} content. Returns
* {@link Optional#empty()} if {@code json} is empty or of it does not contain any SpEL expressions.
*
* @param json
* @param expressionParser
* @return merged {@link ExpressionDependencies} object if expressions were found, otherwise
* {@link ExpressionDependencies#none()}.
* @since 3.1
*/
public ExpressionDependencies captureExpressionDependencies(@Nullable String json, ValueProvider valueProvider,
ExpressionParser expressionParser) {
if (StringUtils.isEmpty(json)) {
return ExpressionDependencies.none();
}
DependencyCapturingExpressionEvaluator expressionEvaluator = new DependencyCapturingExpressionEvaluator(
expressionParser);
this.decode(new ParameterBindingJsonReader(json, new ParameterBindingContext(valueProvider, expressionEvaluator)),
DecoderContext.builder().build());
return expressionEvaluator.getCapturedDependencies();
}
@SuppressWarnings({ "rawtypes", "unchecked" })
@Override
public Document decode(final BsonReader reader, final DecoderContext decoderContext) {
@@ -370,33 +341,4 @@ public class ParameterBindingDocumentCodec implements CollectibleCodec<Document>
reader.readEndArray();
return list;
}
/**
* @author Christoph Strobl
* @since 3.1
*/
static class DependencyCapturingExpressionEvaluator implements SpELExpressionEvaluator {
private static final Object PLACEHOLDER = new Object();
private final ExpressionParser expressionParser;
private final List<ExpressionDependencies> dependencies = new ArrayList<>();
DependencyCapturingExpressionEvaluator(ExpressionParser expressionParser) {
this.expressionParser = expressionParser;
}
@Nullable
@Override
public <T> T evaluate(String expression) {
dependencies.add(ExpressionDependencies.discover(expressionParser.parseExpression(expression)));
return (T) PLACEHOLDER;
}
ExpressionDependencies getCapturedDependencies() {
return ExpressionDependencies.merged(dependencies);
}
}
}

View File

@@ -17,6 +17,8 @@ package org.springframework.data.mongodb.util.json;
import static java.lang.String.*;
import lombok.Data;
import java.text.DateFormat;
import java.text.ParsePosition;
import java.text.SimpleDateFormat;
@@ -36,7 +38,6 @@ import org.bson.types.Decimal128;
import org.bson.types.MaxKey;
import org.bson.types.MinKey;
import org.bson.types.ObjectId;
import org.springframework.data.spel.EvaluationContextProvider;
import org.springframework.expression.EvaluationContext;
import org.springframework.expression.spel.standard.SpelExpressionParser;
@@ -1657,37 +1658,12 @@ public class ParameterBindingJsonReader extends AbstractBsonReader {
// Spring Data Customization START
@Data
static class BindableValue {
private BsonType type;
private Object value;
private int index;
BindableValue() {}
BsonType getType() {
return type;
}
void setType(BsonType type) {
this.type = type;
}
Object getValue() {
return value;
}
void setValue(Object value) {
this.value = value;
}
int getIndex() {
return index;
}
void setIndex(int index) {
this.index = index;
}
BsonType type;
Object value;
int index;
}
// Spring Data Customization END

View File

@@ -20,14 +20,12 @@ import kotlin.reflect.KProperty1
/**
* Abstraction of a property path consisting of [KProperty].
*
* @author Tjeu Kayim
* @author Mark Paluch
* @author Yoann de Martino
* @since 2.2
*/
class KPropertyPath<T, U>(
internal val parent: KProperty<U>,
internal val parent: KProperty<U?>,
internal val child: KProperty1<U, T>
) : KProperty<T> by child
@@ -47,12 +45,13 @@ internal fun asString(property: KProperty<*>): String {
* Builds [KPropertyPath] from Property References.
* Refer to a field in an embedded/nested document.
*
* For example, referring to the field "author.name":
* For example, referring to the field "book.author":
* ```
* Book::author / Author::name isEqualTo "Herman Melville"
* ```
* @author Tjeu Kayim
* @author Yoann de Martino
* @since 2.2
*/
operator fun <T, U> KProperty<T>.div(other: KProperty1<T, U>) =
operator fun <T, U> KProperty<T?>.div(other: KProperty1<T, U>) =
KPropertyPath(this, other)

View File

@@ -1,26 +0,0 @@
/*
* Copyright 2020 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query
import kotlin.reflect.KProperty
/**
* Extension for [KProperty] providing an `toPath` function to render a [KProperty] as property path.
*
* @author Mark Paluch
* @since 3.1
*/
fun KProperty<*>.toPath(): String = asString(this)

View File

@@ -318,8 +318,8 @@ The Mongo driver options
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
<!-- MLP
</xsd:attribute>
<!-- MLP
<xsd:attributeGroup ref="writeConcern" />
-->
<xsd:attribute name="id" type="xsd:ID" use="optional">
@@ -348,14 +348,14 @@ The host to connect to a MongoDB server. Default is localhost
The comma delimited list of host:port entries to use for replica set/pairs.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:attribute>
</xsd:complexType>
<xsd:complexType name="optionsType">
<xsd:attribute name="connections-per-host" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The number of connections allowed per host. Will block if run out. Default is 10. System property MONGO.POOLSIZE can override
The number of connections allowed per host. Will block if run out. Default is 10. System property MONGO.POOLSIZE can override
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
@@ -363,22 +363,22 @@ The number of connections allowed per host. Will block if run out. Default is
<xsd:annotation>
<xsd:documentation><![CDATA[
The multiplier for connectionsPerHost for # of threads that can block. Default is 5.
If connectionsPerHost is 10, and threadsAllowedToBlockForConnectionMultiplier is 5,
then 50 threads can block more than that and an exception will be thrown.
If connectionsPerHost is 10, and threadsAllowedToBlockForConnectionMultiplier is 5,
then 50 threads can block more than that and an exception will be thrown.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="max-wait-time" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The max wait time of a blocking thread for a connection. Default is 12000 ms (2 minutes)
The max wait time of a blocking thread for a connection. Default is 12000 ms (2 minutes)
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="connect-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The connect timeout in milliseconds. 0 is default and infinite.
The connect timeout in milliseconds. 0 is default and infinite.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
@@ -395,18 +395,18 @@ The socket timeout. 0 is default and infinite.
The keep alive flag, controls whether or not to have socket keep alive timeout. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:attribute>
<xsd:attribute name="auto-connect-retry" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls whether or not on a connect, the system retries automatically. Default is false.
This controls whether or not on a connect, the system retries automatically. Default is false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="max-auto-connect-retry-time" type="xsd:long">
<xsd:annotation>
<xsd:documentation><![CDATA[
The maximum amount of time in millisecons to spend retrying to open connection to the same server. Default is 0, which means to use the default 15s if autoConnectRetry is on.
The maximum amount of time in millisecons to spend retrying to open connection to the same server. Default is 0, which means to use the default 15s if autoConnectRetry is on.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
@@ -430,14 +430,14 @@ This controls timeout for write operations in milliseconds. The 'wtimeout' opti
This controls whether or not to fsync. The 'fsync' option to the getlasterror command. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:attribute>
<xsd:attribute name="slave-ok" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls if the driver is allowed to read from secondaries or replicas. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:attribute>
</xsd:complexType>
<xsd:group name="beanElementGroup">
@@ -466,4 +466,4 @@ This controls if the driver is allowed to read from secondaries or replicas. De
</xsd:attribute>
</xsd:complexType>
</xsd:schema>
</xsd:schema>

View File

@@ -316,8 +316,8 @@ The Mongo driver options
<xsd:simpleType>
<xsd:union memberTypes="writeConcernEnumeration xsd:string"/>
</xsd:simpleType>
</xsd:attribute>
<!-- MLP
</xsd:attribute>
<!-- MLP
<xsd:attributeGroup ref="writeConcern" />
-->
<xsd:attribute name="id" type="xsd:ID" use="optional">
@@ -346,14 +346,14 @@ The host to connect to a MongoDB server. Default is localhost
The comma delimited list of host:port entries to use for replica set/pairs.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:attribute>
</xsd:complexType>
<xsd:complexType name="optionsType">
<xsd:attribute name="connections-per-host" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The number of connections allowed per host. Will block if run out. Default is 10. System property MONGO.POOLSIZE can override
The number of connections allowed per host. Will block if run out. Default is 10. System property MONGO.POOLSIZE can override
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
@@ -361,22 +361,22 @@ The number of connections allowed per host. Will block if run out. Default is
<xsd:annotation>
<xsd:documentation><![CDATA[
The multiplier for connectionsPerHost for # of threads that can block. Default is 5.
If connectionsPerHost is 10, and threadsAllowedToBlockForConnectionMultiplier is 5,
then 50 threads can block more than that and an exception will be thrown.
If connectionsPerHost is 10, and threadsAllowedToBlockForConnectionMultiplier is 5,
then 50 threads can block more than that and an exception will be thrown.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="max-wait-time" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The max wait time of a blocking thread for a connection. Default is 12000 ms (2 minutes)
The max wait time of a blocking thread for a connection. Default is 12000 ms (2 minutes)
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="connect-timeout" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
The connect timeout in milliseconds. 0 is default and infinite.
The connect timeout in milliseconds. 0 is default and infinite.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
@@ -393,18 +393,18 @@ The socket timeout. 0 is default and infinite.
The keep alive flag, controls whether or not to have socket keep alive timeout. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:attribute>
<xsd:attribute name="auto-connect-retry" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls whether or not on a connect, the system retries automatically. Default is false.
This controls whether or not on a connect, the system retries automatically. Default is false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
<xsd:attribute name="max-auto-connect-retry-time" type="xsd:long">
<xsd:annotation>
<xsd:documentation><![CDATA[
The maximum amount of time in millisecons to spend retrying to open connection to the same server. Default is 0, which means to use the default 15s if autoConnectRetry is on.
The maximum amount of time in millisecons to spend retrying to open connection to the same server. Default is 0, which means to use the default 15s if autoConnectRetry is on.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
@@ -428,14 +428,14 @@ This controls timeout for write operations in milliseconds. The 'wtimeout' opti
This controls whether or not to fsync. The 'fsync' option to the getlasterror command. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:attribute>
<xsd:attribute name="slave-ok" type="xsd:string">
<xsd:annotation>
<xsd:documentation><![CDATA[
This controls if the driver is allowed to read from secondaries or replicas. Defaults to false.
]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:attribute>
</xsd:complexType>
<xsd:group name="beanElementGroup">
@@ -464,4 +464,4 @@ This controls if the driver is allowed to read from secondaries or replicas. De
</xsd:attribute>
</xsd:complexType>
</xsd:schema>
</xsd:schema>

Some files were not shown because too many files have changed in this diff Show More