Compare commits

..

55 Commits

Author SHA1 Message Date
Thomas Risberg
613b183b70 Preparing for 1.0.0.M5 MongoDB release
* Fixing spring-data-commons dependency
2011-11-04 09:20:18 -04:00
Thomas Risberg
5d1320a82a preparing for snapshot builds 2011-10-24 17:46:12 -04:00
Thomas Risberg
8ccaf61d12 preparing for 1.0.0.M5 MongoDB release 2011-10-24 17:39:28 -04:00
Thomas Risberg
2c46cfd8fe Updating documentation with project name changes 2011-10-24 17:32:41 -04:00
Thomas Risberg
45f2900d15 Updating documentation with package name changes and Criteria changes 2011-10-24 17:04:20 -04:00
Oliver Gierke
79934538b6 DATAMONGO-303 - Updated to Querydsl 2.2.4. 2011-10-24 14:52:01 -05:00
Oliver Gierke
0e4e0094a5 DATAMONGO-302, DATACMNS-91 - Added null-checks for CRUD methods where necessary.
CRUD methods in SimpleMongoRepository now consistently throw IllegalArgumentExceptions for null parameters handed to them.
2011-10-24 14:21:12 -05:00
Thomas Risberg
5df61563f4 DATADOC-300 Changing to use InvalidMongoDbApiUsageException 2011-10-20 15:44:23 -04:00
Thomas Risberg
04ebec993d DATADOC-300 Removing old 'or' metod, adding JavaDoc 2011-10-20 15:34:51 -04:00
Thomas Risberg
caa245dd08 DATADOC-283 DATADOC-300 Refatoring the QueryCriteria implementations to better support $and, $or and $nor queries 2011-10-20 15:22:37 -04:00
Oliver Gierke
717ff38319 DATADOC-230 - Added method to remove objects from specific collection.
Added MongoOperations.remove(Object, String) and according MongoTemplate implementation to be able to explicitly define the collection an object should be removed from. This aligns to the method signatures we provide for all other methods as well.

Consolidated MongoTemplate.getIdPropertName(…) and ….getIdValue(…) into ….getIdQueryFor(…) as they we're only used inside a method building a by-id-query for an object.
2011-10-20 10:02:36 +02:00
Oliver Gierke
33155ba0f4 DATADOC-295 - Added ability to setup a SimpleMongoDbFactory using a MongoURI.
Extended SimpleMongoDbFactory with a constructor to take a MongoUri instance. Expose uri attribute at the db-factory namespace element.
2011-10-14 13:32:46 +02:00
Oliver Gierke
9f62efa47c Fixed typo in reference documentation. 2011-10-14 13:32:03 +02:00
Oliver Gierke
620fc876f4 Removed unused imports. 2011-10-14 13:32:03 +02:00
Oliver Gierke
cfcf839232 DATADOC-297 - Pruned irrelevant sub modules.
Removed CouchDB module as well as the generic document one. Renamed document-parent into mongodb-parent. Adapted poms accordingly.
2011-10-13 20:27:03 +02:00
Oliver Gierke
7ce1e5fbd3 DATADOC-294 - Overhaul of collection and type information handling.
Streamlined handling of when and how to write type information into DBObjects being created. Added handling for converting Collections on the top level.
2011-10-13 18:01:17 +02:00
Christoph Leiter
2d12ba38f8 DATADOC-289 - Filter AfterLoadEvent for specific domain type.
AfterLoadEvent can now be typed to a domain type again and will only be invoked if documents are loaded that shall be mapped onto the declared type.
2011-10-12 15:25:09 +02:00
Oliver Gierke
1554d489ca DATACMNS-73 - Fixed package imports in DocumentBacking aspect. 2011-10-12 15:05:02 +02:00
Oliver Gierke
f030d304f4 DATADOC-271, DATACMNS-73 - Adapted changes of SD Commons. 2011-10-12 14:40:42 +02:00
Oliver Gierke
105e1da82b DATADOC-271 - Added license headers to cross-store files. 2011-10-12 14:30:21 +02:00
Oliver Gierke
dbc7601d7b DATADOC-293 - Activated Polygon related tests.
CI system has been updated to Mongo 2.0 so we can activate the test invoking its features.
2011-10-12 14:27:22 +02:00
Thomas Risberg
ccf981b8fb DATADOC-271 Re-packaging Mongo cross-store support, removing 'document' in package name 2011-10-12 08:16:43 -04:00
Oliver Gierke
8a43b4bbc0 DATADOC-65 - Allow usage of SpEL in @Document.
@Document can now use SpEL expressions to let the collection an entity shall be stored to be calculated on the fly.
2011-10-12 13:38:14 +02:00
Thomas Risberg
1c2c592b96 DATADOC-271 Re-packaging Mongo cross-store support, removing 'document' in package name 2011-10-12 07:08:34 -04:00
Thomas Risberg
cd4b409bf2 DATADOC-283 - Adding query support for $and operator. 2011-10-12 11:42:36 +02:00
Oliver Gierke
f71477f17d DATADOC-289 - Fixed invocation of AfterLoadEvent in AbstractMongoEventListener.
Invoke events that are not bound to the domain type before the domain type check. Removed unnecessary gentrification of AfterLoadEvent.
2011-10-11 17:37:32 +02:00
Oliver Gierke
454df1e7f1 DATADOC-273 - Persisting type information for raw types as well.
We now trigger writing type informations for subtypes of raw collections and maps as well.
2011-10-11 13:21:24 +02:00
Oliver Gierke
80641a0943 DATADOC-293 - Added Polygon abstraction to Criteria.
Introduced Polygon value object to capture a list of Points. Polished implementation of Circle (equals(…) and hashCode()) and API of Criteria. Added some additional unit tests. Introduced Shape interface to allow streamling the implementation of building within-Criterias.

Ignoring the tests for polygons right now until we have updated the Mongo instance on the CI server to 2.0.
2011-10-11 12:31:26 +02:00
Oliver Gierke
e405bf574c DATADOC-291 - MongoQueryCreator now considers mapping information for query building.
Instead of using the pure PropertyPath of the PartTree we ask the MappingContext for a PersistentPropertyPath and create a field name based path expression from it.
2011-10-07 12:00:31 +02:00
Oliver Gierke
9f8e406aff DATADOC-275 - Fixed id handling for DBRefs.
Trying to convert target ids into ObjectId or String before using the actual type.
2011-09-30 11:50:44 +02:00
Oliver Gierke
7d26366352 Added test case to show usage of custom converters and querying. 2011-09-30 09:35:10 +02:00
Oliver Gierke
864ae831dd DATADOC-285 - Added test case for issue, doesn't fail right now. 2011-09-27 14:36:50 +02:00
Oliver Gierke
bfb13d99e3 DATADOC-280 - Added maxAutoConnectRetryTime config option.
Extended MongoOperationsFactoryBean to carry maxAutoConnectRetryTime property and exposed that through the XML namespace.
2011-09-27 13:59:50 +02:00
Oliver Gierke
f39de4c28e DATADOC-286 - Added support for GreaterThanEqual and LessThanEqual.
Using those keywords provided by Spring Data Commons will cause $gte and $lte criterias to be created.
2011-09-27 11:48:20 +02:00
Oliver Gierke
3bdeb68617 DATADOC-278 - QueryMapper now converts ids for $ne correctly. 2011-09-27 10:43:02 +02:00
Oliver Gierke
6b40a27c92 DATACMNS-76 - Adapt changes of Spring Data Commons. 2011-09-26 20:21:58 +02:00
Oliver Gierke
237cbec945 DATADOC-284 - Added custom SpringDataMongodbSerializer.
The custom MongodbSerializer considers mapping information when building keys for the Querydsl queries to be executed.
2011-09-26 20:18:47 +02:00
Oliver Gierke
39bc6771b7 DATADOC-183 - Added count(…) methods to MongoOperations.
MongoTemplate is now able to count documents using either an entity class or collection name.
2011-09-15 12:00:03 +02:00
Oliver Gierke
73b2d5a99c DATADOC-270 - Removed critical Sonar warnings from codebase. 2011-09-14 15:34:29 +02:00
Oliver Gierke
5dab0c721e DATADOC-277 - Upgraded to Querydsl 2.2.2. 2011-09-14 11:49:36 +02:00
Oliver Gierke
bf7c9663cf DATADOC-274, DATADOC-276 - Split up repository package to be consistent with Spring Data JPA.
Introduced dedicated config, query and support packages. Updated Sonargraph architecture description introducing subsystems for repository layer.
2011-09-14 11:47:55 +02:00
Oliver Gierke
135742b7e4 Fixed potential NullPointException in debug level in MongoQueryCreator. 2011-09-14 11:38:58 +02:00
Mark Pollack
dee0307055 DATADOC-269 - XML configuration for replica sets is not working 2011-09-13 12:17:40 -04:00
Oliver Gierke
06fb4144e0 DATADOC-261 - Added id to geoNear section of reference Documentation. 2011-09-07 13:46:11 +02:00
Oliver Gierke
5fdc600570 DATADOC-272 - Moved XSD file into correct package. 2011-09-07 13:29:06 +02:00
Oliver Gierke
f54c69b6ef DATADOC-258 - Updated dependencies to the latest versions. 2011-09-07 11:53:54 +02:00
Oliver Gierke
decdcff79f Added fallback references to general repository documentation. 2011-09-07 11:51:28 +02:00
Oliver Gierke
be8daa5268 Removed obsolete MongoBeanWrapper and MongoPropertyDescriptors. 2011-09-07 11:50:04 +02:00
Oliver Gierke
c4cd074d4d DATADOC-259 - Fixed potential NullPointerException in MapingMongoConverter.writeMapInternal(…).
MappingMongoConverter.writeInternal(…) invoked ….writeMapInternal(…) handing in null for the TypeInformation which violated the implicit contract for the method. Made contract explicit in Javadoc and hand in plain Map TypeInformation.
2011-09-07 11:24:50 +02:00
Oliver Gierke
8b7521a93b DATADOC-268 - CustomConversion considers types only simple for registered write converters.
In cases where only a reading converter is registered (e.g. to manually instantiate the object instance) the type the reading converter is registered for must not be regarded as simple as it will be written to the DBObject as is.
2011-09-07 10:25:44 +02:00
Oliver Gierke
ba508f497c DATACMNS-72 - Adapt changes introduced in Spring Data Commons.
Removed references to MappingContextAware(BeanPostProcessor).
2011-09-07 08:14:48 +02:00
Oliver Gierke
091246a9aa Polished map-reduce tests and formatted documentation with XMLEditor. 2011-09-04 14:00:09 +02:00
Oliver Gierke
745e1f313d DATADOC-259 - MappingMongoConverter now converts Maps nested in Collections as well.
Fixed nested type handling in MappingMongoConverter.writeMapInternally(…). Force usage of ConversionService for simple values if read value doesn't match target type.
2011-09-04 13:57:43 +02:00
Mark Pollack
234e04f4a3 minor doc fices 2011-09-02 19:27:36 -04:00
Thomas Risberg
15ab529596 preparing for snapshot builds 2011-09-01 20:37:40 -04:00
173 changed files with 3846 additions and 4451 deletions

17
pom.xml
View File

@@ -3,16 +3,15 @@
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-dist</artifactId>
<name>Spring Data Document Distribution</name>
<version>1.0.0.M4</version>
<artifactId>spring-data-mongo-dist</artifactId>
<name>Spring Data MongoDB Distribution</name>
<version>1.0.0.M5</version>
<packaging>pom</packaging>
<modules>
<module>spring-data-document-parent</module>
<module>spring-data-mongodb</module>
<module>spring-data-mongodb-cross-store</module>
<module>spring-data-mongodb-log4j</module>
<!-- <module>spring-data-couchdb</module> -->
<module>spring-data-mongodb-parent</module>
</modules>
<developers>
@@ -90,7 +89,7 @@
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<!-- dist.* properties are used by the antrun tasks below -->
<dist.id>spring-data-document</dist.id>
<dist.id>spring-data-mongo</dist.id>
<dist.name>Spring Data Mongo</dist.name>
<dist.key>SDMONGO</dist.key>
<dist.version>${project.version}</dist.version>
@@ -175,7 +174,7 @@
</fileset>
</copy>
<move file="${project.basedir}/target/site/reference/pdf/index.pdf"
tofile="${project.basedir}/target/site/reference/pdf/spring-data-document-reference.pdf"
tofile="${project.basedir}/target/site/reference/pdf/spring-data-mongo-reference.pdf"
failonerror="false"/>
</postProcess>
</configuration>
@@ -258,8 +257,8 @@
</dependencies>
</plugin>
</plugins>
<!-- the name of this project is 'spring-data-document-dist';
make sure the zip file is just 'spring-data-document'. -->
<!-- the name of this project is 'spring-data-mongo-dist';
make sure the zip file is just 'spring-data-mongo'. -->
<finalName>${dist.finalName}</finalName>
</build>

View File

@@ -1,156 +0,0 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-parent</artifactId>
<version>1.0.0.BUILD-SNAPSHOT</version>
<relativePath>../spring-data-document-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-couchdb</artifactId>
<packaging>jar</packaging>
<name>Spring Data CouchDB Support</name>
<licenses>
<license>
<name>The Apache Software License, Version 2.0</name>
<url>http://www.apache.org/licenses/LICENSE-2.0.txt</url>
<distribution>repo</distribution>
</license>
</licenses>
<developers>
<developer>
<id>tareq.abedrabbo</id>
<name>Tareq Abedrabbo</name>
<email>tareq.abedrabbo@opencredo.com</email>
<organization>OpenCredo</organization>
<organizationUrl>http://www.opencredo.org</organizationUrl>
<roles>
<role>Project Admin</role>
<role>Developer</role>
</roles>
<timezone>+0</timezone>
</developer>
<developer>
<id>tomas.lukosius</id>
<name>Tomas Lukosius</name>
<email>tomas.lukosius@opencredo.com</email>
<organization>OpenCredo</organization>
<organizationUrl>http://www.opencredo.org</organizationUrl>
<roles>
<role>Project Admin</role>
<role>Developer</role>
</roles>
<timezone>+0</timezone>
</developer>
</developers>
<dependencies>
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<scope>test</scope>
</dependency>
<!-- Spring Data -->
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-commons-core</artifactId>
</dependency>
<!-- Dependencies for web analytics functionality - to me moved into spring framework -->
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>2.5</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>${org.springframework.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-web</artifactId>
<version>${org.springframework.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-webmvc</artifactId>
<version>${org.springframework.version}</version>
</dependency>
<!-- Jackson JSON -->
<dependency>
<groupId>org.codehaus.jackson</groupId>
<artifactId>jackson-core-asl</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>org.codehaus.jackson</groupId>
<artifactId>jackson-mapper-asl</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>javax.annotation</groupId>
<artifactId>jsr250-api</artifactId>
<optional>true</optional>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-all</artifactId>
<version>1.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
</dependency>
<!-- Couch DB -->
<dependency>
<groupId>com.google.code.jcouchdb</groupId>
<artifactId>jcouchdb</artifactId>
<version>0.11.0-1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>com.springsource.bundlor</groupId>
<artifactId>com.springsource.bundlor.maven</artifactId>
</plugin>
</plugins>
</build>
</project>

View File

@@ -1,34 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb;
import org.springframework.dao.InvalidDataAccessResourceUsageException;
import org.springframework.web.client.HttpServerErrorException;
public class CouchServerResourceUsageException extends InvalidDataAccessResourceUsageException {
/**
* Create a new CouchServerResourceUsageException,
* wrapping an arbitrary HttpServerErrorException.
*
* @param cause the HttpServerErrorException thrown
*/
public CouchServerResourceUsageException(HttpServerErrorException cause) {
super(cause != null ? cause.getMessage() : null, cause);
}
}

View File

@@ -1,34 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.web.client.HttpClientErrorException;
public class CouchUsageException extends InvalidDataAccessApiUsageException {
/**
* Create a new CouchUsageException,
* wrapping an arbitrary HttpServerErrorException.
*
* @param cause the HttpServerErrorException thrown
*/
public CouchUsageException(HttpClientErrorException cause) {
super(cause != null ? cause.getMessage() : null, cause);
}
}

View File

@@ -1,34 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.web.client.HttpStatusCodeException;
public class DocumentExistsException extends DataIntegrityViolationException {
/**
* Create a new DocumentExistsException,
* wrapping an arbitrary HttpServerErrorException.
*
* @param cause the HttpServerErrorException thrown
*/
public DocumentExistsException(String documentId, HttpStatusCodeException cause) {
super(cause != null ? cause.getMessage() : null, cause);
}
}

View File

@@ -1,34 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb;
import org.springframework.dao.UncategorizedDataAccessException;
import org.springframework.web.client.RestClientException;
public class UncategorizedCouchDataAccessException extends UncategorizedDataAccessException {
/**
* Create a new HibernateSystemException,
* wrapping an arbitrary HibernateException.
*
* @param cause the HibernateException thrown
*/
public UncategorizedCouchDataAccessException(RestClientException cause) {
super(cause != null ? cause.getMessage() : null, cause);
}
}

View File

@@ -1,64 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.admin;
import java.util.Arrays;
import java.util.List;
import java.util.Map;
import org.springframework.data.document.couchdb.support.CouchUtils;
import org.springframework.util.StringUtils;
import org.springframework.web.client.RestOperations;
import org.springframework.web.client.RestTemplate;
public class CouchAdmin implements CouchAdminOperations {
private String databaseUrl;
private RestOperations restOperations = new RestTemplate();
public CouchAdmin(String databaseUrl) {
if (!databaseUrl.trim().endsWith("/")) {
this.databaseUrl = databaseUrl.trim() + "/";
} else {
this.databaseUrl = databaseUrl.trim();
}
}
public List<String> listDatabases() {
String dbs = restOperations.getForObject(databaseUrl + "_all_dbs", String.class);
return Arrays.asList(StringUtils.commaDelimitedListToStringArray(dbs));
}
public void createDatabase(String dbName) {
org.springframework.util.Assert.hasText(dbName);
restOperations.put(databaseUrl + dbName, null);
}
public void deleteDatabase(String dbName) {
org.springframework.util.Assert.hasText(dbName);
restOperations.delete(CouchUtils.ensureTrailingSlash(databaseUrl + dbName));
}
public DbInfo getDatabaseInfo(String dbName) {
String url = CouchUtils.ensureTrailingSlash(databaseUrl + dbName);
Map dbInfoMap = (Map) restOperations.getForObject(url, Map.class);
return new DbInfo(dbInfoMap);
}
}

View File

@@ -1,34 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.admin;
import java.util.List;
public interface CouchAdminOperations {
// functionality for /_special - replication, logs, UUIDs
List<String> listDatabases();
void createDatabase(String name);
void deleteDatabase(String name);
DbInfo getDatabaseInfo(String name);
}

View File

@@ -1,72 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.admin;
import java.util.Collections;
import java.util.Map;
public class DbInfo {
private Map dbInfoMap;
public DbInfo(Map dbInfoMap) {
super();
this.dbInfoMap = dbInfoMap;
}
public boolean isCompactRunning() {
return (Boolean) this.dbInfoMap.get("compact_running");
}
public String getDbName() {
return (String) this.dbInfoMap.get("db_name");
}
public long getDiskFormatVersion() {
return (Long) this.dbInfoMap.get("disk_format_version");
}
public long getDiskSize() {
return (Long) this.dbInfoMap.get("disk_size");
}
public long getDocCount() {
return (Long) this.dbInfoMap.get("doc_count");
}
public long getDocDeleteCount() {
return (Long) this.dbInfoMap.get("doc_del_count");
}
public long getInstanceStartTime() {
return (Long) this.dbInfoMap.get("instance_start_time");
}
public long getPurgeSequence() {
return (Long) this.dbInfoMap.get("purge_seq");
}
public long getUpdateSequence() {
return (Long) this.dbInfoMap.get("update_seq");
}
public Map getDbInfoMap() {
return Collections.unmodifiableMap(dbInfoMap);
}
}

View File

@@ -1,70 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.parsing.BeanComponentDefinition;
import org.springframework.beans.factory.parsing.CompositeComponentDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.xml.BeanDefinitionParser;
import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.data.document.couchdb.monitor.ServerInfo;
import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
public class CouchJmxParser implements BeanDefinitionParser {
public BeanDefinition parse(Element element, ParserContext parserContext) {
String databaseUrl = element.getAttribute("database-url");
if (!StringUtils.hasText(databaseUrl)) {
databaseUrl = "http://localhost:5984";
}
registerJmxComponents(databaseUrl, element, parserContext);
return null;
}
protected void registerJmxComponents(String databaseUrl, Element element, ParserContext parserContext) {
Object eleSource = parserContext.extractSource(element);
CompositeComponentDefinition compositeDef = new CompositeComponentDefinition(element.getTagName(), eleSource);
/*
createBeanDefEntry(AssertMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(BackgroundFlushingMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(BtreeIndexCounters.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(ConnectionMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(GlobalLockMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(MemoryMetrics.class, compositeDef, mongoRefName, eleSource, parserContext);
createBeanDefEntry(OperationCounters.class, compositeDef, mongoRefName, eleSource, parserContext);
*/
createBeanDefEntry(ServerInfo.class, compositeDef, databaseUrl, eleSource, parserContext);
//createBeanDefEntry(MongoAdmin.class, compositeDef, mongoRefName, eleSource, parserContext);
parserContext.registerComponent(compositeDef);
}
protected void createBeanDefEntry(Class clazz, CompositeComponentDefinition compositeDef, String databaseUrl, Object eleSource, ParserContext parserContext) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(clazz);
builder.getRawBeanDefinition().setSource(eleSource);
builder.addConstructorArg(databaseUrl);
BeanDefinition assertDef = builder.getBeanDefinition();
String assertName = parserContext.getReaderContext().registerWithGeneratedName(assertDef);
compositeDef.addNestedComponent(new BeanComponentDefinition(assertDef, assertName));
}
}

View File

@@ -1,63 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core;
import java.net.URI;
public interface CouchOperations {
/**
* Reads a document from the database and maps it a Java object.
* </p>
* This method is intended to work when a default database
* is set on the CouchDbDocumentOperations instance.
*
* @param id the id of the CouchDB document to read
* @param targetClass the target type to map to
* @return the mapped object
*/
<T> T findOne(String id, Class<T> targetClass);
/**
* Reads a document from the database and maps it a Java object.
*
* @param uri the full URI of the document to read
* @param targetClass the target type to map to
* @return the mapped object
*/
<T> T findOne(URI uri, Class<T> targetClass);
/**
* Maps a Java object to JSON and writes it to the database
* </p>
* This method is intended to work when a default database
* is set on the CouchDbDocumentOperations instance.
*
* @param id the id of the document to write
* @param document the object to write
*/
void save(String id, Object document);
/**
* Maps a Java object to JSON and writes it to the database
*
* @param uri the full URI of the document to write
* @param document the object to write
*/
void save(URI uri, Object document);
}

View File

@@ -1,143 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core;
import java.net.URI;
import java.util.Map;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.data.document.couchdb.CouchServerResourceUsageException;
import org.springframework.data.document.couchdb.CouchUsageException;
import org.springframework.data.document.couchdb.DocumentRetrievalFailureException;
import org.springframework.data.document.couchdb.UncategorizedCouchDataAccessException;
import org.springframework.data.document.couchdb.support.CouchUtils;
import org.springframework.http.*;
import org.springframework.util.Assert;
import org.springframework.web.client.*;
public class CouchTemplate implements CouchOperations {
protected final Log logger = LogFactory.getLog(this.getClass());
private String defaultDocumentUrl;
private RestOperations restOperations = new RestTemplate();
/**
* Constructs an instance of CouchDbDocumentTemplate with a default database
*
* @param defaultDatabaseUrl the default database to connect to
*/
public CouchTemplate(String defaultDatabaseUrl) {
Assert.hasText(defaultDatabaseUrl, "defaultDatabaseUrl must not be empty");
defaultDocumentUrl = CouchUtils.addId(defaultDatabaseUrl);
}
/**
* Constructs an instance of CouchDbDocumentTemplate with a default database
*
* @param defaultDatabaseUrl the default database to connect to
*/
public CouchTemplate(String defaultDatabaseUrl, RestOperations restOperations) {
this(defaultDatabaseUrl);
Assert.notNull(restOperations, "restOperations must not be null");
this.restOperations = restOperations;
}
public <T> T findOne(String id, Class<T> targetClass) {
Assert.state(defaultDocumentUrl != null, "defaultDatabaseUrl must be set to use this method");
try {
return restOperations.getForObject(defaultDocumentUrl, targetClass, id);
//TODO check this exception translation and centralize.
} catch (HttpClientErrorException clientError) {
if (clientError.getStatusCode() == HttpStatus.NOT_FOUND) {
throw new DocumentRetrievalFailureException(defaultDocumentUrl + "/" + id);
}
throw new CouchUsageException(clientError);
} catch (HttpServerErrorException serverError) {
throw new CouchServerResourceUsageException(serverError);
} catch (RestClientException otherError) {
throw new UncategorizedCouchDataAccessException(otherError);
}
}
public <T> T findOne(URI uri, Class<T> targetClass) {
Assert.state(uri != null, "uri must be set to use this method");
try {
return restOperations.getForObject(uri, targetClass);
//TODO check this exception translation and centralize.
} catch (HttpClientErrorException clientError) {
if (clientError.getStatusCode() == HttpStatus.NOT_FOUND) {
throw new DocumentRetrievalFailureException(uri.getPath());
}
throw new CouchUsageException(clientError);
} catch (HttpServerErrorException serverError) {
throw new CouchServerResourceUsageException(serverError);
} catch (RestClientException otherError) {
throw new UncategorizedCouchDataAccessException(otherError);
}
}
public void save(String id, Object document) {
Assert.notNull(document, "document must not be null for save");
HttpEntity<?> httpEntity = createHttpEntity(document);
try {
ResponseEntity<Map> response = restOperations.exchange(defaultDocumentUrl, HttpMethod.PUT, httpEntity, Map.class, id);
//TODO update the document revision id on the object from the returned value
//TODO better exception translation
} catch (RestClientException e) {
throw new UncategorizedCouchDataAccessException(e);
}
}
public void save(URI uri, Object document) {
Assert.notNull(document, "document must not be null for save");
Assert.notNull(uri, "URI must not be null for save");
HttpEntity<?> httpEntity = createHttpEntity(document);
try {
ResponseEntity<Map> response = restOperations.exchange(uri, HttpMethod.PUT, httpEntity, Map.class);
//TODO update the document revision id on the object from the returned value
//TODO better exception translation
} catch (RestClientException e) {
throw new UncategorizedCouchDataAccessException(e);
}
}
private HttpEntity<?> createHttpEntity(Object document) {
if (document instanceof HttpEntity) {
HttpEntity httpEntity = (HttpEntity) document;
Assert.isTrue(httpEntity.getHeaders().getContentType().equals(MediaType.APPLICATION_JSON),
"HttpEntity payload with non application/json content type found.");
return httpEntity;
}
HttpHeaders httpHeaders = new HttpHeaders();
httpHeaders.setContentType(MediaType.APPLICATION_JSON);
HttpEntity<Object> httpEntity = new HttpEntity<Object>(document, httpHeaders);
return httpEntity;
}
}

View File

@@ -1,315 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core.support;
import java.io.IOException;
import java.nio.charset.Charset;
import java.util.*;
import org.codehaus.jackson.*;
import org.codehaus.jackson.map.JsonMappingException;
import org.codehaus.jackson.map.ObjectMapper;
import org.codehaus.jackson.map.type.TypeFactory;
import org.codehaus.jackson.type.JavaType;
import org.springframework.http.HttpInputMessage;
import org.springframework.http.HttpOutputMessage;
import org.springframework.http.MediaType;
import org.springframework.http.converter.AbstractHttpMessageConverter;
import org.springframework.http.converter.HttpMessageNotReadableException;
import org.springframework.http.converter.HttpMessageNotWritableException;
import org.springframework.util.Assert;
public class CouchDbMappingJacksonHttpMessageConverter extends
AbstractHttpMessageConverter<Object> {
public static final Charset DEFAULT_CHARSET = Charset.forName("UTF-8");
private static final String ROWS_FIELD_NAME = "rows";
private static final String VALUE_FIELD_NAME = "value";
private static final String INCLUDED_DOC_FIELD_NAME = "doc";
private static final String TOTAL_ROWS_FIELD_NAME = "total_rows";
private ObjectMapper objectMapper = new ObjectMapper();
private boolean prefixJson = false;
/**
* Construct a new {@code BindingJacksonHttpMessageConverter}.
*/
public CouchDbMappingJacksonHttpMessageConverter() {
super(new MediaType("application", "json", DEFAULT_CHARSET));
}
/**
* Sets the {@code ObjectMapper} for this view. If not set, a default
* {@link ObjectMapper#ObjectMapper() ObjectMapper} is used.
* <p/>
* Setting a custom-configured {@code ObjectMapper} is one way to take
* further control of the JSON serialization process. For example, an
* extended {@link org.codehaus.jackson.map.SerializerFactory} can be
* configured that provides custom serializers for specific types. The other
* option for refining the serialization process is to use Jackson's
* provided annotations on the types to be serialized, in which case a
* custom-configured ObjectMapper is unnecessary.
*/
public void setObjectMapper(ObjectMapper objectMapper) {
Assert.notNull(objectMapper, "'objectMapper' must not be null");
this.objectMapper = objectMapper;
}
/**
* Indicates whether the JSON output by this view should be prefixed with
* "{} &&". Default is false.
* <p/>
* Prefixing the JSON string in this manner is used to help prevent JSON
* Hijacking. The prefix renders the string syntactically invalid as a
* script so that it cannot be hijacked. This prefix does not affect the
* evaluation of JSON, but if JSON validation is performed on the string,
* the prefix would need to be ignored.
*/
public void setPrefixJson(boolean prefixJson) {
this.prefixJson = prefixJson;
}
@Override
public boolean canRead(Class<?> clazz, MediaType mediaType) {
JavaType javaType = getJavaType(clazz);
return this.objectMapper.canDeserialize(javaType) && canRead(mediaType);
}
/**
* Returns the Jackson {@link JavaType} for the specific class.
* <p/>
* <p/>
* Default implementation returns
* {@link TypeFactory#type(java.lang.reflect.Type)}, but this can be
* overridden in subclasses, to allow for custom generic collection
* handling. For instance:
* <p/>
* <pre class="code">
* protected JavaType getJavaType(Class&lt;?&gt; clazz) {
* if (List.class.isAssignableFrom(clazz)) {
* return TypeFactory.collectionType(ArrayList.class, MyBean.class);
* } else {
* return super.getJavaType(clazz);
* }
* }
* </pre>
*
* @param clazz the class to return the java type for
* @return the java type
*/
protected JavaType getJavaType(Class<?> clazz) {
return TypeFactory.type(clazz);
}
@Override
public boolean canWrite(Class<?> clazz, MediaType mediaType) {
return this.objectMapper.canSerialize(clazz) && canWrite(mediaType);
}
@Override
protected boolean supports(Class<?> clazz) {
// should not be called, since we override canRead/Write instead
throw new UnsupportedOperationException();
}
@Override
protected Object readInternal(Class<?> clazz, HttpInputMessage inputMessage)
throws IOException, HttpMessageNotReadableException {
JavaType javaType = getJavaType(clazz);
try {
return success(clazz, inputMessage);
// return this.objectMapper.readValue(inputMessage.getBody(),
// javaType);
} catch (Exception ex) {
throw new HttpMessageNotReadableException("Could not read JSON: "
+ ex.getMessage(), ex);
}
}
private Object success(Class<?> clazz, HttpInputMessage inputMessage)
throws JsonParseException, IOException {
//Note, parsing code used from ektorp project
JsonParser jp = objectMapper.getJsonFactory().createJsonParser(
inputMessage.getBody());
if (jp.nextToken() != JsonToken.START_OBJECT) {
throw new RuntimeException("Expected data to start with an Object");
}
Map<String, Integer> fields = readHeaderFields(jp);
List result;
if (fields.containsKey(TOTAL_ROWS_FIELD_NAME)) {
int totalRows = fields.get(TOTAL_ROWS_FIELD_NAME);
if (totalRows == 0) {
return Collections.emptyList();
}
result = new ArrayList(totalRows);
} else {
result = new ArrayList();
}
ParseState state = new ParseState();
Object first = parseFirstRow(jp, state, clazz);
if (first == null) {
return Collections.emptyList();
} else {
result.add(first);
}
while (jp.getCurrentToken() != null) {
skipToField(jp, state.docFieldName, state);
if (atEndOfRows(jp)) {
return result;
}
result.add(jp.readValueAs(clazz));
endRow(jp, state);
}
return result;
}
private Object parseFirstRow(JsonParser jp, ParseState state, Class clazz)
throws JsonParseException, IOException, JsonProcessingException,
JsonMappingException {
skipToField(jp, VALUE_FIELD_NAME, state);
JsonNode value = null;
if (atObjectStart(jp)) {
value = jp.readValueAsTree();
jp.nextToken();
if (isEndOfRow(jp)) {
state.docFieldName = VALUE_FIELD_NAME;
Object doc = objectMapper.readValue(value, clazz);
endRow(jp, state);
return doc;
}
}
skipToField(jp, INCLUDED_DOC_FIELD_NAME, state);
if (atObjectStart(jp)) {
state.docFieldName = INCLUDED_DOC_FIELD_NAME;
Object doc = jp.readValueAs(clazz);
endRow(jp, state);
return doc;
}
return null;
}
private boolean isEndOfRow(JsonParser jp) {
return jp.getCurrentToken() == JsonToken.END_OBJECT;
}
private void endRow(JsonParser jp, ParseState state) throws IOException, JsonParseException {
state.inRow = false;
jp.nextToken();
}
private boolean atObjectStart(JsonParser jp) {
return jp.getCurrentToken() == JsonToken.START_OBJECT;
}
private boolean atEndOfRows(JsonParser jp) {
return jp.getCurrentToken() != JsonToken.START_OBJECT;
}
private void skipToField(JsonParser jp, String fieldName, ParseState state) throws JsonParseException, IOException {
String lastFieldName = null;
while (jp.getCurrentToken() != null) {
switch (jp.getCurrentToken()) {
case FIELD_NAME:
lastFieldName = jp.getCurrentName();
jp.nextToken();
break;
case START_OBJECT:
if (!state.inRow) {
state.inRow = true;
jp.nextToken();
} else {
if (isInField(fieldName, lastFieldName)) {
return;
} else {
jp.skipChildren();
}
}
break;
default:
if (isInField(fieldName, lastFieldName)) {
jp.nextToken();
return;
}
jp.nextToken();
break;
}
}
}
private boolean isInField(String fieldName, String lastFieldName) {
return lastFieldName != null && lastFieldName.equals(fieldName);
}
private Map<String, Integer> readHeaderFields(JsonParser jp)
throws JsonParseException, IOException {
Map<String, Integer> map = new HashMap<String, Integer>();
jp.nextToken();
String nextFieldName = jp.getCurrentName();
while (!nextFieldName.equals(ROWS_FIELD_NAME)) {
jp.nextToken();
map.put(nextFieldName, Integer.valueOf(jp.getIntValue()));
jp.nextToken();
nextFieldName = jp.getCurrentName();
}
return map;
}
@Override
protected void writeInternal(Object o, HttpOutputMessage outputMessage)
throws IOException, HttpMessageNotWritableException {
JsonEncoding encoding = getEncoding(outputMessage.getHeaders()
.getContentType());
JsonGenerator jsonGenerator = this.objectMapper.getJsonFactory()
.createJsonGenerator(outputMessage.getBody(), encoding);
try {
if (this.prefixJson) {
jsonGenerator.writeRaw("{} && ");
}
this.objectMapper.writeValue(jsonGenerator, o);
} catch (JsonGenerationException ex) {
throw new HttpMessageNotWritableException("Could not write JSON: "
+ ex.getMessage(), ex);
}
}
private JsonEncoding getEncoding(MediaType contentType) {
if (contentType != null && contentType.getCharSet() != null) {
Charset charset = contentType.getCharSet();
for (JsonEncoding encoding : JsonEncoding.values()) {
if (charset.name().equals(encoding.getJavaName())) {
return encoding;
}
}
}
return JsonEncoding.UTF8;
}
private static class ParseState {
boolean inRow;
String docFieldName = "";
}
}

View File

@@ -1,41 +0,0 @@
/*
* Copyright 2002-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.monitor;
import org.springframework.web.client.RestTemplate;
/**
* Base class to encapsulate common configuration settings when connecting to a CouchDB database
*
* @author Mark Pollack
*/
public abstract class AbstractMonitor {
protected RestTemplate restTemplate;
protected String databaseUrl;
/**
* Gets the databaseUrl used to connect to CouchDB
*
* @return
*/
public String getDatabaseUrl() {
return this.databaseUrl;
}
}

View File

@@ -1,62 +0,0 @@
/*
* Copyright 2002-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.monitor;
import java.net.InetAddress;
import java.net.UnknownHostException;
import java.util.Map;
import org.springframework.jmx.export.annotation.ManagedOperation;
import org.springframework.jmx.export.annotation.ManagedResource;
import org.springframework.web.client.RestTemplate;
/**
* Expose basic server information via JMX
*
* @author Mark Pollack
*/
@ManagedResource(description = "Server Information")
public class ServerInfo extends AbstractMonitor {
public ServerInfo(String databaseUrl) {
this.databaseUrl = databaseUrl;
this.restTemplate = new RestTemplate();
}
@ManagedOperation(description = "Server host name")
public String getHostName() throws UnknownHostException {
return InetAddress.getLocalHost().getHostName();
}
@ManagedOperation(description = "CouchDB Server Version")
public String getVersion() {
return (String) getRoot().get("version");
}
@ManagedOperation(description = "Message of the day")
public String getMotd() {
return (String) getRoot().get("greeting");
}
public Map getRoot() {
Map map = restTemplate.getForObject(getDatabaseUrl(), Map.class);
return map;
}
}

View File

@@ -1,4 +0,0 @@
/**
* CouchDB specific JMX monitoring support.
*/
package org.springframework.data.document.couchdb.monitor;

View File

@@ -1,81 +0,0 @@
/*
* Copyright 2010 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.support;
import org.springframework.dao.DataAccessException;
/**
* Helper class featuring helper methods for internal CouchDB classes.
* <p/>
* <p>Mainly intended for internal use within the framework.
*
* @author Thomas Risberg
* @author Tareq Abedrabbo
* @since 1.0
*/
public abstract class CouchUtils {
/**
* Convert the given runtime exception to an appropriate exception from the
* <code>org.springframework.dao</code> hierarchy.
* Return null if no translation is appropriate: any other exception may
* have resulted from user code, and should not be translated.
*
* @param ex runtime exception that occurred
* @return the corresponding DataAccessException instance,
* or <code>null</code> if the exception should not be translated
*/
public static DataAccessException translateCouchExceptionIfPossible(RuntimeException ex) {
return null;
}
/**
* Adds an id variable to a URL
*
* @param url the URL to modify
* @return the modified URL
*/
public static String addId(String url) {
return ensureTrailingSlash(url) + "{id}";
}
/**
* Adds a 'changes since' variable to a URL
*
* @param url
* @return
*/
public static String addChangesSince(String url) {
return ensureTrailingSlash(url) + "_changes?since={seq}";
}
/**
* Ensures that a URL ends with a slash.
*
* @param url the URL to modify
* @return the modified URL
*/
public static String ensureTrailingSlash(String url) {
if (!url.endsWith("/")) {
url += "/";
}
return url;
}
}

View File

@@ -1 +0,0 @@
http\://www.springframework.org/schema/data/couch=org.springframework.data.document.couchdb.config.CouchNamespaceHandler

View File

@@ -1,2 +0,0 @@
http\://www.springframework.org/schema/data/couch/spring-couch-1.0.xsd=org/springframework/data/document/couchdb/config/spring-couch-1.0.xsd
http\://www.springframework.org/schema/data/couch/spring-couch.xsd=org/springframework/data/document/couchdb/config/spring-couch-1.0.xsd

View File

@@ -1,4 +0,0 @@
# Tooling related information for the Couch DB namespace
http\://www.springframework.org/schema/data/couch@name=Couch Namespace
http\://www.springframework.org/schema/data/couch@prefix=couch
http\://www.springframework.org/schema/data/couch@icon=org/springframework/jdbc/config/spring-jdbc.gif

View File

@@ -1,33 +0,0 @@
<?xml version="1.0" encoding="UTF-8" ?>
<xsd:schema xmlns="http://www.springframework.org/schema/data/couch"
xmlns:xsd="http://www.w3.org/2001/XMLSchema"
xmlns:tool="http://www.springframework.org/schema/tool"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:repository="http://www.springframework.org/schema/data/repository"
targetNamespace="http://www.springframework.org/schema/data/couch"
elementFormDefault="qualified" attributeFormDefault="unqualified">
<xsd:import namespace="http://www.springframework.org/schema/tool"/>
<xsd:import namespace="http://www.springframework.org/schema/context"
schemaLocation="http://www.springframework.org/schema/context/spring-context.xsd"/>
<xsd:import namespace="http://www.springframework.org/schema/data/repository"
schemaLocation="http://www.springframework.org/schema/data/repository/spring-repository.xsd"/>
<xsd:element name="jmx">
<xsd:annotation>
<xsd:documentation><![CDATA[
Defines a JMX Model MBeans for monitoring a CouchDB server'.
]]></xsd:documentation>
</xsd:annotation>
<xsd:complexType>
<xsd:attribute name="database-url" type="xsd:string" use="optional">
<xsd:annotation>
<xsd:documentation><![CDATA[
The database URL of the CouchDB]]></xsd:documentation>
</xsd:annotation>
</xsd:attribute>
</xsd:complexType>
</xsd:element>
</xsd:schema>

View File

@@ -1,77 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb;
import java.util.Date;
import org.codehaus.jackson.annotate.JsonIgnoreProperties;
/**
* @author Tareq Abedrabbo (tareq.abedrabbo@opencredo.com)
* @since 13/01/2011
*/
@JsonIgnoreProperties(ignoreUnknown = true)
public class DummyDocument {
private String message;
private String timestamp = new Date().toString();
public DummyDocument() {
}
public DummyDocument(String message) {
this.message = message;
}
public String getMessage() {
return message;
}
public void setMessage(String message) {
this.message = message;
}
public String getTimestamp() {
return timestamp;
}
@Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
DummyDocument document = (DummyDocument) o;
if (message != null ? !message.equals(document.message) : document.message != null) return false;
return true;
}
@Override
public int hashCode() {
return message != null ? message.hashCode() : 0;
}
@Override
public String toString() {
return "DummyDocument{" +
"message='" + message + '\'' +
", timestamp=" + timestamp +
'}';
}
}

View File

@@ -1,52 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb;
import org.hamcrest.Description;
import org.hamcrest.Factory;
import org.hamcrest.Matcher;
import org.hamcrest.TypeSafeMatcher;
import org.springframework.http.HttpEntity;
/**
* Matches the content of the body of an HttpEntity.
*
* @author Tareq Abedrabbo
* @since 31/01/2011
*/
public class IsBodyEqual extends TypeSafeMatcher<HttpEntity> {
private Object object;
public IsBodyEqual(Object object) {
this.object = object;
}
@Override
public boolean matchesSafely(HttpEntity httpEntity) {
return httpEntity.getBody().equals(object);
}
public void describeTo(Description description) {
description.appendText("body equals ").appendValue(object);
}
@Factory
public static Matcher<HttpEntity> bodyEqual(Object object) {
return new IsBodyEqual(object);
}
}

View File

@@ -1,38 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.admin;
import java.util.List;
import junit.framework.Assert;
import org.junit.Ignore;
import org.junit.Test;
import org.springframework.data.document.couchdb.core.CouchConstants;
public class CouchAdminIntegrationTests {
@Test
@Ignore("until CI has couch server running")
public void dbLifecycle() {
CouchAdmin admin = new CouchAdmin(CouchConstants.COUCHDB_URL);
admin.deleteDatabase("foo");
List<String> dbs = admin.listDatabases();
admin.createDatabase("foo");
List<String> newDbs = admin.listDatabases();
Assert.assertEquals(dbs.size() + 1, newDbs.size());
}
}

View File

@@ -1,118 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core;
import static org.junit.Assume.assumeNoException;
import static org.junit.Assume.assumeTrue;
import static org.springframework.http.HttpStatus.OK;
import java.io.IOException;
import java.util.UUID;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.junit.Before;
import org.junit.BeforeClass;
import org.springframework.http.*;
import org.springframework.http.client.ClientHttpResponse;
import org.springframework.web.client.DefaultResponseErrorHandler;
import org.springframework.web.client.RestClientException;
import org.springframework.web.client.RestTemplate;
/**
* Base class for CouchDB integration tests. Checks whether CouchDB is available before running each test,
* in which case the test is executed. If CouchDB is not available, tests are ignored.
*
* @author Tareq Abedrabbo (tareq.abedrabbo@opencredo.com)
* @since 13/01/2011
*/
public abstract class AbstractCouchTemplateIntegrationTests {
protected static final Log log = LogFactory.getLog(AbstractCouchTemplateIntegrationTests.class);
protected static final RestTemplate restTemplate = new RestTemplate();
/**
* This methods ensures that the database is running. Otherwise, the test is ignored.
*/
@BeforeClass
public static void assumeDatabaseIsUpAndRunning() {
try {
ResponseEntity<String> responseEntity = restTemplate.getForEntity(CouchConstants.COUCHDB_URL, String.class);
assumeTrue(responseEntity.getStatusCode().equals(OK));
log.debug("CouchDB is running on " + CouchConstants.COUCHDB_URL +
" with status " + responseEntity.getStatusCode());
} catch (RestClientException e) {
log.debug("CouchDB is not running on " + CouchConstants.COUCHDB_URL);
assumeNoException(e);
}
}
@Before
public void setUpTestDatabase() throws Exception {
RestTemplate template = new RestTemplate();
template.setErrorHandler(new DefaultResponseErrorHandler() {
@Override
public void handleError(ClientHttpResponse response) throws IOException {
// do nothing, error status will be handled in the switch statement
}
});
ResponseEntity<String> response = template.getForEntity(CouchConstants.TEST_DATABASE_URL, String.class);
HttpStatus statusCode = response.getStatusCode();
switch (statusCode) {
case NOT_FOUND:
createNewTestDatabase();
break;
case OK:
deleteExisitingTestDatabase();
createNewTestDatabase();
break;
default:
throw new IllegalStateException("Unsupported http status [" + statusCode + "]");
}
}
private void deleteExisitingTestDatabase() {
restTemplate.delete(CouchConstants.TEST_DATABASE_URL);
}
private void createNewTestDatabase() {
restTemplate.put(CouchConstants.TEST_DATABASE_URL, null);
}
/**
* Reads a CouchDB document and converts it to the expected type.
*/
protected <T> T getDocument(String id, Class<T> expectedType) {
String url = CouchConstants.TEST_DATABASE_URL + "{id}";
return restTemplate.getForObject(url, expectedType, id);
}
/**
* Writes a CouchDB document
*/
protected String putDocument(Object document) {
HttpHeaders headers = new HttpHeaders();
headers.setContentType(MediaType.APPLICATION_JSON);
HttpEntity request = new HttpEntity(document, headers);
String id = UUID.randomUUID().toString();
restTemplate.put(CouchConstants.TEST_DATABASE_URL + "{id}", request, id);
return id;
}
}

View File

@@ -1,27 +0,0 @@
/*
* Copyright 2011 the original author or authors
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core;
public abstract class CouchConstants {
public static final String COUCHDB_URL = "http://127.0.0.1:5984/";
public static final String TEST_DATABASE_URL = COUCHDB_URL + "si_couchdb_test/";
public CouchConstants() {
// TODO Auto-generated constructor stub
}
}

View File

@@ -1,39 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core;
import java.util.UUID;
import junit.framework.Assert;
import org.junit.Ignore;
import org.junit.Test;
import org.springframework.data.document.couchdb.DummyDocument;
public class CouchTemplateIntegrationTests extends AbstractCouchTemplateIntegrationTests {
@Test
@Ignore("until CI has couch server running")
public void saveAndFindTest() {
CouchTemplate template = new CouchTemplate(CouchConstants.TEST_DATABASE_URL);
DummyDocument document = new DummyDocument("hello");
String id = UUID.randomUUID().toString();
template.save(id, document);
DummyDocument foundDocument = template.findOne(id, DummyDocument.class);
Assert.assertEquals(document.getMessage(), foundDocument.getMessage());
}
}

View File

@@ -1,36 +0,0 @@
/*
* Copyright 2002-2010 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.monitor;
import org.springframework.context.support.ClassPathXmlApplicationContext;
/**
* Server application to test JMX functionality.
*
* @author Mark Pollack
*/
public class JmxServer {
public static void main(String[] args) {
new JmxServer().run();
}
public void run() {
new ClassPathXmlApplicationContext(new String[]{"server-jmx.xml"});
}
}

View File

@@ -1,13 +0,0 @@
log4j.rootCategory=INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.category.org.apache.activemq=ERROR
log4j.category.org.springframework.batch=DEBUG
log4j.category.org.springframework.transaction=INFO
log4j.category.org.hibernate.SQL=DEBUG
# for debugging datasource initialization
# log4j.category.test.jdbc=DEBUG

View File

@@ -1,24 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:p="http://www.springframework.org/schema/p"
xmlns:couch="http://www.springframework.org/schema/data/couch"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/data/couch http://www.springframework.org/schema/data/couch/spring-couch-1.0.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd">
<couch:jmx/>
<context:mbean-export/>
<bean id="registry" class="org.springframework.remoting.rmi.RmiRegistryFactoryBean"
p:port="1099"/>
<!-- Expose JMX over RMI -->
<bean id="serverConnector" class="org.springframework.jmx.support.ConnectorServerFactoryBean" depends-on="registry"
p:objectName="connector:name=rmi"
p:serviceUrl="service:jmx:rmi://localhost/jndi/rmi://localhost:1099/myconnector"/>
</beans>

View File

@@ -1,24 +0,0 @@
Bundle-SymbolicName: org.springframework.data.couchdb
Bundle-Name: Spring Data Couch DB Support
Bundle-Vendor: SpringSource
Bundle-ManifestVersion: 2
Import-Package:
sun.reflect;version="0";resolution:=optional
Import-Template:
org.springframework.beans.*;version="[3.0.0, 4.0.0)",
org.springframework.core.*;version="[3.0.0, 4.0.0)",
org.springframework.dao.*;version="[3.0.0, 4.0.0)",
org.springframework.http.*;version="[3.0.0, 4.0.0)",
org.springframework.web.*;version="[3.0.0, 4.0.0)",
org.springframework.util.*;version="[3.0.0, 4.0.0)",
org.springframework.context.*;version="[3.0.0, 4.0.0)",
org.springframework.jmx.*;version="[3.0.0, 4.0.0)",
org.springframework.remoting.*;version="[3.0.0, 4.0.0)",
org.springframework.data.core.*;version="[1.0.0, 2.0.0)",
org.springframework.data.document.*;version="[1.0.0, 2.0.0)",
org.codehaus.jackson.*;version="[1.0.0, 2.0.0)";resolution:=optional,
org.aopalliance.*;version="[1.0.0, 2.0.0)";resolution:=optional,
org.apache.commons.logging.*;version="[1.1.1, 2.0.0)",
org.w3c.dom.*;version="0"

View File

@@ -3,12 +3,11 @@
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-parent</artifactId>
<version>1.0.0.M4</version>
<relativePath>../spring-data-document-parent/pom.xml</relativePath>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.0.M5</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-cross-store</artifactId>
<packaging>jar</packaging>
<name>Spring Data MongoDB Cross-store Persistence Support</name>
<dependencies>

View File

@@ -1,30 +1,22 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb.core;
import org.junit.Test;
/**
* Unit tests for CouchTemplate with mocks
*/
public class CouchTemplateTests {
@Test
public void foo() {
}
}
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import org.springframework.data.crossstore.ChangeSetBacked;
public interface DocumentBacked extends ChangeSetBacked {
}

View File

@@ -0,0 +1,185 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManagerFactory;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.crossstore.ChangeSet;
import org.springframework.data.crossstore.ChangeSetBacked;
import org.springframework.data.crossstore.ChangeSetPersister;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.util.ClassUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.MongoException;
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_CLASS = "_entity_class";
private static final String ENTITY_ID = "_entity_id";
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
protected final Log log = LogFactory.getLog(getClass());
private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory;
public void setMongoTemplate(MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
public void setEntityManagerFactory(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory;
}
public void getPersistentState(Class<? extends ChangeSetBacked> entityClass,
Object id, final ChangeSet changeSet)
throws DataAccessException, NotFoundException {
if (id == null) {
log.debug("Unable to load MongoDB data for null id");
return;
}
String collName = getCollectionNameForEntity(entityClass);
final DBObject dbk = new BasicDBObject();
dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for " + dbk);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
for (DBObject dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) {
log.debug("Processing key: " + key);
}
if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) {
throw new DataIntegrityViolationException(
"Unble to convert property " + key
+ ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
}
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: " + key);
}
changeSet.set(key, value);
}
}
return null;
}
});
}
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
log.debug("getPersistentId called on " + entity);
if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
}
Object o = entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity);
return o;
}
public Object persistState(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (cs == null) {
log.debug("Flush: changeset was null, nothing to flush.");
return 0L;
}
if (log.isDebugEnabled()) {
log.debug("Flush: changeset: " + cs.getValues());
}
String collName = getCollectionNameForEntity(entity.getClass());
if (mongoTemplate.getCollection(collName) == null) {
mongoTemplate.createCollection(collName);
}
for (String key : cs.getValues().keySet()) {
if (key != null && !key.startsWith("_") && !key.equals(ChangeSetPersister.ID_KEY)) {
Object value = cs.getValues().get(key);
final DBObject dbQuery = new BasicDBObject();
dbQuery.put(ENTITY_ID, getPersistentId(entity, cs));
dbQuery.put(ENTITY_CLASS, entity.getClass().getName());
dbQuery.put(ENTITY_FIELD_NAME, key);
DBObject dbId = mongoTemplate.execute(collName,
new CollectionCallback<DBObject>() {
public DBObject doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
return collection.findOne(dbQuery);
}
});
if (value == null) {
if (log.isDebugEnabled()) {
log.debug("Flush: removing: " + dbQuery);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
collection.remove(dbQuery);
return null;
}
});
}
else {
final DBObject dbDoc = new BasicDBObject();
dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) {
log.debug("Flush: saving: " + dbQuery);
}
mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());
if (dbId != null) {
dbDoc.put("_id", dbId.get("_id"));
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
collection.save(dbDoc);
return null;
}
});
}
}
}
return 0L;
}
private String getCollectionNameForEntity(Class<? extends ChangeSetBacked> entityClass) {
return ClassUtils.getQualifiedName(entityClass);
}
}

View File

@@ -1,4 +1,19 @@
package org.springframework.data.persistence.document.mongodb;
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import java.lang.reflect.Field;
@@ -12,14 +27,14 @@ import org.aspectj.lang.JoinPoint;
import org.aspectj.lang.reflect.FieldSignature;
import org.springframework.dao.DataAccessException;
import org.springframework.data.persistence.document.RelatedDocument;
import org.springframework.data.persistence.document.DocumentBacked;
import org.springframework.data.persistence.document.DocumentBackedTransactionSynchronization;
import org.springframework.data.persistence.ChangeSet;
import org.springframework.data.persistence.ChangeSetPersister;
import org.springframework.data.persistence.ChangeSetPersister.NotFoundException;
import org.springframework.data.persistence.HashMapChangeSet;
import org.springframework.data.mongodb.crossstore.RelatedDocument;
import org.springframework.data.mongodb.crossstore.DocumentBacked;
import org.springframework.data.crossstore.ChangeSetBackedTransactionSynchronization;
import org.springframework.data.crossstore.ChangeSet;
import org.springframework.data.crossstore.ChangeSetPersister;
import org.springframework.data.crossstore.ChangeSetPersister.NotFoundException;
import org.springframework.data.crossstore.HashMapChangeSet;
import org.springframework.transaction.support.TransactionSynchronizationManager;
/**
@@ -130,7 +145,7 @@ public aspect MongoDocumentBacking {
entity.setChangeSet(new HashMapChangeSet());
entity.itdChangeSetPersister = changeSetPersister;
entity.itdTransactionSynchronization =
new DocumentBackedTransactionSynchronization(changeSetPersister, entity);
new ChangeSetBackedTransactionSynchronization(changeSetPersister, entity);
//registerTransactionSynchronization(entity);
}
@@ -163,7 +178,7 @@ public aspect MongoDocumentBacking {
@Transient private ChangeSetPersister<?> DocumentBacked.itdChangeSetPersister;
@Transient private DocumentBackedTransactionSynchronization DocumentBacked.itdTransactionSynchronization;
@Transient private ChangeSetBackedTransactionSynchronization DocumentBacked.itdTransactionSynchronization;
public void DocumentBacked.setChangeSet(ChangeSet cs) {
this.changeSet = cs;

View File

@@ -1,11 +1,11 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
@@ -13,8 +13,7 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.persistence.document;
package org.springframework.data.mongodb.crossstore;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;

View File

@@ -1,7 +0,0 @@
package org.springframework.data.persistence.document;
import org.springframework.data.persistence.ChangeSetBacked;
public interface DocumentBacked extends ChangeSetBacked {
}

View File

@@ -1,63 +0,0 @@
package org.springframework.data.persistence.document;
//public class DocumentBackedTransactionSynchronization {
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.data.persistence.ChangeSetBacked;
import org.springframework.data.persistence.ChangeSetPersister;
import org.springframework.transaction.support.TransactionSynchronization;
public class DocumentBackedTransactionSynchronization implements TransactionSynchronization {
protected final Log log = LogFactory.getLog(getClass());
private ChangeSetPersister<Object> changeSetPersister;
private ChangeSetBacked entity;
private int changeSetTxStatus = -1;
public DocumentBackedTransactionSynchronization(ChangeSetPersister<Object> changeSetPersister, ChangeSetBacked entity) {
this.changeSetPersister = changeSetPersister;
this.entity = entity;
}
public void afterCommit() {
log.debug("After Commit called for " + entity);
changeSetPersister.persistState(entity, entity.getChangeSet());
changeSetTxStatus = 0;
}
public void afterCompletion(int status) {
log.debug("After Completion called with status = " + status);
if (changeSetTxStatus == 0) {
if (status == STATUS_COMMITTED) {
// this is good
log.debug("ChangedSetBackedTransactionSynchronization completed successfully for " + this.entity);
}
else {
// this could be bad - TODO: compensate
log.error("ChangedSetBackedTransactionSynchronization failed for " + this.entity);
}
}
}
public void beforeCommit(boolean readOnly) {
}
public void beforeCompletion() {
}
public void flush() {
}
public void resume() {
throw new IllegalStateException("ChangedSetBackedTransactionSynchronization does not support transaction suspension currently.");
}
public void suspend() {
throw new IllegalStateException("ChangedSetBackedTransactionSynchronization does not support transaction suspension currently.");
}
}

View File

@@ -1,169 +0,0 @@
package org.springframework.data.persistence.document.mongodb;
import javax.persistence.EntityManagerFactory;
import com.mongodb.BasicDBObject;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
import com.mongodb.MongoException;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataAccessResourceFailureException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.data.mongodb.core.CollectionCallback;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.persistence.ChangeSet;
import org.springframework.data.persistence.ChangeSetBacked;
import org.springframework.data.persistence.ChangeSetPersister;
import org.springframework.util.ClassUtils;
public class MongoChangeSetPersister implements ChangeSetPersister<Object> {
private static final String ENTITY_CLASS = "_entity_class";
private static final String ENTITY_ID = "_entity_id";
private static final String ENTITY_FIELD_NAME = "_entity_field_name";
private static final String ENTITY_FIELD_CLASS = "_entity_field_class";
protected final Log log = LogFactory.getLog(getClass());
private MongoTemplate mongoTemplate;
private EntityManagerFactory entityManagerFactory;
public void setMongoTemplate(MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
public void setEntityManagerFactory(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory;
}
public void getPersistentState(Class<? extends ChangeSetBacked> entityClass,
Object id, final ChangeSet changeSet)
throws DataAccessException, NotFoundException {
if (id == null) {
log.debug("Unable to load MongoDB data for null id");
return;
}
String collName = getCollectionNameForEntity(entityClass);
final DBObject dbk = new BasicDBObject();
dbk.put(ENTITY_ID, id);
dbk.put(ENTITY_CLASS, entityClass.getName());
if (log.isDebugEnabled()) {
log.debug("Loading MongoDB data for " + dbk);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
for (DBObject dbo : collection.find(dbk)) {
String key = (String) dbo.get(ENTITY_FIELD_NAME);
if (log.isDebugEnabled()) {
log.debug("Processing key: " + key);
}
if (!changeSet.getValues().containsKey(key)) {
String className = (String) dbo.get(ENTITY_FIELD_CLASS);
if (className == null) {
throw new DataIntegrityViolationException(
"Unble to convert property " + key
+ ": Invalid metadata, " + ENTITY_FIELD_CLASS + " not available");
}
Class<?> clazz = ClassUtils.resolveClassName(className, ClassUtils.getDefaultClassLoader());
Object value = mongoTemplate.getConverter().read(clazz, dbo);
if (log.isDebugEnabled()) {
log.debug("Adding to ChangeSet: " + key);
}
changeSet.set(key, value);
}
}
return null;
}
});
}
public Object getPersistentId(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
log.debug("getPersistentId called on " + entity);
if (entityManagerFactory == null) {
throw new DataAccessResourceFailureException("EntityManagerFactory cannot be null");
}
Object o = entityManagerFactory.getPersistenceUnitUtil().getIdentifier(entity);
return o;
}
public Object persistState(ChangeSetBacked entity, ChangeSet cs) throws DataAccessException {
if (cs == null) {
log.debug("Flush: changeset was null, nothing to flush.");
return 0L;
}
if (log.isDebugEnabled()) {
log.debug("Flush: changeset: " + cs.getValues());
}
String collName = getCollectionNameForEntity(entity.getClass());
DBCollection dbc = mongoTemplate.getCollection(collName);
if (dbc == null) {
dbc = mongoTemplate.createCollection(collName);
}
for (String key : cs.getValues().keySet()) {
if (key != null && !key.startsWith("_") && !key.equals(ChangeSetPersister.ID_KEY)) {
Object value = cs.getValues().get(key);
final DBObject dbQuery = new BasicDBObject();
dbQuery.put(ENTITY_ID, getPersistentId(entity, cs));
dbQuery.put(ENTITY_CLASS, entity.getClass().getName());
dbQuery.put(ENTITY_FIELD_NAME, key);
DBObject dbId = mongoTemplate.execute(collName,
new CollectionCallback<DBObject>() {
public DBObject doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
return collection.findOne(dbQuery);
}
});
if (value == null) {
if (log.isDebugEnabled()) {
log.debug("Flush: removing: " + dbQuery);
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
collection.remove(dbQuery);
return null;
}
});
}
else {
final DBObject dbDoc = new BasicDBObject();
dbDoc.putAll(dbQuery);
if (log.isDebugEnabled()) {
log.debug("Flush: saving: " + dbQuery);
}
mongoTemplate.getConverter().write(value, dbDoc);
dbDoc.put(ENTITY_FIELD_CLASS, value.getClass().getName());
if (dbId != null) {
dbDoc.put("_id", dbId.get("_id"));
}
mongoTemplate.execute(collName, new CollectionCallback<Object>() {
public Object doInCollection(DBCollection collection)
throws MongoException, DataAccessException {
collection.save(dbDoc);
return null;
}
});
}
}
}
return 0L;
}
private String getCollectionNameForEntity(Class<? extends ChangeSetBacked> entityClass) {
return ClassUtils.getQualifiedName(entityClass);
}
}

View File

@@ -1,154 +0,0 @@
package org.springframework.data.document.persistence;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import org.junit.Assert;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.document.persistence.test.Address;
import org.springframework.data.document.persistence.test.Person;
import org.springframework.data.document.persistence.test.Resume;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.test.annotation.Rollback;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.TransactionStatus;
import org.springframework.transaction.annotation.Transactional;
import org.springframework.transaction.support.TransactionCallback;
import org.springframework.transaction.support.TransactionTemplate;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = "classpath:/META-INF/spring/applicationContext.xml")
public class CrossStoreMongoTests {
@Autowired
private MongoTemplate mongoTemplate;
private EntityManager entityManager;
@Autowired
private PlatformTransactionManager transactionManager;
@PersistenceContext
public void setEntityManager(EntityManager entityManager) {
this.entityManager = entityManager;
}
private void clearData(String collectionName) {
DBCollection col = this.mongoTemplate.getCollection(collectionName);
if (col != null) {
this.mongoTemplate.dropCollection(collectionName);
}
}
@Test
@Transactional
@Rollback(false)
public void testCreateJpaToMongoEntityRelationship() {
clearData(Person.class.getName());
Person p = new Person("Thomas", 20);
Address a = new Address(12, "MAin St.", "Boston", "MA", "02101");
p.setAddress(a);
Resume r = new Resume();
r.addEducation("Skanstulls High School, 1975");
r.addEducation("Univ. of Stockholm, 1980");
r.addJob("DiMark, DBA, 1990-2000");
r.addJob("VMware, Developer, 2007-");
p.setResume(r);
p.setId(1L);
entityManager.persist(p);
}
@Test
@Transactional
@Rollback(false)
public void testReadJpaToMongoEntityRelationship() {
Person found = entityManager.find(Person.class, 1L);
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found.getResume());
Assert.assertEquals("DiMark, DBA, 1990-2000" + "; "
+ "VMware, Developer, 2007-", found.getResume().getJobs());
found.getResume().addJob("SpringDeveloper.com, Consultant, 2005-2006");
found.setAge(44);
}
@Test
@Transactional
@Rollback(false)
public void testUpdatedJpaToMongoEntityRelationship() {
Person found = entityManager.find(Person.class, 1L);
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found.getResume());
Assert.assertEquals("DiMark, DBA, 1990-2000" + "; "
+ "VMware, Developer, 2007-" + "; "
+ "SpringDeveloper.com, Consultant, 2005-2006", found.getResume().getJobs());
}
@Test
public void testMergeJpaEntityWithMongoDocument() {
TransactionTemplate txTemplate = new TransactionTemplate(transactionManager);
final Person detached = entityManager.find(Person.class, 1L);
detached.getResume().addJob("TargetRx, Developer, 2000-2005");
Person merged = txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
return entityManager.merge(detached);
}
});
Assert.assertTrue(detached.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
Assert.assertTrue(merged.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
final Person updated = entityManager.find(Person.class, 1L);
Assert.assertTrue(updated.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
}
@Test
public void testRemoveJpaEntityWithMongoDocument() {
TransactionTemplate txTemplate = new TransactionTemplate(transactionManager);
txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
Person p2 = new Person("Thomas", 20);
Resume r2 = new Resume();
r2.addEducation("Skanstulls High School, 1975");
r2.addJob("DiMark, DBA, 1990-2000");
p2.setResume(r2);
p2.setId(2L);
entityManager.persist(p2);
Person p3 = new Person("Thomas", 20);
Resume r3 = new Resume();
r3.addEducation("Univ. of Stockholm, 1980");
r3.addJob("VMware, Developer, 2007-");
p3.setResume(r3);
p3.setId(3L);
entityManager.persist(p3);
return null;
}
});
txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
final Person found2 = entityManager.find(Person.class, 2L);
entityManager.remove(found2);
return null;
}
});
boolean weFound3 = false;
for (DBObject dbo : this.mongoTemplate.getCollection(Person.class.getName()).find()) {
Assert.assertTrue(!dbo.get("_entity_id").equals(2L));
if (dbo.get("_entity_id").equals(3L)) {
weFound3 = true;
}
}
Assert.assertTrue(weFound3);
}
}

View File

@@ -1,87 +0,0 @@
package org.springframework.data.document.persistence.test;
import javax.persistence.Entity;
import javax.persistence.Id;
import org.springframework.data.persistence.document.RelatedDocument;
@Entity
public class Person {
@Id
Long id;
private String name;
private int age;
private java.util.Date birthDate;
@RelatedDocument
private Address address;
@RelatedDocument
private Resume resume;
public Person() {
}
public Person(String name, int age) {
this.name = name;
this.age = age;
this.birthDate = new java.util.Date();
}
public void birthday() {
++age;
}
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public int getAge() {
return age;
}
public void setAge(int age) {
this.age = age;
}
public java.util.Date getBirthDate() {
return birthDate;
}
public void setBirthDate(java.util.Date birthDate) {
this.birthDate = birthDate;
}
public Resume getResume() {
return resume;
}
public void setResume(Resume resume) {
this.resume = resume;
}
public Address getAddress() {
return address;
}
public void setAddress(Address address) {
this.address = address;
}
}

View File

@@ -1,48 +0,0 @@
package org.springframework.data.document.persistence.test;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.bson.types.ObjectId;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.mapping.Document;
@Document
public class Resume {
private static final Log LOGGER = LogFactory.getLog(Resume.class);
@Id
private ObjectId id;
private String education = "";
private String jobs = "";
public String getId() {
return id.toString();
}
public String getEducation() {
return education;
}
public void addEducation(String education) {
LOGGER.debug("Adding education " + education);
this.education = this.education + (this.education.length() > 0 ? "; " : "") + education;
}
public String getJobs() {
return jobs;
}
public void addJob(String job) {
LOGGER.debug("Adding job " + job);
this.jobs = this.jobs + (this.jobs.length() > 0 ? "; " : "") + job;
}
@Override
public String toString() {
return "Resume [education=" + education + ", jobs=" + jobs + "]";
}
}

View File

@@ -0,0 +1,169 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import org.junit.Assert;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.crossstore.test.Address;
import org.springframework.data.mongodb.crossstore.test.Person;
import org.springframework.data.mongodb.crossstore.test.Resume;
import org.springframework.test.annotation.Rollback;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.TransactionStatus;
import org.springframework.transaction.annotation.Transactional;
import org.springframework.transaction.support.TransactionCallback;
import org.springframework.transaction.support.TransactionTemplate;
import com.mongodb.DBCollection;
import com.mongodb.DBObject;
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = "classpath:/META-INF/spring/applicationContext.xml")
public class CrossStoreMongoTests {
@Autowired
private MongoTemplate mongoTemplate;
private EntityManager entityManager;
@Autowired
private PlatformTransactionManager transactionManager;
@PersistenceContext
public void setEntityManager(EntityManager entityManager) {
this.entityManager = entityManager;
}
private void clearData(String collectionName) {
DBCollection col = this.mongoTemplate.getCollection(collectionName);
if (col != null) {
this.mongoTemplate.dropCollection(collectionName);
}
}
@Test
@Transactional
@Rollback(false)
public void testCreateJpaToMongoEntityRelationship() {
clearData(Person.class.getName());
Person p = new Person("Thomas", 20);
Address a = new Address(12, "MAin St.", "Boston", "MA", "02101");
p.setAddress(a);
Resume r = new Resume();
r.addEducation("Skanstulls High School, 1975");
r.addEducation("Univ. of Stockholm, 1980");
r.addJob("DiMark, DBA, 1990-2000");
r.addJob("VMware, Developer, 2007-");
p.setResume(r);
p.setId(1L);
entityManager.persist(p);
}
@Test
@Transactional
@Rollback(false)
public void testReadJpaToMongoEntityRelationship() {
Person found = entityManager.find(Person.class, 1L);
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found.getResume());
Assert.assertEquals("DiMark, DBA, 1990-2000" + "; "
+ "VMware, Developer, 2007-", found.getResume().getJobs());
found.getResume().addJob("SpringDeveloper.com, Consultant, 2005-2006");
found.setAge(44);
}
@Test
@Transactional
@Rollback(false)
public void testUpdatedJpaToMongoEntityRelationship() {
Person found = entityManager.find(Person.class, 1L);
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found);
Assert.assertEquals(Long.valueOf(1), found.getId());
Assert.assertNotNull(found.getResume());
Assert.assertEquals("DiMark, DBA, 1990-2000" + "; "
+ "VMware, Developer, 2007-" + "; "
+ "SpringDeveloper.com, Consultant, 2005-2006", found.getResume().getJobs());
}
@Test
public void testMergeJpaEntityWithMongoDocument() {
TransactionTemplate txTemplate = new TransactionTemplate(transactionManager);
final Person detached = entityManager.find(Person.class, 1L);
detached.getResume().addJob("TargetRx, Developer, 2000-2005");
Person merged = txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
return entityManager.merge(detached);
}
});
Assert.assertTrue(detached.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
Assert.assertTrue(merged.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
final Person updated = entityManager.find(Person.class, 1L);
Assert.assertTrue(updated.getResume().getJobs().contains("TargetRx, Developer, 2000-2005"));
}
@Test
public void testRemoveJpaEntityWithMongoDocument() {
TransactionTemplate txTemplate = new TransactionTemplate(transactionManager);
txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
Person p2 = new Person("Thomas", 20);
Resume r2 = new Resume();
r2.addEducation("Skanstulls High School, 1975");
r2.addJob("DiMark, DBA, 1990-2000");
p2.setResume(r2);
p2.setId(2L);
entityManager.persist(p2);
Person p3 = new Person("Thomas", 20);
Resume r3 = new Resume();
r3.addEducation("Univ. of Stockholm, 1980");
r3.addJob("VMware, Developer, 2007-");
p3.setResume(r3);
p3.setId(3L);
entityManager.persist(p3);
return null;
}
});
txTemplate.execute(new TransactionCallback<Person>() {
public Person doInTransaction(TransactionStatus status) {
final Person found2 = entityManager.find(Person.class, 2L);
entityManager.remove(found2);
return null;
}
});
boolean weFound3 = false;
for (DBObject dbo : this.mongoTemplate.getCollection(Person.class.getName()).find()) {
Assert.assertTrue(!dbo.get("_entity_id").equals(2L));
if (dbo.get("_entity_id").equals(3L)) {
weFound3 = true;
}
}
Assert.assertTrue(weFound3);
}
}

View File

@@ -1,13 +1,28 @@
package org.springframework.data.document.persistence.test;
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore.test;
public class Address {
private Integer streetNumber;
private String streetName;
private String city;
private String state;
private String zip;
public Address(Integer streetNumber, String streetName, String city,
String state, String zip) {
super();
@@ -17,7 +32,7 @@ public class Address {
this.state = state;
this.zip = zip;
}
public Integer getStreetNumber() {
return streetNumber;
}
@@ -48,7 +63,7 @@ public class Address {
public void setZip(String zip) {
this.zip = zip;
}
}

View File

@@ -0,0 +1,102 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore.test;
import javax.persistence.Entity;
import javax.persistence.Id;
import org.springframework.data.mongodb.crossstore.RelatedDocument;
@Entity
public class Person {
@Id
Long id;
private String name;
private int age;
private java.util.Date birthDate;
@RelatedDocument
private Address address;
@RelatedDocument
private Resume resume;
public Person() {
}
public Person(String name, int age) {
this.name = name;
this.age = age;
this.birthDate = new java.util.Date();
}
public void birthday() {
++age;
}
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public int getAge() {
return age;
}
public void setAge(int age) {
this.age = age;
}
public java.util.Date getBirthDate() {
return birthDate;
}
public void setBirthDate(java.util.Date birthDate) {
this.birthDate = birthDate;
}
public Resume getResume() {
return resume;
}
public void setResume(Resume resume) {
this.resume = resume;
}
public Address getAddress() {
return address;
}
public void setAddress(Address address) {
this.address = address;
}
}

View File

@@ -0,0 +1,63 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.crossstore.test;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.bson.types.ObjectId;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.mapping.Document;
@Document
public class Resume {
private static final Log LOGGER = LogFactory.getLog(Resume.class);
@Id
private ObjectId id;
private String education = "";
private String jobs = "";
public String getId() {
return id.toString();
}
public String getEducation() {
return education;
}
public void addEducation(String education) {
LOGGER.debug("Adding education " + education);
this.education = this.education + (this.education.length() > 0 ? "; " : "") + education;
}
public String getJobs() {
return jobs;
}
public void addJob(String job) {
LOGGER.debug("Adding job " + job);
this.jobs = this.jobs + (this.jobs.length() > 0 ? "; " : "") + job;
}
@Override
public String toString() {
return "Resume [education=" + education + ", jobs=" + jobs + "]";
}
}

View File

@@ -4,7 +4,7 @@
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">
<persistence-unit name="test" transaction-type="RESOURCE_LOCAL">
<provider>org.hibernate.ejb.HibernatePersistence</provider>
<class>org.springframework.data.document.persistence.test.Person</class>
<class>org.springframework.data.mongodb.crossstore.test.Person</class>
<properties>
<property name="hibernate.dialect" value="org.hibernate.dialect.HSQLDialect"/>
<!--value='create' to build a new database on each run; value='update' to modify an existing database; value='create-drop' means the same as 'create' but also drops tables when Hibernate closes; value='validate' makes no changes to the database-->

View File

@@ -13,7 +13,7 @@
<context:spring-configured/>
<context:component-scan base-package="org.springframework.persistence.test">
<context:component-scan base-package="org.springframework.persistence.mongodb.test">
<context:exclude-filter expression="org.springframework.stereotype.Controller" type="annotation"/>
</context:component-scan>
@@ -38,12 +38,12 @@
<bean class="org.springframework.data.mongodb.core.MongoExceptionTranslator"/>
<!-- Mongo aspect config -->
<bean class="org.springframework.data.persistence.document.mongodb.MongoDocumentBacking"
<bean class="org.springframework.data.mongodb.crossstore.MongoDocumentBacking"
factory-method="aspectOf">
<property name="changeSetPersister" ref="mongoChangeSetPersister"/>
</bean>
<bean id="mongoChangeSetPersister"
class="org.springframework.data.persistence.document.mongodb.MongoChangeSetPersister">
class="org.springframework.data.mongodb.crossstore.MongoChangeSetPersister">
<property name="mongoTemplate" ref="mongoTemplate"/>
<property name="entityManagerFactory" ref="entityManagerFactory"/>
</bean>

View File

@@ -2,7 +2,7 @@ log4j.rootCategory=INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - <%m>%n
log4j.appender.stdout.layout.ConversionPattern=%d{ABSOLUTE} %5p %40.40c:%4L - %m%n
log4j.category.org.springframework=INFO
log4j.category.org.springframework.data=TRACE

View File

@@ -4,12 +4,11 @@
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-parent</artifactId>
<version>1.0.0.M4</version>
<relativePath>../spring-data-document-parent/pom.xml</relativePath>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.0.M5</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb-log4j</artifactId>
<packaging>jar</packaging>
<name>Spring Data MongoDB Log4J Appender</name>
<properties>

View File

@@ -3,10 +3,10 @@
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-parent</artifactId>
<name>Spring Data Document Parent</name>
<url>http://www.springsource.org/spring-data/data-document</url>
<version>1.0.0.M4</version>
<artifactId>spring-data-mongodb-parent</artifactId>
<name>Spring Data MongoDB Parent</name>
<url>http://www.springsource.org/spring-data/mongodb</url>
<version>1.0.0.M5</version>
<packaging>pom</packaging>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
@@ -17,7 +17,7 @@
<org.slf4j.version>1.5.10</org.slf4j.version>
<org.codehaus.jackson.version>1.6.1</org.codehaus.jackson.version>
<org.springframework.version>3.0.6.RELEASE</org.springframework.version>
<data.commons.version>1.2.0.M1</data.commons.version>
<data.commons.version>1.2.0.M2</data.commons.version>
<aspectj.version>1.6.11.RELEASE</aspectj.version>
</properties>
<profiles>
@@ -39,15 +39,15 @@
<distributionManagement>
<site>
<id>spring-site-staging</id>
<url>file:///${java.io.tmpdir}/spring-data/data-document/docs</url>
<url>file:///${java.io.tmpdir}/spring-data/mongodb/docs</url>
</site>
<repository>
<id>spring-milestone-staging</id>
<url>file:///${java.io.tmpdir}/spring-data/data-document/milestone</url>
<url>file:///${java.io.tmpdir}/spring-data/mongodb/milestone</url>
</repository>
<snapshotRepository>
<id>spring-snapshot-staging</id>
<url>file:///${java.io.tmpdir}/spring-data/data-document/snapshot</url>
<url>file:///${java.io.tmpdir}/spring-data/mongodb/snapshot</url>
</snapshotRepository>
</distributionManagement>
</profile>
@@ -63,7 +63,7 @@
<site>
<id>static.springframework.org</id>
<url>
scp://static.springframework.org/var/www/domains/springframework.org/static/htdocs/spring-data/data-document/docs/${project.version}
scp://static.springframework.org/var/www/domains/springframework.org/static/htdocs/spring-data/mongodb/docs/${project.version}
</url>
</site>
<repository>

View File

@@ -7,7 +7,7 @@
<architecture>
<element name="Config" type="Layer">
<element name="Assignment" type="TypeFilter">
<element name="**.config.**" type="IncludeTypePattern"/>
<element name="**.config.**" type="WeakTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Monitoring"/>
@@ -17,6 +17,30 @@
<element name="Assignment" type="TypeFilter">
<element name="**.repository.**" type="IncludeTypePattern"/>
</element>
<element name="API" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.repository.*" type="IncludeTypePattern"/>
</element>
</element>
<element name="Query" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.query.**" type="IncludeTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API"/>
</element>
<element name="Implementation" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.support.**" type="IncludeTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|API"/>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Query"/>
</element>
<element name="Config" type="Subsystem">
<element name="Assignment" type="TypeFilter">
<element name="**.config.**" type="IncludeTypePattern"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Repositories::Subsystem|Implementation"/>
</element>
<dependency type="AllowedDependency" toName="Project|spring-data-mongodb::Layer|Core"/>
</element>
<element name="Monitoring" type="Layer">

View File

@@ -4,17 +4,16 @@
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-document-parent</artifactId>
<version>1.0.0.M4</version>
<relativePath>../spring-data-document-parent/pom.xml</relativePath>
<artifactId>spring-data-mongodb-parent</artifactId>
<version>1.0.0.M5</version>
<relativePath>../spring-data-mongodb-parent/pom.xml</relativePath>
</parent>
<artifactId>spring-data-mongodb</artifactId>
<packaging>jar</packaging>
<name>Spring Data MongoDB Support</name>
<name>Spring Data MongoDB</name>
<properties>
<mongo.version>2.6.5</mongo.version>
<querydsl.version>2.2.0</querydsl.version>
<querydsl.version>2.2.4</querydsl.version>
</properties>
<dependencies>
@@ -148,7 +147,7 @@
</goals>
<configuration>
<outputDirectory>target/generated-test-sources</outputDirectory>
<processor>org.springframework.data.mongodb.repository.MongoAnnotationProcessor</processor>
<processor>org.springframework.data.mongodb.repository.support.MongoAnnotationProcessor</processor>
</configuration>
</execution>
</executions>

View File

@@ -35,10 +35,10 @@ public class CannotGetMongoDbConnectionException extends DataAccessResourceFailu
super(msg);
}
public CannotGetMongoDbConnectionException(String msg, String database, String username, char[] password2) {
public CannotGetMongoDbConnectionException(String msg, String database, String username, char[] password) {
super(msg);
this.username = username;
this.password = password2;
this.password = password == null ? null : password.clone();
this.database = database;
}

View File

@@ -26,7 +26,6 @@ import org.springframework.context.annotation.Configuration;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.mapping.context.MappingContextAwareBeanPostProcessor;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.SimpleMongoDbFactory;
@@ -100,12 +99,4 @@ public abstract class AbstractMongoConfiguration {
*/
protected void afterMappingMongoConverterCreation(MappingMongoConverter converter) {
}
@Bean
public MappingContextAwareBeanPostProcessor mappingContextAwareBeanPostProcessor() {
MappingContextAwareBeanPostProcessor bpp = new MappingContextAwareBeanPostProcessor();
bpp.setMappingContextBeanName("mongoMappingContext");
return bpp;
}
}

View File

@@ -25,6 +25,4 @@ public abstract class BeanNames {
static final String INDEX_HELPER = "indexCreationHelper";
static final String MONGO = "mongo";
static final String DB_FACTORY = "mongoDbFactory";
static final String POST_PROCESSOR = "mappingContextAwareBeanPostProcessor";
}

View File

@@ -38,7 +38,6 @@ import org.springframework.beans.factory.xml.ParserContext;
import org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider;
import org.springframework.core.type.filter.AnnotationTypeFilter;
import org.springframework.data.annotation.Persistent;
import org.springframework.data.mapping.context.MappingContextAwareBeanPostProcessor;
import org.springframework.data.mongodb.core.convert.CustomConversions;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.index.MongoPersistentEntityIndexCreator;
@@ -70,15 +69,6 @@ public class MappingMongoConverterParser extends AbstractBeanDefinitionParser {
BeanDefinition conversionsDefinition = getCustomConversions(element, parserContext);
String ctxRef = potentiallyCreateMappingContext(element, parserContext, conversionsDefinition);
try {
registry.getBeanDefinition(POST_PROCESSOR);
} catch (NoSuchBeanDefinitionException ignored) {
BeanDefinitionBuilder postProcBuilder = BeanDefinitionBuilder
.genericBeanDefinition(MappingContextAwareBeanPostProcessor.class);
postProcBuilder.addPropertyValue("mappingContextBeanName", ctxRef);
registry.registerBeanDefinition(POST_PROCESSOR, postProcBuilder.getBeanDefinition());
}
// Need a reference to a Mongo instance
String dbFactoryRef = element.getAttribute("db-factory-ref");
if (!StringUtils.hasText(dbFactoryRef)) {

View File

@@ -1,5 +1,5 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -16,6 +16,7 @@
package org.springframework.data.mongodb.config;
import static org.springframework.data.mongodb.config.BeanNames.*;
import static org.springframework.data.mongodb.config.ParsingUtils.*;
import org.springframework.beans.factory.BeanDefinitionStoreException;
import org.springframework.beans.factory.config.BeanDefinition;
@@ -33,6 +34,7 @@ import org.springframework.util.StringUtils;
import org.w3c.dom.Element;
import com.mongodb.Mongo;
import com.mongodb.MongoURI;
/**
* {@link BeanDefinitionParser} to parse {@code db-factory} elements into {@link BeanDefinition}s.
@@ -54,31 +56,37 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
@Override
protected AbstractBeanDefinition parseInternal(Element element, ParserContext parserContext) {
String uri = element.getAttribute("uri");
String mongoRef = element.getAttribute("mongo-ref");
if (!StringUtils.hasText(mongoRef)) {
mongoRef = registerMongoBeanDefinition(element, parserContext);
}
// Database name
String dbname = element.getAttribute("dbname");
if (!StringUtils.hasText(dbname)) {
dbname = "db";
}
BeanDefinition userCredentials = getUserCredentialsBeanDefinition(element, parserContext);
// Common setup
BeanDefinitionBuilder dbFactoryBuilder = BeanDefinitionBuilder.genericBeanDefinition(SimpleMongoDbFactory.class);
ParsingUtils.setPropertyValue(element, dbFactoryBuilder, "write-concern", "writeConcern");
if (StringUtils.hasText(uri)) {
if(StringUtils.hasText(mongoRef) || StringUtils.hasText(dbname) || userCredentials != null) {
parserContext.getReaderContext().error("Configure either Mongo URI or details individually!", parserContext.extractSource(element));
}
dbFactoryBuilder.addConstructorArgValue(getMongoUri(uri));
return getSourceBeanDefinition(dbFactoryBuilder, parserContext, element);
}
// Defaulting
mongoRef = StringUtils.hasText(mongoRef) ? mongoRef : registerMongoBeanDefinition(element, parserContext);
dbname = StringUtils.hasText(dbname) ? dbname : "db";
dbFactoryBuilder.addConstructorArgValue(new RuntimeBeanReference(mongoRef));
dbFactoryBuilder.addConstructorArgValue(dbname);
BeanDefinition userCredentials = getUserCredentialsBeanDefinition(element);
if (userCredentials != null) {
dbFactoryBuilder.addConstructorArgValue(userCredentials);
}
ParsingUtils.setPropertyValue(element, dbFactoryBuilder, "write-concern", "writeConcern");
return dbFactoryBuilder.getBeanDefinition();
return getSourceBeanDefinition(dbFactoryBuilder, parserContext, element);
}
/**
@@ -105,7 +113,7 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
* @param element
* @return the {@link BeanDefinition} or {@literal null} if neither username nor password given.
*/
private BeanDefinition getUserCredentialsBeanDefinition(Element element) {
private BeanDefinition getUserCredentialsBeanDefinition(Element element, ParserContext context) {
String username = element.getAttribute("username");
String password = element.getAttribute("password");
@@ -118,6 +126,20 @@ public class MongoDbFactoryParser extends AbstractBeanDefinitionParser {
userCredentialsBuilder.addConstructorArgValue(StringUtils.hasText(username) ? username : null);
userCredentialsBuilder.addConstructorArgValue(StringUtils.hasText(password) ? password : null);
return userCredentialsBuilder.getBeanDefinition();
return getSourceBeanDefinition(userCredentialsBuilder, context, element);
}
/**
* Creates a {@link BeanDefinition} for a {@link MongoURI}.
*
* @param uri
* @return
*/
private BeanDefinition getMongoUri(String uri) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.genericBeanDefinition(MongoURI.class);
builder.addConstructorArgValue(uri);
return builder.getBeanDefinition();
}
}

View File

@@ -16,13 +16,14 @@
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.xml.NamespaceHandlerSupport;
import org.springframework.data.mongodb.repository.config.MongoRepositoryConfigParser;
/**
* {@link org.springframework.beans.factory.xml.NamespaceHandler} for Mongo DB based repositories.
*
* @author Oliver Gierke
*/
public class MongoRepositoryNamespaceHandler extends NamespaceHandlerSupport {
public class MongoNamespaceHandler extends NamespaceHandlerSupport {
/*
* (non-Javadoc)

View File

@@ -16,6 +16,8 @@
package org.springframework.data.mongodb.config;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.support.AbstractBeanDefinition;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.ManagedList;
import org.springframework.beans.factory.xml.ParserContext;
@@ -79,6 +81,7 @@ abstract class ParsingUtils {
setPropertyValue(optionsElement, optionsDefBuilder, "socket-timeout", "socketTimeout");
setPropertyValue(optionsElement, optionsDefBuilder, "socket-keep-alive", "socketKeepAlive");
setPropertyValue(optionsElement, optionsDefBuilder, "auto-connect-retry", "autoConnectRetry");
setPropertyValue(optionsElement, optionsDefBuilder, "max-auto-connect-retry-time", "maxAutoConnectRetryTime");
setPropertyValue(optionsElement, optionsDefBuilder, "write-number", "writeNumber");
setPropertyValue(optionsElement, optionsDefBuilder, "write-timeout", "writeTimeout");
setPropertyValue(optionsElement, optionsDefBuilder, "write-fsync", "writeFsync");
@@ -109,4 +112,19 @@ abstract class ParsingUtils {
builder.addPropertyValue(attrName, attr);
}
}
/**
* Returns the {@link BeanDefinition} built by the given {@link BeanDefinitionBuilder} enriched with source
* information derived from the given {@link Element}.
*
* @param builder must not be {@literal null}.
* @param context must not be {@literal null}.
* @param element must not be {@literal null}.
* @return
*/
static AbstractBeanDefinition getSourceBeanDefinition(BeanDefinitionBuilder builder, ParserContext context, Element element) {
AbstractBeanDefinition definition = builder.getBeanDefinition();
definition.setSource(context.extractSource(element));
return definition;
}
}

View File

@@ -90,31 +90,23 @@ public class MongoFactoryBean implements FactoryBean<Mongo>, PersistenceExceptio
Mongo mongo;
if (host == null) {
ServerAddress defaultOptions = new ServerAddress();
logger.debug("Property host not specified. Using default configuration");
mongo = new Mongo();
if (mongoOptions == null) {
mongoOptions = new MongoOptions();
}
if (replicaPair != null) {
if (replicaPair.size() < 2) {
throw new CannotGetMongoDbConnectionException("A replica pair must have two server entries");
}
mongo = new Mongo(replicaPair.get(0), replicaPair.get(1), mongoOptions);
} else if (replicaSetSeeds != null) {
mongo = new Mongo(replicaSetSeeds, mongoOptions);
} else {
ServerAddress defaultOptions = new ServerAddress();
if (mongoOptions == null) {
mongoOptions = new MongoOptions();
}
if (replicaPair != null) {
if (replicaPair.size() < 2) {
throw new CannotGetMongoDbConnectionException("A replica pair must have two server entries");
}
mongo = new Mongo(replicaPair.get(0), replicaPair.get(1), mongoOptions);
} else if (replicaSetSeeds != null) {
mongo = new Mongo(replicaSetSeeds, mongoOptions);
} else {
String mongoHost = host != null ? host : defaultOptions.getHost();
mongo = port != null ? new Mongo(new ServerAddress(mongoHost, port), mongoOptions) : new Mongo(mongoHost,
mongoOptions);
}
String mongoHost = host != null ? host : defaultOptions.getHost();
mongo = port != null ? new Mongo(new ServerAddress(mongoHost, port), mongoOptions) : new Mongo(mongoHost,
mongoOptions);
}
if (writeConcern != null) {

View File

@@ -79,22 +79,22 @@ public interface MongoOperations {
CommandResult executeCommand(DBObject command, int options);
/**
* Execute a MongoDB query and iterate over the query results on a per-document basis with a DocumentCallbackHandler.
* Execute a MongoDB query and iterate over the query results on a per-document basis with a DocumentCallbackHandler.
*
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param collectionName name of the collection to retrieve the objects from
* @param dch the handler that will extract results, one document at a time
* @param collectionName name of the collection to retrieve the objects from
* @param dch the handler that will extract results, one document at a time
*/
void executeQuery(Query query, String collectionName, DocumentCallbackHandler dch);
/**
* Execute a MongoDB query and iterate over the query results on a per-document basis with a DocumentCallbackHandler using the
* Execute a MongoDB query and iterate over the query results on a per-document basis with a DocumentCallbackHandler using the
* provided CursorPreparer.
* @param query the query class that specifies the criteria used to find a record and also an optional fields
* specification
* @param collectionName name of the collection to retrieve the objects from
* @param dch the handler that will extract results, one document at a time
* @param collectionName name of the collection to retrieve the objects from
* @param dch the handler that will extract results, one document at a time
* @param preparer allows for customization of the DBCursor used when iterating over the result set, (apply limits,
* skips and so on).
*/
@@ -504,6 +504,24 @@ public interface MongoOperations {
*/
<T> T findAndRemove(Query query, Class<T> entityClass, String collectionName);
/**
* Returns the number of documents for the given {@link Query} by querying the collection of the given entity class.
*
* @param query
* @param entityClass must not be {@literal null}.
* @return
*/
long count(Query query, Class<?> entityClass);
/**
* Returns the number of documents for the given {@link Query} querying the given collection.
*
* @param query
* @param collectionName must not be {@literal null} or empty.
* @return
*/
long count(Query query, String collectionName);
/**
* Insert the object into the collection for the entity type of the object to save.
* <p/>
@@ -639,11 +657,19 @@ public interface MongoOperations {
WriteResult updateMulti(Query query, Update update, String collectionName);
/**
* Remove the given object from the collection by Id
* Remove the given object from the collection by id.
*
* @param object
*/
void remove(Object object);
/**
* Removes the given object from the given collection.
*
* @param object
* @param collection must not be {@literal null} or empty.
*/
void remove(Object object, String collection);
/**
* Remove all documents that match the provided query document criteria from the the collection used to store the

View File

@@ -68,6 +68,8 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
*/
private boolean autoConnectRetry = MONGO_OPTIONS.autoConnectRetry;
private long maxAutoConnectRetryTime = MONGO_OPTIONS.maxAutoConnectRetryTime;
/**
* This specifies the number of servers to wait for on the write operation, and exception raising behavior.
*
@@ -183,6 +185,16 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
this.autoConnectRetry = autoConnectRetry;
}
/**
* The maximum amount of time in millisecons to spend retrying to open connection to the same server. Default is 0,
* which means to use the default 15s if autoConnectRetry is on.
*
* @param maxAutoConnectRetryTime the maxAutoConnectRetryTime to set
*/
public void setMaxAutoConnectRetryTime(long maxAutoConnectRetryTime) {
this.maxAutoConnectRetryTime = maxAutoConnectRetryTime;
}
/**
* Specifies if the driver is allowed to read from secondaries or slaves. Defaults to false.
*
@@ -200,6 +212,7 @@ public class MongoOptionsFactoryBean implements FactoryBean<MongoOptions>, Initi
MONGO_OPTIONS.socketTimeout = socketTimeout;
MONGO_OPTIONS.socketKeepAlive = socketKeepAlive;
MONGO_OPTIONS.autoConnectRetry = autoConnectRetry;
MONGO_OPTIONS.maxAutoConnectRetryTime = maxAutoConnectRetryTime;
MONGO_OPTIONS.slaveOk = slaveOk;
MONGO_OPTIONS.w = writeNumber;
MONGO_OPTIONS.wtimeout = writeTimeout;

View File

@@ -1,7 +1,9 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* Licensed under t
import org.springframework.data.convert.EntityReader;
he Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
@@ -58,13 +60,13 @@ import org.springframework.dao.DataAccessException;
import org.springframework.dao.DataIntegrityViolationException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.data.authentication.UserCredentials;
import org.springframework.data.convert.EntityReader;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mapping.model.BeanWrapper;
import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.convert.MappingMongoConverter;
import org.springframework.data.mongodb.core.convert.MongoConverter;
import org.springframework.data.mongodb.core.convert.MongoReader;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.GeoResult;
@@ -512,6 +514,28 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
entityClass);
}
public long count(Query query, Class<?> entityClass) {
Assert.notNull(entityClass);
return count(query, entityClass, determineCollectionName(entityClass));
}
public long count(final Query query, String collectionName) {
return count(query, null, collectionName);
}
private long count(Query query, Class<?> entityClass, String collectionName) {
Assert.hasText(collectionName);
final DBObject dbObject = query == null ? null : mapper.getMappedObject(query.getQueryObject(),
entityClass == null ? null : mappingContext.getPersistentEntity(entityClass));
return execute(collectionName, new CollectionCallback<Long>() {
public Long doInCollection(DBCollection collection) throws MongoException, DataAccessException {
return collection.count(dbObject);
}
});
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.MongoOperations#insert(java.lang.Object)
@@ -785,7 +809,50 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return;
}
remove(new Query(where(getIdPropertyName(object)).is(getIdValue(object))), object.getClass());
remove(getIdQueryFor(object), object.getClass());
}
public void remove(Object object, String collection) {
Assert.hasText(collection);
if (object == null) {
return;
}
remove(getIdQueryFor(object), collection);
}
/**
* Returns a {@link Query} for the given entity by its id.
*
* @param object must not be {@literal null}.
* @return
*/
private Query getIdQueryFor(Object object) {
Assert.notNull(object);
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(object.getClass());
MongoPersistentProperty idProp = entity.getIdProperty();
if (idProp == null) {
throw new MappingException("No id property found for object of type " + entity.getType().getName());
}
ConversionService service = mongoConverter.getConversionService();
Object idProperty = null;
try {
idProperty = BeanWrapper.create(object, service).getProperty(idProp, Object.class, true);
return new Query(where(idProp.getFieldName()).is(idProperty));
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
}
public <T> void remove(Query query, Class<T> entityClass) {
@@ -832,8 +899,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
entityClass), collectionName);
}
public <T> MapReduceResults<T> mapReduce(String inputCollectionName, String mapFunction, String reduceFunction, Class<T> entityClass) {
return mapReduce(null, inputCollectionName, mapFunction, reduceFunction, new MapReduceOptions().outputTypeInline(), entityClass);
public <T> MapReduceResults<T> mapReduce(String inputCollectionName, String mapFunction, String reduceFunction,
Class<T> entityClass) {
return mapReduce(null, inputCollectionName, mapFunction, reduceFunction, new MapReduceOptions().outputTypeInline(),
entityClass);
}
public <T> MapReduceResults<T> mapReduce(String inputCollectionName, String mapFunction, String reduceFunction,
@@ -841,12 +910,14 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
return mapReduce(null, inputCollectionName, mapFunction, reduceFunction, mapReduceOptions, entityClass);
}
public <T> MapReduceResults<T> mapReduce(Query query, String inputCollectionName, String mapFunction, String reduceFunction, Class<T> entityClass) {
return mapReduce(query, inputCollectionName, mapFunction, reduceFunction, new MapReduceOptions().outputTypeInline(), entityClass);
public <T> MapReduceResults<T> mapReduce(Query query, String inputCollectionName, String mapFunction,
String reduceFunction, Class<T> entityClass) {
return mapReduce(query, inputCollectionName, mapFunction, reduceFunction,
new MapReduceOptions().outputTypeInline(), entityClass);
}
public <T> MapReduceResults<T> mapReduce(Query query, String inputCollectionName, String mapFunction, String reduceFunction,
MapReduceOptions mapReduceOptions, Class<T> entityClass) {
public <T> MapReduceResults<T> mapReduce(Query query, String inputCollectionName, String mapFunction,
String reduceFunction, MapReduceOptions mapReduceOptions, Class<T> entityClass) {
String mapFunc = replaceWithResourceIfNecessary(mapFunction);
String reduceFunc = replaceWithResourceIfNecessary(reduceFunction);
DBCollection inputCollection = getCollection(inputCollectionName);
@@ -866,15 +937,16 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
} else {
commandResult = executeCommand(commandObject);
}
commandResult.throwOnError();
commandResult.throwOnError();
} catch (RuntimeException ex) {
this.potentiallyConvertRuntimeException(ex);
}
String error = commandResult.getErrorMessage();
if (error != null) {
throw new InvalidDataAccessApiUsageException("Command execution failed: Error [" + error + "], Command = " + commandObject);
throw new InvalidDataAccessApiUsageException("Command execution failed: Error [" + error + "], Command = "
+ commandObject);
}
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("MapReduce command result = [" + commandResult + "]");
}
@@ -1002,7 +1074,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
* @return the List of converted objects.
*/
protected <T> T doFindOne(String collectionName, DBObject query, DBObject fields, Class<T> entityClass) {
MongoReader<? super T> readerToUse = this.mongoConverter;
EntityReader<? super T, DBObject> readerToUse = this.mongoConverter;
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
DBObject mappedQuery = mapper.getMappedObject(query, entity);
@@ -1061,7 +1133,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
LOGGER.debug("find using query: " + query + " fields: " + fields + " for class: " + entityClass
+ " in collection: " + collectionName);
}
MongoReader<? super T> readerToUse = this.mongoConverter;
EntityReader<? super T, DBObject> readerToUse = this.mongoConverter;
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
return executeFindMultiInternal(new FindCallback(mapper.getMappedObject(query, entity), fields), null,
new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
@@ -1096,7 +1168,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
*/
protected <T> T doFindAndRemove(String collectionName, DBObject query, DBObject fields, DBObject sort,
Class<T> entityClass) {
MongoReader<? super T> readerToUse = this.mongoConverter;
EntityReader<? super T, DBObject> readerToUse = this.mongoConverter;
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("findAndRemove using query: " + query + " fields: " + fields + " sort: " + sort + " for class: "
+ entityClass + " in collection: " + collectionName);
@@ -1106,35 +1178,6 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
new ReadDbObjectCallback<T>(readerToUse, entityClass), collectionName);
}
protected Object getIdValue(Object object) {
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(object.getClass());
MongoPersistentProperty idProp = entity.getIdProperty();
if (idProp == null) {
throw new MappingException("No id property found for object of type " + entity.getType().getName());
}
ConversionService service = mongoConverter.getConversionService();
try {
return BeanWrapper.create(object, service).getProperty(idProp, Object.class, true);
} catch (IllegalAccessException e) {
throw new MappingException(e.getMessage(), e);
} catch (InvocationTargetException e) {
throw new MappingException(e.getMessage(), e);
}
}
protected String getIdPropertyName(Object object) {
Assert.notNull(object);
MongoPersistentEntity<?> persistentEntity = mappingContext.getPersistentEntity(object.getClass());
MongoPersistentProperty idProperty = persistentEntity.getIdProperty();
return idProperty == null ? ID : idProperty.getName();
}
/**
* Populates the id property of the saved object, if it's not set already.
*
@@ -1277,7 +1320,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
if (entityClass == null) {
throw new InvalidDataAccessApiUsageException(
"No class parameter provided, entity collection can't be determined for " + entityClass);
"No class parameter provided, entity collection can't be determined!");
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(entityClass);
@@ -1445,10 +1488,10 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
*/
private class ReadDbObjectCallback<T> implements DbObjectCallback<T> {
private final MongoReader<? super T> reader;
private final EntityReader<? super T, DBObject> reader;
private final Class<T> type;
public ReadDbObjectCallback(MongoReader<? super T> reader, Class<T> type) {
public ReadDbObjectCallback(EntityReader<? super T, DBObject> reader, Class<T> type) {
Assert.notNull(reader);
Assert.notNull(type);
this.reader = reader;
@@ -1457,7 +1500,7 @@ public class MongoTemplate implements MongoOperations, ApplicationContextAware {
public T doWith(DBObject object) {
if (null != object) {
maybeEmitEvent(new AfterLoadEvent<DBObject>(object));
maybeEmitEvent(new AfterLoadEvent<T>(object, type));
}
T source = reader.read(type, object);
if (null != source) {

View File

@@ -60,7 +60,9 @@ public class QueryMapper {
* @return
*/
public DBObject getMappedObject(DBObject query, MongoPersistentEntity<?> entity) {
String idKey = null;
if (null != entity && entity.getIdProperty() != null) {
idKey = entity.getIdProperty().getName();
} else if (query.containsField("id")) {
@@ -87,7 +89,6 @@ public class QueryMapper {
value = getMappedObject((DBObject) value, entity);
}
} else {
value = convertId(value);
}
newKey = "_id";
@@ -100,11 +101,13 @@ public class QueryMapper {
newConditions.add(getMappedObject((DBObject) iter.next(), entity));
}
value = newConditions;
} else {
// TODO: Implement other forms of conversion (like @Alias and whatnot)
} else if (key.equals("$ne")) {
value = convertId(value);
}
newDbo.put(newKey, value);
}
return newDbo;
}

View File

@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core;
import java.net.UnknownHostException;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.beans.factory.DisposableBean;
@@ -25,6 +27,8 @@ import org.springframework.util.Assert;
import com.mongodb.DB;
import com.mongodb.Mongo;
import com.mongodb.MongoException;
import com.mongodb.MongoURI;
import com.mongodb.WriteConcern;
/**
@@ -69,6 +73,19 @@ public class SimpleMongoDbFactory implements DisposableBean, MongoDbFactory {
this.username = userCredentials.getUsername();
this.password = userCredentials.getPassword();
}
/**
* Creates a new {@link SimpleMongoDbFactory} instance from the given {@link MongoURI}.
*
* @param uri must not be {@literal null}.
* @throws MongoException
* @throws UnknownHostException
* @see MongoURI
*/
public SimpleMongoDbFactory(MongoURI uri) throws MongoException, UnknownHostException {
this(new Mongo(uri), uri.getDatabase(), new UserCredentials(uri.getUsername(), String.valueOf(uri.getPassword())));
}
/**
* Configures the {@link WriteConcern} to be used on the {@link DB} instance being created.

View File

@@ -1,98 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import java.util.HashMap;
import java.util.Map;
import java.util.Map.Entry;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
/**
* {@link TypeMapper} allowing to configure a {@link Map} containing {@link Class} to {@link String} mappings that will
* be used to map the values found under the configured type key (see {@link DefaultTypeMapper#setTypeKey(String)}. This
* allows declarative type mapping in a Spring config file for example.
*
* @author Oliver Gierke
*/
public class ConfigurableTypeMapper extends DefaultTypeMapper {
private final Map<TypeInformation<?>, String> typeMap;
private boolean handleUnmappedClasses = false;
/**
* Creates a new {@link ConfigurableTypeMapper} for the given type map.
*
* @param sourceTypeMap must not be {@literal null}.
*/
public ConfigurableTypeMapper(Map<? extends Class<?>, String> sourceTypeMap) {
Assert.notNull(sourceTypeMap);
this.typeMap = new HashMap<TypeInformation<?>, String>(sourceTypeMap.size());
for (Entry<? extends Class<?>, String> entry : sourceTypeMap.entrySet()) {
TypeInformation<?> key = ClassTypeInformation.from(entry.getKey());
String value = entry.getValue();
if (typeMap.containsValue(value)) {
throw new IllegalArgumentException(String.format(
"Detected mapping ambiguity! String %s cannot be mapped to more than one type!", value));
}
this.typeMap.put(key, value);
}
}
/**
* Configures whether to try to handle unmapped classes by simply writing the class' name or loading the class as
* specified in the superclass. Defaults to {@literal false}.
*
* @param handleUnmappedClasses the handleUnmappedClasses to set
*/
public void setHandleUnmappedClasses(boolean handleUnmappedClasses) {
this.handleUnmappedClasses = handleUnmappedClasses;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DefaultTypeMapper#getTypeInformation(java.lang.String)
*/
@Override
protected TypeInformation<?> getTypeInformation(String value) {
for (Entry<TypeInformation<?>, String> entry : typeMap.entrySet()) {
if (entry.getValue().equals(value)) {
return entry.getKey();
}
}
return handleUnmappedClasses ? super.getTypeInformation(value) : null;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.DefaultTypeMapper#getTypeString(org.springframework.data.util.TypeInformation)
*/
@Override
protected String getTypeString(TypeInformation<?> typeInformation) {
String key = typeMap.get(typeInformation);
return key != null ? key : handleUnmappedClasses ? super.getTypeString(typeInformation) : null;
}
}

View File

@@ -104,8 +104,10 @@ public class CustomConversions {
}
/**
* Returns whether the given type is considered to be simple.
* Returns whether the given type is considered to be simple. That means it's either a general simple type or we have
* a writing {@link Converter} registered for a particular type.
*
* @see SimpleTypeHolder#isSimpleType(Class)
* @param type
* @return
*/
@@ -176,7 +178,6 @@ public class CustomConversions {
if (isMongoBasicType(pair.getSourceType())) {
readingPairs.add(pair);
customSimpleTypes.add(pair.getTargetType());
}
if (isMongoBasicType(pair.getTargetType())) {

View File

@@ -0,0 +1,122 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import java.util.Arrays;
import java.util.List;
import java.util.Map;
import org.springframework.data.convert.SimpleTypeInformationMapper;
import org.springframework.data.convert.DefaultTypeMapper;
import org.springframework.data.convert.TypeAliasAccessor;
import org.springframework.data.convert.TypeInformationMapper;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
import com.mongodb.BasicDBList;
import com.mongodb.DBObject;
/**
* Default implementation of {@link MongoTypeMapper} allowing configuration of the key to lookup and store type
* information in {@link DBObject}. The key defaults to {@link #DEFAULT_TYPE_KEY}. Actual type-to-{@link String}
* conversion and back is done in {@link #getTypeString(TypeInformation)} or {@link #getTypeInformation(String)}
* respectively.
*
* @author Oliver Gierke
*/
public class DefaultMongoTypeMapper extends DefaultTypeMapper<DBObject> implements MongoTypeMapper {
public static final String DEFAULT_TYPE_KEY = "_class";
@SuppressWarnings("rawtypes")
private static final TypeInformation<List> LIST_TYPE_INFO = ClassTypeInformation.from(List.class);
@SuppressWarnings("rawtypes")
private static final TypeInformation<Map> MAP_TYPE_INFO = ClassTypeInformation.from(Map.class);
private String typeKey = DEFAULT_TYPE_KEY;
public DefaultMongoTypeMapper() {
this(DEFAULT_TYPE_KEY, Arrays.asList(SimpleTypeInformationMapper.INSTANCE));
}
public DefaultMongoTypeMapper(String typeKey) {
super(new DBObjectTypeAliasAccessor(typeKey));
this.typeKey = typeKey;
}
public DefaultMongoTypeMapper(String typeKey, MappingContext<? extends PersistentEntity<?,?>, ?> mappingContext) {
super(new DBObjectTypeAliasAccessor(typeKey), mappingContext, Arrays.asList(SimpleTypeInformationMapper.INSTANCE));
this.typeKey = typeKey;
}
public DefaultMongoTypeMapper(String typeKey, List<? extends TypeInformationMapper> mappers) {
super(new DBObjectTypeAliasAccessor(typeKey), mappers);
this.typeKey = typeKey;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.MongoTypeMapper#isTypeKey(java.lang.String)
*/
public boolean isTypeKey(String key) {
return typeKey == null ? false : typeKey.equals(key);
}
/* (non-Javadoc)
* @see org.springframework.data.convert.DefaultTypeMapper#getFallbackTypeFor(java.lang.Object)
*/
@Override
protected TypeInformation<?> getFallbackTypeFor(DBObject source) {
return source instanceof BasicDBList ? LIST_TYPE_INFO : MAP_TYPE_INFO;
}
/**
*
* @author Oliver Gierke
*/
public static final class DBObjectTypeAliasAccessor implements TypeAliasAccessor<DBObject> {
private final String typeKey;
public DBObjectTypeAliasAccessor(String typeKey) {
this.typeKey = typeKey;
}
/*
* (non-Javadoc)
* @see org.springframework.data.convert.TypeAliasAccessor#readAliasFrom(java.lang.Object)
*/
public Object readAliasFrom(DBObject source) {
if (source instanceof BasicDBList) {
return null;
}
return source.get(typeKey);
}
/*
* (non-Javadoc)
* @see org.springframework.data.convert.TypeAliasAccessor#writeTypeTo(java.lang.Object, java.lang.Object)
*/
public void writeTypeTo(DBObject sink, Object alias) {
if (typeKey != null) {
sink.put(typeKey, alias);
}
}
}
}

View File

@@ -1,147 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import java.util.List;
import org.springframework.data.util.ClassTypeInformation;
import org.springframework.data.util.TypeInformation;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBList;
import com.mongodb.DBObject;
/**
* Default implementation of {@link TypeMapper} allowing configuration of the key to lookup and store type information
* in {@link DBObject}. The key defaults to {@link #DEFAULT_TYPE_KEY}. Actual type-to-{@link String} conversion and back
* is done in {@link #getTypeString(TypeInformation)} or {@link #getTypeInformation(String)} respectively.
*
* @author Oliver Gierke
*/
public class DefaultTypeMapper implements TypeMapper {
public static final String DEFAULT_TYPE_KEY = "_class";
@SuppressWarnings("rawtypes")
private static final TypeInformation<List> LIST_TYPE_INFORMATION = ClassTypeInformation.from(List.class);
private String typeKey = DEFAULT_TYPE_KEY;
/**
* Sets the key to store the type information under. If set to {@literal null} no type information will be stored in
* the document.
*
* @param typeKey the typeKey to set
*/
public void setTypeKey(String typeKey) {
this.typeKey = typeKey;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.TypeMapper#isTypeKey(java.lang.String)
*/
public boolean isTypeKey(String key) {
return typeKey == null ? false : typeKey.equals(key);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.TypeMapper#readType(com.mongodb.DBObject)
*/
public TypeInformation<?> readType(DBObject dbObject) {
if (dbObject instanceof BasicDBList) {
return LIST_TYPE_INFORMATION;
}
if (typeKey == null) {
return null;
}
Object classToBeUsed = dbObject.get(typeKey);
if (classToBeUsed == null) {
return null;
}
return getTypeInformation(classToBeUsed.toString());
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.TypeMapper#writeType(java.lang.Class, com.mongodb.DBObject)
*/
public void writeType(Class<?> type, DBObject dbObject) {
writeType(ClassTypeInformation.from(type), dbObject);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.TypeMapper#writeType(java.lang.Class, com.mongodb.DBObject)
*/
public void writeType(TypeInformation<?> info, DBObject dbObject) {
Assert.notNull(info);
if (typeKey == null) {
return;
}
String string = getTypeString(info);
if (string != null) {
dbObject.put(typeKey, getTypeString(info));
}
}
/**
* Turn the given type information into the String representation that shall be stored inside the {@link DBObject}. If
* the returned String is {@literal null} no type information will be stored. Default implementation simply returns
* the fully-qualified class name.
*
* @param typeInformation must not be {@literal null}.
* @return the String representation to be stored or {@literal null} if no type information shall be stored.
*/
protected String getTypeString(TypeInformation<?> typeInformation) {
return typeInformation.getType().getName();
}
/**
* Returns the {@link TypeInformation} that shall be used when the given {@link String} value is found as type hint.
* The default implementation will simply interpret the given value as fully-qualified class name and try to load the
* class. Will return {@literal null} in case the given {@link String} is empty. Will not be called in case no
* {@link String} was found for the configured type key at all.
*
* @param value the type to load, must not be {@literal null}.
* @return the type to be used for the given {@link String} representation or {@literal null} if nothing found or the
* class cannot be loaded.
*/
protected TypeInformation<?> getTypeInformation(String value) {
if (!StringUtils.hasText(value)) {
return null;
}
try {
return ClassTypeInformation.from(ClassUtils.forName(value, null));
} catch (ClassNotFoundException e) {
return null;
}
}
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,7 +13,6 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import java.lang.reflect.Array;
@@ -29,11 +28,6 @@ import java.util.List;
import java.util.Map;
import java.util.Map.Entry;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.bson.types.ObjectId;
@@ -44,6 +38,7 @@ import org.springframework.context.expression.BeanFactoryResolver;
import org.springframework.core.CollectionFactory;
import org.springframework.core.convert.ConversionException;
import org.springframework.core.convert.support.ConversionServiceFactory;
import org.springframework.data.convert.TypeMapper;
import org.springframework.data.mapping.Association;
import org.springframework.data.mapping.AssociationHandler;
import org.springframework.data.mapping.PreferredConstructor;
@@ -54,6 +49,7 @@ import org.springframework.data.mapping.model.MappingException;
import org.springframework.data.mapping.model.ParameterValueProvider;
import org.springframework.data.mapping.model.SpELAwareParameterValueProvider;
import org.springframework.data.mongodb.MongoDbFactory;
import org.springframework.data.mongodb.core.QueryMapper;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.util.ClassTypeInformation;
@@ -65,18 +61,27 @@ import org.springframework.util.Assert;
import org.springframework.util.CollectionUtils;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;
import com.mongodb.DB;
import com.mongodb.DBObject;
import com.mongodb.DBRef;
/**
* {@link MongoConverter} that uses a {@link MappingContext} to do sophisticated mapping of domain objects to
* {@link DBObject}.
*
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Oliver Gierke
* @author Jon Brisbin
*/
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware,
TypeMapperProvider {
public class MappingMongoConverter extends AbstractMongoConverter implements ApplicationContextAware, TypeKeyAware {
@SuppressWarnings("rawtypes")
private static final TypeInformation<Map> MAP_TYPE_INFORMATION = ClassTypeInformation.from(Map.class);
@SuppressWarnings("rawtypes")
private static final TypeInformation<Collection> COLLECTION_TYPE_INFORMATION = ClassTypeInformation
.from(Collection.class);
private static final List<Class<?>> VALID_ID_TYPES = Arrays.asList(new Class<?>[] { ObjectId.class, String.class,
BigInteger.class, byte[].class });
@@ -85,9 +90,10 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
protected final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
protected final SpelExpressionParser spelExpressionParser = new SpelExpressionParser();
protected final MongoDbFactory mongoDbFactory;
protected final QueryMapper idMapper;
protected ApplicationContext applicationContext;
protected boolean useFieldAccessOnly = true;
protected TypeMapper typeMapper = new DefaultTypeMapper();
protected MongoTypeMapper typeMapper;
/**
* Creates a new {@link MappingMongoConverter} given the new {@link MongoDbFactory} and {@link MappingContext}.
@@ -105,30 +111,34 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
this.mongoDbFactory = mongoDbFactory;
this.mappingContext = mappingContext;
this.typeMapper = new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY, mappingContext);
this.idMapper = new QueryMapper(conversionService);
}
/**
* Configures the {@link TypeMapper} to be used to add type information to {@link DBObject}s created by the converter
* and how to lookup type information from {@link DBObject}s when reading them. Uses a {@link DefaultTypeMapper} by
* default. Setting this to {@literal null} will reset the {@link TypeMapper} to the default one.
* Configures the {@link MongoTypeMapper} to be used to add type information to {@link DBObject}s created by the
* converter and how to lookup type information from {@link DBObject}s when reading them. Uses a
* {@link DefaultMongoTypeMapper} by default. Setting this to {@literal null} will reset the {@link TypeMapper} to the
* default one.
*
* @param typeMapper the typeMapper to set
*/
public void setTypeMapper(TypeMapper typeMapper) {
this.typeMapper = typeMapper == null ? new DefaultTypeMapper() : typeMapper;
public void setTypeMapper(MongoTypeMapper typeMapper) {
this.typeMapper = typeMapper == null ? new DefaultMongoTypeMapper(DefaultMongoTypeMapper.DEFAULT_TYPE_KEY,
mappingContext) : typeMapper;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.convert.MongoConverter#getTypeMapper()
* @see org.springframework.data.mongodb.core.convert.TypeKeyAware#isTypeKey(java.lang.String)
*/
public TypeMapper getTypeMapper() {
return this.typeMapper;
public boolean isTypeKey(String key) {
return typeMapper.isTypeKey(key);
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.core.convert.MongoConverter#getMappingContext()
* @see org.springframework.data.convert.EntityConverter#getMappingContext()
*/
public MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> getMappingContext() {
return mappingContext;
@@ -168,7 +178,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return null;
}
TypeInformation<? extends S> typeToUse = getMoreConcreteTargetType(dbo, type);
TypeInformation<? extends S> typeToUse = typeMapper.readType(dbo, type);
Class<? extends S> rawType = typeToUse.getType();
if (conversions.hasCustomReadTarget(dbo.getClass(), rawType)) {
@@ -200,7 +210,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
spelCtx.setBeanResolver(new BeanFactoryResolver(applicationContext));
}
if (!(dbo instanceof BasicDBList)) {
String[] keySet = dbo.keySet().toArray(new String[] {});
String[] keySet = dbo.keySet().toArray(new String[dbo.keySet().size()]);
for (String key : keySet) {
spelCtx.setVariable(key, dbo.get(key));
}
@@ -297,12 +307,13 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
boolean handledByCustomConverter = conversions.getCustomWriteTarget(obj.getClass(), DBObject.class) != null;
if (!handledByCustomConverter) {
typeMapper.writeType(ClassTypeInformation.from(obj.getClass()), dbo);
TypeInformation<? extends Object> type = ClassTypeInformation.from(obj.getClass());
if (!handledByCustomConverter && !(dbo instanceof BasicDBList)) {
typeMapper.writeType(type, dbo);
}
writeInternal(obj, dbo);
writeInternal(obj, dbo, type);
}
/**
@@ -312,7 +323,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
* @param dbo
*/
@SuppressWarnings("unchecked")
protected void writeInternal(final Object obj, final DBObject dbo) {
protected void writeInternal(final Object obj, final DBObject dbo, final TypeInformation<?> typeHint) {
if (null == obj) {
return;
@@ -327,12 +338,18 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
if (Map.class.isAssignableFrom(obj.getClass())) {
writeMapInternal((Map<Object, Object>) obj, dbo, null);
writeMapInternal((Map<Object, Object>) obj, dbo, MAP_TYPE_INFORMATION);
return;
}
if (Collection.class.isAssignableFrom(obj.getClass())) {
writeCollectionInternal((Collection<?>) obj, COLLECTION_TYPE_INFORMATION, (BasicDBList) dbo);
return;
}
MongoPersistentEntity<?> entity = mappingContext.getPersistentEntity(obj.getClass());
writeInternal(obj, dbo, entity);
addCustomTypeKeyIfNecessary(typeHint, obj, dbo);
}
protected void writeInternal(Object obj, final DBObject dbo, MongoPersistentEntity<?> entity) {
@@ -501,7 +518,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
protected DBObject createCollection(Collection<?> collection, MongoPersistentProperty property) {
if (!property.isDbReference()) {
return createCollectionDBObject(collection, property.getTypeInformation());
return writeCollectionInternal(collection, property.getTypeInformation(), new BasicDBList());
}
BasicDBList dbList = new BasicDBList();
@@ -520,15 +537,15 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
}
/**
* Creates a new {@link BasicDBList} from the given {@link Collection}.
* Populates the given {@link BasicDBList} with values from the given {@link Collection}.
*
* @param source the collection to create a {@link BasicDBList} for, must not be {@literal null}.
* @param type the {@link TypeInformation} to consider or {@literal null} if unknown.
* @param sink the {@link BasicDBList} to write to.
* @return
*/
private BasicDBList createCollectionDBObject(Collection<?> source, TypeInformation<?> type) {
private BasicDBList writeCollectionInternal(Collection<?> source, TypeInformation<?> type, BasicDBList sink) {
BasicDBList dbList = new BasicDBList();
TypeInformation<?> componentType = type == null ? null : type.getComponentType();
for (Object element : source) {
@@ -540,22 +557,29 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Class<?> elementType = element.getClass();
if (conversions.isSimpleType(elementType)) {
dbList.add(getPotentiallyConvertedSimpleWrite(element));
sink.add(getPotentiallyConvertedSimpleWrite(element));
} else if (element instanceof Collection || elementType.isArray()) {
dbList.add(createCollectionDBObject(asCollection(element), componentType));
sink.add(writeCollectionInternal(asCollection(element), componentType, new BasicDBList()));
} else {
BasicDBObject propDbObj = new BasicDBObject();
writeInternal(element, propDbObj,
mappingContext.getPersistentEntity(ClassTypeInformation.from(element.getClass())));
addCustomTypeKeyIfNecessary(componentType, element, propDbObj);
dbList.add(propDbObj);
writeInternal(element, propDbObj, componentType);
sink.add(propDbObj);
}
}
return dbList;
return sink;
}
protected void writeMapInternal(Map<Object, Object> obj, DBObject dbo, TypeInformation<?> propertyType) {
/**
* Writes the given {@link Map} to the given {@link DBObject} considering the given {@link TypeInformation}.
*
* @param obj must not be {@literal null}.
* @param dbo must not be {@literal null}.
* @param propertyType must not be {@literal null}.
* @return
*/
protected DBObject writeMapInternal(Map<Object, Object> obj, DBObject dbo, TypeInformation<?> propertyType) {
for (Map.Entry<Object, Object> entry : obj.entrySet()) {
Object key = entry.getKey();
Object val = entry.getValue();
@@ -566,17 +590,19 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (val == null || conversions.isSimpleType(val.getClass())) {
writeSimpleInternal(val, dbo, simpleKey);
} else if (val instanceof Collection) {
dbo.put(simpleKey, createCollectionDBObject((Collection<?>) val, propertyType.getMapValueType()));
dbo.put(simpleKey,
writeCollectionInternal((Collection<?>) val, propertyType.getMapValueType(), new BasicDBList()));
} else {
DBObject newDbo = new BasicDBObject();
writeInternal(val, newDbo);
addCustomTypeKeyIfNecessary(propertyType, val, newDbo);
writeInternal(val, newDbo, propertyType);
dbo.put(simpleKey, newDbo);
}
} else {
throw new MappingException("Cannot use a complex object as a key value.");
}
}
return dbo;
}
/**
@@ -589,11 +615,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
*/
protected void addCustomTypeKeyIfNecessary(TypeInformation<?> type, Object value, DBObject dbObject) {
if (type == null) {
return;
}
Class<?> reference = type.getActualType().getType();
TypeInformation<?> actualType = type != null ? type.getActualType() : type;
Class<?> reference = actualType == null ? Object.class : actualType.getType();
boolean notTheSameClass = !value.getClass().equals(reference);
if (notTheSameClass) {
@@ -645,10 +668,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
@SuppressWarnings({ "rawtypes", "unchecked" })
private Object getPotentiallyConvertedSimpleRead(Object value, Class<?> target) {
Assert.notNull(target);
if (value == null) {
return null;
if (value == null || target == null) {
return value;
}
if (conversions.hasCustomReadTarget(value.getClass(), target)) {
@@ -659,10 +680,11 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
return Enum.valueOf((Class<Enum>) target, value.toString());
}
return value;
return target.isAssignableFrom(value.getClass()) ? value : conversionService.convert(value, target);
}
protected DBRef createDBRef(Object target, org.springframework.data.mongodb.core.mapping.DBRef dbref) {
MongoPersistentEntity<?> targetEntity = mappingContext.getPersistentEntity(target.getClass());
if (null == targetEntity || null == targetEntity.getIdProperty()) {
return null;
@@ -671,6 +693,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
MongoPersistentProperty idProperty = targetEntity.getIdProperty();
Object id = null;
BeanWrapper<MongoPersistentEntity<Object>, Object> wrapper = BeanWrapper.create(target, conversionService);
try {
id = wrapper.getProperty(idProperty, Object.class, useFieldAccessOnly);
if (null == id) {
@@ -689,7 +712,8 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
String dbname = dbref.db();
DB db = StringUtils.hasText(dbname) ? mongoDbFactory.getDb(dbname) : mongoDbFactory.getDb();
return new DBRef(db, collection, id);
return new DBRef(db, collection, idMapper.convertId(id));
}
@SuppressWarnings("unchecked")
@@ -729,7 +753,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
(BasicDBList) sourceValue);
}
TypeInformation<?> toType = findTypeToBeUsed((DBObject) sourceValue);
TypeInformation<?> toType = typeMapper.readType((DBObject) sourceValue);
// It's a complex object, have to read it in
if (toType != null) {
@@ -787,7 +811,7 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
Assert.notNull(dbObject);
Class<?> mapType = getMoreConcreteTargetType(dbObject, type).getType();
Class<?> mapType = typeMapper.readType(dbObject, type).getType();
Map<Object, Object> map = CollectionFactory.createMap(mapType, dbObject.keySet().size());
Map<String, Object> sourceMap = dbObject.toMap();
@@ -810,57 +834,14 @@ public class MappingMongoConverter extends AbstractMongoConverter implements App
if (value instanceof DBObject) {
map.put(key, read(valueType, (DBObject) value));
} else {
valueType = valueType == null ? MAP_TYPE_INFORMATION : valueType;
map.put(key, getPotentiallyConvertedSimpleRead(value, valueType.getType()));
Class<?> valueClass = valueType == null ? null : valueType.getType();
map.put(key, getPotentiallyConvertedSimpleRead(value, valueClass));
}
}
return map;
}
/**
* Returns the type to be used to convert the DBObject given to. Will return {@literal null} if there's not type hint
* found in the {@link DBObject} or the type hint found can't be converted into a {@link Class} as the type might not
* be available.
*
* @param dbObject
* @return the type to be used for converting the given {@link DBObject} into or {@literal null} if there's no type
* found.
*/
protected TypeInformation<?> findTypeToBeUsed(DBObject dbObject) {
return typeMapper.readType(dbObject);
}
private Class<?> getDefaultedTypeToBeUsed(DBObject dbObject) {
TypeInformation<?> result = findTypeToBeUsed(dbObject);
if (result != null) {
return result.getType();
}
return dbObject instanceof BasicDBList ? List.class : Map.class;
}
/**
* Inspects the a custom class definition stored inside the given {@link DBObject} and returns that in case it's a
* subtype of the given basic one.
*
* @param dbObject
* @param basicType
* @return
*/
@SuppressWarnings("unchecked")
private <S> TypeInformation<? extends S> getMoreConcreteTargetType(DBObject dbObject, TypeInformation<S> basicType) {
Class<?> documentsTargetType = getDefaultedTypeToBeUsed(dbObject);
Class<S> rawType = basicType == null ? null : basicType.getType();
boolean isMoreConcreteCustomType = rawType == null ? true : rawType.isAssignableFrom(documentsTargetType)
&& !rawType.equals(documentsTargetType);
return isMoreConcreteCustomType ? (TypeInformation<? extends S>) ClassTypeInformation.from(documentsTargetType)
: basicType;
}
protected <T> List<?> unwrapList(BasicDBList dbList, TypeInformation<T> targetType) {
List<Object> rootList = new ArrayList<Object>();
for (int i = 0; i < dbList.size(); i++) {

View File

@@ -1,93 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import static org.springframework.beans.PropertyAccessorFactory.forBeanPropertyAccess;
import static org.springframework.beans.PropertyAccessorFactory.forDirectFieldAccess;
import org.springframework.beans.BeanWrapper;
import org.springframework.beans.ConfigurablePropertyAccessor;
import org.springframework.beans.NotWritablePropertyException;
import org.springframework.core.convert.ConversionService;
import org.springframework.data.mongodb.core.convert.MongoPropertyDescriptors.MongoPropertyDescriptor;
import org.springframework.util.Assert;
/**
* Custom Mongo specific {@link BeanWrapper} to allow access to bean properties via {@link MongoPropertyDescriptor}s.
*
* @author Oliver Gierke
*/
class MongoBeanWrapper {
private final ConfigurablePropertyAccessor accessor;
private final MongoPropertyDescriptors descriptors;
private final boolean fieldAccess;
/**
* Creates a new {@link MongoBeanWrapper} for the given target object and {@link ConversionService}.
*
* @param target
* @param conversionService
* @param fieldAccess
*/
public MongoBeanWrapper(Object target, ConversionService conversionService, boolean fieldAccess) {
Assert.notNull(target);
Assert.notNull(conversionService);
this.fieldAccess = fieldAccess;
this.accessor = fieldAccess ? forDirectFieldAccess(target) : forBeanPropertyAccess(target);
this.accessor.setConversionService(conversionService);
this.descriptors = new MongoPropertyDescriptors(target.getClass());
}
/**
* Returns all {@link MongoPropertyDescriptors.MongoPropertyDescriptor}s for the underlying target object.
*
* @return
*/
public MongoPropertyDescriptors getDescriptors() {
return this.descriptors;
}
/**
* Returns the value of the underlying object for the given property.
*
* @param descriptor
* @return
*/
public Object getValue(MongoPropertyDescriptors.MongoPropertyDescriptor descriptor) {
Assert.notNull(descriptor);
return accessor.getPropertyValue(descriptor.getName());
}
/**
* Sets the property of the underlying object to the given value.
*
* @param descriptor
* @param value
*/
public void setValue(MongoPropertyDescriptors.MongoPropertyDescriptor descriptor, Object value) {
Assert.notNull(descriptor);
try {
accessor.setPropertyValue(descriptor.getName(), value);
} catch (NotWritablePropertyException e) {
if (!fieldAccess) {
throw e;
}
}
}
}

View File

@@ -15,29 +15,20 @@
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.core.convert.ConversionService;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.convert.EntityConverter;
import org.springframework.data.convert.EntityReader;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import com.mongodb.DBObject;
/**
* Central Mongo specific converter interface which combines {@link MongoWriter} and {@link MongoReader}.
*
* @author Oliver Gierke
*/
public interface MongoConverter extends MongoWriter<Object>, MongoReader<Object> {
public interface MongoConverter extends
EntityConverter<MongoPersistentEntity<?>, MongoPersistentProperty, Object, DBObject>, MongoWriter<Object>,
EntityReader<Object, DBObject> {
/**
* Returns the underlying {@link MappingContext} used by the converter.
*
* @return never {@literal null}
*/
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> getMappingContext();
/**
* Returns the underlying {@link ConversionService} used by the converter.
*
* @return never {@literal null}.
*/
ConversionService getConversionService();
}

View File

@@ -1,244 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import java.beans.PropertyDescriptor;
import java.lang.reflect.Method;
import java.lang.reflect.Type;
import java.math.BigInteger;
import java.util.*;
import org.bson.types.ObjectId;
import org.springframework.beans.BeanUtils;
import org.springframework.util.Assert;
import org.springframework.util.ReflectionUtils;
/**
* An iterable of {@link MongoPropertyDescriptor}s that allows dedicated access to the {@link MongoPropertyDescriptor}
* that captures the id-property.
*
* @author Oliver Gierke
*/
public class MongoPropertyDescriptors implements Iterable<MongoPropertyDescriptors.MongoPropertyDescriptor> {
private final Collection<MongoPropertyDescriptors.MongoPropertyDescriptor> descriptors;
private final MongoPropertyDescriptors.MongoPropertyDescriptor idDescriptor;
/**
* Creates the {@link MongoPropertyDescriptors} for the given type.
*
* @param type
*/
public MongoPropertyDescriptors(Class<?> type) {
Assert.notNull(type);
Set<MongoPropertyDescriptors.MongoPropertyDescriptor> descriptors = new HashSet<MongoPropertyDescriptors.MongoPropertyDescriptor>();
MongoPropertyDescriptors.MongoPropertyDescriptor idDesciptor = null;
for (PropertyDescriptor candidates : BeanUtils.getPropertyDescriptors(type)) {
MongoPropertyDescriptor descriptor = new MongoPropertyDescriptors.MongoPropertyDescriptor(candidates, type);
descriptors.add(descriptor);
if (descriptor.isIdProperty()) {
idDesciptor = descriptor;
}
}
this.descriptors = Collections.unmodifiableSet(descriptors);
this.idDescriptor = idDesciptor;
}
/**
* Returns the {@link MongoPropertyDescriptor} for the id property.
*
* @return the idDescriptor
*/
public MongoPropertyDescriptors.MongoPropertyDescriptor getIdDescriptor() {
return idDescriptor;
}
/*
* (non-Javadoc)
*
* @see java.lang.Iterable#iterator()
*/
public Iterator<MongoPropertyDescriptors.MongoPropertyDescriptor> iterator() {
return descriptors.iterator();
}
/**
* Simple value object to have a more suitable abstraction for MongoDB specific property handling.
*
* @author Oliver Gierke
*/
public static class MongoPropertyDescriptor {
public static Collection<Class<?>> SUPPORTED_ID_CLASSES;
static {
Set<Class<?>> classes = new HashSet<Class<?>>();
classes.add(ObjectId.class);
classes.add(String.class);
classes.add(BigInteger.class);
SUPPORTED_ID_CLASSES = Collections.unmodifiableCollection(classes);
}
private static final String ID_PROPERTY = "id";
static final String ID_KEY = "_id";
private final PropertyDescriptor delegate;
private final Class<?> owningType;
/**
* Creates a new {@link MongoPropertyDescriptor} for the given {@link PropertyDescriptor}.
*
* @param descriptor
* @param owningType
*/
public MongoPropertyDescriptor(PropertyDescriptor descriptor, Class<?> owningType) {
Assert.notNull(descriptor);
this.delegate = descriptor;
this.owningType = owningType;
}
/**
* Returns whether the property is the id-property. Will be identified by name for now ({@value #ID_PROPERTY}).
*
* @return
*/
public boolean isIdProperty() {
return ID_PROPERTY.equals(delegate.getName()) || ID_KEY.equals(delegate.getName());
}
/**
* Returns whether the property is of one of the supported id types. Currently we support {@link String},
* {@link ObjectId} and {@link BigInteger}.
*
* @return
*/
public boolean isOfIdType() {
return SUPPORTED_ID_CLASSES.contains(delegate.getPropertyType());
}
/**
* Returns the key that shall be used for mapping. Will return {@value #ID_KEY} for the id property and the plain
* name for all other ones.
*
* @return
*/
public String getKeyToMap() {
return isIdProperty() ? ID_KEY : delegate.getName();
}
/**
* Returns the name of the property.
*
* @return
*/
public String getName() {
return delegate.getName();
}
/**
* Returns whether the underlying property is actually mappable. By default this will exclude the {@literal class}
* property and only include properties with a getter.
*
* @return
*/
public boolean isMappable() {
boolean isNotClassAttribute = !delegate.getName().equals("class");
boolean hasGetter = delegate.getReadMethod() != null;
boolean hasField = ReflectionUtils.findField(owningType, delegate.getName()) != null;
return isNotClassAttribute && hasGetter && hasField;
}
/**
* Returns the plain property type.
*
* @return
*/
public Class<?> getPropertyType() {
return delegate.getPropertyType();
}
/**
* Returns the type type to be set. Will return the setter method's type and fall back to the getter method's return
* type in case no setter is available. Useful for further (generics) inspection.
*
* @return
*/
public Type getTypeToSet() {
Method method = delegate.getWriteMethod();
return method == null ? delegate.getReadMethod().getGenericReturnType() : method.getGenericParameterTypes()[0];
}
/**
* Returns whther we describe a {@link Map}.
*
* @return
*/
public boolean isMap() {
return Map.class.isAssignableFrom(getPropertyType());
}
/**
* Returns whether the descriptor is for a collection.
*
* @return
*/
public boolean isCollection() {
return Collection.class.isAssignableFrom(getPropertyType());
}
/**
* Returns whether the descriptor is for an {@link Enum}.
*
* @return
*/
public boolean isEnum() {
return Enum.class.isAssignableFrom(getPropertyType());
}
/*
* (non-Javadoc)
*
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (obj == this) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
MongoPropertyDescriptor that = (MongoPropertyDescriptor) obj;
return that.delegate.equals(this.delegate);
}
/*
* (non-Javadoc)
*
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return delegate.hashCode();
}
}
}

View File

@@ -1,40 +0,0 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import com.mongodb.DBObject;
/**
* A MongoWriter is responsible for converting a native MongoDB DBObject to an object of type T.
*
* @param <T> the type of the object to convert from a DBObject
* @author Mark Pollack
* @author Thomas Risberg
* @author Oliver Gierke
*/
public interface MongoReader<T> {
/**
* Ready from the native MongoDB DBObject representation to an instance of the class T. The given type has to be the
* starting point for marshalling the {@link DBObject} into it. So in case there's no real valid data inside
* {@link DBObject} for the given type, just return an empty instance of the given type.
*
* @param clazz the type of the return value. Will never be {@literal null}.
* @param dbo the {@link DBObject} to convert into a domain object. Might be {@literal null}.
* @return the converted object. Might be {@literal null}.
*/
<S extends T> S read(Class<S> clazz, DBObject dbo);
}

View File

@@ -1,5 +1,5 @@
/*
* Copyright 2010-2011 the original author or authors.
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -13,16 +13,18 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query;
package org.springframework.data.mongodb.core.convert;
public class NorCriteria extends OrCriteria {
import org.springframework.data.convert.TypeMapper;
public NorCriteria(Query[] queries) {
super(queries);
}
import com.mongodb.DBObject;
/**
* Combining interface to express Mongo specific {@link TypeMapper} implementations will be {@link TypeKeyAware} as
* well.
*
* @author Oliver Gierke
*/
public interface MongoTypeMapper extends TypeMapper<DBObject>, TypeKeyAware {
@Override
protected String getOperator() {
return "$nor";
}
}

View File

@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.convert.EntityWriter;
import com.mongodb.DBObject;
/**
@@ -25,15 +27,7 @@ import com.mongodb.DBObject;
* @author Thomas Risberg
* @author Oliver Gierke
*/
public interface MongoWriter<T> {
/**
* Write the given object of type T to the native MongoDB object representation DBObject.
*
* @param t The object to convert to a DBObject
* @param dbo The DBObject to use for writing.
*/
void write(T t, DBObject dbo);
public interface MongoWriter<T> extends EntityWriter<T, DBObject> {
/**
* Converts the given object into one Mongo will be able to store natively. If the given object can already be stored

View File

@@ -15,17 +15,19 @@
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.convert.TypeMapper;
/**
* Interfaces for components being able to provide a {@link TypeMapper}.
*
* @author Oliver Gierke
*/
public interface TypeMapperProvider {
public interface TypeKeyAware {
/**
* Returns the {@link TypeMapper}.
*
* @return the {@link TypeMapper} or {@literal null} if none available.
*/
TypeMapper getTypeMapper();
boolean isTypeKey(String key);
}

View File

@@ -1,60 +0,0 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.convert;
import org.springframework.data.util.TypeInformation;
import com.mongodb.DBObject;
/**
* Interface to define strategies how to store type information in a {@link DBObject}.
*
* @author Oliver Gierke
*/
public interface TypeMapper {
/**
* Returns whether the given key is the key being used as type key.
*
* @param key
* @return
*/
boolean isTypeKey(String key);
/**
* Reads the {@link TypeInformation} from the given {@link DBObject}.
*
* @param dbObject must not be {@literal null}.
* @return
*/
TypeInformation<?> readType(DBObject dbObject);
/**
* Writes type information for the given type into the given {@link DBObject}.
*
* @param type must not be {@literal null}.
* @param dbObject must not be {@literal null}.
*/
void writeType(Class<?> type, DBObject dbObject);
/**
* Writes type information for the given {@link TypeInformation} into the given {@link DBObject}.
*
* @param type must not be {@literal null}.
* @param dbObject must not be {@literal null}.
*/
void writeType(TypeInformation<?> type, DBObject dbObject);
}

View File

@@ -15,6 +15,9 @@
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.List;
import org.springframework.data.mongodb.core.mapping.Field;
import org.springframework.util.Assert;
@@ -24,7 +27,7 @@ import org.springframework.util.Assert;
* @author Mark Pollack
* @author Oliver Gierke
*/
public class Box {
public class Box implements Shape {
@Field(order = 10)
private final Point first;
@@ -53,6 +56,25 @@ public class Box {
return second;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
public List<? extends Object> asList() {
List<List<Double>> list = new ArrayList<List<Double>>();
list.add(getLowerLeft().asList());
list.add(getUpperRight().asList());
return list;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
public String getCommand() {
return "$box";
}
@Override
public String toString() {
return String.format("Box [%s, %s]", first, second);

View File

@@ -15,6 +15,9 @@
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.List;
import org.springframework.data.annotation.PersistenceConstructor;
import org.springframework.util.Assert;
@@ -24,33 +27,112 @@ import org.springframework.util.Assert;
* @author Mark Pollack
* @author Oliver Gierke
*/
public class Circle {
public class Circle implements Shape {
private Point center;
private double radius;
private final Point center;
private final double radius;
/**
* Creates a new {@link Circle} from the given {@link Point} and radius.
*
* @param center must not be {@literal null}.
* @param radius must be greater or equal to zero.
*/
@PersistenceConstructor
public Circle(Point center, double radius) {
Assert.notNull(center);
Assert.isTrue(radius >= 0, "Radius must not be negative!");
this.center = center;
this.radius = radius;
}
/**
* Creates a new {@link Circle} from the given coordinates and radius.
*
* @param centerX
* @param centerY
* @param radius must be greater or equal to zero.
*/
public Circle(double centerX, double centerY, double radius) {
this(new Point(centerX, centerY), radius);
}
/**
* Returns the center of the {@link Circle}.
*
* @return will never be {@literal null}.
*/
public Point getCenter() {
return center;
}
/**
* Returns the radius of the {@link Circle}.
*
* @return
*/
public double getRadius() {
return radius;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
public List<Object> asList() {
List<Object> result = new ArrayList<Object>();
result.add(getCenter().asList());
result.add(getRadius());
return result;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
public String getCommand() {
return "$center";
}
/*
* (non-Javadoc)
* @see java.lang.Object#toString()
*/
@Override
public String toString() {
return String.format("Circle [center=%s, radius=%d]", center, radius);
return String.format("Circle [center=%s, radius=%f]", center, radius);
}
/* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
Circle that = (Circle) obj;
return this.center.equals(that.center) && this.radius == that.radius;
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
int result = 17;
result += 31 * center.hashCode();
result += 31 * radius;
return result;
}
}

View File

@@ -0,0 +1,113 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Iterator;
import java.util.List;
import org.springframework.util.Assert;
/**
* Simple value object to represent a {@link Polygon}.
*
* @author Oliver Gierke
*/
public class Polygon implements Shape, Iterable<Point> {
private final List<Point> points;
/**
* Creates a new {@link Polygon} for the given Points.
*
* @param x
* @param y
* @param z
* @param others
*/
public Polygon(Point x, Point y, Point z, Point... others) {
Assert.notNull(x);
Assert.notNull(y);
Assert.notNull(z);
Assert.notNull(others);
this.points = new ArrayList<Point>(3 + others.length);
this.points.addAll(Arrays.asList(x, y, z));
this.points.addAll(Arrays.asList(others));
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#asList()
*/
public List<List<Double>> asList() {
List<List<Double>> result = new ArrayList<List<Double>>();
for (Point point : points) {
result.add(point.asList());
}
return result;
}
/*
* (non-Javadoc)
* @see org.springframework.data.mongodb.core.geo.Shape#getCommand()
*/
public String getCommand() {
return "$polygon";
}
/*
* (non-Javadoc)
* @see java.lang.Iterable#iterator()
*/
public Iterator<Point> iterator() {
return this.points.iterator();
}
/*
* (non-Javadoc)
* @see java.lang.Object#equals(java.lang.Object)
*/
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
if (obj == null || !getClass().equals(obj.getClass())) {
return false;
}
Polygon that = (Polygon) obj;
return this.points.equals(that.points);
}
/*
* (non-Javadoc)
* @see java.lang.Object#hashCode()
*/
@Override
public int hashCode() {
return points.hashCode();
}
}

View File

@@ -1,35 +1,41 @@
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.document.couchdb;
import org.springframework.dao.DataRetrievalFailureException;
public class DocumentRetrievalFailureException extends
DataRetrievalFailureException {
private String documentPath;
public DocumentRetrievalFailureException(String documentPath) {
super("Could not find document at path = " + documentPath);
this.documentPath = documentPath;
}
public String getDocumentPath() {
return documentPath;
}
}
/*
* Copyright 2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.geo;
import java.util.List;
/**
* Common interface for all shapes. Allows building MongoDB representations of them.
*
* @author Oliver Gierke
*/
public interface Shape {
/**
* Returns the {@link Shape} as a list of usually {@link Double} or {@link List}s of {@link Double}s. Wildcard bound
* to allow implementations to return a more concrete element type.
*
* @return
*/
List<? extends Object> asList();
/**
* Returns the command to be used to create the {@literal $within} criterion.
*
* @return
*/
String getCommand();
}

View File

@@ -17,14 +17,9 @@
package org.springframework.data.mongodb.core.index;
import java.lang.reflect.Field;
import java.util.Collections;
import java.util.Map;
import java.util.Set;
import java.util.concurrent.ConcurrentHashMap;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.util.JSON;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.context.ApplicationListener;
@@ -38,6 +33,10 @@ import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import com.mongodb.util.JSON;
/**
* Component that inspects {@link BasicMongoPersistentEntity} instances contained in the given
* {@link MongoMappingContext} for indexing metadata and ensures the indexes to be available.

View File

@@ -18,10 +18,19 @@ package org.springframework.data.mongodb.core.mapping;
import java.util.Comparator;
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.context.expression.BeanFactoryAccessor;
import org.springframework.context.expression.BeanFactoryResolver;
import org.springframework.data.mapping.PersistentEntity;
import org.springframework.data.mapping.model.BasicPersistentEntity;
import org.springframework.data.mongodb.MongoCollectionUtils;
import org.springframework.data.util.TypeInformation;
import org.springframework.expression.Expression;
import org.springframework.expression.ParserContext;
import org.springframework.expression.spel.standard.SpelExpressionParser;
import org.springframework.expression.spel.support.StandardEvaluationContext;
import org.springframework.util.StringUtils;
/**
@@ -32,9 +41,11 @@ import org.springframework.util.StringUtils;
* @author Oliver Gierke
*/
public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, MongoPersistentProperty> implements
MongoPersistentEntity<T> {
MongoPersistentEntity<T>, ApplicationContextAware {
private final String collection;
private final SpelExpressionParser parser;
private final StandardEvaluationContext context;
/**
* Creates a new {@link BasicMongoPersistentEntity} with the given {@link TypeInformation}. Will default the
@@ -46,6 +57,9 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
super(typeInformation, MongoPersistentPropertyComparator.INSTANCE);
this.parser = new SpelExpressionParser();
this.context = new StandardEvaluationContext();
Class<?> rawType = typeInformation.getType();
String fallback = MongoCollectionUtils.getPreferredCollectionName(rawType);
@@ -57,13 +71,25 @@ public class BasicMongoPersistentEntity<T> extends BasicPersistentEntity<T, Mong
}
}
/*
* (non-Javadoc)
* @see org.springframework.context.ApplicationContextAware#setApplicationContext(org.springframework.context.ApplicationContext)
*/
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
context.addPropertyAccessor(new BeanFactoryAccessor());
context.setBeanResolver(new BeanFactoryResolver(applicationContext));
context.setRootObject(applicationContext);
}
/**
* Returns the collection the entity should be stored in.
*
* @return
*/
public String getCollection() {
return collection;
Expression expression = parser.parseExpression(collection, ParserContext.TEMPLATE_EXPRESSION);
return expression.getValue(context, String.class);
}
/**

View File

@@ -19,6 +19,9 @@ package org.springframework.data.mongodb.core.mapping;
import java.beans.PropertyDescriptor;
import java.lang.reflect.Field;
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.data.mapping.context.AbstractMappingContext;
import org.springframework.data.mapping.model.SimpleTypeHolder;
import org.springframework.data.util.TypeInformation;
@@ -27,7 +30,9 @@ import org.springframework.data.util.TypeInformation;
* @author Jon Brisbin <jbrisbin@vmware.com>
* @author Oliver Gierke ogierke@vmware.com
*/
public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersistentEntity<?>, MongoPersistentProperty> {
public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersistentEntity<?>, MongoPersistentProperty> implements ApplicationContextAware {
private ApplicationContext context;
/**
* Creates a new {@link MongoMappingContext}.
@@ -46,11 +51,27 @@ public class MongoMappingContext extends AbstractMappingContext<BasicMongoPersis
return new CachingMongoPersistentProperty(field, descriptor, owner, simpleTypeHolder);
}
/* (non-Javadoc)
/*
* (non-Javadoc)
* @see org.springframework.data.mapping.BasicMappingContext#createPersistentEntity(org.springframework.data.util.TypeInformation, org.springframework.data.mapping.model.MappingContext)
*/
@Override
protected <T> BasicMongoPersistentEntity<T> createPersistentEntity(TypeInformation<T> typeInformation) {
return new BasicMongoPersistentEntity<T>(typeInformation);
BasicMongoPersistentEntity<T> entity = new BasicMongoPersistentEntity<T>(typeInformation);
if (context != null) {
entity.setApplicationContext(context);
}
return entity;
}
/*
* (non-Javadoc)
* @see org.springframework.context.ApplicationContextAware#setApplicationContext(org.springframework.context.ApplicationContext)
*/
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
this.context = applicationContext;
}
}

View File

@@ -15,6 +15,7 @@
*/
package org.springframework.data.mongodb.core.mapping;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.mapping.PersistentProperty;
/**
@@ -53,4 +54,22 @@ public interface MongoPersistentProperty extends PersistentProperty<MongoPersist
* @return
*/
DBRef getDBRef();
/**
* Simple {@link Converter} implementation to transform a {@link MongoPersistentProperty} into its field name.
*
* @author Oliver Gierke
*/
public enum PropertyToFieldNameConverter implements Converter<MongoPersistentProperty, String> {
INSTANCE;
/*
* (non-Javadoc)
* @see org.springframework.core.convert.converter.Converter#convert(java.lang.Object)
*/
public String convert(MongoPersistentProperty source) {
return source.getFieldName();
}
}
}

View File

@@ -15,6 +15,8 @@
*/
package org.springframework.data.mongodb.core.mapping;
import java.math.BigInteger;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
@@ -31,14 +33,23 @@ import com.mongodb.DBRef;
*/
public abstract class MongoSimpleTypes {
private static final Set<Class<?>> MONGO_SIMPLE_TYPES = new HashSet<Class<?>>();
public static final Set<Class<?>> SUPPORTED_ID_CLASSES;
static {
MONGO_SIMPLE_TYPES.add(DBRef.class);
MONGO_SIMPLE_TYPES.add(ObjectId.class);
MONGO_SIMPLE_TYPES.add(CodeWScope.class);
Set<Class<?>> classes = new HashSet<Class<?>>();
classes.add(ObjectId.class);
classes.add(String.class);
classes.add(BigInteger.class);
SUPPORTED_ID_CLASSES = Collections.unmodifiableSet(classes);
Set<Class<?>> simpleTypes = new HashSet<Class<?>>();
simpleTypes.add(DBRef.class);
simpleTypes.add(ObjectId.class);
simpleTypes.add(CodeWScope.class);
MONGO_SIMPLE_TYPES = Collections.unmodifiableSet(simpleTypes);
}
private static final Set<Class<?>> MONGO_SIMPLE_TYPES;
public static final SimpleTypeHolder HOLDER = new SimpleTypeHolder(MONGO_SIMPLE_TYPES, true);
private MongoSimpleTypes() {

View File

@@ -1,5 +1,5 @@
/*
* Copyright (c) 2011 by the original author(s).
* Copyright 2011 by the original author(s).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
@@ -27,9 +27,9 @@ import org.springframework.core.GenericTypeResolver;
* @author Jon Brisbin
* @author Oliver Gierke
*/
public abstract class AbstractMongoEventListener<E> implements ApplicationListener<MongoMappingEvent<E>> {
public abstract class AbstractMongoEventListener<E> implements ApplicationListener<MongoMappingEvent<?>> {
protected final Log log = LogFactory.getLog(getClass());
protected final Log LOG = LogFactory.getLog(getClass());
private final Class<?> domainClass;
/**
@@ -43,10 +43,22 @@ public abstract class AbstractMongoEventListener<E> implements ApplicationListen
* (non-Javadoc)
* @see org.springframework.context.ApplicationListener#onApplicationEvent(org.springframework.context.ApplicationEvent)
*/
public void onApplicationEvent(MongoMappingEvent<E> event) {
E source = event.getSource();
public void onApplicationEvent(MongoMappingEvent<?> event) {
if (event instanceof AfterLoadEvent) {
AfterLoadEvent<?> afterLoadEvent = (AfterLoadEvent<?>) event;
if (domainClass.isAssignableFrom(afterLoadEvent.getType())) {
onAfterLoad(event.getDBObject());
}
return;
}
@SuppressWarnings("unchecked")
E source = (E) event.getSource();
// Check for matching domain type and invoke callbacks
if (source != null && !domainClass.isAssignableFrom(source.getClass())) {
return;
}
@@ -57,40 +69,38 @@ public abstract class AbstractMongoEventListener<E> implements ApplicationListen
onBeforeSave(source, event.getDBObject());
} else if (event instanceof AfterSaveEvent) {
onAfterSave(source, event.getDBObject());
} else if (event instanceof AfterLoadEvent) {
onAfterLoad((DBObject) event.getSource());
} else if (event instanceof AfterConvertEvent) {
onAfterConvert(event.getDBObject(), source);
}
}
public void onBeforeConvert(E source) {
if (log.isDebugEnabled()) {
log.debug("onBeforeConvert(" + source + ")");
if (LOG.isDebugEnabled()) {
LOG.debug("onBeforeConvert(" + source + ")");
}
}
public void onBeforeSave(E source, DBObject dbo) {
if (log.isDebugEnabled()) {
log.debug("onBeforeSave(" + source + ", " + dbo + ")");
if (LOG.isDebugEnabled()) {
LOG.debug("onBeforeSave(" + source + ", " + dbo + ")");
}
}
public void onAfterSave(E source, DBObject dbo) {
if (log.isDebugEnabled()) {
log.debug("onAfterSave(" + source + ", " + dbo + ")");
if (LOG.isDebugEnabled()) {
LOG.debug("onAfterSave(" + source + ", " + dbo + ")");
}
}
public void onAfterLoad(DBObject dbo) {
if (log.isDebugEnabled()) {
log.debug("onAfterLoad(" + dbo + ")");
if (LOG.isDebugEnabled()) {
LOG.debug("onAfterLoad(" + dbo + ")");
}
}
public void onAfterConvert(DBObject dbo, E source) {
if (log.isDebugEnabled()) {
log.debug("onAfterConvert(" + dbo + "," + source + ")");
if (LOG.isDebugEnabled()) {
LOG.debug("onAfterConvert(" + dbo + "," + source + ")");
}
}
}

View File

@@ -16,14 +16,42 @@
package org.springframework.data.mongodb.core.mapping.event;
import org.springframework.util.Assert;
import com.mongodb.DBObject;
/**
* @author Jon Brisbin <jbrisbin@vmware.com>
* Event to be triggered after loading {@link DBObject}s to be mapped onto a given type.
*
* @author Oliver Gierke
* @author Jon Brisbin
* @author Christoph Leiter
*/
public class AfterLoadEvent<DBObject> extends MongoMappingEvent<DBObject> {
public class AfterLoadEvent<T> extends MongoMappingEvent<DBObject> {
private static final long serialVersionUID = 1L;
private final Class<T> type;
public AfterLoadEvent(DBObject dbo) {
super(dbo, null);
/**
* Creates a new {@link AfterLoadEvent} for the given {@link DBObject} and type.
*
* @param dbo must not be {@literal null}.
* @param type must not be {@literal null}.
*/
public AfterLoadEvent(DBObject dbo, Class<T> type) {
super(dbo, dbo);
Assert.notNull(type, "Type must not be null!");
this.type = type;
}
/**
* Returns the type for which the {@link AfterLoadEvent} shall be invoked for.
*
* @return
*/
public Class<T> getType() {
return type;
}
}

View File

@@ -20,16 +20,21 @@ import java.util.Collection;
import java.util.LinkedHashMap;
import java.util.List;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import org.bson.types.BasicBSONList;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
import org.springframework.data.mongodb.core.geo.Box;
import org.springframework.data.mongodb.core.geo.Circle;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.mongodb.core.geo.Shape;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
/**
* Central class for creating queries. It follows a fluent API style so that you can easily chain together multiple
* criteria. Static import of the 'Criteria.where' method will improve readability.
*/
public class Criteria implements CriteriaDefinition {
/**
@@ -45,6 +50,10 @@ public class Criteria implements CriteriaDefinition {
private Object isValue = NOT_SET;
public Criteria() {
this.criteriaChain = new ArrayList<Criteria>();
}
public Criteria(String key) {
this.criteriaChain = new ArrayList<Criteria>();
this.criteriaChain.add(this);
@@ -59,7 +68,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Static factory method to create a Criteria using the provided key
*
*
* @param key
* @return
*/
@@ -69,8 +78,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Static factory method to create a Criteria using the provided key
*
* @param key
*
* @return
*/
public Criteria and(String key) {
@@ -79,7 +87,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a criterion using equality
*
*
* @param o
* @return
*/
@@ -97,7 +105,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a criterion using the $ne operator
*
*
* @param o
* @return
*/
@@ -108,7 +116,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a criterion using the $lt operator
*
*
* @param o
* @return
*/
@@ -119,7 +127,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a criterion using the $lte operator
*
*
* @param o
* @return
*/
@@ -130,7 +138,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a criterion using the $gt operator
*
*
* @param o
* @return
*/
@@ -141,7 +149,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a criterion using the $gte operator
*
*
* @param o
* @return
*/
@@ -152,7 +160,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a criterion using the $in operator
*
*
* @param o the values to match against
* @return
*/
@@ -167,7 +175,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a criterion using the $in operator
*
*
* @param c the collection containing the values to match against
* @return
*/
@@ -178,7 +186,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a criterion using the $nin operator
*
*
* @param o
* @return
*/
@@ -189,7 +197,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a criterion using the $mod operator
*
*
* @param value
* @param remainder
* @return
@@ -204,7 +212,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a criterion using the $all operator
*
*
* @param o
* @return
*/
@@ -215,7 +223,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a criterion using the $size operator
*
*
* @param s
* @return
*/
@@ -226,7 +234,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a criterion using the $exists operator
*
*
* @param b
* @return
*/
@@ -237,7 +245,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a criterion using the $type operator
*
*
* @param t
* @return
*/
@@ -248,7 +256,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a criterion using the $not meta operator which affects the clause directly following
*
*
* @return
*/
public Criteria not() {
@@ -258,7 +266,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a criterion using a $regex
*
*
* @param re
* @return
*/
@@ -269,7 +277,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a criterion using a $regex and $options
*
*
* @param re
* @param options
* @return
@@ -282,54 +290,28 @@ public class Criteria implements CriteriaDefinition {
return this;
}
/**
* Creates a geospatial criterion using a $within $center operation
*
* @param circle must not be {@literal null}
* @return
*/
public Criteria withinCenter(Circle circle) {
Assert.notNull(circle);
List<Object> list = new ArrayList<Object>();
list.add(circle.getCenter().asList());
list.add(circle.getRadius());
criteria.put("$within", new BasicDBObject("$center", list));
return this;
}
/**
* Creates a geospatial criterion using a $within $center operation. This is only available for Mongo 1.7 and higher.
*
*
* @param circle must not be {@literal null}
* @return
*/
public Criteria withinCenterSphere(Circle circle) {
public Criteria withinSphere(Circle circle) {
Assert.notNull(circle);
List<Object> list = new ArrayList<Object>();
list.add(circle.getCenter().asList());
list.add(circle.getRadius());
criteria.put("$within", new BasicDBObject("$centerSphere", list));
criteria.put("$within", new BasicDBObject("$centerSphere", circle.asList()));
return this;
}
/**
* Creates a geospatial criterion using a $within $box operation
*
* @param box
* @return
*/
public Criteria withinBox(Box box) {
Assert.notNull(box);
List<List<Double>> list = new ArrayList<List<Double>>();
list.add(box.getLowerLeft().asList());
list.add(box.getUpperRight().asList());
criteria.put("$within", new BasicDBObject("$box", list));
public Criteria within(Shape shape) {
Assert.notNull(shape);
criteria.put("$within", new BasicDBObject(shape.getCommand(), shape.asList()));
return this;
}
/**
* Creates a geospatial criterion using a $near operation
*
*
* @param point must not be {@literal null}
* @return
*/
@@ -341,7 +323,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a geospatial criterion using a $nearSphere operation. This is only available for Mongo 1.7 and higher.
*
*
* @param point must not be {@literal null}
* @return
*/
@@ -353,7 +335,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a geospatical criterion using a $maxDistance operation, for use with $near
*
*
* @param maxDistance
* @return
*/
@@ -364,7 +346,7 @@ public class Criteria implements CriteriaDefinition {
/**
* Creates a criterion using the $elemMatch operator
*
*
* @param c
* @return
*/
@@ -374,14 +356,39 @@ public class Criteria implements CriteriaDefinition {
}
/**
* Creates an or query using the $or operator for all of the provided queries
*
* @param queries
* Creates an 'or' criteria using the $or operator for all of the provided criteria
*
* @param criteria
*/
public void or(List<Query> queries) {
criteria.put("$or", queries);
public Criteria orOperator(Criteria... criteria) {
BasicBSONList bsonList = createCriteriaList(criteria);
criteriaChain.add(new Criteria("$or").is(bsonList));
return this;
}
/**
* Creates a 'nor' criteria using the $nor operator for all of the provided criteria
*
* @param criteria
*/
public Criteria norOperator(Criteria... criteria) {
BasicBSONList bsonList = createCriteriaList(criteria);
criteriaChain.add(new Criteria("$nor").is(bsonList));
return this;
}
/**
* Creates an 'and' criteria using the $and operator for all of the provided criteria
*
* @param criteria
*/
public Criteria andOperator(Criteria... criteria) {
BasicBSONList bsonList = createCriteriaList(criteria);
criteriaChain.add(new Criteria("$and").is(bsonList));
return this;
}
public String getKey() {
return this.key;
}
@@ -398,7 +405,10 @@ public class Criteria implements CriteriaDefinition {
} else {
DBObject criteriaObject = new BasicDBObject();
for (Criteria c : this.criteriaChain) {
criteriaObject.putAll(c.getSingleCriteriaObject());
DBObject dbo = c.getSingleCriteriaObject();
for (String k : dbo.keySet()) {
setValue(criteriaObject, k, dbo.get(k));
}
}
return criteriaObject;
}
@@ -431,4 +441,24 @@ public class Criteria implements CriteriaDefinition {
return queryCriteria;
}
private BasicBSONList createCriteriaList(Criteria[] criteria) {
BasicBSONList bsonList = new BasicBSONList();
for (Criteria c : criteria) {
bsonList.add(c.getCriteriaObject());
}
return bsonList;
}
private void setValue(DBObject dbo, String key, Object value) {
Object existing = dbo.get(key);
if (existing == null) {
dbo.put(key, value);
}
else {
throw new InvalidMongoDbApiUsageException("Due to limitations of the com.mongodb.BasicDBObject, " +
"you can't add a second '" + key + "' expression specified as '" + key + " : " + value + "'. " +
"Criteria already contains '" + key + " : " + existing + "'.");
}
}
}

View File

@@ -1,9 +0,0 @@
package org.springframework.data.mongodb.core.query;
public class NorQuery extends Query {
public NorQuery(Query... q) {
super.nor(q);
}
}

View File

@@ -1,51 +0,0 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import org.bson.types.BasicBSONList;
public class OrCriteria implements CriteriaDefinition {
Query[] queries = null;
public OrCriteria(Query[] queries) {
super();
this.queries = queries;
}
/*
* (non-Javadoc)
*
* @see org.springframework.datastore.document.mongodb.query.Criteria#
* getCriteriaObject(java.lang.String)
*/
public DBObject getCriteriaObject() {
DBObject dbo = new BasicDBObject();
BasicBSONList l = new BasicBSONList();
for (Query q : queries) {
l.add(q.getQueryObject());
}
dbo.put(getOperator(), l);
return dbo;
}
protected String getOperator() {
return "$or";
}
}

View File

@@ -1,9 +1,35 @@
/*
* Copyright 2010-2011 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.core.query;
import java.util.ArrayList;
import java.util.List;
public class OrQuery extends Query {
public OrQuery(Query... q) {
super.or(q);
super(getOrCriteria(q));
}
private static Criteria getOrCriteria(Query[] queries) {
List<Criteria> criteriaList = new ArrayList<Criteria>();
for (Query q : queries) {
criteriaList.addAll(q.getCriteria());
}
return new Criteria(criteriaList, "$or");
}
}

View File

@@ -15,14 +15,17 @@
*/
package org.springframework.data.mongodb.core.query;
import java.util.ArrayList;
import java.util.LinkedHashMap;
import java.util.List;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import org.springframework.data.mongodb.InvalidMongoDbApiUsageException;
public class Query {
private LinkedHashMap<String, CriteriaDefinition> criteria = new LinkedHashMap<String, CriteriaDefinition>();
private LinkedHashMap<String, Criteria> criteria = new LinkedHashMap<String, Criteria>();
private Field fieldSpec;
private Sort sort;
private int skip;
@@ -30,7 +33,7 @@ public class Query {
/**
* Static factory method to create a Query using the provided criteria
*
*
* @param critera
* @return
*/
@@ -46,17 +49,16 @@ public class Query {
}
public Query addCriteria(Criteria criteria) {
this.criteria.put(criteria.getKey(), criteria);
return this;
}
public Query or(Query... queries) {
this.criteria.put("$or", new OrCriteria(queries));
return this;
}
public Query nor(Query... queries) {
this.criteria.put("$nor", new NorCriteria(queries));
CriteriaDefinition existing = this.criteria.get(criteria.getKey());
String key = criteria.getKey();
if (existing == null) {
this.criteria.put(key, criteria);
}
else {
throw new InvalidMongoDbApiUsageException("Due to limitations of the com.mongodb.BasicDBObject, " +
"you can't add a second '" + key + "' criteria. " +
"Query already contains '" + existing.getCriteriaObject() + "'.");
}
return this;
}
@@ -119,4 +121,8 @@ public class Query {
public int getLimit() {
return this.limit;
}
protected List<Criteria> getCriteria() {
return new ArrayList<Criteria>(this.criteria.values());
}
}

View File

@@ -1,355 +0,0 @@
/*
* Copyright 2002-2010 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository;
import static org.springframework.data.querydsl.QueryDslUtils.*;
import java.io.Serializable;
import java.lang.reflect.Method;
import java.util.Arrays;
import java.util.HashSet;
import java.util.Set;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.data.domain.Sort;
import org.springframework.data.mapping.context.MappingContext;
import org.springframework.data.mongodb.core.MongoOperations;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.core.convert.MongoPropertyDescriptors.MongoPropertyDescriptor;
import org.springframework.data.mongodb.core.index.Index;
import org.springframework.data.mongodb.core.mapping.MongoPersistentEntity;
import org.springframework.data.mongodb.core.mapping.MongoPersistentProperty;
import org.springframework.data.mongodb.core.query.Order;
import org.springframework.data.querydsl.QueryDslPredicateExecutor;
import org.springframework.data.repository.Repository;
import org.springframework.data.repository.core.NamedQueries;
import org.springframework.data.repository.core.RepositoryMetadata;
import org.springframework.data.repository.core.support.QueryCreationListener;
import org.springframework.data.repository.core.support.RepositoryFactoryBeanSupport;
import org.springframework.data.repository.core.support.RepositoryFactorySupport;
import org.springframework.data.repository.query.QueryLookupStrategy;
import org.springframework.data.repository.query.RepositoryQuery;
import org.springframework.data.repository.query.QueryLookupStrategy.Key;
import org.springframework.data.repository.query.parser.Part;
import org.springframework.data.repository.query.parser.Part.Type;
import org.springframework.data.repository.query.parser.PartTree;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
/**
* {@link org.springframework.beans.factory.FactoryBean} to create {@link MongoRepository} instances.
*
* @author Oliver Gierke
*/
public class MongoRepositoryFactoryBean<T extends Repository<S, ID>, S, ID extends Serializable> extends
RepositoryFactoryBeanSupport<T, S, ID> {
private MongoTemplate template;
private boolean createIndexesForQueryMethods = false;
/**
* Configures the {@link MongoTemplate} to be used.
*
* @param template the template to set
*/
public void setTemplate(MongoTemplate template) {
this.template = template;
}
/**
* Configures whether to automatically create indexes for the properties referenced in a query method.
*
* @param createIndexesForQueryMethods the createIndexesForQueryMethods to set
*/
public void setCreateIndexesForQueryMethods(boolean createIndexesForQueryMethods) {
this.createIndexesForQueryMethods = createIndexesForQueryMethods;
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.support.RepositoryFactoryBeanSupport
* #createRepositoryFactory()
*/
@Override
protected final RepositoryFactorySupport createRepositoryFactory() {
RepositoryFactorySupport factory = getFactoryInstance(template);
if (createIndexesForQueryMethods) {
factory.addQueryCreationListener(new IndexEnsuringQueryCreationListener(template));
}
return factory;
}
/**
* Creates and initializes a {@link RepositoryFactorySupport} instance.
*
* @param template
* @return
*/
protected RepositoryFactorySupport getFactoryInstance(MongoTemplate template) {
return new MongoRepositoryFactory(template);
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.support.RepositoryFactoryBeanSupport
* #afterPropertiesSet()
*/
@Override
public void afterPropertiesSet() {
super.afterPropertiesSet();
Assert.notNull(template, "MongoTemplate must not be null!");
}
/**
* Repository to create {@link MongoRepository} instances.
*
* @author Oliver Gierke
*/
public static class MongoRepositoryFactory extends RepositoryFactorySupport {
private final MongoTemplate template;
private final EntityInformationCreator entityInformationCreator;
/**
* Creates a new {@link MongoRepositoryFactory} with the given {@link MongoTemplate} and {@link MappingContext}.
*
* @param template must not be {@literal null}
* @param mappingContext
*/
public MongoRepositoryFactory(MongoTemplate template) {
Assert.notNull(template);
this.template = template;
this.entityInformationCreator = new EntityInformationCreator(template.getConverter().getMappingContext());
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.support.RepositoryFactorySupport
* #getRepositoryBaseClass()
*/
@Override
protected Class<?> getRepositoryBaseClass(RepositoryMetadata metadata) {
return isQueryDslRepository(metadata.getRepositoryInterface()) ? QueryDslMongoRepository.class
: SimpleMongoRepository.class;
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.support.RepositoryFactorySupport
* #getTargetRepository
* (org.springframework.data.repository.support.RepositoryMetadata)
*/
@Override
@SuppressWarnings({ "rawtypes", "unchecked" })
protected Object getTargetRepository(RepositoryMetadata metadata) {
Class<?> repositoryInterface = metadata.getRepositoryInterface();
MongoEntityInformation<?, Serializable> entityInformation = getEntityInformation(metadata.getDomainClass());
if (isQueryDslRepository(repositoryInterface)) {
return new QueryDslMongoRepository(entityInformation, template);
} else {
return new SimpleMongoRepository(entityInformation, template);
}
}
private static boolean isQueryDslRepository(Class<?> repositoryInterface) {
return QUERY_DSL_PRESENT && QueryDslPredicateExecutor.class.isAssignableFrom(repositoryInterface);
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.support.RepositoryFactorySupport
* #getQueryLookupStrategy
* (org.springframework.data.repository.query.QueryLookupStrategy.Key)
*/
@Override
protected QueryLookupStrategy getQueryLookupStrategy(Key key) {
return new MongoQueryLookupStrategy();
}
/**
* {@link QueryLookupStrategy} to create {@link PartTreeMongoQuery} instances.
*
* @author Oliver Gierke
*/
private class MongoQueryLookupStrategy implements QueryLookupStrategy {
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.query.QueryLookupStrategy
* #resolveQuery(java.lang.reflect.Method, java.lang.Class)
*/
public RepositoryQuery resolveQuery(Method method, RepositoryMetadata metadata, NamedQueries namedQueries) {
MongoQueryMethod queryMethod = new MongoQueryMethod(method, metadata, entityInformationCreator);
String namedQueryName = queryMethod.getNamedQueryName();
if (namedQueries.hasQuery(namedQueryName)) {
String namedQuery = namedQueries.getQuery(namedQueryName);
return new StringBasedMongoQuery(namedQuery, queryMethod, template);
} else if (queryMethod.hasAnnotatedQuery()) {
return new StringBasedMongoQuery(queryMethod, template);
} else {
return new PartTreeMongoQuery(queryMethod, template);
}
}
}
/*
* (non-Javadoc)
* @see org.springframework.data.repository.support.RepositoryFactorySupport#validate(org.springframework.data.repository.support.RepositoryMetadata)
*/
@Override
protected void validate(RepositoryMetadata metadata) {
Class<?> idClass = metadata.getIdClass();
if (!MongoPropertyDescriptor.SUPPORTED_ID_CLASSES.contains(idClass)) {
throw new IllegalArgumentException(String.format("Unsupported id class! Only %s are supported!",
StringUtils.collectionToCommaDelimitedString(MongoPropertyDescriptor.SUPPORTED_ID_CLASSES)));
}
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.support.RepositoryFactorySupport
* #getEntityInformation(java.lang.Class)
*/
@Override
public <T, ID extends Serializable> MongoEntityInformation<T, ID> getEntityInformation(Class<T> domainClass) {
return entityInformationCreator.getEntityInformation(domainClass);
}
}
/**
* Simple wrapper to to create {@link MongoEntityInformation} instances based on a {@link MappingContext}.
*
* @author Oliver Gierke
*/
static class EntityInformationCreator {
private final MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext;
public EntityInformationCreator(
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
Assert.notNull(mappingContext);
this.mappingContext = mappingContext;
}
public <T, ID extends Serializable> MongoEntityInformation<T, ID> getEntityInformation(Class<T> domainClass) {
return getEntityInformation(domainClass, null);
}
@SuppressWarnings("unchecked")
public <T, ID extends Serializable> MongoEntityInformation<T, ID> getEntityInformation(Class<T> domainClass,
Class<?> collectionClass) {
MongoPersistentEntity<T> persistentEntity = (MongoPersistentEntity<T>) mappingContext
.getPersistentEntity(domainClass);
String customCollectionName = collectionClass == null ? null : mappingContext
.getPersistentEntity(collectionClass).getCollection();
return new MappingMongoEntityInformation<T, ID>(persistentEntity, customCollectionName);
}
}
/**
* {@link QueryCreationListener} inspecting {@link PartTreeMongoQuery}s and creating an index for the properties it
* refers to.
*
* @author Oliver Gierke
*/
static class IndexEnsuringQueryCreationListener implements QueryCreationListener<PartTreeMongoQuery> {
private static final Set<Type> GEOSPATIAL_TYPES = new HashSet<Part.Type>(Arrays.asList(Type.NEAR, Type.WITHIN));
private static final Log LOG = LogFactory.getLog(IndexEnsuringQueryCreationListener.class);
private final MongoOperations operations;
public IndexEnsuringQueryCreationListener(MongoOperations operations) {
this.operations = operations;
}
/*
* (non-Javadoc)
*
* @see
* org.springframework.data.repository.support.QueryCreationListener
* #onCreation(org.springframework.data.repository
* .query.RepositoryQuery)
*/
public void onCreation(PartTreeMongoQuery query) {
PartTree tree = query.getTree();
Index index = new Index();
index.named(query.getQueryMethod().getName());
Sort sort = tree.getSort();
for (Part part : tree.getParts()) {
if (GEOSPATIAL_TYPES.contains(part.getType())) {
return;
}
String property = part.getProperty().toDotPath();
Order order = toOrder(sort, property);
index.on(property, order);
}
// Add fixed sorting criteria to index
if (sort != null) {
for (Sort.Order order : sort) {
index.on(order.getProperty(), QueryUtils.toOrder(order));
}
}
MongoEntityInformation<?, ?> metadata = query.getQueryMethod().getEntityInformation();
operations.ensureIndex(index, metadata.getCollectionName());
LOG.debug(String.format("Created %s!", index));
}
private static Order toOrder(Sort sort, String property) {
if (sort == null) {
return Order.DESCENDING;
}
org.springframework.data.domain.Sort.Order order = sort.getOrderFor(property);
return order == null ? Order.DESCENDING : order.isAscending() ? Order.ASCENDING : Order.DESCENDING;
}
}
}

View File

@@ -13,11 +13,11 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
package org.springframework.data.mongodb.repository.config;
import org.springframework.beans.factory.support.BeanDefinitionBuilder;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.data.mongodb.config.SimpleMongoRepositoryConfiguration.MongoRepositoryConfiguration;
import org.springframework.data.mongodb.repository.config.SimpleMongoRepositoryConfiguration.MongoRepositoryConfiguration;
import org.springframework.data.repository.config.AbstractRepositoryConfigDefinitionParser;
import org.w3c.dom.Element;

View File

@@ -13,9 +13,9 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.config;
package org.springframework.data.mongodb.repository.config;
import org.springframework.data.mongodb.repository.MongoRepositoryFactoryBean;
import org.springframework.data.mongodb.repository.support.MongoRepositoryFactoryBean;
import org.springframework.data.repository.config.AutomaticRepositoryConfigInformation;
import org.springframework.data.repository.config.ManualRepositoryConfigInformation;
import org.springframework.data.repository.config.RepositoryConfig;

View File

@@ -13,9 +13,9 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository;
package org.springframework.data.mongodb.repository.query;
import static org.springframework.data.mongodb.repository.QueryUtils.*;
import static org.springframework.data.mongodb.repository.query.QueryUtils.*;
import java.util.List;

View File

@@ -13,15 +13,14 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.springframework.data.mongodb.repository;
package org.springframework.data.mongodb.repository.query;
import java.util.Iterator;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import org.springframework.data.mongodb.core.convert.MongoWriter;
import org.springframework.data.mongodb.core.convert.TypeMapper;
import org.springframework.data.mongodb.core.convert.TypeMapperProvider;
import org.springframework.data.mongodb.core.convert.TypeKeyAware;
import org.springframework.data.mongodb.core.geo.Distance;
import org.springframework.data.mongodb.core.geo.Point;
import org.springframework.data.repository.query.ParameterAccessor;
@@ -107,12 +106,11 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
*/
private Object getConvertedValue(Object value) {
if (!(writer instanceof TypeMapperProvider)) {
if (!(writer instanceof TypeKeyAware)) {
return value;
}
TypeMapper mapper = ((TypeMapperProvider) writer).getTypeMapper();
return removeTypeInfoRecursively(writer.convertToMongoType(value), mapper);
return removeTypeInfoRecursively(writer.convertToMongoType(value), ((TypeKeyAware) writer));
}
/**
@@ -121,9 +119,9 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
* @param object
* @return
*/
private Object removeTypeInfoRecursively(Object object, TypeMapper mapper) {
private Object removeTypeInfoRecursively(Object object, TypeKeyAware typeKeyAware) {
if (!(object instanceof DBObject) || mapper == null) {
if (!(object instanceof DBObject) || typeKeyAware == null) {
return object;
}
@@ -131,17 +129,17 @@ public class ConvertingParameterAccessor implements MongoParameterAccessor {
String keyToRemove = null;
for (String key : dbObject.keySet()) {
if (mapper.isTypeKey(key)) {
if (typeKeyAware.isTypeKey(key)) {
keyToRemove = key;
}
Object value = dbObject.get(key);
if (value instanceof BasicDBList) {
for (Object element : (BasicDBList) value) {
removeTypeInfoRecursively(element, mapper);
removeTypeInfoRecursively(element, typeKeyAware);
}
} else {
removeTypeInfoRecursively(value, mapper);
removeTypeInfoRecursively(value, typeKeyAware);
}
}

Some files were not shown because too many files have changed in this diff Show More