-
Notifications
You must be signed in to change notification settings - Fork 359
DATAJDBC-99 - Repository support basics #5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Created a maven project based on spring-data-jpa. Created JdbcRepositoryFactory, which actually creates Repositories. Wired the construction of meta data. Implemented a dummy version of saving an entity to demonstrate availability of meta data. Included a simple test for exercising the JdbcRepositoryFactory and the resulting JdbcRepository.
e1dc3a4
to
937b04a
Compare
Creation of the necessary sql statements is now dynamic and driven in a trivial way by the entity. Known issues with the solution that need to get fixed later: Sql generating code is in the repository and should go somewhere else. Mapping logic is very trivial and should go in a separate place.
No longer using batch inserts, which won't (easily) hold up for long anyway, since we need to decide between update and insert anyway. If one wants to support batch insert one also has to check if any autoid columns are present, because those seem not to work with batch inserts with many or at least some drivers. Related ticket: SPR-1836.
New instances get saved with an insert statement. Existing instances get updated. Also added some test to find certain corner cases that I feared may cause problems: - Id properties being not editable (no setter and final). - Id properties being primitive. - Id properties not being named "Id" and fixed the issues resulting from those.
The repository publishes events before and after inserting, updating and deleting entities, as well as after instantiation of entities. JdbcEvent ist the common super class of all events and makes the id and the entity available (if possible). Added issue id comments to tests from previous issues.
937b04a
to
f4b5f2b
Compare
removed the intermediate 'Prepare feature branch' commits. Remaining 'Prepare feature branch' commit comes second because it needs at least a basic pom.xml |
</executions> | ||
</plugin> | ||
|
||
<plugin> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess that's a one-to-one copy from the JPA setup? Wonder whether need the split up really as in JPA we have to use it as different Java agents need to be in place for different persistence providers. I'd like to avoid the additional setup if necessary.
* | ||
* @author Jens Schauder | ||
*/ | ||
public class AfterCreationEvent extends JdbcEvent{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Space missing before {
. A couple of other places, too.
private final Object id; | ||
private final Object instance; | ||
|
||
<T> JdbcEvent(T instance, Function<T, Object> idProvider) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you elaborate on the idProvider
constructor parameter? Handing that in seems unusual.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is unusual and I'm not happy about it.
I want to make the entity and the id available in the events, because:
- it allows to do many things in a generic way without knowing anything about the entity, for example logging of changes with the id of the changed entity.
- not all events can provide a complete entity, because it would require us to actually load entities before deleting them by id.
But this leads to a constructor JdbcEvent(Object instance, Object id)
, which I found very disturbing. So I tried this approach, which I don't like either :-/
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As you say, I am not sure we can/should keep the requirement for every event to hold an entity in the first place. If there's an entity, wouldn't the listener just be able to obtain the id from calling a method on the entity?
What I am trying to say is: if we make sure we populate the identifiers before publishing the event, the listeners can see what they need to see, right?
* | ||
* @return instance of the entity triggering this event. | ||
*/ | ||
public Object getInstance() { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should actually already have Lombok on the classpath. Feel free to go ahead and just use @Getter
. JavaDoc would go onto the field declaration then.
return idColumn; | ||
} | ||
|
||
public Object getIdValue(T instance) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Two things here:
- I'd like to keep the
PersistentEntity
implementations as meta-data and type related as possible. ThegetPropertyAccessor(…)
andgetIdentifierAccessor(…)
methods are the only ones that work with instances, so that code that's related to instances is actually in a different abstraction. I guess we may have to reintroduce the usage ofPersistentEntityInformation
in clients, so that the responsibilities are clear again. Also this solves… - Access to the identifier should always be implemented using the
IdentifierAccessor
rather than thePersistentPropertyAccessor
. The reason is as follows — and it's great that this topic comes up here as it made me think about how we could improve on making this more obvious: historically,PersistentProperty
assumed one property to be the identifier, which unfortunately doesn't necessarily hold true for e.g. JPA with compound keys. Also, a dedicated identifier accessor allows us to provide an implementation that can make even more specific assumptions when it comes to reading an id. E.g. in cases of proxies in JPA, we can detect that the entity is a proxy and a lookup of the field value or a call to the getter would issue another query materializing the object. With that information and a bit of provider specific knowledge we can now pull the id out of the proxy itself, as it has to have that information already to eventually issue the query.
} | ||
|
||
@Data | ||
static class PrimitiveIdEntity { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's not really a primitive identifier type, is it? Did you mean long
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I hate you, now the test actually fails. ;-)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have a quick fix for now and a new issue to go: https://jira.spring.io/browse/DATAJDBC-104
* | ||
* @author Jens Schauder | ||
*/ | ||
public class JdbcRepositoryIntegrationTests { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For setup, see above comment.
} | ||
|
||
@Test // DATAJDBC-95 | ||
public void canSaveAnEntity() { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's use something more demanding as test name: savesEntity
.
} | ||
|
||
@Test // DATAJDBC-97 | ||
public void saveMany() { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
savesManyEntities
} | ||
|
||
@Test // DATAJDBC-97 | ||
public void count() { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
countsEntities
. See more instances below.
Events now use the id as source. The id is encapsulated in a value object to support null ids in BeforeInsert events. Entity references in events are now Optionals. Dropped redundant Event suffix from events. Added since 2.0 to classes Replaced constructors and getters with Lombok annotations. Simplified pom.xml. Extracted interface JdbcPersistentEntity and JdbcPersistentProperty. Integration tests use the Spring Test framework. Fixed test for entities with primitive id type.
Different databases are now supported by means of Maven Profiles and Spring Profiles. There is one Profile for each supported database. The default Profile is equivalent to hql. There is a Maven Profile all-dbs, which runs the integration tests against all databases. The databases need to be started outside the build. The build assumes the default configuration as I found it after `brew install <database>` For now we support the databases mysql, postgres and hsqldb. In order to make the new databases work setting of properties and reading generated ids was improved to do some simple conversions. This might be considered a first step towards DATAJDBC-104. The project root contains some basic scripts for starting and stopping databases, as well as running a build against all supported databases. Integration tests using a database now use Rules instead of JUnit runners. This gives more flexibility when adding fancy stuff to the Tests in the form of other Rules. Related issue: DATAJDBC-104.
<plugin> | ||
<groupId>org.apache.maven.plugins</groupId> | ||
<artifactId>maven-assembly-plugin</artifactId> | ||
</plugin> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are we using this plugin for anything?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Needed by the release infrastructure. Same for all other Spring Data modules.
<plugin> | ||
<groupId>org.codehaus.mojo</groupId> | ||
<artifactId>wagon-maven-plugin</artifactId> | ||
</plugin> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same question. If we aren't doing anything with it, maybe we don't need it, yet.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Needed by the release infrastructure. Same for all other Spring Data modules.
* @since 2.0 | ||
*/ | ||
public class AfterUpdate extends AfterSave { | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wasn't 100% certain updates are saves from a coding perspective. If someone wants to get triggered on Save
events, does that mean they should ALSO catch Update
and Insert
events? I can imagine one person saying "yes" and another saying "no". Hence, I would lean toward the coder explicitly subscribing for each event type.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's the point of having the save event and update and insert ones. If you're interested in save in general, listen to that. If you're interested in only inster, listen to that in particular etc.
public BeforeInsert(Object instance) { | ||
super(Unset.UNSET, instance); | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same comment as before regarding AfterSave, AfterInsert, and AfterUpdate.
* @since 2.0 | ||
*/ | ||
public class BeforeUpdate extends BeforeSave implements WithId { | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like WithId could be moved up to JdbcEvent, given that is where the id is, and hence inheritable by all the event types.
} | ||
} | ||
} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think you can create the same with just this:
static final Identifier UNSET = () -> Optional.empty();
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, the more I read this interface, the more I can not tell the difference between it and using a plain old Optional. After all, there is nothing "identifying" here, like a generic "id" parameter.
|
||
public JdbcEventWithEntity(Identifier id, Object entity) { | ||
super(id, Optional.of(entity)); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should either Assert.notNull
or use Optional.ofNullable
.
default Object getEntity() { | ||
return ((JdbcEvent) this).getOptionalEntity() | ||
.orElseThrow(() -> new IllegalStateException("Entity must not be NULL")); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wonder if this interface shouldn't be package local given it assumes that this
is JdbcEvent
.
Makes me wonder which events are concrete vs. abstracts used to define shared behavior. If not all of the events are meant to be used directly, the abstracts should be marked abstract IMHO to make that clear.
* @since 2.0 | ||
*/ | ||
public interface WithId { | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For similar reasons, this interface may also need to be package local or we run the risk of someone generating a ClassCastException
in the future at runtime.
private final JdbcPersistentEntity<T> entity; | ||
private final EntityInstantiator instantiator = new ClassGeneratingEntityInstantiator(); | ||
private final ConversionService conversions = new DefaultConversionService(); | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For a single conversion service, I'd call it conversion
not conversions
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's go with conversionService
as that's what it is in particular.
entity.doWithProperties((PropertyHandler<JdbcPersistentProperty>) property -> { | ||
setProperty(rs, t, property); | ||
}); | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe it's just me, but the number of times we are casting things has me wondering if something in the type system isn't properly set up. For every cast I see, I feel there has to be at least two test cases: success/failure to avoid surprising ClassCastException
s.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's not a cast here but a type hint as there's an override of the method that also takes a lambda.
Let's not throw out the baby with the bathwater here. Yes, I'd also like to avoid this PR to grow, which means it's probably a good idea to finish it rather sooner than later and not add more things to it. That said, I think the commits listed here look good, we can add a polishing one and then merge things unsquashed as there's a decent relationship between tickets and commits. The only thing I am still wondering is what to do with the old code from the JDBC extensions module. I'd like to keep the commit history there. So I wonder if we can do the following things:
This approach should prevent us from having to apply a lot of Git Kung-Fu to move the code introduced here into the sub-module right away. @schauder, @gregturn — Would you mind to coordinate 1 and 2? I can prepare 3 in the meantime. I'd love to that pretty soon and myself, just because we already had some back and forth on a lot of topics here and I want to avoid us getting lost in discussion. As it's a very fundamental PR, I'd still love to apply the polishing myself. Happy to report about the what and why I have polished things what way in a retro. You can then take care of 4 and 5 if you want. I can do 4, too. I should find time for this weekend. |
I'd be happy to bang on the details of 1 and 2. The issues with spring-data-jdbc-ext are:
This project appears principally designed to support doing Oracle-specifics. For example, it mentions RAC failover handling. I don't even know if RAC support is an industry thing, or if that's outdated. Project also has AQ JMS support which I would vote as out of scope for Spring Data JDBC. I can do more reading to see if this is the path to go in ensuring proper JDBC on Oracle support, given the code itself is kind of dated. |
Looks like https://blogs.oracle.com/dev2dev/entry/how_to_get_oracle_jdbc#settings provides details on creating a proper settings.xml file with credentials to access Oracle's maven repository. Definitely the better way to proceed, if we wanted to do that. https://blogs.oracle.com/dev2dev/entry/oracle_maven_repository_instructions_for shows the dependencies to pull in what may arguably be the same drivers Spring Data JDBC Ext has you manually install. But it looks like there is little in this project pursuant to "storing an entity". Instead, it's about infrastructure to wrap the JDBC connection with interceptors, detect RAC-based exceptions on failover, and retry the same operation. The idea being you want to decorate your JdbcTemplate. |
Jumping in here, quickly looked through the comments and Greg raises some good points. I think going forward it would be difficult keeping the Oracle specific code together with the rest of the new codebase. It is all legacy support for Oracle specific features and will most likely move slower in terms of updating to latest JDK and Spring versions. Can we just move the |
And I don't think using Oracle's maven repo solves any problems; you still have to register with OTN and put your credentials into Maven's config files. Plus, I couldn't find all dependencies we require in that repo. |
Okay, the original idea was to keep a single project around as otherwise we'd have to do a lot of explanations of what the difference between all those modules with JDBC in their names is. Spring Framework providing JDBC support already even complicates that matter. But it looks like we might be better off just staying with a new |
I can poke my nose into migrating Jira issues. I think Rob may have a script. |
Thanks for the update, @schauder. I'll take it from here. |
@gregturn I'm going through all the old issues and moving the still relevant ones to GitHub issues and closing the rest so no need for a script ATM. |
I went through all old issues and moved a few to GitHub under the https://github.com/spring-projects/spring-data-jdbc-ext project and some to the SPR core project. I marked all old issues in JIRA as |
Introduced interface for JdbcPersistentEntity to be consistent with other store implementations. Introduced JdbcPersistentEntityInformation.getRequiredId(…) to return a value and throw an exception if it can't be obtained. Added serialVersionUIDs to event implementations. Extracted Unset and SpecifiedIdentifier into top level classes to be able to reduce visibility. Introduced Specified interface and factory methods on Identifier to be able to create Identifier instances in all ways needed (specified / optional). Extracted JdbcEvent interface from the previously thus named class to be able to let WithId and WithEntity extend that interface to avoid the cast in the default method of WithId. WithId now simply redeclares getId returning Specified so that the guarantees of returned object type simply adapt as soon as event types also implement that interface. Reduced the visibility of SimpleJdbcEvent as listeners could now just refer to JdbcEvent. Switched to a nullable field for the entity in SimpleJdbcEvent as ApplicationEvent already implies potential serializability and Optional is not Serializable. Reorganized integration test setup to use configuration classes over FactoryBean implementations. Switched to AssertJ for test assertions. Renamed UnableToSetIdException to UnableToSetId. A bit of Javadoc here and there. Formatting. No abbreviated variable names.
The method names in the CrudRepository have changed. Related issue: DATACMNS-944
Created a maven project based on Spring Data JPA. Created JdbcRepositoryFactory, which actually creates Repositories. Wired the construction of meta data. Implemented a dummy version of saving an entity to demonstrate availability of meta data. Included a simple test for exercising the JdbcRepositoryFactory and the resulting JdbcRepository. Related pull request: #5.
Creation of the necessary sql statements is now dynamic and driven in a trivial way by the entity. Known issues with the solution that need to get fixed later: - SQL generating code is in the repository and should go somewhere else. - Mapping logic is very trivial and should go in a separate place. Original pull request: #5.
No longer using batch inserts, which won't (easily) hold up for long anyway, since we need to decide between update and insert anyway. If one wants to support batch insert one also has to check if any autoid columns are present, because those seem not to work with batch inserts with many or at least some drivers. Related ticket: SPR-1836. Original pull request: #5.
New instances get saved with an insert statement. Existing instances get updated. Also added some test to find certain corner cases that I feared may cause problems: - ID properties being not editable (no setter and final). - ID properties being primitive. - ID properties not being named "id" and fixed the issues resulting from those. Original pull request: #5.
The repository publishes events before and after inserting, updating and deleting entities, as well as after instantiation of entities. JdbcEvent ist the common super class of all events and makes the id and the entity available (if possible). Added issue id comments to tests from previous issues. Original pull request: #5.
Events now use the id as source. The id is encapsulated in a value object to support null ids in BeforeInsert events. Entity references in events are now Optionals. Dropped redundant Event suffix from events. Added "@SInCE 2.0" to classes. Replaced constructors and getters with Lombok annotations. Simplified pom.xml. Extracted interface JdbcPersistentEntity and JdbcPersistentProperty. Integration tests use the Spring Test framework. Fixed test for entities with primitive id type. Original pull request: #5.
Different databases are now supported by means of Maven Profiles and Spring Profiles. There is one Profile for each supported database. The default Profile is equivalent to hql. There is a Maven Profile all-dbs, which runs the integration tests against all databases. The databases need to be started outside the build. The build assumes the default configuration as I found it after `brew install <database>` For now we support the databases mysql, postgres and hsqldb. In order to make the new databases work setting of properties and reading generated ids was improved to do some simple conversions. This might be considered a first step towards DATAJDBC-104. The project root contains some basic scripts for starting and stopping databases, as well as running a build against all supported databases. Integration tests using a database now use Rules instead of JUnit runners. This gives more flexibility when adding fancy stuff to the Tests in the form of other Rules. Related issue: DATAJDBC-104. Original pull request: #5.
Introduced interface for JdbcPersistentEntity to be consistent with other store implementations. Introduced JdbcPersistentEntityInformation.getRequiredId(…) to return a value and throw an exception if it can't be obtained. Added serialVersionUIDs to event implementations. Extracted Unset and SpecifiedIdentifier into top level classes to be able to reduce visibility. Introduced Specified interface and factory methods on Identifier to be able to create Identifier instances in all ways needed (specified / optional). Extracted JdbcEvent interface from the previously thus named class to be able to let WithId and WithEntity extend that interface to avoid the cast in the default method of WithId. WithId now simply redeclares getId returning Specified so that the guarantees of returned object type simply adapt as soon as event types also implement that interface. Reduced the visibility of SimpleJdbcEvent as listeners could now just refer to JdbcEvent. Switched to a nullable field for the entity in SimpleJdbcEvent as ApplicationEvent already implies potential serializability and Optional is not Serializable. Reorganized integration test setup to use configuration classes over FactoryBean implementations. Switched to AssertJ for test assertions. Renamed UnableToSetIdException to UnableToSetId. A bit of Javadoc here and there. Formatting. No abbreviated variable names. Original pull request: #5.
The method names in the CrudRepository have changed. Related issue: DATACMNS-944 Original pull request: #5.
JDK 11 build is no longer optional.
Eventsupport.
The repository publishes events before and after inserting, updating and deleting entities, as well as after instantiation of entities.
JdbcEvent ist the common super class of all events and makes the id
and the entity available (if possible).
Added issue id comments to tests from previous issues.
Also, includes all other issues done so far.