New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: migrate the build to Maven #322

Closed
wants to merge 1 commit into
base: master
from

Conversation

Projects
None yet
6 participants
@lordnelson
Contributor

lordnelson commented Jun 13, 2015

With reference to #313, this is a view of how the project would work using Maven as the build tool.
I have implemented a build profile for Java 6/7/8 and JDBC 4/4.1/4.2 respectively.

The change aims to produce a like-for-like build of the artifacts as compared with the existing Ant build. Any improvements to the build process will be submitted as separate PRs.

Still to do:

  • Create the OSGI bundle (I'd like someone familiar with OSGI to test this)
  • Compare the documentation generation (Existing doc target seems to fail with unknown property)
  • Compare the jar pre and post Maven change
  • Test the deployment to OSS (Added the components but don't have the credentials to test)
  • Generate the javadoc
  • Generate the distribution archive
  • Generate the MANIFEST.MF entries
  • Update README.md with build changes

I think this is complete now. Please checkout the branch and have a look.

@lordnelson lordnelson force-pushed the lordnelson:maven-build branch from 1cb676c to a068981 Jun 13, 2015

@mkarg

This comment has been minimized.

Contributor

mkarg commented Jun 13, 2015

  • As strongly as I would love to propose a solution for your first question, I cannot as I simply don't understand the problem you describe.
  • An idea to deal with feature sets could be to actively address them in the design, i. e. to move them into their own "Feature" classes implementing the Strategy design pattern. If a drive supports the feature, it creates an instance if that. If it does not, loads a "null-strategy" instead. Just my 2C.
@davecramer

This comment has been minimized.

Member

davecramer commented Jun 14, 2015

@vlsi Can you expand on the pluginMangement comment ?

@vlsi

This comment has been minimized.

Member

vlsi commented Jun 14, 2015

Here's the reference: http://maven.apache.org/pom.html#Plugin_Management

  1. You define plugin versions and probably some common configuration in <pluginManagement>.
  2. When you add a <plugin> to the <build> configuration, you don't specify the version.

This helps for multi-module configurations so the version numbers are not duplicated everywhere.

@@ -29,7 +29,7 @@
* @author Aaron Mulder (ammulder@chariotsolutions.com)
*/
public class PGConnectionPoolDataSource
extends @CONN_POOL_DS_CLASS@
extends ${CONN_POOL_DS_CLASS}

This comment has been minimized.

@lordnelson

lordnelson Jun 14, 2015

Contributor

@mkarg It's easier to explain with a code sample :-)

Here is an example of the pre-processing that currently occurs in the build. In this case the class it extends can be either src/main/java/org/postgresql/ds/jdbc23/AbstractJdbc23PoolingDataSource.java or src/main/java/org/postgresql/ds/jdbc4/AbstractJdbc4PoolingDataSource.java. If the build is for JDBC4 or above the latter is chosen.

If we support JDBC4 as a minimum I believe many of these pre-processing steps can be removed. This would leave the Driver class which I'll annotate separately.

This comment has been minimized.

@vlsi

vlsi Jun 14, 2015

Member

JDBC4 is java 6, isn't it? I think we can safely make that a minimum. I think even java 7 as a minimum is good enough.

This comment has been minimized.

@lordnelson

lordnelson Jun 14, 2015

Contributor

@vlsi Looking further into the template methods in Driver.java, I think JDBC4 is already the minimum. Therefore the change could be made to simplify that part. Maybe it would be better to make these changes separate PRs once this is merged? Therefore this change can purely move the build from Ant to Maven as the change of source folders has already generated many changed files.

This comment has been minimized.

@mkarg

mkarg Jun 14, 2015

Contributor

For me it would be OK to even make JRE 8 + JDBC 4.2 the minimum in near future, as we're talking solely about the master branch, Oracle is not support supporting Java 1-7 anyways, and even reluctant projects like Debian and IBM already have support for Java 8. If there are people out there not able to migrate to Java 8 more than one year after its first publication, they simply have to stick with PGJDBC as-is, possibly plus bug fixes managed in a separate branch. I really do not see that mass of people that so badly want new features in PGJDBC but have to stick with JDBC 6 or 7 at the same time.

This comment has been minimized.

@mkarg

mkarg Jun 14, 2015

Contributor

@lordnelson +1 for doing it in separate PRs

This comment has been minimized.

@vlsi

vlsi Jun 14, 2015

Member

Oracle is not support supporting Java 1-7 anyways

You know, enterprise support is still available.

This comment has been minimized.

@mkarg

mkarg Jun 14, 2015

Contributor

@vlsi Mum's the word! ;-)

@@ -411,7 +411,7 @@ public Connection getResult(long timeout) throws SQLException {
* @throws SQLException if the connection could not be made
*/
private static Connection makeConnection(String url, Properties props) throws SQLException {
return new @JDBCCONNECTCLASS@(hostSpecs(props),
return new ${JDBCCONNECTCLASS}(hostSpecs(props),

This comment has been minimized.

@lordnelson

lordnelson Jun 14, 2015

Contributor

@mkarg This method returns a different instance depending on the version of JDBC we are building for. It can be either:

  • Jdbc42Connection
  • Jdbc4Connection

We need a performant way of achieving the same result. We could use a properties file that is embedded in the jar containing the major and minor versions as well as the spec version and use that to instantiate the correct class.

This comment has been minimized.

@mkarg

mkarg Jun 14, 2015

Contributor

Yes, it's easier to understad having a piece of code to lookt at. :-)

I think we should separate discussions here:

  • How to do preprocessing in Maven?
  • Is preprocessing a good thing at all?

The target of the current PR is Mavenization. Hence, the question is "How to do preprocessing in Maven?". That's pretty simple: Maven can filter source code automatically before compilation, so the solution as-is will work once Maven is correctly configured. What you need is called the "Templating Maven Plugin", which is an alpha currently and the web site is moving from Codehaus to Mojohaus at the moment (bad, so you cannot do it NOW). A different approach is to apply a workaround using a combination of the "Maven Resources Plugin" (which can filter pretty anything as long as it is text) and setting the source path to the output of that plugin. We use that a lot that way to patch version numbers and build dates into the source code and it works pretty well. The drawback is that least IDEs any of these solutions and won't be able to support it. So our conclusion is that we do not like preprocessing as it makes things complex.

The second question is whether it is a good idea to patch sources at all. I do not think this should be part of the same PR, but we should first open an issue for that and discuss in a larger audience. My personal opinion is that I would keep one single source code for one single platform. It makes development pretty easy and new people can jump right in easily. I'd call that "one code for one platform". If really multiple platforms or even a matrix shall be supported (and it looks like the project leads want that), I'd go with git branches. A different solution would be using the Strategy pattern, hence solving the complexity at runtime by simply loading different implementations.

In fact, the problem is less the different JDBC levels. We could simply always support the latest level, as the JVM will not complain about unkown classes until we really instantiate them, which is not the case if the code is kept JDK 6 compliant (you can offer interface that uses JDK 8 without the need to load JDK 8 classes until that class is really used). Maybe this is the way you like to go?

This comment has been minimized.

@mkarg

mkarg Jun 14, 2015

Contributor

I want to throw in an other idea. Why not having different Driver classes? That would be dead simple and we could then just replace the service registration in the final JARs to have the right class loaded.

This comment has been minimized.

@lordnelson

lordnelson Jun 14, 2015

Contributor

@mkarg Thanks for the pointer. I had already found the same plugin and implemented it to follow the Ant build as closely as I can. As you mention, it does mean that in an IDE the code appears to be broken. From my experience, pre-processing is not common in Java projects, so my position would be to look for alternative methods to achieve the same goal.

Thinking about your comment regarding patching the sources, I think you are right and this PR should just switch the build and try to mimic the Ant build where possible. Optimisations to the process can be added with further PRs if this one is merged.

If the project adopts a branch per platform won't that make the backporting of fixes difficult, particularly if say a Java 8 branch has pushed ahead with Java 8 specific features? I know the Spring framework requires Java 8 to build, however will still run on Java 6. I am not familiar with the source code, but would it be feasible to use some of the same patterns to achieve something similar? This will need some extensive mailing list discussion first with the main committers, so I might continue the discussion on there.

The use of a single driver class, I think, is again a legacy, where driver classes were originally instantiated for Class.forName("org.postgresql.Driver"). This kept the name of the Driver class constant whether you were using the JDBC2/3/4 version of the driver. I'm not sure if people are still using this method to instantiate the driver so might need to be retained. Again, this will need to be discussed on the list to decide the best approach.

This comment has been minimized.

@mkarg

mkarg Jun 14, 2015

Contributor

+1 for discussing this is separate threads.

Java 8 and Fixes: Shouldn't be a problem if a fix is developed in a branch and that branch is then merged or cherry-picked into the others. We do that a lot and in 90% of all cases it works well thanks to git. The scenario where a fix is needed in a place that works differently in Java 6 vs Java 8 means duplicate work, right. But how often does that really happen? And does the small adaption really outweigh the stress we have with the current pre-processing solution?

Single Driver: I know SQL Anywhere pretty well and there the driver name changed more often than just with each JDBC level... They even changed the package names frequently. Nobody complained about that. In fact, I even would love to have an explicit name likd "org.postgresql.jdbc3.Driver" vs "org.postgresql.jdbc4.Driver" as then it is really pretty obvious what features I really can expect to get.

This comment has been minimized.

@mkarg

mkarg Jun 14, 2015

Contributor

@lordnelson You're right with all what you say, but I would love to get some numbers how many people will really be angry with us adding new driver names in a major release of the driver. I mean, all the existing docs saying "org.postgresql.Driver" in the end talk about already EXISTING versions, while we talk about a FUTURE version. I think we overestimate the negative effect of such a change and should not be reluctant just by fear.

This comment has been minimized.

@vlsi

vlsi Jun 14, 2015

Member

@mkarg , can you please raise an issue with the change you have in mind?

I think we'd better move to a proper discussion thread.

PS. I have no problem with adding version-specific class names provided universal org.postgresql.Driver is still working.

This comment has been minimized.

@lordnelson

lordnelson Jun 14, 2015

Contributor

@vlsi I'm drafting a mailing list message right now :-)

This comment has been minimized.

@davecramer

davecramer Jun 14, 2015

Member

I should have published the ground rules earlier.

  1. The priority is not to build to the latest Java version. The priority is
    to make sure the largest reasonable population has latest drivers for any
    version of PostgreSQL currently supported. What this means practically is
    that we have to have jars for 1.6.1,7, 1.8, etc

  2. We cannot change the code other than re-arrange the source. No surprises
    for anyone.

If this can't be accomplished with Maven then IMO this is a non-starter.
Building with ant is not that much different.

git checkout ...
ant jar

Dave
r

On 14 June 2015 at 16:07, Stephen Nelson notifications@github.com wrote:

In src/main/java-templates/org/postgresql/Driver.java
#322 (comment):

@@ -411,7 +411,7 @@ public Connection getResult(long timeout) throws SQLException {
* @throws SQLException if the connection could not be made
*/
private static Connection makeConnection(String url, Properties props) throws SQLException {

  •    return new @JDBCCONNECTCLASS@(hostSpecs(props),
    
  •    return new ${JDBCCONNECTCLASS}(hostSpecs(props),
    

@vlsi https://github.com/vlsi I'm drafting a mailing list message right
now :-)


Reply to this email directly or view it on GitHub
https://github.com/pgjdbc/pgjdbc/pull/322/files#r32383661.

This comment has been minimized.

@mkarg

mkarg Jun 14, 2015

Contributor

@davecramer Sorry for the confusion -- the discussion has nothing to do with Maven actually.

@lordnelson

This comment has been minimized.

Contributor

lordnelson commented Jun 14, 2015

Thanks for the comments. The pom.xml is not a final version, more a quick implementation to build the project. It needs the changes you've highlighted before I believe it is suitable for inclusion in the project.

I'd like to get the build process logical, repeatable and simple to run first by addressing some of the legacy elements of the build.

@mkarg I'm not sure what CoC is? It's probably why I haven't followed it! :-)

@mkarg

This comment has been minimized.

Contributor

mkarg commented Jun 14, 2015

@lordnelson CoC -> https://en.wikipedia.org/wiki/Convention_over_configuration

The idea is to only write things down explicitly if you really MUST deviate from a default. This implies that the default is chosen according to the most common conventions.

Example: Don't write < package > jar < /package > as this is the default. That default is chosen by the Maven vendor because in the majority of cases what Maven user want to produce are JARs.

So using the convention (getting a JAR) is preferred over the configuration (explicitly telling < package > in pom.xml).

A more comprehensive appraoch to follow CoC would be to modify the project's build process and / or software design so deeply that we can live with ALL defaults of Maven, hence can reduce the pom.xml to the smallest possible extend. This is what I'd go for, but it implies the "one source for one platform" pattern, which seems to be unwanted by the project leads.

@lordnelson

This comment has been minimized.

Contributor

lordnelson commented Jun 14, 2015

@mkarg Thanks for confirming! Indeed I agree with you and will rely on the convention where possible, if only to save me less typing ;-)

@lordnelson lordnelson changed the title from Migrated the pgjdbc build to Maven. to feat: migrate the build to Maven Jun 14, 2015

@lordnelson lordnelson changed the title from feat: migrate the build to Maven to chore: migrate the build to Maven Jun 14, 2015

@lordnelson lordnelson force-pushed the lordnelson:maven-build branch from cf42cf3 to cdc0383 Jun 15, 2015

@mkarg

This comment has been minimized.

Contributor

mkarg commented Jun 20, 2015

@davecramer As the mailing list discussion seems to have come to the conclusion that we will mavenize and that it will not include any other changes, when can we expect this to be merged to master?

@vlsi

This comment has been minimized.

Member

vlsi commented Jun 20, 2015

I've went through the patch. Here are my comments.

Build & test works for java 1.8.
Did not verify site.

  1. Not sure if coordinates are good
  2. MANIFEST.MF does not contain OSGi metadata
  3. MANIFEST.MF does not contain title/version
  4. build.properties & build.local.properties do not work. It is probably not that required, but it would be nice to
  5. build.xml should be deleted
  6. README.md needs to be updated
  7. mvn deploy to be verified (upload to maven central). It could probably be the aim of a subsequent PR.
@lordnelson

This comment has been minimized.

Contributor

lordnelson commented Jun 20, 2015

There's a few more changes to do before it's ready to merge. I've been a bit busy to complete this week. If you want to create a PR against my branch I'll accept them so it can be finished quicker.

Briefly in answer to your points:

  1. The coordinates are already published on OSS so need to match. We already changed once a while back so it wouldn't be wise to do it again.
  2. Agree
  3. Agree
  4. Build properties are within pom.xml so additional files not really needed now.
  5. Agree
  6. Agree
  7. It needs to be confirmed as working but @davecramer already has a process for existing build.

I wanted to ensure contents of jar files are exactly the same too. Also should we use the maven release plugin?

@davecramer

This comment has been minimized.

Member

davecramer commented Jun 20, 2015

Yes, IMO changing the coords is a non-starter this would affect many
automated builds, and tools. As Stephen pointed out we already did it once
(somewhat at our peril)

I am thinking of pushing a branch we can use or we can just use Stephen's
branch for now.

PS. just got back from pgcon.... the lack of sleep and hangover(s) is
abating

Dave Cramer

On 20 June 2015 at 09:09, Stephen Nelson notifications@github.com wrote:

There's a few more changes to do before it's ready to merge. I've been a
bit busy to complete this week. If you want to create a PR against my
branch I'll accept them so it can be finished quicker.

Briefly in answer to your points:

  1. The coordinates are already published on OSS so need to match. We
    already changed once a while back so it wouldn't be wise to do it again.
  2. Agree
  3. Agree
  4. Build properties are within pom.xml so additional files not really
    needed now.
  5. Agree
  6. Agree
  7. It needs to be confirmed as working but @davecramer
    https://github.com/davecramer already has a process for existing
    build.

I wanted to ensure contents of jar files are exactly the same too. Also
should we use the maven release plugin?


Reply to this email directly or view it on GitHub
#322 (comment).

@vlsi

This comment has been minimized.

Member

vlsi commented Jun 20, 2015

Yes, IMO changing the coords is a non-starter this would affect many
automated builds,

I completely forgot the pgjdbc already has maven coordinates. Shame on me.

Build properties are within pom.xml so additional files not really needed now.

In fact, the name of the database I use for testing does not match test/test/test pattern. It was somewhat useful for me to have my local values in the build.local.properties, so I do not have to provide the exact values each time.

It needs to be confirmed as working but @davecramer already has a process for existing build.

Sure. Committers should pick a way of publishing that will work for them the best.

@lordnelson

This comment has been minimized.

Contributor

lordnelson commented Jun 22, 2015

I've made a number of changes to add in the missing features:

  • OSGI
  • build.properties
  • Distribution generation in zip, tar.gz and tar.bz2
  • Javadoc generation
  • Modified readme

I'm not sure if the correct github etiquette is to rebase my changes and thereby force pushing them, or just add the commits? Ideally if this gets merged in I'd like it to be a single commit rather than my many commits.

@lordnelson lordnelson force-pushed the lordnelson:maven-build branch 5 times, most recently from d224155 to ab0642e Jun 22, 2015

@mkarg

This comment has been minimized.

Contributor

mkarg commented Jun 29, 2015

@lordnelson Do I see it right, you now think this PR is ready to merge, or do you still wait for feedback?

@lordnelson

This comment has been minimized.

Contributor

lordnelson commented Jun 29, 2015

I think it's nearly there. I'd like someone more experienced with OSGI to test it works as before.

@vlsi

This comment has been minimized.

Member

vlsi commented Jun 29, 2015

If you don't object, I'll have a look shortly (let the timeout be 16 hours).
I can't validate OSGi, however I'll check mvn/etc.

@vlsi

This comment has been minimized.

Member

vlsi commented Jun 29, 2015

I'm not sure if the correct github etiquette is to rebase my changes and thereby force pushing them, or just add the commits?

I typically just force-push. Previous commits are still available and diffable, so I think no much harm is made by rebase&force-push.

@lordnelson

This comment has been minimized.

Contributor

lordnelson commented Jun 30, 2015

@vlsi Yes please. The more reviewers the better.

@vlsi

This comment has been minimized.

Member

vlsi commented Jun 30, 2015

Questionable

  1. The patch no longer applies since 5ec7dea introduced new top-level class JdbcCallParseInfo.
    Could you please rebase PR against current master?
    As I move org/... folder to src/... it starts to compile for me.
  2. What is the purpose of "zip, tar.gz and tar.bz2" distributions?
    It looks like it contains "jar + sources + pom.xml + dependencies". What is the purpose of that?
  3. Manifest contains some questionable attributes:
POOLED_CONN_CLASS: org.postgresql.ds.jdbc4.AbstractJdbc4PooledConnection
NOTIMPLEMENTEDEXCEPTION: java.sql.SQLFeatureNotSupportedException
CONN_POOL_DS_CLASS: org.postgresql.ds.jdbc4.AbstractJdbc4ConnectionPoolDataSource
DEF_PGPORT: 5432
MAKE_SSL_CLASS: org.postgresql.ssl.jdbc4.AbstractJdbc4MakeSSL
JDBCCONNECTCLASS: org.postgresql.jdbc42.Jdbc42Connection
POOLING_DS_CLASS: org.postgresql.ds.jdbc4.AbstractJdbc4PoolingDataSource
SIMPLE_DS_CLASS: org.postgresql.ds.jdbc4.AbstractJdbc4SimpleDataSource

Should (can) we remove that?

Tolerable

  1. [ERROR] MavenReportException: Error while generating Javadoc:. I think we should address javadoc/etc errors/warnings in a separate PR.

Good

  1. build.local.properties works.
  2. mvn install works, mvn install -DskipTests works, benchmarks work
@lordnelson

This comment has been minimized.

Contributor

lordnelson commented Jun 30, 2015

@vlsi Thanks for the comments - very helpful. Replies as follows:

Questionable

  1. I have been re-basing changes as they've merged. I've just rebased this one and will push the change shortly.
  2. This is existing behaviour in the build and used to generate the source distribution available on: https://jdbc.postgresql.org/download.html. However it only needs to provide .tar.gz so I'll drop the others for now.
  3. I need to remove those properties. Maven has helpfully added them.

Tolerable

  1. I believe the Maven build is surfacing those errors - they already exist in the source.

Nearly there :-)

@vlsi

This comment has been minimized.

Member

vlsi commented Nov 5, 2015

I wonder what if we take pre-processor approach to "multi-jdbc-version" support?
That would solve current pain points of multiple jdbcXX packages and string replacement in Driver.

Preprocessing (e.g. conditional disabling of new methods) seems much easier from implementation point of view: there would be just a single solid class hierarchy: Statement -> PreparedStatement -> CallableStatement

@davecramer

This comment has been minimized.

Member

davecramer commented Nov 5, 2015

I'm not convinced we actually have to disable new methods.

Has anyone tried compiling with 1.8 and targeting 1.6 for instance ?

Dave Cramer

On 5 November 2015 at 19:12, Vladimir Sitnikov notifications@github.com
wrote:

I wonder what if we take pre-processor approach to "multi-jdbc-version"
support?
That would solve current pain points of multiple jdbcXX packages and
string replacement in Driver.

Preprocessing (e.g. conditional disabling of new methods) seems much
easier from implementation point of view: there would be just a single
solid class hierarchy: Statement -> PreparedStatement -> CallableStatement


Reply to this email directly or view it on GitHub
#322 (comment).

@vlsi

This comment has been minimized.

Member

vlsi commented Nov 5, 2015

Has anyone tried compiling with 1.8 and targeting 1.6 for instance ?

Dave, java 6 does NOT have java.time.Instant. Do you really need a try to understand if "target=1.6" is a feasible option?

@davecramer

This comment has been minimized.

Member

davecramer commented Nov 5, 2015

On 5 November 2015 at 19:36, Vladimir Sitnikov notifications@github.com
wrote:

Has anyone tried compiling with 1.8 and targeting 1.6 for instance ?

Dave, java 6 does NOT have java.time.Instant. Do you really need a try to
understand if "target=1.6" is a feasible option?

Guess that answers that...


Reply to this email directly or view it on GitHub
#322 (comment).

@vlsi

This comment has been minimized.

Member

vlsi commented Nov 5, 2015

Do you really need a try to understand if "target=1.6" is a feasible option?

I mean "pre-processor + target=1.6" might be an (poor man's) option, however, pre-processor is still required to filter out non-supported classes.

Compiling via proper JDK would ensure pre-processing is done right as in "all the non-supported by current jre classes are trimmed out".

@davecramer

This comment has been minimized.

Member

davecramer commented Nov 5, 2015

stubborn as I am, I actually tried it, and javac won't allow any other
target besides 1.8

Dave Cramer

On 5 November 2015 at 19:44, Vladimir Sitnikov notifications@github.com
wrote:

Do you really need a try to understand if "target=1.6" is a feasible
option?

I mean "pre-processor + target=1.6" might be an (poor man's) option,
however, pre-processor is still required to filter out non-supported
classes.

Compiling via proper JDK would ensure pre-processing is done right as in
"all the non-supported by current jre classes are trimmed out".


Reply to this email directly or view it on GitHub
#322 (comment).

@vlsi

This comment has been minimized.

Member

vlsi commented Nov 5, 2015

source=1.6 target=1.6 should be just fine.
So what is your take on refactoring jdbcXY class hierarchy?

@davecramer

This comment has been minimized.

Member

davecramer commented Nov 5, 2015

If it accomplishes the goal of only requiring one JDK to build then I would
be inclined to agree.

However I'd be less than inclined if it makes readability even worse than
it is now

Dave Cramer

On 5 November 2015 at 19:57, Vladimir Sitnikov notifications@github.com
wrote:

source=1.6 target=1.6 should be just fine.
So what is your take on refactoring jdbcXY class hierarchy?


Reply to this email directly or view it on GitHub
#322 (comment).

@lordnelson

This comment has been minimized.

Contributor

lordnelson commented Nov 5, 2015

Yeah, I would like to get this completed. How should I proceed based on my earlier issue with building and packaging for multiple spec versions? Shall I go for the multi-module build where we have a different artifact for each spec version?

@davecramer

This comment has been minimized.

Member

davecramer commented Nov 5, 2015

That might actually work out. I'm sure there are some issues which we will
find but as long as we retain the common code somewhere in a "base" modules
it certainly isn't worse than what we have now.

How do you plan on dealing with classes that build upon their immediate
predecessor such as jdbc4.resultset extends jdbc3.resultset ?

Dave Cramer

On 5 November 2015 at 22:52, Stephen Nelson notifications@github.com
wrote:

Yeah, I would like to get this completed. How should I proceed based on my
earlier issue with building and packaging for multiple spec versions? Shall
I go for the multi-module build where we have a different artifact for each
spec version?


Reply to this email directly or view it on GitHub
#322 (comment).

@vlsi

This comment has been minimized.

Member

vlsi commented Nov 6, 2015

Shall I go for the multi-module build

@lordnelson , I'm more inclined to try https://github.com/raydac/java-comment-preprocessor to build different specs out of a single "AbstractStatement.java".

@lordnelson

This comment has been minimized.

Contributor

lordnelson commented Nov 6, 2015

Thanks @vlsi I'll take a look. The issue I ran up against before is that we are technically publishing a Maven artifact per JDBC spec version i.e. the source is different, the jar is different and the javadoc is different.

As Maven only wants to publish a single artifact per pom the problem becomes apparent. We got around it with the Ant build by naming our artifact differently at runtime depending on the JDBC spec version we we building.

So I'm thinking we would need to have a multi-module build if we use Maven but as @davecramer mentioned the classes in the later spec versions depend on the earlier ones so we'd need to solve that in a way that allow us to ship a single jar - maybe by a later JDBC spec version of the driver build-depending on the earlier one and then re-packing the deps?

@mkarg

This comment has been minimized.

Contributor

mkarg commented Nov 6, 2015

It is not true that Maven only wants to publish a single artifact per pom. You can add any number of additional artifacts using maven-assembly-plugin.

-Markus

From: Stephen Nelson [mailto:notifications@github.com]
Sent: Freitag, 6. November 2015 21:47
To: pgjdbc/pgjdbc
Cc: Markus KARG
Subject: Re: [pgjdbc] chore: migrate the build to Maven (#322)

Thanks @vlsi https://github.com/vlsi I'll take a look. The issue I ran up against before is that we are technically publishing a Maven artifact per JDBC spec version i.e. the source is different, the jar is different and the javadoc is different.

As Maven only wants to publish a single artifact per pom the problem becomes apparent. We got around it with the Ant build by naming our artifact differently at runtime depending on the JDBC spec version we we building.

So I'm thinking we would need to have a multi-module build if we use Maven but as @davecramer https://github.com/davecramer mentioned the classes in the later spec versions depend on the earlier ones so we'd need to solve that in a way that allow us to ship a single jar - maybe by a later JDBC spec version of the driver build-depending on the earlier one and then re-packing the deps?


Reply to this email directly or view it on GitHub #322 (comment) . https://github.com/notifications/beacon/ABn3t07GAF6OdAut8bSLqwNa9DhIS2yKks5pDQlSgaJpZM4FCDIg.gif

@lordnelson

This comment has been minimized.

Contributor

lordnelson commented Nov 6, 2015

But to publish to Sonatype there is a single source jar and a single javadoc jar to accompany the compiled driver jar. So if we want the sources and javadoc to actually match the compiled code we are limited to a single artifact per pom.

@mkarg

This comment has been minimized.

Contributor

mkarg commented Nov 7, 2015

Understood. But what actually is the problem with having submodules? I mean, I daily deal with projects having several tens of submodules, so what actually holds us back from doing that?

From: Stephen Nelson [mailto:notifications@github.com]
Sent: Samstag, 7. November 2015 00:03
To: pgjdbc/pgjdbc
Cc: Markus KARG
Subject: Re: [pgjdbc] chore: migrate the build to Maven (#322)

But to publish to Sonatype http://central.sonatype.org/pages/requirements.html there is a single source jar and a single javadoc jar to accompany the compiled driver jar. So if we want the sources and javadoc to actually match the compiled code we are limited to a single artifact per pom.


Reply to this email directly or view it on GitHub #322 (comment) . https://github.com/notifications/beacon/ABn3t1mvEvTXjs-hbHkkL53y7y1mlmPbks5pDSkugaJpZM4FCDIg.gif

@davecramer

This comment has been minimized.

Member

davecramer commented Nov 7, 2015

Having not seen how preprocessing would work in practice my druthers would
be modules if they can be made to work. The reason being that I don't think
IDE's would be able to follow preprocessing.

Dave Cramer

On 7 November 2015 at 10:38, Markus KARG notifications@github.com wrote:

Understood. But what actually is the problem with having submodules? I
mean, I daily deal with projects having several tens of submodules, so what
actually holds us back from doing that?

From: Stephen Nelson [mailto:notifications@github.com]
Sent: Samstag, 7. November 2015 00:03
To: pgjdbc/pgjdbc
Cc: Markus KARG
Subject: Re: [pgjdbc] chore: migrate the build to Maven (#322)

But to publish to Sonatype <
http://central.sonatype.org/pages/requirements.html> there is a single
source jar and a single javadoc jar to accompany the compiled driver jar.
So if we want the sources and javadoc to actually match the compiled code
we are limited to a single artifact per pom.


Reply to this email directly or view it on GitHub <
https://github.com/pgjdbc/pgjdbc/pull/322#issuecomment-154570442> . <
https://github.com/notifications/beacon/ABn3t1mvEvTXjs-hbHkkL53y7y1mlmPbks5pDSkugaJpZM4FCDIg.gif>


Reply to this email directly or view it on GitHub
#322 (comment).

@vlsi

This comment has been minimized.

Member

vlsi commented Nov 8, 2015

The reason being that I don't think IDE's would be able to follow preprocessing.

My idea is simple: make sure the code in repository is "JDK-latest-compatible" (i.e. jdk8) and use pre-processing only for lower java versions.
In other words, if you invoke mvn ... under java6/java7, then it would remove "original java8.java file" out of compile-path and pre-process it.

Well, alternative way is to have multiple pom.xml files just for the purpose of "deploying to nexus".

@vlsi

This comment has been minimized.

Member

vlsi commented Nov 8, 2015

Understood. But what actually is the problem with having submodules?

@mkarg , submodules are perfectly fine. Submodule per jdbc spec version -- not fine. Do you agree?

I think it's better have a single AbstractStatement class rather than AbstractJdbc4Statement extends AbstractJdbc3gStatement extends AbstractJdbc3Statement extends AbstractJdbc2Statement

@mkarg

This comment has been minimized.

Contributor

mkarg commented Nov 8, 2015

I also do not like this hierarchy, but you maybe already know my opinion about that from earlier discussions: I would use branches since I would limit master to be "latest JDK + latest PostgreSQL" only, so no new features are found in older combinations, as those are only MAINTAINED but not DEVELOPED anymore -- but this unfortunately is not wanted.

From: Vladimir Sitnikov [mailto:notifications@github.com]
Sent: Sonntag, 8. November 2015 15:02
To: pgjdbc/pgjdbc
Cc: Markus KARG
Subject: Re: [pgjdbc] chore: migrate the build to Maven (#322)

Understood. But what actually is the problem with having submodules?

@mkarg https://github.com/mkarg , submodules are perfectly fine. Submodule per jdbc spec version -- not fine. Do you agree?

I think it's better have a single AbstractStatement class rather than AbstractJdbc4Statement extends AbstractJdbc3gStatement extends AbstractJdbc3Statement extends AbstractJdbc2Statement


Reply to this email directly or view it on GitHub #322 (comment) . https://github.com/notifications/beacon/ABn3twWVCfUyTnzlpFKtdsTZTZAAteTIks5pD00-gaJpZM4FCDIg.gif

@davecramer

This comment has been minimized.

Member

davecramer commented Nov 8, 2015

On 8 November 2015 at 11:02, Markus KARG notifications@github.com wrote:

I also do not like this hierarchy, but you maybe already know my opinion
about that from earlier discussions: I would use branches since I would
limit master to be "latest JDK + latest PostgreSQL" only, so no new
features are found in older combinations, as those are only MAINTAINED but
not DEVELOPED anymore -- but this unfortunately is not wanted.

I don't think this meets the mandate of the project; which is to provide
the latest of JDBC for Postgres for all java versions from 1.6 to current
until we deem 1.6 to be not worth supporting.

From: Vladimir Sitnikov [mailto:notifications@github.com]
Sent: Sonntag, 8. November 2015 15:02
To: pgjdbc/pgjdbc
Cc: Markus KARG
Subject: Re: [pgjdbc] chore: migrate the build to Maven (#322)

Understood. But what actually is the problem with having submodules?

@mkarg https://github.com/mkarg , submodules are perfectly fine.
Submodule per jdbc spec version -- not fine. Do you agree?

I think it's better have a single AbstractStatement class rather than
AbstractJdbc4Statement extends AbstractJdbc3gStatement extends
AbstractJdbc3Statement extends AbstractJdbc2Statement


Reply to this email directly or view it on GitHub <
https://github.com/pgjdbc/pgjdbc/pull/322#issuecomment-154828378> . <
https://github.com/notifications/beacon/ABn3twWVCfUyTnzlpFKtdsTZTZAAteTIks5pD00-gaJpZM4FCDIg.gif>


Reply to this email directly or view it on GitHub
#322 (comment).

@mkarg

This comment has been minimized.

Contributor

mkarg commented Nov 8, 2015

Just to correctly understand: What actually do you mean with "the latest of JDBC for all java versions"? It simply is impossible to provide latest JDBC on Java 6 as the Instant class simply does not exist prior to Java 8 but is mandatory in JDBC 4.2. Hence the mandate is impossible to fulfil. I rather would interpret the mandate as "the latest JDBC supported on all java version" which means JDBC 4.2 is simply not supported on Java 6 and 7, hence, new features only for JDBC 4.2 + Java 8.

From: Dave Cramer [mailto:notifications@github.com]
Sent: Sonntag, 8. November 2015 21:27
To: pgjdbc/pgjdbc
Cc: Markus KARG
Subject: Re: [pgjdbc] chore: migrate the build to Maven (#322)

On 8 November 2015 at 11:02, Markus KARG notifications@github.com wrote:

I also do not like this hierarchy, but you maybe already know my opinion
about that from earlier discussions: I would use branches since I would
limit master to be "latest JDK + latest PostgreSQL" only, so no new
features are found in older combinations, as those are only MAINTAINED but
not DEVELOPED anymore -- but this unfortunately is not wanted.

I don't think this meets the mandate of the project; which is to provide
the latest of JDBC for Postgres for all java versions from 1.6 to current
until we deem 1.6 to be not worth supporting.

From: Vladimir Sitnikov [mailto:notifications@github.com]
Sent: Sonntag, 8. November 2015 15:02
To: pgjdbc/pgjdbc
Cc: Markus KARG
Subject: Re: [pgjdbc] chore: migrate the build to Maven (#322)

Understood. But what actually is the problem with having submodules?

@mkarg https://github.com/mkarg , submodules are perfectly fine.
Submodule per jdbc spec version -- not fine. Do you agree?

I think it's better have a single AbstractStatement class rather than
AbstractJdbc4Statement extends AbstractJdbc3gStatement extends
AbstractJdbc3Statement extends AbstractJdbc2Statement


Reply to this email directly or view it on GitHub <
https://github.com/pgjdbc/pgjdbc/pull/322#issuecomment-154828378> . <
https://github.com/notifications/beacon/ABn3twWVCfUyTnzlpFKtdsTZTZAAteTIks5pD00-gaJpZM4FCDIg.gif>


Reply to this email directly or view it on GitHub
#322 (comment).


Reply to this email directly or view it on GitHub #322 (comment) . https://github.com/notifications/beacon/ABn3t0fSUXHx6UYJpChDaEk_SDhQGWlAks5pD6eOgaJpZM4FCDIg.gif

@vlsi vlsi referenced this pull request Nov 20, 2015

Closed

update_translation.sh fix #430

chore: migrate the build to Maven
Use Maven as the build tool to reduce complexity of the build process. Separate source and test into
separate directories according to Maven conventions.

@lordnelson lordnelson force-pushed the lordnelson:maven-build branch from d0b9998 to 0afd6d8 Nov 26, 2015

vlsi added a commit to Gordiychuk/pgjdbc that referenced this pull request Dec 3, 2015

chore: migrate build to Maven
The change is heavily based on pgjdbc#322 by @lordnelson. This PR is a multi-module build, so I squash "pgjdbc#322 mavenization" and "modularization" commits to make git history cleaner in terms of understanding where went each file.

vlsi added a commit to Gordiychuk/pgjdbc that referenced this pull request Dec 3, 2015

chore: migrate build to Maven
The change is heavily based on pgjdbc#322 by Stephen Nelson (@lordnelson). This PR is a multi-module build, so I squash "pgjdbc#322 mavenization" and "modularization" commits to make git history cleaner in terms of understanding where went each file.

@vlsi vlsi closed this in f470f05 Dec 20, 2015

@lordnelson lordnelson deleted the lordnelson:maven-build branch Jul 1, 2018

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment