-
Notifications
You must be signed in to change notification settings - Fork 29k
SPARK-1094 Support MiMa for reporting binary compatibility accross versions. #20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Rather than getting these from a jar file I'd just look at the current classpath and search for org.apache.spark classes:
You can follow the code here:
http://dzone.com/snippets/get-all-classes-within-package
|
Hey @ScrapCodes this is looking good. I think ideally the way this should work is the following. After we run |
project/MimaBuild.scala
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd add logic here that reads from .mima-excludes and ignores classes defined in there.
I am not very sure, but won't this automatic creation of excludes forfeit the purpose of having mima at all. What would happen if a class was public in version 1 and become private[x] in 1.2. Then we would simply ignore it but it is a violating of binary compatibility. |
|
One or more automated tests failed |
|
@ScrapCodes yes you are correct. We'll miss those cases - but without proper MIMA support I don't think we can fix that. Do you see an alternative solution that would catch those types of errors? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think maybe we should not include this file in the repo (e.g. put it in .gitigntore) and just generate it automatically inside of ./dev/run-test. Otherwise we'll have to update this all the time.
|
@ScrapCodes by the way - I don't think this forfeits all of the functionality. If someone goes and makes a class |
|
Hey @ScrapCodes FYI I think I'm gonna take a crack a this tonight using this patch as a starting point. I found a few other issues when doing more review and it's fairly complicated, so I may just submit a second patch. Details fourthcoming... :) |
|
@pwendell @ScrapCodes I think this broken the build for maven yarn - see https://spark-project.atlassian.net/browse/SPARK-1315 |
PR 585 from incubator repo.