-
Notifications
You must be signed in to change notification settings - Fork 16.3k
Ensure Kerberos token is valid in SparkSubmitOperator before running yarn kill
#9044
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this should be
| kinit_cmd = "kinit -kt {} {}" \ | |
| .format(self._keytab, self._principal).split() | |
| kinit = subprocess.Popen(kinit_cmd, | |
| stdout=subprocess.PIPE, | |
| stderr=subprocess.PIPE) | |
| from airflow.security.kerberos import renew_from_kt | |
| renew_from_kt(self._principal, self._keytab) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(But I'm not familiar with Kerberos, so I could be wrong)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cool, that would renew the kerberos token and store it at /tmp/airflow_krb5_ccache(configured in airflow.cfg) but yarn expects it at /tmp/krb5cc_<uid> the default token location , It would work if the env variable KRB5CCNAME is set to /tmp/<airflow.cfg.kerberos.ccache > in the subprocess used for yarn kill. will update it
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ashb - tried using the renew_from_kt and it seems to do the following
kinit -r 3600m -k -t my.keytab -c /tmp/airflow_krb5_ccache principal
and then does a workaround to renew the ticket
kinit -c /tmp/airflow_krb5_ccache -R
the latter seems to fail for non-renewable tickets and does a sys.exit, any suggestions on how this can be handled? I think the renewal work around won't be needed for a yarn kill
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This change looks good, but can you add a comment to the nrew_from_kt call in SparkOperator to say why we aren't treating the errors as fatal.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done, can you have another look please
ashb
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use existing renew function please
|
@ashb - squashed and rebased this, can you have another look please |
b708275 to
f351211
Compare
|
@ashb thank you for all the help with my first PR, have made the suggested changes, can you have a look please :) |
ashb
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One minor change, and please rebase on to latest master (see https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#id8 if you aren't sure how to do this) to fix the test failures.
done, rebased on top of master. found the answer on the same doc, squashed them, can you have a look please |
yarn kill
ashb
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Do you think the title edit I just made is accurate?
ashb
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please add unit tests covering this behaviour (you can use mocking to have renew_from_kt be a no-op, but please exercise this code path in tests)
|
Current tests need updating too: |
|
sure, will try adding tests and fixing this 👍 |
b4d2d13 to
a815768
Compare
|
hi @ashb, have added unit tests to cover the new path, can you have look please |
edff08d to
9a2994e
Compare
ashb
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good
|
Could you rebase to latest master, hopefully that should fix the failing Kube tests |
done, but failed again, is there a way to re-run them? |
43d8e86 to
5dbfd90
Compare
|
@ashb passed after another rebase |
* Tests should also be triggered when there is just setup.py change (apache#9690) So far tests were not triggered when only requirements changed, but this is quite needed in fact. * Update FlaskAppBuilder to v3 (apache#9648) * Some Pylint fixes in airflow/models/taskinstance.py (apache#9674) * Update migrations to ensure compatibility with Airflow 1.10.* (apache#9660) closes apache#9640 * Fix _process_executor_events method to use in-memory try_number (apache#9692) * use the correct claim name in the webserver (apache#9688) * Update Thumbtack points of contact in Airflow Users list (apache#9701) The previously-listed person is no longer at the company * generate go client from openapi spec (apache#9502) * generate go client from openapi spec * move openapi codegen to seperate workflow * [AIRFLOW-XXXX] Remove unnecessary docstring in AWSAthenaOperator * Add health API endpoint (apache#8144) (apache#9277) * Add AWS StepFunctions integrations to the aws provider (apache#8749) * Move gcs & wasb task handlers to their respective provider packages (apache#9714) * Allow AWSAthenaHook to get more than 1000/first page of results (apache#6075) Co-authored-by: Dylan Joss <[email protected]> * Add Dag Runs CRUD endpoints (apache#9473) * Make airflow/migrations/env.py Pylint Compatible (apache#9670) * Get Airflow configs with sensitive data from Secret Backends (apache#9645) * YAML file supports extra json parameters (apache#9549) Co-authored-by: Kamil Breguła <[email protected]> Co-authored-by: Vinay <[email protected]> Co-authored-by: Kamil Breguła <[email protected]> * fix grammar in prereq tasks gcp operator docs (apache#9728) * Add The Climate Corporation to user list (apache#9726) * Add Qingping Hou to committers list (apache#9725) * Add new fantastic team member of Polidea. (apache#9724) * Error in description after deployment (apache#9723) * Error in description after deployment Co-authored-by: Daniel Debny <[email protected]> * Skip one version of Python for each test. Skip one version of Python for each test. * Add read-only endpoints for DAG Model (apache#9045) Co-authored-by: Tomek Urbaszek <[email protected]> Co-authored-by: Tomek Urbaszek <[email protected]> * Ensure Kerberos token is valid in SparkSubmitOperator before running `yarn kill` (apache#9044) do a kinit before yarn kill if keytab and principal is provided * Update example DAG for AI Platform operators (apache#9727) * Fix warning about incompatible plugins (apache#9704) One condition was bad and warns when the plugin is for admin and FAB flask. * Update local_task_job.py (apache#9746) Removing the suicide joke. * Tests are working for newly added backport providers (apache#9739) * Tests are working for newly added backport providers * Pre-create Celery db result tables before running Celery worker (apache#9719) Otherwise at large scale this can end up with some tasks failing as they try to create the result table at the same time. This was always possible before, just exceedingly rare, but in large scale performance testing where I create a lot of tasks quickly (especially in my HA testing) I hit this a few times. This is also only a problem for fresh installs/clean DBs, as once these tables exist the possible race goes away. This is the same fix from apache#8909, just for runtime, not test time. * Support extra config options for Sentry (apache#8911) For now only dsn can be configured through the airflow.cfg. Need support 'http_proxy' option for example (it can't be configured through the environment variables). This change implements solution for supporting all existed (and maybe future) options for sentry configuration. * Use namedtuple for TaskInstanceKeyType (apache#9712) * Use namedtuple for TaskInstanceKeyType * Add TargetQueryValue to KEDA Autoscaler (apache#9748) Co-authored-by: Daniel Imberman <[email protected]> * Add unit tests for mlengine_operator_utils (apache#9702) * Mask other forms of password arguments in SparkSubmitOperator (apache#9615) This is a follow-up to apache#6917 before modifying the masking code. Related: apache#9595. * Use absolute paths in howto guides (apache#9758) * Fix StackdriverTaskHandler + add system tests (apache#9761) Co-authored-by: Tomek Urbaszek <[email protected]> Co-authored-by: Tomek Urbaszek <[email protected]> * Check project structure in sensors/transfers directories (apache#9764) * Add tests for yandex hook (apache#9665) * improve type hinting for celery provider (apache#9762) * Add ME-Br to who uses Airflow list (apache#9770) * Add 1.10.11 Changelog & Update UPDATING.md (apache#9757) * Links Breeze documentation to new Breeze video (apache#9768) * Fix is_terminal_support_colors functtion (apache#9734) * Add type hinting for discord provider (apache#9773) * Fix typo in the word "Airflow" (apache#9772) * Add Google Stackdriver link (apache#9765) * Improve type hinting to provider microsoft (apache#9774) * Unit tests jenkins hook (apache#9767) * Fixes failing formatting of DAG file containing {} in docstring (apache#9779) * Upgrade to latest isort (5.0.8) (apache#9782) * Add API Endpoint - DagRuns Batch (apache#9556) Co-authored-by: Ephraim Anierobi <[email protected]> * Improve typing coverage in scheduler_job.py (apache#9783) * Enable pretty output in mypy (apache#9785) * provide_session keep return type (apache#9787) * Refactor Google operators guides (apache#9766) * Refactor Google guides * fixup! Refactor Google guides * fixup! fixup! Refactor Google guides * Fix small errors in image building documentation (apache#9792) * Backfill reset_dagruns set DagRun to NONE state (apache#9756) * Add DAG Source endpoint (apache#9322) * The group of embedded DAGs should be root to be OpenShift compatible (apache#9794) * Add docs for replace_microseconds parameters in trigger DAG endpoint (apache#9793) * Add multiple file upload functionality to GCS hook (apache#8849) Co-authored-by: Timothy Healy <[email protected]> * Keep functions signatures in decorators (apache#9786) * Use paths relative to root docs dir in *include directives (apache#9797) * Add Migration guide from the experimental API to the REST API (apache#9771) Co-authored-by: Kaxil Naik <[email protected]> Co-authored-by: Kamil Breguła <[email protected]> * Update paths in .github/boring-cyborg.yml (apache#9799) * Update paths in .github/boring-cyborg.yml * fixup! Update paths in .github/boring-cyborg.yml * Minor typo fix in OpenAPI specification (apache#9809) * Enable annotations to be added to the webserver service (apache#9776) * Make airflow package type check compatible (apache#9791) * Update README to add Py 3.8 in supported versions (apache#9804) * Remove unnecessary comprehension (apache#9805) * Add type annotations for redis provider (apache#9815) * Remove package.json and yarn.lock from the prod image (apache#9814) Closes apache#9810 * For now cloud tools are not needed in CI (apache#9818) Currently there is "unbound" variable error printed in CI logs because of that. * Python 3.8.4 release breaks our builds (apache#9820) * Allow `replace` flag in gcs_to_gcs operator. (apache#9667) * Allow `replace` flag in gcs_to_gcs operator. If we are not replacing, list all files in the Destination GCS bucket and only keep those files which are present in Source GCS bucket and not in Destination GCS bucket * Add kylin operator (apache#9149) Co-authored-by: yongheng.liu <[email protected]> * Fix SqlAlchemy-Flask failure with python 3.8.4 (apache#9821) * Add API Reference docs (redoc) to sphinx (apache#9806) * Add Google Deployment Manager Hook (apache#9159) Co-authored-by: Ephraim Anierobi <[email protected]> * Remove HTTP guide index in docs (apache#9796) * Improve type hinting to provider cloudant (apache#9825) Co-authored-by: Refael Y <[email protected]> * Add option to delete by prefix to S3DeleteObjectsOperator (apache#9350) Co-authored-by: Felix Uellendall <[email protected]> * Add CloudVisionDeleteReferenceImageOperator (apache#9698) * Add note in Updating.md about the change in `run_as_user` default (apache#9822) Until Airflow 1.10.10 the default run_as_user config (https://airflow.readthedocs.io/en/1.10.10/configurations-ref.html#run-as-user) which defaulted it to root user `0` (https://github.com/apache/airflow/blob/96697180d79bfc90f6964a8e99f9dd441789177c/airflow/contrib/executors/kubernetes_executor.py#L295-L301) In Airflow 1.10.11 we changed it to `50000` * Improve typing in airflow/models/pool.py (apache#9835) * Remove global variable with API auth backend (apache#9833) * Fix Writing Serialized Dags to DB (apache#9836) * Update gcp to google in docs (apache#9839) Co-authored-by: Ashwin Shankar <[email protected]> * BigQueryTableExistenceSensor needs to specify keyword arguments (apache#9832) * Add guide for AI Platform (previously Machine Learning Engine) Operators (apache#9798) * Change DAG.clear to take dag_run_state (apache#9824) * Change DAG.clear to take dag_run_state * fix lint * fix tests * assign var * extend original clause * Rename DagBag.store_serialized_dags to Dagbag.read_dags_from_db (apache#9838) * Update more occurrences of gcp to google (apache#9842) * Add Dynata to the Airflow users list (apache#9846) * Fix S3FileTransformOperator to support S3 Select transformation only (apache#8936) Documentation for S3FileTransformOperator states that users can skip transformation script if S3 Select experession is specified, but in this case the created file is always zero bytes long. This fix changes the behaviour, so in case of no transformation given, the source file (a result of S3Select) is uploaded. * Fix DagRun.conf when using trigger_dag API (apache#9853) fixes apache#9852 * Helm chart can now place arbitrary config settings in to airflow.cfg (apache#9816) Rather than only allowing specific pre-determined config settings, this change allows the user to place _any_ config setting they like in the generated airflow.cfg, including overwriting the "generated defaults". This providers a nicer interface for the users of the chart (even if the could already set these via the env vars). * Fix typo in datafusion operator (apache#9859) Co-authored-by: michalslowikowski00 <[email protected]> * Fix Experimental API Client (apache#9849) * Add imagePullSecrets to the create user job (apache#9802) So that it can pull the specified image from a private registry. * Group CI scripts in subdirectories (apache#9653) Reviewed the scripts and removed some of the old unused ones. Co-authored-by: Jarek Potiuk <[email protected]> Co-authored-by: Ephraim Anierobi <[email protected]> Co-authored-by: Kaxil Naik <[email protected]> Co-authored-by: Tomek Urbaszek <[email protected]> Co-authored-by: Aneesh Joseph <[email protected]> Co-authored-by: Dylan Joss <[email protected]> Co-authored-by: QP Hou <[email protected]> Co-authored-by: Cooper Gillan <[email protected]> Co-authored-by: Omair Khan <[email protected]> Co-authored-by: chamcca <[email protected]> Co-authored-by: lindsable <[email protected]> Co-authored-by: Vinay G B <[email protected]> Co-authored-by: Kamil Breguła <[email protected]> Co-authored-by: Vinay <[email protected]> Co-authored-by: Vismita Uppalli <[email protected]> Co-authored-by: Jeff Melching <[email protected]> Co-authored-by: Daniel Debny <[email protected]> Co-authored-by: James Timmins <[email protected]> Co-authored-by: Tomek Urbaszek <[email protected]> Co-authored-by: Morgan Racine <[email protected]> Co-authored-by: Ash Berlin-Taylor <[email protected]> Co-authored-by: Bolgov Andrey <[email protected]> Co-authored-by: Daniel Imberman <[email protected]> Co-authored-by: Daniel Imberman <[email protected]> Co-authored-by: chipmyersjr <[email protected]> Co-authored-by: Jacek Kołodziej <[email protected]> Co-authored-by: Kanthi <[email protected]> Co-authored-by: morrme <[email protected]> Co-authored-by: Nitai Bezerra da Silva <[email protected]> Co-authored-by: Rafferty Chen <[email protected]> Co-authored-by: Mauricio De Diana <[email protected]> Co-authored-by: Guilherme Da Silva Gonçalves <[email protected]> Co-authored-by: takunnithan <[email protected]> Co-authored-by: Chao-Han Tsai <[email protected]> Co-authored-by: Tobiasz Kędzierski <[email protected]> Co-authored-by: Tim Healy <[email protected]> Co-authored-by: Timothy Healy <[email protected]> Co-authored-by: Adam Dobrawy <[email protected]> Co-authored-by: Vicken Simonian <[email protected]> Co-authored-by: Alexander Sutcliffe <[email protected]> Co-authored-by: royberkoweee <[email protected]> Co-authored-by: yongheng.liu <[email protected]> Co-authored-by: yongheng.liu <[email protected]> Co-authored-by: Sam Wheating <[email protected]> Co-authored-by: Ephraim Anierobi <[email protected]> Co-authored-by: rafyzg <[email protected]> Co-authored-by: Refael Y <[email protected]> Co-authored-by: Shoichi Kagawa <[email protected]> Co-authored-by: Felix Uellendall <[email protected]> Co-authored-by: ashwinshankar77 <[email protected]> Co-authored-by: Ashwin Shankar <[email protected]> Co-authored-by: Nathan Hadfield <[email protected]> Co-authored-by: Neil Bhandari <[email protected]> Co-authored-by: Mariusz Strzelecki <[email protected]> Co-authored-by: Michał Słowikowski <[email protected]> Co-authored-by: michalslowikowski00 <[email protected]>
When the spark submit hook's
on_killis called, it tries to kill the yarn app, but doesn't do a kinit even if the keytab and principal is supplied. This would mean that theyarn applicaition -kill <app id>will fail when yarn needs kerberos authentication, and the app will continue to run.This change tries to do an explicit
kinitbefore killing the YARN app(if keytab and principal are provided).Make sure to mark the boxes below before creating PR: [x]
In case of fundamental code change, Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in UPDATING.md.
Read the Pull Request Guidelines for more information.