Skip to content

[Spark][1.0] Fix a data loss bug in MergeIntoCommand #2176

[Spark][1.0] Fix a data loss bug in MergeIntoCommand

[Spark][1.0] Fix a data loss bug in MergeIntoCommand #2176

Workflow file for this run

name: "Delta Lake Tests"
on: [push, pull_request]
jobs:
build:
name: "Prepare Env"
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v2
- name: install java
uses: actions/setup-java@v2
with:
distribution: 'zulu'
java-version: '8'
- name: Cache Scala, SBT
uses: actions/cache@v2
with:
path: |
~/.sbt
~/.ivy2
~/.cache/coursier
key: delta-sbt-cache
- name: Install Job dependencies
shell: bash -l {0}
run: |
sudo apt-get update
sudo apt-get install -y make build-essential libssl-dev zlib1g-dev libbz2-dev libreadline-dev libsqlite3-dev wget curl llvm libncurses5-dev libncursesw5-dev xz-utils tk-dev libffi-dev liblzma-dev python-openssl git
sudo apt install libedit-dev
sudo apt install python3-pip --fix-missing
sudo pip3 install pipenv
curl https://pyenv.run | bash
export PATH="~/.pyenv/bin:$PATH"
eval "$(pyenv init -)"
eval "$(pyenv virtualenv-init -)"
pyenv install 3.7.4
pyenv global system 3.7.4
pipenv --python 3.7 install
pipenv run pip install pyspark==3.1.1
pipenv run pip install flake8==3.5.0 pypandoc==1.3.3
pipenv run pip install importlib_metadata==3.10.0
- name: Run Scala/Java and Python tests
run: |
pipenv run python run-tests.py