Skip to content

Commit 6c29070

Browse files
zero323HyukjinKwon
authored andcommitted
[SPARK-23435][2.4][SPARKR][TESTS] Update testthat to >= 2.0.0
### What changes were proposed in this pull request? This PR backports #27359: - Update `testthat` to >= 2.0.0 - Replace of `testthat:::run_tests` with `testthat:::test_package_dir` - Add trivial assertions for tests, without any expectations, to avoid skipping. - Update related docs. ### Why are the changes needed? `testthat` version has been frozen by [SPARK-22817](https://issues.apache.org/jira/browse/SPARK-22817) / #20003, but 1.0.2 is pretty old, and we shouldn't keep things in this state forever. ### Does this PR introduce any user-facing change? No. ### How was this patch tested? - Existing CI pipeline: - Windows build on AppVeyor, R 3.6.2, testthtat 2.3.1 - Linux build on Jenkins, R 3.1.x, testthat 1.0.2 - Additional builds with thesthat 2.3.1 using [sparkr-build-sandbox](https://github.com/zero323/sparkr-build-sandbox) on c7ed64a R 3.4.4 (image digest ec9032f8cf98) ``` docker pull zero323/sparkr-build-sandbox:3.4.4 docker run zero323/sparkr-build-sandbox:3.4.4 zero323 --branch SPARK-23435 --commit c7ed64a --public-key https://keybase.io/zero323/pgp_keys.asc ``` 3.5.3 (image digest 0b1759ee4d1d) ``` docker pull zero323/sparkr-build-sandbox:3.5.3 docker run zero323/sparkr-build-sandbox:3.5.3 zero323 --branch SPARK-23435 --commit c7ed64a --public-key https://keybase.io/zero323/pgp_keys.asc ``` and 3.6.2 (image digest 6594c8ceb72f) ``` docker pull zero323/sparkr-build-sandbox:3.6.2 docker run zero323/sparkr-build-sandbox:3.6.2 zero323 --branch SPARK-23435 --commit c7ed64a --public-key https://keybase.io/zero323/pgp_keys.asc ```` Corresponding [asciicast](https://asciinema.org/) are available as 10.5281/zenodo.3629431 [![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.3629431.svg)](https://doi.org/10.5281/zenodo.3629431) (a bit to large to burden asciinema.org, but can run locally via `asciinema play`). ---------------------------- Continued from #27328 Closes #27379 from HyukjinKwon/testthat-2.0. Authored-by: zero323 <[email protected]> Signed-off-by: HyukjinKwon <[email protected]>
1 parent ad9f578 commit 6c29070

File tree

8 files changed

+31
-15
lines changed

8 files changed

+31
-15
lines changed

R/pkg/tests/fulltests/test_context.R

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -93,6 +93,7 @@ test_that("rdd GC across sparkR.stop", {
9393
countRDD(rdd3)
9494
countRDD(rdd4)
9595
sparkR.session.stop()
96+
expect_true(TRUE)
9697
})
9798

9899
test_that("job group functions can be called", {
@@ -105,6 +106,7 @@ test_that("job group functions can be called", {
105106
suppressWarnings(cancelJobGroup(sc, "groupId"))
106107
suppressWarnings(clearJobGroup(sc))
107108
sparkR.session.stop()
109+
expect_true(TRUE)
108110
})
109111

110112
test_that("job description and local properties can be set and got", {
@@ -143,6 +145,7 @@ test_that("utility function can be called", {
143145
sparkR.sparkContext(master = sparkRTestMaster)
144146
setLogLevel("ERROR")
145147
sparkR.session.stop()
148+
expect_true(TRUE)
146149
})
147150

148151
test_that("getClientModeSparkSubmitOpts() returns spark-submit args from whitelist", {
@@ -246,4 +249,5 @@ test_that("SPARK-25234: parallelize should not have integer overflow", {
246249
# 47000 * 47000 exceeds integer range
247250
parallelize(sc, 1:47000, 47000)
248251
sparkR.session.stop()
252+
expect_true(TRUE)
249253
})

R/pkg/tests/fulltests/test_includePackage.R

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -39,6 +39,7 @@ test_that("include inside function", {
3939
data <- lapplyPartition(rdd, generateData)
4040
actual <- collectRDD(data)
4141
}
42+
expect_true(TRUE)
4243
})
4344

4445
test_that("use include package", {
@@ -55,6 +56,7 @@ test_that("use include package", {
5556
data <- lapplyPartition(rdd, generateData)
5657
actual <- collectRDD(data)
5758
}
59+
expect_true(TRUE)
5860
})
5961

6062
sparkR.session.stop()

R/pkg/tests/fulltests/test_sparkSQL.R

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1409,6 +1409,7 @@ test_that("column operators", {
14091409
c5 <- c2 ^ c3 ^ c4
14101410
c6 <- c2 %<=>% c3
14111411
c7 <- !c6
1412+
expect_true(TRUE)
14121413
})
14131414

14141415
test_that("column functions", {

R/pkg/tests/fulltests/test_textFile.R

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -75,6 +75,7 @@ test_that("several transformations on RDD created by textFile()", {
7575
collectRDD(rdd)
7676

7777
unlink(fileName)
78+
expect_true(TRUE)
7879
})
7980

8081
test_that("textFile() followed by a saveAsTextFile() returns the same content", {

R/pkg/tests/run-all.R

Lines changed: 17 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,6 @@ library(SparkR)
2020

2121
# SPARK-25572
2222
if (identical(Sys.getenv("NOT_CRAN"), "true")) {
23-
2423
# Turn all warnings into errors
2524
options("warn" = 2)
2625

@@ -60,11 +59,23 @@ if (identical(Sys.getenv("NOT_CRAN"), "true")) {
6059
if (identical(Sys.getenv("NOT_CRAN"), "true")) {
6160
# set random seed for predictable results. mostly for base's sample() in tree and classification
6261
set.seed(42)
63-
# for testthat 1.0.2 later, change reporter from "summary" to default_reporter()
64-
testthat:::run_tests("SparkR",
65-
file.path(sparkRDir, "pkg", "tests", "fulltests"),
66-
NULL,
67-
"summary")
62+
63+
# TODO (SPARK-30663) To be removed once testthat 1.x is removed from all builds
64+
if (grepl("^1\\..*", packageVersion("testthat"))) {
65+
# testthat 1.x
66+
test_runner <- testthat:::run_tests
67+
reporter <- "summary"
68+
69+
} else {
70+
# testthat >= 2.0.0
71+
test_runner <- testthat:::test_package_dir
72+
reporter <- testthat::default_reporter()
73+
}
74+
75+
test_runner("SparkR",
76+
file.path(sparkRDir, "pkg", "tests", "fulltests"),
77+
NULL,
78+
reporter)
6879
}
6980

7081
SparkR:::uninstallDownloadedSpark()

appveyor.yml

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -42,10 +42,9 @@ install:
4242
# Install maven and dependencies
4343
- ps: .\dev\appveyor-install-dependencies.ps1
4444
# Required package for R unit tests
45-
- cmd: R -e "install.packages(c('knitr', 'rmarkdown', 'devtools', 'e1071', 'survival'), repos='https://cloud.r-project.org/')"
46-
# Here, we use the fixed version of testthat. For more details, please see SPARK-22817.
47-
- cmd: R -e "devtools::install_version('testthat', version = '1.0.2', repos='https://cloud.r-project.org/')"
48-
- cmd: R -e "packageVersion('knitr'); packageVersion('rmarkdown'); packageVersion('testthat'); packageVersion('e1071'); packageVersion('survival')"
45+
- cmd: R -e "install.packages(c('knitr', 'rmarkdown', 'e1071', 'survival', 'arrow'), repos='https://cloud.r-project.org/')"
46+
- cmd: R -e "install.packages(c('crayon', 'praise', 'R6', 'testthat'), repos='https://cloud.r-project.org/')"
47+
- cmd: R -e "packageVersion('knitr'); packageVersion('rmarkdown'); packageVersion('testthat'); packageVersion('e1071'); packageVersion('survival'); packageVersion('arrow')"
4948

5049
build_script:
5150
- cmd: mvn -DskipTests -Psparkr -Phive package

docs/README.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -22,9 +22,8 @@ $ sudo gem install jekyll jekyll-redirect-from pygments.rb
2222
$ sudo pip install Pygments
2323
# Following is needed only for generating API docs
2424
$ sudo pip install sphinx pypandoc mkdocs
25-
$ sudo Rscript -e 'install.packages(c("knitr", "devtools", "rmarkdown"), repos="https://cloud.r-project.org/")'
25+
$ sudo Rscript -e 'install.packages(c("knitr", "devtools", "testthat", "rmarkdown"), repos="https://cloud.r-project.org/")'
2626
$ sudo Rscript -e 'devtools::install_version("roxygen2", version = "5.0.1", repos="https://cloud.r-project.org/")'
27-
$ sudo Rscript -e 'devtools::install_version("testthat", version = "1.0.2", repos="https://cloud.r-project.org/")'
2827
```
2928

3029
Note: If you are on a system with both Ruby 1.9 and Ruby 2.0 you may need to replace gem with gem2.0.

docs/building-spark.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@ This will build Spark distribution along with Python pip and R packages. For mor
5858
You can specify the exact version of Hadoop to compile against through the `hadoop.version` property.
5959
If unset, Spark will build against Hadoop 2.6.X by default.
6060

61-
You can enable the `yarn` profile and optionally set the `yarn.version` property if it is different
61+
You can enable the `yarn` profile and optionally set the `yarn.version` property if it is different
6262
from `hadoop.version`.
6363

6464
Examples:
@@ -236,8 +236,7 @@ The run-tests script also can be limited to a specific Python version or a speci
236236

237237
To run the SparkR tests you will need to install the [knitr](https://cran.r-project.org/package=knitr), [rmarkdown](https://cran.r-project.org/package=rmarkdown), [testthat](https://cran.r-project.org/package=testthat), [e1071](https://cran.r-project.org/package=e1071) and [survival](https://cran.r-project.org/package=survival) packages first:
238238

239-
Rscript -e "install.packages(c('knitr', 'rmarkdown', 'devtools', 'e1071', 'survival'), repos='https://cloud.r-project.org/')"
240-
Rscript -e "devtools::install_version('testthat', version = '1.0.2', repos='https://cloud.r-project.org/')"
239+
Rscript -e "install.packages(c('knitr', 'rmarkdown', 'devtools', 'testthat', 'e1071', 'survival'), repos='https://cloud.r-project.org/')"
241240

242241
You can run just the SparkR tests using the command:
243242

0 commit comments

Comments
 (0)