Tuesday, October 2, 2018

Executors and Futures in Java

This is part of an experiment. It is "code as blog". This entire blog post is just documented Java code.

package software.coop.know.future;

import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.concurrent.ArrayBlockingQueue;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.ForkJoinPool;
import java.util.concurrent.Future;
import java.util.concurrent.ThreadPoolExecutor;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.stream.Collectors;
import java.util.stream.DoubleStream;


/**
 * In this class, we will look at the most common way to interact with Futures -- via executors.
 */
public class FuturesWithExecutors {

    public static void main(String... args) throws Exception {
       doExecutorService();
       doFutures();
    }

    /** This is just a utility method to sleep without a checked exception.
     *
     * @param ms Number of milliseconds to sleep.
     */
    private static void sleepWithoutException(long ms) {
        try {
            Thread.sleep(ms);
        } catch (InterruptedException e) {
            throw new RuntimeException(e);
        }
    }

    /** A look at the ExecutorService class and how it is used...
     *
     * @throws InterruptedException
     */
    private static void doExecutorService() throws InterruptedException {
        // An ExecutorService is a service that does work off the thread you call it from. They come in many forms, but
        // generally they have a pool of threads that pull units of work off a queue, execute them, then pull the next
        // one.
        //
        // There is also the the Executors class, which has some utility methods for quickly creating executor services.
        //
        // Let's start with the dead simple example...

        ExecutorService executorService = Executors.newSingleThreadExecutor();

        // This created an executor service with a single thread to do work. So if we do...

        System.out.println("Submitting Job 1 from " + Thread.currentThread().getName());
        executorService.submit(() -> {
            System.out.println("Starting Job 1 on " + Thread.currentThread().getName());
            sleepWithoutException(2000);
            System.out.println("Finishing Job 1");
        });
        System.out.println("Submitting Job 2 from " + Thread.currentThread().getName());
        executorService.submit(() -> {
            System.out.println("Starting Job 2 on " + Thread.currentThread().getName());
            sleepWithoutException(2000);
            System.out.println("Finishing Job 2");
        });
        System.out.println("Submitted Job 2");


        executorService.shutdown();
        executorService.awaitTermination(1, TimeUnit.MINUTES);

        System.out.println("---------------------------------------------------------------------------------");

        // This will give us the following output :
        //
        // Submitting Job 1 from main          // #1 into the
        // Submitting Job 2 from main          // #1 is finished submitting
        // Starting Job 1 on pool-1-thread-1   // #1 begins running
        // Submitted Job 2                     // Since #1 is out of the queue and running, submit() for #2 completes
        // Finishing Job 1                     // #1 finishes
        // Starting Job 2 on pool-1-thread-1   // #2 begins running
        // Finishing Job 2                     // #2 finishes
        //
        // Now we have to do shutdown() and awaitTermination() to prevent the JVM from just shutting down on us. The
        // thread in the Executor is a daemon thread, which means it won't prevent the JVM from terminating when "main"
        // is done.
        //
        // shutdown() tells the Executor to stop accepting new work.
        // awaitTermination() waits for a given amount of time, blockingly, until all the jobs that are in the queue
        // have finished.
        //
        // This was perhaps the simplest example possible. Now lets look at perhaps the most complex...

        final AtomicInteger index = new AtomicInteger(0);
        executorService = new ThreadPoolExecutor(
                1,                              // a minimum number of threads.
                5,                              // a maximum number of threads
                2, TimeUnit.SECONDS,            // a time to wait before growing the pool
                new ArrayBlockingQueue<>(10),   // queue of tasks
                r -> {                          // a custom ThreadFactory
                    int thread = index.getAndIncrement();
                    System.out.println("Creating thread "+thread);
                    Thread t = new Thread(r);
                    t.setName("Custom Thread " + thread);
                    t.setDaemon(true);
                    return t;
                },
                new ThreadPoolExecutor.CallerRunsPolicy() // A policy for jobs that are rejected from the queue
                                                          // "CallerRunsPolicy" means that if you can't accept a new
                                                          // task, run it immediately on the calling thread.

        );

        for (int i = 0; i < 40; i++) {
            final int idx = i;
            executorService.submit(
                    () -> {
                        long runtime = 1900 + Math.round(Math.random() * 200D); // sleep for random time.
                        System.out.println("Starting " + idx + " for " + runtime + " ms on "
                                + Thread.currentThread().getName());
                        sleepWithoutException(runtime);
                        System.out.println("Finishing " + idx);
                    }
            );
            sleepWithoutException(100);
        }

        executorService.shutdown();
        executorService.awaitTermination(1, TimeUnit.MINUTES);
        System.out.println("---------------------------------------------------------------------------------");

        // This gives us output akin to the following...
        //
        // Creating thread 0                           < create the first thread
        // Starting 0 for 1986 ms on Custom Thread 0   < start task
        // Creating thread 1                           < grow the thread bool.
        // Starting 11 for 1928 ms on Custom Thread 1  < start the next task. Notice this ISN'T #1, it is the first job
        //                                               that won't fit in the queue
        // Creating thread 2                           < grow again.
        // Starting 12 for 2010 ms on Custom Thread 2
        // Creating thread 3
        // Starting 13 for 2001 ms on Custom Thread 3
        // Creating thread 4                           < Max thread pool size
        // Starting 14 for 2074 ms on Custom Thread 4
        // Starting 15 for 1999 ms on main             < Since we are now at max threads, and the queue is full,
        //                                               job 15 executes on the "main" thread inline with our our call
        //                                               to submit it.
        // Finishing 0
        // Starting 1 for 2042 ms on Custom Thread 0   < We just now pull the second job off the queue
        // Finishing 11
        // Starting 2 for 1993 ms on Custom Thread 1

        // As you can see, jobs are not necessarily executed in a FIFO manner, especially if you have a variable sized
        // thread pool.
    }

    /** Using Futures with Executors
     *
     */
    private static void doFutures() throws InterruptedException {
        // In the previous example, we looked entirely at submitting "Runnables" to our ExecutorService. But sometimes,
        // you want to get a result back from a task running off thread. Let's look at that.
        System.out.println("doFutures() ---------------------------------------------------------------------");
        ExecutorService executorService = Executors.newFixedThreadPool(2);

        List<Double> doubles = Arrays.asList( 0D, 1D, 2D, 3D, 4D, 5D);

        List<Future<String>> futures = doubles.stream()
                .map(d-> executorService.submit(()->{
                        long runtime =  Math.round(Math.random() * 2000D);
                        System.out.println("Running on "+Thread.currentThread().getName()+ " for "+runtime);
                        sleepWithoutException(runtime);
                        return Double.toString(d * Math.PI) +" from "+Thread.currentThread().getName();
                    })
                ).collect(Collectors.toList());
        futures.forEach(future -> {
            try {
                System.out.println(future.get());
            } catch (ExecutionException |InterruptedException e) {
                e.printStackTrace();
            }
        });

        executorService.shutdown();
        executorService.awaitTermination(1, TimeUnit.MINUTES);
        System.out.println("---------------------------------------------------------------------------------");

        // This gives us the output:
        //
        // Running on pool-1-thread-2 for 695
        // Running on pool-1-thread-1 for 173
        // 0.0 from pool-1-thread-1
        // Running on pool-1-thread-1 for 1135
        // 3.141592653589793 from pool-1-thread-2
        // Running on pool-1-thread-2 for 915
        // 6.283185307179586 from pool-1-thread-1
        // Running on pool-1-thread-1 for 316
        // 9.42477796076938 from pool-1-thread-2
        // Running on pool-1-thread-2 for 409
        // 12.566370614359172 from pool-1-thread-1
        // 15.707963267948966 from pool-1-thread-2

        // You can see that this is obviously in order from our list of Doubles, but it is also "As Fast As Possible"
        // with two threads. Why? Because despite the fact that our execution time varies wildly, we iterate over the
        // mapped Futures in order. That a job down the queue finished before a previous one doesn't stop the
        // ExecutorService from continuing to run. The value of the Callable is contained in the Future. So if a later
        // one finished before the one we are waiting on, it is just a long pole small pole problem.

        // In all of the examples so far, we have created an ExcecutorService to control threads, or queue size or
        // whatever. Java does have a default one we can use that should have some reasonable defaults:
        // The ForkJoinPool.

        // The ForkJoinPool is used when you use language-level parallelism. For example:

        DoubleStream.of(1D, 2D, 3D, 4D, 5D).parallel()
                .forEach(d-> {
                    sleepWithoutException(100);
                    System.out.println( d + " from "+Thread.currentThread().getName());
                });
        System.out.println("---------------------------------------------------------------------------------");

        // This gives us something like:
        //
        // 5.0 from ForkJoinPool.commonPool-worker-2
        // 2.0 from ForkJoinPool.commonPool-worker-1
        // 4.0 from ForkJoinPool.commonPool-worker-4
        // 3.0 from main
        // 1.0 from ForkJoinPool.commonPool-worker-3

        // 3.0 on Main? Why? Who knows. This is Java making a guess about the pool size based on the number of cores on
        // my machine (8) and whatever other heuristic it uses.

        // The important think here is you can get at this "generic" executor service the same way .parallel() does...

        ForkJoinPool forkJoin = ForkJoinPool.commonPool();
        List<Future<Double>> futureDoubles = new ArrayList<>(20);
        for(double d = 0; d < 20D; d++) {
            double finalD = d;
            futureDoubles.add(forkJoin.submit(() -> {
                sleepWithoutException(2000);
                System.out.println("Computing on " + Thread.currentThread().getName());
                return finalD * Math.PI;
            }));
        }
        futureDoubles.forEach(f-> {
            try {
                System.out.println(f.get());
            } catch (ExecutionException|InterruptedException e) {
                e.printStackTrace();
            }
        });

        System.out.println("---------------------------------------------------------------------------------");

        // This gives us something like:

        // Computing on ForkJoinPool.commonPool-worker-5
        // Computing on ForkJoinPool.commonPool-worker-6
        // 0.0
        // Computing on ForkJoinPool.commonPool-worker-4
        // Computing on ForkJoinPool.commonPool-worker-2
        // Computing on ForkJoinPool.commonPool-worker-3
        // Computing on ForkJoinPool.commonPool-worker-1
        // Computing on ForkJoinPool.commonPool-worker-7
        // 3.141592653589793
        // 6.283185307179586
        // 9.42477796076938

    }
}

Wednesday, August 29, 2018

"Unit Testing" and Third Party Software

It is legit not my intention to make this a "testing" blog, but as I started this, I found myself for the first time in a testing role, so this is stuff at the top of mind. One thing I want to talk about, though, is "What is 'Unit Testing'?"

So one of the prime directives from the "Unit Testing" world is "Don't test software that isn't yours". This is a fine idea but there is are trap around it into which you don't want to fall. One specifically I want to discuss here:

Your configuration information IS YOUR SOFTWARE.

Let's pick an easy example: Hibernate. If you are building Java software, some form of JPA, and probably Hibernate is in your stack.

So lets talk about queries. Maybe you are using something with a dynamic proxy system. Maybe you are using compiled queries directly with your EntityManager. Doesn't matter. Your ANNOTATIONS are code, and should be tested.

Do you have to test the dynamic proxy generation? No. But if you have a DAO that looks like:

@Query("SELECT o FROM Foo WHERE o.value like '%:bar%')

List<Foo> fooValuesWithBar(String bar)

Should you be writing unit tests around whether the dynamic proxy correctly interprets your query? No. "Noy my yob mah." But making sure that all the configuration information in the annotation you wrote is correct is your job. If you are not writing a unit test that covers the annotation as code, you don't really have coverage.

The long and the short of this, is if you are using a tool to dynamically generate DAOs and your unit tests aren't going all the way to the database, you aren't actually covering your code, because the metadata about how the DAO framework will construct your queries is your code. Again, let's consider the unit test you should have around fooValuesWithBar()...  If you insert a stray character into the @Query, then your unit test should fail. The fact that you aren't writing the actual implementation of fooValuesWithBar() on this interface doesn't matter. You have defined the functionality with the annotation, so you need a test around it. The annotation is code.

So let's not just bitch about it, let's solve some problems.

So if you are using Hibernate/TopLink/EclipseLink, your code should be database portable. Is it? Do you care if it is not? At my current gig we are using Flyway as part of Spring Boot to do database migrations, but that involves writing SQL files. As soon as you get into writing SQL files, you have given up on DB portability. That said, the other option is trusting the JPA provider to update your schema. As much as people SAY that is a thing that can happen, I personally don't trust it.

That said, for the purposes of unit tests, there is no reason you can't rely on your JPA provider to create a schema it thinks is reasonable. That is, you can write DAO/Entity tests against in-memory Hypersonic/Derby/JavaDB and feel those are good tests. Database migration with a tool like Flyway can still be a thing, but you can pass that off to an "integration test" without feeling like you have lost something...  mostly.

So let's go with some rules:


  1. Don't mock a DAO unless you REALLY know what you are doing. Mocking things that are loaded with configuration is, IMHO, fraught. Does your test provide value? Well, that assumes the things you are mocking comply with production systems. Mocking an external service for which you have a contract test is OK. Mocking a DAO/Service/Other Dynamic Proxy where you aren't sure your annotations are correct?
  2. Use your DAO to persist and read outside of your test, rather than use verify calls. Something in a database is real. save(any(Foo.class)) is a crutch. Create a transient database if you need to.
  3. This doesn't just apply to DAOs. Anything with Annotation-specified behavior should be unit tested. This means custom XML/JSON (de)serialization rules, too.

Quick Tip: Images in React-Native on Android Not Loading

So something I ran into recently that I never found any good tips around.

We had a problem with static asset images not painting on Android. It appears that if there is a state-triggered repaint while the image is being spun up from a drawable, it never fully paints the image on the screen.

Typically people (read: the react-native docs) tell you to do your images something like:

<Image source={require('./my-icon.png')} />

That mostly works, but require returns a Promise and it seems like when something goes weird in the paint lifecycle stuff can go bad. There is lots of discussion out there about using the resolve asset functions from the image library, but this causes much weirdness between the debug and release variants of your app. You can do Promise.all([]) from UNSAFE_componentWillMount. But there is an easier way!

import myIcon from './my-icon.png';

Why is this better than const myIcon = require('./my-icon.png')? Well import still does the same thing under the hood that require() does. The difference is import demands that all the required things are fully resolved before it begins evaluating the the script at all. This means that your image assets are guaranteed to be loaded before the script evaluates. Lemon squeezy.


Thursday, May 31, 2018

Mobile BDD with Appium and Cucumber: Capturing Testing Data (Part 3)

Of a series: Part 1, Part 2.

The code for this exercise is available on the WITH_GIF branch.

One of the problems with doing automated UI testing in a CI environment is understanding failures. Today we are going to look at extending our Cucumber drivers to help with that. We are going to make a recording of what we are doing on the client side, and capture the log information from the client when there is a failure.

Cucumber for Java, like JUnit or TestNG or whatever else you might for testing has a @Before and @After annotation that you can use to set up state for a test. The thing is, the "test" here is going to be a Scenario in your Cucumber tests. We are going to start here, though, with a before and after Step bit of code, so we need to do that ourselves.  Revisiting our BaseSteps class...


private void beforeStep() {
    
}

private void afterStep() {
    
}


private void doStep(ThrowRunnable runnable) throws Exception {
    beforeStep();
    try {
        runnable.run();
    } finally {
        afterStep()
;    }
}

private interface ThrowRunnable {
    void run() throws Exception;
}

Here we have created a method we can use to wrap a step with a generic beforeStep() and afterStep() method. We will need to get these invoked, but with Java 8+ closures, this is easy. We simply do a no args call in each of our step methods.

@Then("the \"(.*)\" is gone")
public void assertMissing(String text) throws Exception {
    doStep(()->strategy.assertMissing(text));
}

Now, let's start by getting a screenshot and logs before and after each step. We will create a Recorder class with some static fields we will use to capture this information.

public class Recorder {
    private static final Logger LOGGER = Logger.getLogger(
                Recorder.class.getCanonicalName()
    );
    private static List<File> IMAGES;
    private static List<LogEntry> LOGS;
    public static void record(File file) {
        IMAGES.add(file);
    }

    public static void log(List<LogEntry> logs){
        LOGS = logs;
    }
}


Now, let's instrument our platform strategies to give us this information. For Android:

@Override
public List<LogEntry> getLogEntries() {
    return getDriver().manage().logs().get("logcat").filter(Level.ALL);
}
@Override
public File getScreenshotAsFile() {
    return getDriver().getScreenshotAs(OutputType.FILE);
}

... and iOS:

@Override
public List<LogEntry> getLogEntries() {
    List<LogEntry> allEntries = new ArrayList<>();
    getDriver().manage().logs().getAvailableLogTypes()
            .stream()
            .filter(Objects::nonNull)
            .flatMap(s -> {
                try {
                    return getDriver().manage().logs().get(s)
                                      .filter(Level.ALL).stream();
                } catch (Exception e) {
                    return Stream.empty();
                }
            })
            .filter(Objects::nonNull)
            .forEach(allEntries::add);
    allEntries.sort((o1, o2) -> Long.compare(o2.getTimestamp(), 
                                             o1.getTimestamp()));
    return allEntries;
}

public File getScreenshotAsFile() {
    return getDriver().getScreenshotAs(OutputType.FILE);
}

Since iOS has a few different log files, we need to merge them all together into a single sorted list. For Android, hey, "logcat" is probably what we want anyway. Each of the drivers will give us a screenshot to a temp file.

Now, let's revisit the beforeStep() and afterStep() we created earlier, and capture all this information.

private void beforeStep() {
    Recorder.record(strategy.getScreenshotAsFile());
}

@SuppressWarnings("unchecked")
private void afterStep() {
    Recorder.log(strategy.getLogEntries());
    Recorder.record(strategy.getScreenshotAsFile());
}

So we get a screenshot before and after each step, and record the logs after each step.

Now let's bring it all together and persist our information for failing Scenarios. We can do this by adding the @Before and @After hook annotations to our recorder class. This will create a new instance of the class, but we can still refer to the static variables.

@Before
public void initialize() {
    IMAGES = new ArrayList<>();
    LOGS = new ArrayList<>();
} @After public void finalize(Scenario scenario) throws IOException { if (scenario.isFailed()) { File outDir = new File("build/cucumber-images"); outDir.mkdirs(); outDir.mkdir(); if(IMAGES.isEmpty()){ return; } BufferedImage first = ImageIO.read(IMAGES.iterator().next()); File destination = new File(outDir, scenario.getName().replaceAll("[^\\w]", "_") + ".gif"); try ( ImageOutputStream outputStream = new FileImageOutputStream(destination); AnimatedGIFEncoder encoder = new AnimatedGIFEncoder(outputStream, first.getType(), 750, true)) { IMAGES.stream() .map(f -> {
                        try {
                            return ImageIO.read(f);
                        } catch (Exception e) {
                            throw new RuntimeException(e);
                        }
                    })
                    .forEach(i -> {
                        try {
                            encoder.writeToSequence(i);
                        } catch (IOException e) {
                            throw new RuntimeException(e);
                        }
                    });
        }
        LOGGER.info("Wrote scenario animation to " + 
                    destination.getAbsolutePath());
        ByteArrayOutputStream baos = new ByteArrayOutputStream();
        ByteStreams.copy(new FileInputStream(destination), baos);
        scenario.embed(baos.toByteArray(), "image/gif");
        scenario.embed(logFile(), "text/plain");
    }
}

private byte[] logFile(){
    StringBuilder sb = new StringBuilder();
    LOGS.stream()
            .map(e-> new Date(e.getTimestamp()) + "," + 
                e.getLevel().getName() + ", " + e.getMessage()
            )
            .forEach(line-> sb.append(line).append("\n"));
    return sb.toString().getBytes(Charsets.UTF_8);
}


So in our @Before we initialize the static members. Then in the @After we finalize everything. If there are no images we can bounce. If there are, we will create an AnimatedGIFEncoder class and add all the images to it. I'm not going to get into the image processing thing, but you should pay attention to the last two methods of the finalize() method: by getting the Cucumber Scenario object passed into the method at the end, we can embed other data in the results by MIME type.

Now if we want to see the data we collect, we can add a reporting plugin to our build.gradle file:

buildscript {
    repositories {
        maven {
            url "http://repo.bodar.com"
        }
        maven {
            url "https://plugins.gradle.org/m2/"
        }
    }
    dependencies {
        classpath "com.github.samueltbrown:gradle-cucumber-plugin:0.9"
        classpath "gradle.plugin.com.github.spacialcircumstances:" +
              "gradle-cucumber-reporting:0.0.11"
    }
}

plugins {
    id 'java'
    id "com.github.samueltbrown.cucumber" version "0.9"
    id 'idea'
    id "com.github.spacialcircumstances.gradle-cucumber-reporting" version "0.0.11"
}

cucumberReports {
    outputDir = file("$project.buildDir/reports")
    buildName = '0'
    reports = files("$project.buildDir/cucumber.json")
}
// stuff here


tasks.cucumber.finalizedBy generateCucumberReports

Now when our cucumber gradle cucumber task runs, we will get a report telling us what failed, like so:



(this image not animated)

Monday, May 7, 2018

Multi-Platform BDD with Cucumber and Appium (Part 2)

Previously, on Battlestar Galactica...

In the last post, we set up a simple BDD test for an Android app. We defined some general BaseSteps that allowed us to look for text on the screen and click it. In this exercise we are going to abstract this out so that we can perform the same test on multiple platforms using Guice injection.

Source code for this version with the changes from Part 1 is available on the WITH_GUICE branch on GitHub.

As you saw in the last article, the biding between step definitions and feature files is handled via Java annotations. By adding the cucumber-guice library we can provide a platform strategy that will let us define these common step definitions across platforms based on execution time parameters.

If you are like me, you are going to be wondering about the decisions that have been make here from an architecture standpoint. So let's cover some of the weirdness here...

  1. cucumber-guice doesn't seem to know how to introspect abstract classes for step definitions. You need a concrete class for that.
  2. If you have two concrete classes with matching @Override methods, it is going to yell at you about duplicate step definitions.
  3. I have elected to overcome these difficulties by using a "Strategy" pattern. We have preserved our BaseSteps class from the previous example, but now it simply delegates calls to a strategy implementation @Inject-ed when it is created.
  4. We're going to spend more time looking at design patterns for BDD tests in the next post, but for now we just want to add muti-platform support to our testing scenario.
All that said, let's get started...

The first thing we want to do us update our dependencies in the integration module from the previous example we are adding two lines here:

dependencies {
    cucumberRuntime 'info.cukes:cucumber-java:1.2.5'
    cucumberCompile 'info.cukes:cucumber-java:1.2.5'
    cucumberCompile 'info.cukes:cucumber-guice:1.2.5'
    cucumberCompile 'com.google.inject:guice:4.2.0'
    cucumberCompile 'junit:junit:4.12'
    cucumberCompile 'io.appium:java-client:3.3.0'
}


The cucumber-guice dependency is new. Because, for some FSM damned reason, the cucumber-guice module doesn't have a declared dependency on Guice, we also add a dependency on Guice. I really got nothing here. You have to do it or it just won't work.

Now we have the option to inject dependencies into our step definitions. If all you want is some simple state definitions, you can do that with the regular javax.inject annotation, or the Cucumber annotations, but we will get to that. Right now, though, we want to change our execution based on platform, so we need to modify our build.gradle to pass a new parameter to the test execution.

cucumber {
    formats = ['pretty', 'json:build/cucumber.json', 'junit:build/cucumber.xml']
    jvmOptions {
        maxHeapSize = '512m'
        environment 'apk', rootProject.project(":app").buildDir.getAbsolutePath() +
                "/outputs/apk/debug/app-debug.apk"
        environment 'platform', System.getProperty("platform") == null ?
                "android" :
                System.getProperty("platform");
    }
}

So we are just going to read a system property here and default to android if it is undefined, then pass that value to the execution environment for Cucumber.

Now, we have a bit of boilerplate to create. Again going to interesting design decisions, cucumber-guice doesn't allow you to simply declare modules to include. However, you can provide a factory class for the Guice injector it will use. Just so you understand the subtleties of this, lets be clear about what is happening here: Cucumber scans the classpath for method implementations with a step definition (@Given, @When, @Then), and then with cucumber-guice, asks the Injector for an implementation of that class. This is why you can't have multiple simple implementations of a step definition floating around, and why we have opted for a strategy pattern to provide multi-platform implementations. Again, we will look at this in more detail in the next post. All this hand-waving aside, we need to create an InjectorSource implementation:

public class ConfiguredInjectorSource implements InjectorSource {
    @Override
    public Injector getInjector() {
        return Guice.createInjector(CucumberModules.SCENARIO,
           new CucumberModule());
    }
}

When we create our Guice injector, we need to include the SCENARIO module. This defines a @ScenarioScoped annotation that we can use later when creating our injected classes. We aren't going to use it for this exercise, but it is required to bootstrap cucumber-guice. Here we have added this default module and our module.

Finally, we need to declare the CucumberModule class. We are going to read the environment variable we defined in our Gradle file and do an implementation swap.

public class CucumberModule extends AbstractModule {

    private static final Logger LOGGER = Logger.getLogger(
       CucumberModule.class.getCanonicalName()
    );
    @Override
    protected void configure() {
        String platform = System.getenv("platform");
        LOGGER.info("Configuring run for platform: "+platform);
        switch(platform){
            case "android":
                bind(BaseStepsStrategy.class)
                   .to(AndroidBaseSteps.class)
                   .in(Singleton.class);
                break;
            case "ios":
                bind(BaseStepsStrategy.class)
                   .to(IOSBaseSteps.class)
                   .in(Singleton.class);
                break;
            default:
                throw new RuntimeException(
                   "Unknown platform environment variable: " +
                   platform);
        }
    }
}


This feels like a crazy number of lines for what we are doing here, but you know, cope. We read the environment variable, we log it so the user knows what is going on, then we do a switch around the BaseStepsStrategy to provide an implementation.

This is all well and good, but our test execution is still going to depend on BaseSteps from the previous exercise. So what we are going to do is rework this class to delegate to our strategy implementations, declare itself as a @Singleton, and get the strategy implementation injected into it.

@Singleton
public class BaseSteps {
    private final BaseStepsStrategy strategy;
    @Inject
    public BaseSteps(BaseStepsStrategy baseSteps){
        this.strategy = baseSteps;
    }

    @Given("I have launched the application")
    public  void startApp() throws IOException{
        strategy.startApp();
    }

    @When("I click the \"(.*)\" button")
    public void clickByText(String text){
        strategy.clickByText(text);
    }

    @Then("the \"(.*)\" is gone")
    public void assertMissing(String text){
        strategy.assertMissing(text);
    }
}

What about all the code we had from the previous exercise? Well, we're gonna copy and paste that into our AndroidBaseSteps class, but first, we're gonna create a generic abstract strategy here.

public abstract class BaseStepsStrategy<T extends WebDriver> {

    private T driver;
    BaseStepsStrategy() throws MalformedURLException {
        this.driver = createDriver();
    }

    protected abstract T createDriver() throws MalformedURLException;
    T getDriver(){
        return this.driver;
    }

    public abstract void startApp() throws IOException;
    public abstract void clickByText(String text);
    public abstract void assertMissing(String text);
}

I know that WebDriver seems like a weird thing to extend from here, but MobileDriver for Appium actually extends from WebDriver -- I guess no one has gone back and just made "AppDriver" as a thing from the Selenium group.  Because we want the "driver" to continue to be a singleton in the runtime, we have kicked it out to a createDriver() method. The reason is obvious when we create our AndroidBaseSteps class.

public class AndroidBaseSteps extends BaseStepsStrategy<AndroidDriver> {

    public AndroidBaseSteps() throws MalformedURLException {
        super();    }

    @Override
    protected AndroidDriver createDriver() throws MalformedURLException {
        File app = new File(System.getenv("apk"));
        DesiredCapabilities capabilities = new DesiredCapabilities();
        capabilities.setCapability("deviceName","Android Emulator");
        capabilities.setCapability("app", app.getAbsolutePath());
        capabilities.setCapability("appPackage", "net.kebernet.appium_cucumber");
        capabilities.setCapability("appActivity", ".MainActivity");
        return new AndroidDriver<>(new URL("http://127.0.0.1:4723/wd/hub"),
           capabilities);
    }

    @Override
    public void startApp() throws IOException {
        getDriver().resetApp();
    }

    @Override
    public void clickByText(String text) {
        getDriver().findElementByAndroidUIAutomator(
                      "new UiSelector().textContains(\""+text+"\")")
                    .click();
    }

    @Override
    public void assertMissing(String text) {
        MobileElement element = null;
        try {
            element = (MobileElement) getDriver()
               .findElementByAndroidUIAutomator(
                   "new UiSelector().textContains(\"" + text + "\")");
        } catch(NoSuchElementException e){
            //expected exception;
        }
        assertTrue(element == null);
    }
}


Aside from a bit of boilerplate, this looks pretty much exactly like out BaseSteps class from the first example. We are creating the driver, by delegation, in the constructor, but since this is a generic implementation, we can use getDriver() everywhere and know we are starting with an AndroidDriver. We do have a new cast in the assertMissing() method, but that is a small price to pay.

We are now back to a "known good" state for our application and tests. Now, let's look at making our "integration" suite work with iOS.

We will start with our iOS app, that is basically the same as our Android app...

Next we need to implement our IOSBaseSteps class. This is going to look largely like our Android version we already have, only this time we are going to use the XPath selector. We are also adding a new configuration to the environment configuration in the build.gradle to pass a path to our .app build.

public class IOSBaseSteps extends BaseStepsStrategy<IOSDriver<MobileElement>> {


    public IOSBaseSteps() throws MalformedURLException {
        super();
    }

    @Override
    protected IOSDriver<MobileElement> createDriver() throws MalformedURLException {
        File app = new File(System.getenv("app"));
        DesiredCapabilities capabilities = new DesiredCapabilities();
        capabilities.setCapability(MobileCapabilityType.PLATFORM_NAME, "iOS");
        capabilities.setCapability(MobileCapabilityType.PLATFORM_VERSION, "11.3");
        capabilities.setCapability(MobileCapabilityType.DEVICE_NAME, "iPhone Simulator");
        capabilities.setCapability(MobileCapabilityType.APP, app.getAbsolutePath());
        return new IOSDriver<>(new URL("http://127.0.0.1:4723/wd/hub"), capabilities);
    }

    @Override
    public void startApp() throws IOException {
        getDriver().resetApp();
    }

    @Override
    public void clickByText(String text) {
        MobileElement element = getDriver().findElementByXPath(
              "//*[contains(@label, '"
              + text + "')]");
        element.click();
    }

    @Override
    public void assertMissing(String text) {
        MobileElement element = null;
        try {
            element = getDriver().findElementByXPath(
               "//*[contains(@label, '"
               + text + "')]");
        } catch(NoSuchElementException e){
            //expected execption;
        }
        assertNull(element);
    }

And our change to the build.gradle file:

cucumber {
    formats = ['pretty', 'json:build/cucumber.json', 'junit:build/cucumber.xml']
    jvmOptions {
        maxHeapSize = '512m'
        environment 'apk', rootProject.project(":app").buildDir.getAbsolutePath() +
                "/outputs/apk/debug/app-debug.apk"
        environment 'platform', System.getProperty("platform") == null ?
                "android" :
                System.getProperty("platform")

        environment 'app', rootProject.projectDir.getAbsolutePath() +
                "/ios/DerivedData/appium-cucumber/Build/Products/" +
                "Debug-iphonesimulator/appium-cucumber.app"
    }
}

Now we can run our sample tests on the iOS app:

$../gradlew -Dplatform=ios cucumber



> Task :integration:cucumber 
Gradle now uses separate output directories for each JVM language, but this build assumes a single directory for all classes from a source set. This behaviour has been deprecated and is scheduled to be removed in Gradle 5.0
May 07, 2018 12:10:39 PM cucumber.inject.CucumberModule configure
INFO: Configuring run for platform: ios
Feature: Click the button
  Clicking buttons is clever

  Scenario: I see a button and click it.  # features/hello.feature:4
    Given I have launched the application # BaseSteps.startApp()
    When I click the "Click Me" button    # BaseSteps.clickByText(String)
    Then the "Click Me" is gone           # BaseSteps.assertMissing(String)

1 Scenarios (1 passed)r
3 Steps (3 passed)
2m37.075s



BUILD SUCCESSFUL in 2m 57s

Friday, May 4, 2018

BDD For Android Hello World


Today I am going to demonstrate a very basic bootstrap of BDD for Android using Cucumber and Appium.

You can find all of the code on GitHub, natch.

The first step is to get Appium installed and working. I found this to be the best startup instructions, but it is a little old, so YMMV.

First, the app. We have a very simple Android app here with a button. You click it, and it goes away.


Now, we want to test our app. We need to set up a new Gradle module parallel to our application project, as the plugin environment we need to make this work isn't going to play well with the Android plugin. Here I have called the project "integration" to indicate this for performing integration testing.

Lets break down the build.gradle real fast:

Apply our plugins:

buildscript {
    repositories {
        maven {
            url "http://repo.bodar.com"
        }
        maven {
            url "https://plugins.gradle.org/m2/"
        }
    }
    dependencies {
        classpath "com.github.samueltbrown:gradle-cucumber-plugin:0.9"
    }
}
plugins {
    id 'java'
    id "com.github.samueltbrown.cucumber" version "0.9"
    id 'idea'
}

Next we need to make sure our app gets built for the Cucumber plugin to run.

tasks.cucumber.dependsOn(":app:assembleDebug")


And point the cucumber plugin to the debug build of the application, so we can find it. Here we are going to pass an environment variable pointing to the APK file we want to test.

cucumber {
    formats = ['pretty', 'json:build/cucumber.json', 'junit:build/cucumber.xml']
    jvmOptions {
        maxHeapSize = '512m'
        environment 'apk', rootProject.project(":app").buildDir.getAbsolutePath() +
                "/outputs/apk/debug/app-debug.apk"    
     }
}

repositories {
    jcenter()
    maven {
        url 'https://repository-saucelabs.forge.cloudbees.com/release'
    }
}

Next we set up the dependencies we need for the Cucumber environment. These come in as two new Gradle configuration scopes. Finally, we are just going to use a little config magic to make sure IntelliJ IDEA can properly resolve the dependencies while we are editing these things.

dependencies {
    cucumberRuntime 'info.cukes:cucumber-java:1.2.5'
    cucumberCompile 'info.cukes:cucumber-java:1.2.5'
    cucumberCompile 'junit:junit:4.12'
    cucumberCompile 'io.appium:java-client:3.3.0'
}

idea {
    module {
        testSourceDirs += file('src/cucumber/java')
        scopes.TEST.plus.add(configurations.cucumberCompile)
    }
}

sourceCompatibility = "1.8"
targetCompatibility = "1.8"

Cucumber allows us to write test directives using a natural language format. The idea here is that the natural language syntax is easier to maintain as your requirements change. Our sample .feature file in src/cucumber/resources looks like so:


Feature: Click the button
  Clicking buttons is clever

  Scenario: I see a button and click it.
    Given I have launched the application
    When I click the "Click Me" button
    Then the "Click Me" is gone

The first line is the name of the feature we are testing with this file. The second line is just a descriptor. It is then followed by 1..n "Scenarios" that are basically sequences of test steps. Here, the scenario is named "I see a button and click it."

Next we have a number of steps. These are in the format of "Given/When/Then". Given steps are basically there to establish the baseline for the test. Here, we are just launching the app, but you could have a "Give" that navigates to a page, or logs in, or whatever.  When steps are basically your interaction operations. Then steps are your assertion operations.

These steps, however, are going to require some code to work. So lets get into that. Cucumber is going to look for your steps to be defined in some Java classes with methods annotated with the step definition. You can use regular expressions to pull data from the step line, as well as some other {} -type templating.

So let's look at the first one: "Given I have launched the application." This is a simple method call which we annotate with the @Given annotation...


public class BaseSteps {
    protected AndroidDriver<MobileElement> driver;
    @Given("I have launched the application")
    public void startApp() throws IOException {
        File app = new File(System.getenv("apk"));
        DesiredCapabilities capabilities = new DesiredCapabilities();
        capabilities.setCapability("deviceName","Android Emulator");
        capabilities.setCapability("app", app.getAbsolutePath());
        capabilities.setCapability("appPackage", "net.kebernet.appium_cucumber");
        capabilities.setCapability("appActivity", ".MainActivity");
        driver = new AndroidDriver<>(new URL("http://127.0.0.1:4723/wd/hub"),
                   capabilities);
        driver.resetApp();
    }

Here we start by reading the apk environment variable we configured in the Gradle file, configure the DesiredCapabilities object, and create an AndroidDriver.  For prophylactic reasons, we do a resetApp() call to just kill and restart the app so we know we have a good state. This will connect to the Appium node.js server, which in turn will install the apk file on the running emulator and spin it up.

Next we want to click our button, so let's look at that one.

@When("I click the \"(.*)\" button")
public void clickByText(String text){
    driver.findElementByAndroidUIAutomator(
              "new UiSelector().textContains(\"" +
              text + "\")"
           ).click();
}

Here we are selecting the text inside the quotation marks using a regex selector. Then we take the text and go to the driver and use findElementByAndroidUIAutomator() to select it. Here we are constructing a small Java snippet in a String to perform the UI selection on the device, then we call click() to perform the action.

Finally, we want to validate the behavior: that the button goes away. To do that we look for the button again, but expect it to be gone...
@Then("the \"(.*)\" is gone")
public void assertMissing(String text){
    MobileElement element = null;
    try {
        element = driver.findElementByAndroidUIAutomator(
               "new UiSelector().textContains(\"" + 
                text + "\")"
               );
    } catch(NoSuchElementException e) {
        //expected exception;
    }

    assertTrue(element == null);
}

If the driver fails to select the element, it will throw a NoSuchElementException, but we want that to happen here.

Finally we run our feature suite by doing gradle cucumber:


Executing tasks: [cucumber]

Configuration on demand is an incubating feature.
:app:preBuild UP-TO-DATE
:app:preDebugBuild UP-TO-DATE
:app:compileDebugAidl UP-TO-DATE
:app:compileDebugRenderscript UP-TO-DATE
:app:checkDebugManifest UP-TO-DATE
:app:generateDebugBuildConfig UP-TO-DATE
:app:prepareLintJar UP-TO-DATE
:app:mainApkListPersistenceDebug UP-TO-DATE
:app:generateDebugResValues UP-TO-DATE
:app:generateDebugResources UP-TO-DATE
:app:mergeDebugResources UP-TO-DATE
:app:createDebugCompatibleScreenManifests UP-TO-DATE
:app:processDebugManifest UP-TO-DATE
:app:splitsDiscoveryTaskDebug UP-TO-DATE
:app:processDebugResources UP-TO-DATE
:app:generateDebugSources UP-TO-DATE
:app:javaPreCompileDebug UP-TO-DATE
:app:compileDebugJavaWithJavac UP-TO-DATE
:app:compileDebugNdk NO-SOURCE
:app:compileDebugSources UP-TO-DATE
:app:mergeDebugShaders UP-TO-DATE
:app:compileDebugShaders UP-TO-DATE
:app:generateDebugAssets UP-TO-DATE
:app:mergeDebugAssets UP-TO-DATE
:app:transformClassesWithDexBuilderForDebug UP-TO-DATE
:app:transformDexArchiveWithExternalLibsDexMergerForDebug UP-TO-DATE
:app:transformDexArchiveWithDexMergerForDebug UP-TO-DATE
:app:mergeDebugJniLibFolders UP-TO-DATE
:app:transformNativeLibsWithMergeJniLibsForDebug UP-TO-DATE
:app:processDebugJavaRes NO-SOURCE
:app:transformResourcesWithMergeJavaResForDebug UP-TO-DATE
:app:validateSigningDebug UP-TO-DATE
:app:packageDebug UP-TO-DATE
:app:assembleDebug UP-TO-DATE
:integration:compileJava NO-SOURCE
:integration:processResources NO-SOURCE
:integration:classes UP-TO-DATE
:integration:jar UP-TO-DATE
:integration:assemble UP-TO-DATE
:integration:compileTestJava NO-SOURCE
:integration:processTestResources NO-SOURCE
:integration:testClasses UP-TO-DATE
:integration:compileCucumberJava
:integration:processCucumberResources UP-TO-DATE
:integration:cucumberClasses
:integration:cucumber
Gradle now uses separate output directories for each JVM language, but this build assumes a single directory for all classes from a source set. This behaviour has been deprecated and is scheduled to be removed in Gradle 5.0
Feature: Click the button
  Clicking buttons is clever

  Scenario: I see a button and click it.  # features/hello.feature:4
    Given I have launched the application # BaseSteps.startApp()
    When I click the "Click Me" button    # BaseSteps.clickByText(String)
    Then the "Click Me" is gone           # BaseSteps.assertMissing(String)

1 Scenarios (1 passed)
3 Steps (3 passed)
0m12.263s


BUILD SUCCESSFUL in 16s
30 actionable tasks: 2 executed, 28 up-to-date
1:18:07 PM: Task execution finished 'cucumber'.