Lab 6: Unit Testing and Integration Testing for Enigma

A. Introduction

Pull the files for lab 6 from the skeleton.

git fetch shared
git merge shared/lab6 -m "Start lab6"
git push

Writing Tests

In this lab, we will go over how to test Project 1 (Enigma). There are two components to testing: unit testing and integration testing. Unit tests make sure that a function or subset of functions work properly when given a specific input. Integration tests ensure that the entire project works properly as a whole. For example, you may write unit tests for the permute method in the Permutation class for Enigma, but you would write an integration test to ensure that given a set of configurations and string input, your Enigma machine produces the expected output. Both are important because while individual functions may work properly in an isolated environment, once they start taking in values from other parts of the program, they could break. Here's a very short visual demonstration of why we need integration tests.

Lab Structure

This lab will consist of two parts:

  1. Writing unit tests for the Permutation class in Enigma.
  2. Writing integration tests for Enigma.

You will only need to complete one of these two parts to get full credit on the lab. For this lab, we recommend you do the unit testing portion if you have not yet started writing unit tests for the project, and do the integration test portion if you have already made significant headway in completing and testing the project or if you do not plan on using the provided skeleton for Enigma.

This lab is designed to help you get started with writing tests for Enigma, but hopefully should not take an extensive amount of time or thoroughness to get full credit for the lab.

We highly encourage you to work with a partner to understand Enigma, Permutations, the existing testing methods and utilities, and brainstorming tests and edge cases. HOWEVER, if you are planning on using the tests you write in this lab in your project, you will need to do the actual code-writing by yourself, since code for projects (including tests) must be written by you alone as per our Project Collaboration Policy.

B. Unit Testing for Enigma

You may skip this section if you do not plan on writing unit tests for Permutation in this lab.

Before You Start

Before you start, please read over the spec for Enigma. You don't need to completely understand everything for this lab, but it will help give context. The part relevant to what you will be testing in this lab is under "Describing Permutations" which will also be described below under "Tl;dr of Permutations".

Tl;dr of Permutations

Skip this section if you already understand Permutations.

Alphabet

You are given an abstract class Alphabet which you can assume behaves correctly. In your actual project, it will be a concrete class you will have to implement yourself. Its constructor takes in a string of unique characters that becomes an "alphabet". The index of each character in the string is its index in the Alphabet. For example, if you pass in the String "ABCDEFGHIJKLMNOPQRSTUVWXYZ", 'A' would be index 0, 'Z' would be index 25. The 1-th indexed character would be 'B' and the 24-th indexed character would be 'Y'. Another example, if you pass in the string "MICH", 'M' would be index 0, 'H' would be index 3. The 1-th indexed character would be 'I' and the 2-th indexed character would be 'C'.

Permutations

You are also given an abstract class Permutation. Like Alphabet, in your actual project, it will be a concrete class you will have to implement yourself. However, for now, it is the class you are writing tests for, and Permutation could either be correct (i.e. follows the spec and docstrings) or buggy. The permutation constructor takes in a String cycles and an Alphabet alphabet. You can think of alphabet as the total set of characters you can work with. cycles must be a String that contains only characters from alphabet, '(', ')', and whitespace between cycles and should follow the format (<character(s)>) (<character(s)>), e.g. for the alphabet "HILFNGR", its cycles could be (HIL) (FNGR), (GRL)(HI)(FN), (FHNIGLR), , (HILF) (R), etc.

Each grouping of characters enclosed by parenthesis represents one cycle. A Permutation should permute a character at index i of a cycle to the character at index i+1 (wrapping around to the beginning of the cycle if it's the last character) and should invert a character at index i of a cycle to the character at index i-1 (wrapping around to the end of the cycle if it's the first character). A character in the alphabet but not in cycles will map to itself and a character may not show up more than once in cycles.

For example, if your alphabet was HILFNGR and cycles was (HIG)(NF) (L), the following are the permutations (x -> y means x permutes to y):

And the inversions would be I -> H, G -> I, etc. (reversing the above arrows).

Writing Unit Tests

Now let's start writing your own unit tests in PermutationTest. Take a look at the existing files for helper methods and utilities you can use, but you should only have to make additions to PermutationTest. Note that you cannot instantiate Alphabet or Permutation objects directly and must use the helpers in PermutationTests.

Your goal for this lab will be to understand a general approach to writing your own unit tests. There are two approaches for writing unit tests: (1) testing per-method, and (2) testing per-case. Testing per-method would be approaching each method one-by-one and writing a test for each kind of input and output you would expect that method to take and give. For example, you could write a test for the permute method and then a test for the invert method. Testing per-case would be approaching the class Permutation as a whole and thinking about what kinds of Permutations could be created and then testing all relevant methods for it. This could be writing a test for a very simple case, e.g. creating a Permutation with one cycle and testing all/most methods on it. In the end, you will want to do a mixture of both to make sure you hit all edge cases.

Let's start with a simple test that will test the ability to invert a character.

@Test
public void testInvertChar() {
    Permutation p = getNewPermutation("(BACD)", getNewAlphabet("ABCD"));
    /* TODO: Add additional assert statements here! */
}

Recall that the invert function is defined as follows:

/** Return the result of applying the inverse of this permutation to C. */
int invert(char c) { ... }

Given the permutation above, we know the inverse of 'A' should return 'B'. We can add an assert statements to test this:

assertEquals('B', p.invert('A'));

Write a few more of your own test cases for this permutation. For example, what should a call to p.invert('B') return?

Then continue writing more tests to test expected behavior and edge cases for Permutation. You can think of edge cases by looking at the docstrings in Permutation about each method and rereading the Enigma spec or the section "Tl;dr of Permutations". You may find the provided method checkPerm and JUnit methods such as assertEquals, assertTrue, assertFalse, assertNull, and others useful. If you would like to write a test to make sure a method call throws an exception (not needed for this lab, but will be useful in Enigma), you can write a test in the format:

@Test(expected = <exception class>) 
public void test() { 
    <code that should throw an exception>
} 

For the example above, you could ensure that calling p.invert('F') throws an EnigmaException with the following test:

@Test(expected = EnigmaException.class) 
public void testNotInAlphabet() { 
    Permutation p = getNewPermutation("(BACD)", getNewAlphabet("ABCD"));
    p.invert('F');
} 

While getting tests to pass actually requires that you completely implement the functions being tested, you don't have to pass them immediately at the time of writing them. Instead, you should create scenarios where you know what is expected to happen. For this lab, you will not be able to actually run your tests on any code - and that's the point. We want you to be able to look a spec and write tests according to that spec, before actually implementing it. This is a common paradigm known as Test-Driven Development (TDD). It's very helpful because it forces you to actually know the expected behavior of the method for a certain input - knowing this will give you an idea of what and how you should implement it.

After writing tests in Enigma, don't forget to run the tests as you implement each class to make sure that your method is correct.

Grading

For this part of the lab, we will be testing your tests against a correct implementation of Permutation, and 12 buggy implementations. For full score, all your tests must pass on the correct implementation of Permutation and at least one test must fail for at least each of 9 of 12 of the buggy implementations. Your tests must pass on the correct implementation to get any credit for this lab. For reference, one solution that was able to get full credit used about 35 lines, but you will likely be writing more (and should be writing more). We highly recommend you try to get your tests to fail on all 12 buggy implementations!

Using Your Tests for Project 1

If you would like to copy over PermutationTest.java to your project 1, you will need to copy over PermutationTest.java into proj1/enigma and remove the abstract keyword from the class declaration and from the three abstract methods. You can implement each of the three formerly-abstract methods by having them call the respective constructors to Alphabet and Permutation.

C. Integration Testing for Enigma

Please see Running Enigma in IntelliJ in the project spec for how to run these integration tests on your own code.

Before You Start

Before you start, please read over the spec for Enigma and understand how the Enigma machine works.

IMPORTANT: For Windows users, please run the following command:

git config --global core.autocrlf true

Windows and Unix systems use different line endings in files, and this will ensure that your Windows line endings are always converted to Unix line endings before pushing.

Generating Integration Tests

For Enigma, an integration test will test the functionality of the machine as a whole (as opposed to any of the individual parts). Almost all of the autograder tests for Enigma will be integration tests.

Integration tests will be in the proj1/testing/correct or proj1/testing/error folders. We will first talk about tests that go in the proj1/testing/correct folder.

Each integration test should have a yyy.conf, xxx.in, and xxx.out file. The default default.conf file is:

 ABCDEFGHIJKLMNOPQRSTUVWXYZ
 5 3
 I MQ      (AELTPHQXRU) (BKNW) (CMOY) (DFG) (IV) (JZ) (S)
 II ME     (FIXVYOMW) (CDKLHUP) (ESZ) (BJ) (GR) (NT) (A) (Q)
 III MV    (ABDHPEJT) (CFLVMZOYQIRWUKXSG) (N)
 IV MJ     (AEPLIYWCOXMRFZBSTGJQNH) (DV) (KU)
 V MZ      (AVOLDRWFIUQ)(BZKSMNHYC) (EGTJPX)
 VI MZM    (AJQDVLEOZWIYTS) (CGMNHFUX) (BPRK) 
 VII MZM   (ANOUPFRIMBZTLWKSVEGCJYDHXQ) 
 VIII MZM  (AFLSETWUNDHOZVICQ) (BKJ) (GXY) (MPR)
 Beta N    (ALBEVFCYODJWUGNMQTZSKPR) (HIX)
 Gamma N   (AFNIRLBSQWVXGUZDKMTPCOYJHE)
 B R       (AE) (BN) (CK) (DQ) (FU) (GY) (HW) (IJ) (LO) (MP)
           (RX) (SZ) (TV)
 C R       (AR) (BD) (CO) (EJ) (FN) (GT) (HK) (IV) (LM) (PW)
           (QZ) (SX) (UY)

The provided trivial.in file is:

* B Beta I II III AAAA
HELLO WORLD
* B Beta I II III AAAA
ILBDA AMTAZ

and the provided trivial.out file is:

ILBDA AMTAZ
HELLO WORLD

.conf files are config files that describe the machine and its available rotors and can be shared across multiple tests, the .in files describe the machine's specific configurations for this test and what we are inputting into the machine, and the .out file describes the expected output of the machine.

For integration testing, you will want to try many different cases, but the most tedious part is figuring out the expected output given some machine configuration and input. Luckily, you have something that can generate the output for you! When writing integration tests, you will need to make your own .conf and .in files, but you can use the staff solution (staff-enigma) on the instructional machines to generate the output. On the instructional machine, you can run:

staff-enigma <config file> <input file>

and the expected output will be generated and printed. For example, after navigating to the folder with the files default.conf and trivial.in, you can run:

ashby [302] ~ $ staff-enigma default.conf trivial.in

and the output will be:

ILBDA AMTAZ
HELLO WORLD

You can take advantage of this to generate your own test cases! You can alse save the printed output of staff-enigma to a file with a third argument:

staff-enigma <config file> <input file> <output file>

for example:

staff-enigma default.conf trivial.in trivial.out

will save the results to trivial.out. Note that then nothing will be printed to your terminal in this case. However, you can view the contents of a file easily in terminal with the command cat:

cat trivial.out

Use this methodology to generate new expected output files based on config and input files you create. The easiest way to get config and input files onto the instructional machines is either writing them locally, storing them in your testing folder, committing and pushing, sshing into your instructional account, and then pulling OR by working on a lab computer directly.

Designing Integration Tests

It is recommended that you write integration tests that have increasing levels of complexity. Some variables that can make a test vary in complexity are:

You will want tests that are very simple (i.e. pick the simplest possible of all the above variables) that will sanity check your machine and be easy to debug. You will then want slightly more complex tests that test just one of these "complexity" variables (i.e. pick the simplest possible of all the above variables except one, and crank the complexity of that one variable up), which will be helpful in ensuring your program handles each of these cases while still being easy to interpret and debug. Lastly, you will then want to create very complex tests that are complex in some or all these variables. These will be the hardest to debug, but, if your previous tests were thorough, there will be a lower likelihood that these complex tests would fail while your others pass.

It is helpful to note in the name of each test what you are testing, and try to get your own tests to pass from least complex to most complex when you are debugging your program.

Grading

Please put the tests you write in the lab6/testing/correct folder. By defualt, your .in files will be run with default.conf unless you provide a .conf file with the same name as your .in file (e.g. abc.conf for abc.in). For this part of the lab, we will be testing your integration tests against our correct implementation of Enigma, and 8 buggy implementations of Enigma. For full score, all your tests must pass on the correct implementation of Enigma, and fail on 6 of 8 buggy implementations. Your tests must pass on the correct implementation to get any credit for this lab. We highly recommend you try to get your tests to fail on all 8 buggy implementations!

Error Integration Tests

We will not be testing integration tests in the testing/error folder for this lab.

Integration tests that expect your Enigma program to throw an EnigmaException rather than produce output belong in the proj1/testing/error folder. These tests should only contain .conf and .in files, and not .out files. You can verify that your tests are correct by running them on staff-engima in the same manner as for the correctness tests:

staff-enigma <config file> <input file>

However, you would verify that the staff solution also throws an exception rather than getting the output of the staff solution.

D. Submission and Grading

Deliverables

For full credit, you must submit at least one of:

Your final score will be the max of the two of these sections.

Also, remember to submit your partner.txt (left unchanged if you did not work with a partner).

There is no style check for this lab.

Submission

You should be able to submit the same as always:

<git add/commit>
git tag lab6-0 # or the next highest submission number
git push
git push --tags