Integration testing with Spock

Một phần của tài liệu Manning java testing with spock (Trang 224 - 232)

In chapter 6, you saw various techniques used to mock collaborator classes so that only the class under test affects the outcome of the tests. In some cases, however, you don’t want to mock collaborators but want to test multiple real classes together.

An integration test spans multiple Java classes (instead of just one) and examines the communication among them. Common scenarios that need integration tests are database mappings, security constraints, communication with external systems, and any other cases where testing is focused on a module rather than a single class.

7.2.1 Testing a Spring application To start, you’ll look at a stand- alone application powered by a Swing user interface that manages the warehouse inventory of an e-shop (see figure 7.4).

Figure 7.4 A simple database application with a Swing user interface

(continued)

can use your favorite Java testing libraries and learn new tricks with Groovy-based test- ing utilities. The testing tools I show here are my personal selection. I tried to find simple examples that anybody can understand. All the examples are contrived. If you want to learn more about integration testing for Java applications in general, you should consult books that focus on the specific technology of your application.

REST testing

Web testing Integration testing

Figure 7.3 Source code contains three projects

The application is based on the Spring framework and saves all its data in a JPA/ Hibernate database, which in this case is an HSQLDB6 file. The design of the applica- tion is straightforward. In the middle is the Spring dependency injection container, and all other classes revolve around it, as shown in figure 7.5.

The Spring context that binds all other classes is an XML file defined in src/main/

resources/spring-context.xml. You want to write a Spock test that examines the Hiber- nate mappings for the Product class. To achieve that, you need to test the whole chain of database loading and saving. The following classes should be tested:

■ The ProductLoader class, which is the DAO

■ The JPA entity manager that manages database mappings

■ The Datasource that provides access to the real database.

The good news is that Spock already contains a Spring extension that instantly recog- nizes the @ContextConfiguration7 annotation provided by the Spring test facilities, as shown in the following listing.

@ContextConfiguration(locations = "classpath:spring-context.xml") class RealDatabaseSpec extends spock.lang.Specification{

@Autowired ProductLoader productLoader

@Sql("clear-db.sql") def "Testing hibernate mapping of product class"() { given: "the creation of a new product"

productLoader.createDefaultProduct() when: "we read back that product"

List<Product> allProducts = productLoader.getAllProducts();

then: "it should be present in the db"

allProducts.size() == 1 and: "it should start with zero quantity"

allProducts[0].getStock() ==0 }

}

6 See the HyperSQL website at http://hsqldb.org/ for more information on HSQLDB.

7 And the @SpringApplicationConfiguration annotation from Spring Boot.

Listing 7.1 Access Spring context from a Spock test

DataSource

Spring MainWindow

EntityManager ProductLoader

HSQLDB database

Figure 7.5 The Spring context initializes all Java classes.

Marking the test with the Spring ContextConfiguration annotation Wiring a Spring

bean as in normal

production code Spring facility to

initialize a database Saves something to the database Reads back again

from the database

Verifies database read

201 Integration testing with Spock

In this listing, you can see that all testing facilities and annotations are already offered by Spring. Spock automatically understands that this file uses a Spring context and allows you to obtain and use Spring beans (in this case, ProductLoader) as in normal Java code.

The important line here is the @ContextConfiguration annotation because it’s used by Spock to understand that this is a Spring-based integration test. Notice also that you use the Spring @Sql annotation, which allows you to run an SQL file before the test runs. This is already offered by Spring and works as expected in the Spock test.

The resulting test is an integration test, because the real database is initialized, and a product is saved on it and then read back. Nothing is mocked here, so if your data- base is slow, this test will also run slowly.

A nice facility offered by Spring is the automatic rollback of database changes during a unit test, as shown in the following listing. This is an effective way to keep your unit tests completely independent from one another. Activating this behavior is (unsurpris- ingly) done by using standard Spring facilities that apply automatically, even in the case of a Spock test.

@ContextConfiguration(locations = "classpath:spring-context.xml")

@Transactional class RealDatabaseSpec extends spock.lang.Specification{

@Autowired

ProductLoader productLoader

@Rollback @Sql("clear-db.sql")

def "Testing hibernate mapping of product class"() { [...code redacted for brevity...]

} }

The test code in this listing is exactly the same as in listing 7.1. I have only added two extra Spring annotations. The @Transactional annotation notifies Spring that this test will use database transactions. The @Rollback annotation instructs Spring to revert8 all database changes performed inside the Spock feature method after the test finishes.

Options for Spring testing

The Spring framework contains a gazillion options when it comes to testing. Explaining them all is outside the scope of this book. You should consult the official Spring doc- umentation (https://spring.io/docs). This chapter presents some techniques that prove that Spock and Spring play well together.

Listing 7.2 Rolling back database changes automatically

8 The default behavior by Spring is to revert all transactions. I show the @Rollback annotation for emphasis only.

Making this test honor transactions

Database changes will be reverted once the test finishes.

Even if your Spock test deletes or changes data in the database, these changes won’t be persisted at the end of the test suite. Again, this capability is offered by Spring, and Spock is completely oblivious to it.

In summary, Spock support for Spring tests is as easy as marking a test with the Spring test annotations. If you’ve written JUnit tests by using SpringJUnit4Class- Runner, you’ll feel right at home.

7.2.2 Narrowing down the Spring context inside Spock tests

If you’ve written Spring integration tests before, you should have noticed two serious flaws of the Spock tests shown in listings 7.1 and 7.2. Both tests use the same Spring context as the production code. The two flaws are as follows:

1 Tests use the same database as production code. This isn’t desirable and some- times not even possible because of security constraints.

2 The Spring context initializes all Java classes even though not all of them are used in the Spock test.

For example, in the Swing application, the Spock test also creates the Swing class for the GUI even though you never test the GUI. The Spock tests shown in listings 7.1 and 7.2 might not run easily in a headless machine (and build servers are typically head- less machines).

The recommended way to solve these issues is to use a different Spring context for the tests. The production context contains all classes of the application, and the test context contains a reduced set of the classes tested. A second XML file is created, as shown in figure 7.6.

With the reduced context, you’re free to redefine the beans that are active during the Spock test. Two common techniques are replacing the real database with a memory- based one and removing beans that aren’t needed for the test. If you look at the con- tents of the reduced-text context file, you’ll see that I’ve removed the GUI class and replaced the file-based datasource with an in-memory H2 DB9 with the following line:

<jdbc:embedded-database id="dataSource" type="H2"/>

9 You can find more information about the H2 database at www.h2database.com/html/main.html.

Real context

Test context

Figure 7.6 Creating a second Spring context just for tests

203 Integration testing with Spock

The in-memory database is much faster than a real database, but it works for only small datasets (you can’t easily use it as a clone of a real database). Because unit tests use specific datasets (small in size) and also need to run fast, an in-memory database is a good candidate for DB testing.

The context for the Spock test is now simplified, as shown in figure 7.7.

To run the test, you inform Spring of the alternative context file. Spock automati- cally picks up the change, as shown in the following listing.

@ContextConfiguration(locations = "classpath:reduced-test-context.xml")

@Transactional

class DummyDatabaseSpec extends spock.lang.Specification{

@Autowired

ProductLoader productLoader

def "Testing hibernate mapping of product class - mem db"() { given: "the creation of a new product"

productLoader.createDefaultProduct() when: "we read back that product"

List<Product> allProducts = productLoader.getAllProducts();

then: "it should be present in the db"

allProducts.size() == 1

and: "it should start with zero quantity"

allProducts[0].stock ==0 }

}

You can find the reduced Spring context at GitHub.10 Because this test runs with an in-memory database, it’s much faster than the original test shown in listing 7.2. Also

Listing 7.3 Using a reduced Spring context for unit testing

10https://github.com/kkapelon/java-testing-with-spock/blob/master/chapter7/spring-standalone-swing/

src/test/resources/reduced-test-context.xml DataSource

Test Spring context

EntityManager ProductLoader

H2 in-memory database

Figure 7.7 Spring context for tests uses an in-memory database and no GUI classes.

Defining an alternative Spring context

Data is written to an in-memory database.

Data is fetched from an in-memory database, making the test fast.

you removed the GUI class from the context, so this unit test can run in any Unix/

Linux system in a shell environment (the typical case for build servers).

You need to examine your own application and decide what you’ll discard/replace in the test context. A good starting point is to remove all beans that aren’t used in your tests.

7.2.3 Directly accessing the database with Groovy SQL

At this point, you’ve seen that a Spock test has access to Spring beans without any spe- cial configuration,11 and that you can use common Spring features as testing aids.

The additional advantage of Spock tests is that you also have all Groovy tools and libraries at your disposal. I introduced you to some essential Groovy facilities back in chapter 2, but you should spend some extra time exploring the full Groovy documen- tation to see what’s available to help you while writing Spock tests for your applica- tion needs.

A handy Groovy feature not mentioned in chapter 2 (because it’s mainly a nice-to- have feature) is the Groovy SQL interface.12 The Groovy SQL class is a thin abstraction over JDBC that allows you to access a database in a convenient way. You can think of it as a Spring JDBC template on steroids.

Let’s assume that you want to verify that the DAO of the e-shop brings back all products in alphabetical order. You can initialize your database by using Groovy SQL, as shown in the next listing.

@ContextConfiguration(locations = "classpath:reduced-test-context.xml") class DummyDatabaseGroovySqlWriteSpec extends spock.lang.Specification{

@Autowired

DataSource dataSource @Autowired

ProductLoader productLoader

def "Testing ordering of products"() {

given: "the creation of 3 new products"

Sql sql = new Sql(dataSource)

sql.execute("DELETE FROM PRODUCT") sql.execute("INSERT INTO PRODUCT (id,name,price, weight,stock) VALUES (1, 'samsung',400,1,45);") sql.execute("INSERT INTO PRODUCT (id,name,price, weight,stock) VALUES (2, 'bravia',1200,3,2);") sql.execute("INSERT INTO PRODUCT (id,name,price, weight,stock) VALUES (3, 'canon',500,5,23);")

11JUnit tests need the special SpringJUnit4ClassRunner in order to access Spring beans.

12The takeaway of this section is that you can easily use Groovy libraries to do what you want. Groovy SQL is used as an example. Details of the Groovy SQL interface are provided at http://docs.groovy-lang.org/latest/html/

api/groovy/sql/Sql.html.

Listing 7.4 Using Groovy SQL to prepare the DB in a Spock test

Getting the underlying datasource from Spring

Groovy SQL creation over an existing datasource Clears the DB

Inserts data directly on the database

205 Integration testing with Spock

when: "we read back the products"

List<Product> allProducts = productLoader.getAllProducts();

then: "they should be ordered by name"

allProducts.size() == 3

allProducts[0].name =="bravia"

allProducts[1].name =="canon"

allProducts[2].name =="samsung"

cleanup: "remove inserted data"

sql.execute("DELETE FROM PRODUCT")

sql.close() }

}

The Groovy SQL interface is a powerful feature. It supports all SQL statements you’d expect (schema creations/data writing/data querying), and explaining all its capabili- ties is beyond the scope of this book. It can be used both in production code and in Spock tests.

I tend to use it when I want to do something strange on the DB (perhaps re-create an error condition) that’s normally not possible via the DAOs of the application. Be careful when using it in your Spock tests, because as you’ve seen in listing 7.4, it gets direct access to the database, so it acts outside the caches of JPA/Hibernate.

Despite these shortcomings, it’s a natural Groovy way to access the DB, and you’ll find its code compact and comfortable. The last example can be further improved by extracting the common SQL statement in its own string, as shown in the next listing.

def "Testing ordering of products - improved"() { given: "the creation of 3 new products"

Sql sql = new Sql(dataSource) sql.execute("DELETE FROM PRODUCT")

String insertProduct = "INSERT INTO PRODUCT (id,name,price, weight,stock) VALUES (?, ?,?,?,?);"

sql.execute(insertProduct,[1, 'samsung',400,1,45]) sql.execute(insertProduct,[2, 'bravia',1200,3,2]) sql.execute(insertProduct,[3, 'canon',500,5,23]) when: "we read back the products"

List<Product> allProducts = productLoader.getAllProducts();

then: "they should be ordered by name"

allProducts.size() == 3 allProducts[0].name =="bravia"

allProducts[1].name =="canon"

allProducts[2].name =="samsung"

cleanup: "remove inserted data"

sql.execute("DELETE FROM PRODUCT") sql.close()

}

Listing 7.5 Using Groovy SQL to prepare the DB in a Spock test—improved

Clean up so that other tests are unaffected.

Always a good practice

Creates Groovy SQL over an existing data source Defines a

parameterized SQL statement

Runs the same SQL statement with different parameters

A final note regarding Groovy SQL is that if you use it in multiple test methods, it’s best to make it a @Shared field so that it’s created only once. Otherwise, performance of your unit tests will suffer.

7.2.4 Integration testing with other containers (Java EE and Guice)

The example application in the previous paragraph was based on the Spring con- tainer because the Spring framework is mature and popular among Java developers. If you’re not using Spring, chances are that your application is based on Java EE. In that case, the respective facilities offered by Spring in integration tests can be replicated by using Arquillian, a test framework for Java EE applications that acts as a testing con- tainer and allows access to EJBs, CDI injection,13 and other enterprise services.

Arquillian (http://arquillian.org/) natively supports JUnit tests, but for Spock tests, you need the Spock-Arquillian extension (https://github.com/arquillian/

arquillian-testrunner-spock). The extension has its own repository and a different life- cycle than Spock releases. It works by creating a special runner that brings the Arquil- lian facilities inside the Spock test.

Apart from Spring, the core Spock distribution also includes support for the Guice dependency injection framework (https://github.com/google/guice). In a similar manner, it allows you to access Guice services/beans inside the Spock test.

If the dependency injection framework you use is something else (other than Spring, Guice, and Java CDI), and there isn’t a Spock extension for that by the time you’re reading this book, you have two choices:

■ Manually initialize and inject your services in the Spock setupSpec() method.14

■ Find a way to initialize the DI container programmatically inside the Spock test.

The first option isn’t practical because you have to write a lot of boilerplate code that’s usually not needed, going against the mentality of writing Spock tests in the first place (compact and readable tests).

The second way depends on the capabilities of the container you use and whether it supports declarative or programmatic configuration. As an example, assume that the Spock-Spring extension didn’t exist. The Spring container can be still created pro- grammatically, as shown in the next listing.

class ManualInjectionSpec extends spock.lang.Specification{

def "Testing hibernate mapping of product class - mem db"() { given: "a product DAO"

ApplicationContext ctx = new ClassPathXmlApplicationContext("reduced-test-context.xml");

ProductLoader productLoader =

ctx.getBean(ProductLoader.class)

13The Java spec for dependency injection.

14You can find the code in ManualInjectionSpec in GitHub (the second test method).

Listing 7.6 Manual Spring context creation

Creates a Spring context from an XML file

Manually initializes a Spring bean

207 Functional testing of REST services with Spock

when: "we read products from the DB"

List<Product> allProducts = productLoader.getAllProducts();

then: "the db is empty"

allProducts.size() == 0 }

}

This Spock test still has access to the Spring context because it creates one manually.

Notice the lack of any extra annotations in this listing. If your dependency injection framework supports programmatic initialization, you can still write Spock integration tests without needing a special extension.

Một phần của tài liệu Manning java testing with spock (Trang 224 - 232)

Tải bản đầy đủ (PDF)

(306 trang)