People, tell me a good solution. There is a certain layer of data validation, business logic of the application and a layer of work with the database. There are rules that impose restrictions (for example, some entities have 2 identifiers [Guid and string of a certain length] and their lack of duplication should be controlled using validation). With all this, when writing tests, a case arises that we test the operation of the validation layer and it is necessary that validation does not pass (because such an object already exists in the database), or vice versa since it does not exist yet. And here a problem arises because we do not know what is and what is not in the test database. A possible solution is to pre-fill the database with certain entities and constantly work with them, but this approach is not pleasant because when writing a test you have to remember what we have in the database (and even more so in other tests I can create new entities), but I want that in each test we could create N convenient objects for this test and work with them. But this approach with the usual base is impossible because the only solution that I see is that it will be necessary to clean the base in each test, and this is not fast anyway. What comes to mind is to create a mock database objects that can be cleaned in each test or recreated, but this does not seem like an elegant solution. Actually how to solve the problem elegantly? Ps. C # language tests embedded in visual studio using entity-framework
- oneThat's right, you need to use mock. Initialize it before each test. - Alexander Petrov
2 answers
Your unit tests should not connect to the database at all. You should test only your class responsible for validation, all external dependencies of this class should be replaced by mocks. You should have a mock that will pretend that it connects to the database, but in reality you don’t connect anywhere, but simply return the values ​​that are suitable for testing.
For example, one test. Let's say you check that there are no two elements in the database with the same id. Then the test should look something like this:
- create a moc class to access the database
- configure the desired method of this mock so that it returns obviously incorrect data
- You pass this validator to the validator and make sure that the validation failed
The code is approximately the same (in the example nunit and moq are used ):
[Test] public void Test1() { // arrange var dbRepositoryMock = new Mock<IDbReposiitory>(); dbRepositoryMock.Setup(x => x.GetItems()).Returns(new [] {new Item(){ Id = 1}, new Item(){ Id = 1}}); var validator = new Validator(); // action var validationResult = validator.ValidateItemsUnique(dbRepositoryMock.Object); // assert Assert.IsFalse(validationResult); } So the idea is this: you create a real instance ONLY for the class under test. In our case, the tested class is a validator. For all dependencies, you create mocks, and you configure ONLY those methods that are used by the tested functionality for these mocks, just ignore all the others. For example, your repository may have 20 more methods returning different entities from the database, but if you check the uniqueness of the Item, you only set the return value of the GetItems() method, and simply ignore all other methods.
Forget about the connection to the database; your unit tests should be run without it. If you find it difficult to write code in this style - without a real connection to the database, it means that it is poorly suited for unit testing, and you need to refactor it.
Your attempt to write unit tests with connection to real database is quite a typical mistake for beginners to write unit tests. I also tried to do it and saw how others try to do the same. This error indicates that you do not fully understand the meaning of unit testing and you should spend time reading some kind of book on this topic. Kent Beck's “test-driven development” would be an excellent option.
Strictly speaking, you are not writing unit tests, but rather integration tests.
If you don’t limit yourself to the answer “tests on the base is bad, try to avoid it if possible” - the easiest way to solve the problem of cleaning the database in integration tests is to cancel the transaction on each test:
- once to create a clean database at the start of tests (indicating DropCreateDatabaseAlways, or manually, using a static flag)
- create a new TransactionScope in TestInitialize
- dispatch this TransactionScope in TestCleanup, without calling Complete
It will not work instantly, the overhead will be about 100-200 ms per test, but for the existing (already written without tests) code this is the fastest option.
If you plan to refactor the code in the direction of better testing by mocks, it is better if by the time of refactoring you have ready tests for the code that you already have, even if these tests are slow and will make real queries to the database. Only Chuck can refactor uncovered code. For everyone else, this is a matter of delaying the timeframes, crowds of new bugs, and “I saw your tests and refactoring in a coffin” from the non-technical boss nearest up the hierarchy.
By the way, your validation method is unreliable. Nothing prevents a goma stream from getting into the base between checking with the “all ok” result and inserting a duplicate there. Such checks still need to be duplicated with databases at the base level.