Related Posts
More Posts
Got rejected from BCG final round :(
Anyone else done menopause completely natural?
Please Read and Help me to decide.! Thanks

Got rejected from BCG final round :(
Anyone else done menopause completely natural?
Please Read and Help me to decide.! Thanks
Pro
imo it’s the SE’s job to write up base test cases and the SDET’s job to expand on it and make it more comprehensive. SDET’s with enough domain knowledge of the platform should be able to create comprehensive test plans for e2e testing.
QAE/SDET should familiarize themselves with the product the team is building. They can do so by talking with a product manager and have them walk through all the functionality of that product. Once they have a chance to use the product they can come up with a test plan that includes all the test cases and what should be tested. It could be different parts of a webpage or application. Whenever engineers are building new features the thing they should provide to the SDET should be the new functionality of the feature they are building. Then SDET should use that and integrate into their test plan.
Now they should have a comprehensive test plan that has all the test cases for that product. If they are planning on creating an automated e2e test, having that test plan will help tremendously. However, you have to think which framework is right for your requirements. Cypress is great for quick UI/API tests, but when it comes to testing cross browsers that's where it lacks. So create a POC and compare the pros vs. cons between different frameworks and see which one is best suitable for the team.
Lastly, having a QA process is a must if SDET wants to succeed. Without a proper process the SDET will just blindly test things.
You could take it another step back and have the product manager provide acceptance criteria. That'll help both SWE and QA fulfill their end.
Also, automation is key.
Rising Star
We do this too!
But inevitably, we miss something and a customer reports a bug. I'd like to try capturing that knowledge somehow to prevent it from happening again - I suppose automation is the answer for that 😕
Automate full regression suite. Run every day / alternate days. Keep pushing new functionalities and get it tested as per acceptance criteria in Story. Proper requirements and good regression automted suites helps a lot
This is a good rule of thumb to do it. Once you have your automated tests done for the given piece of functionality, it will let you know when something is broken. Make sure you add the additional test cases to the scope as well so you don’t skip anything. Also, once an issue is found and fixed, it’s a good idea to cover this scenario in the particular test case, and automate it if needed. This is how it works, folks!
It depends on your application. If it's a backend web app, then functional specs and API docs should be all that QA needs to perform edge case analysis, etc.
If it's a ui application, you should already have product specs and docs on how to use it. They should be trained (ideally self-trained) to understand the app, and should be able to do proper testing themselves.
It is not the developers job to baby-bird QAs. In fact, it's a horrible idea from every aspect.
The main value in having a QA is for someone who didn't write the code, who won't make the same assumptions as the dev, to hammer away at it.
If I just write the test cases, and the QA adds a couple, there's more chance of misses.
Otherwise, QA ends up not understanding the application, which benefits nobody.
Source: I've worked in every kind of QA environment.