A
-
Artifacts – A files created or used during the testing, like logs, images, documents etc.,
-
Automation Testing – A type of automated testing performed using some software tools to ensure actual and expected results are matching.
-
Accessibility – Ensuring that an application is accessible to the people having disabilities (color blindness, hard of hearing etc.
-
Alpha Testing – A type of testing performed to identify bugs before the product is released for real users.
-
AdHoc Testing- An unplanned random testing performed to explore and break the application without the product knowledge.
-
Agile methodology – An incremental approach to manage a project by diving into multiple phases.
-
API Testing – A type of testing which is performed to test applications interface to validate if it meets the expectations.
B
-
Bug – A defect identified by the tester during testing, which is accepted by the development team.
-
Beta testing – A type of testing performed on an application by real users for end user validations.
-
Block box testing – A type of testing that is performed without the access to source code of the application.
C
-
Crash logs – Logs that are generated during any crash observed in the native application due to code break.
-
Compatibility testing – A type of testing performed to verify if the application is capable of running on various platforms, devices, browsers etc.
-
Crowd Testing – An outsourced model providing access to their services and product for a group of voluntary users (Testers) with related skill sets to contribute by testing the applications.
D
-
Defect – The deviation between the actual and expected output.
-
Debugging – The process of identifying and correcting the existing errors in an application under test.
-
Deliverables – All the necessary aspects of project delivery as per contract and includes agreed changes.
E
-
Exploratory – A type of testing where the tester explores an application to identify and document the defects.
-
Error – A mistake in coding.
F
-
Functional Testing – Testing performed to verify the potential aspects of the application.
-
Failure – If the application does not meet the expected requirements, then it is termed as failure.
-
Fault – Condition that causes system or software failure.
G
-
Guided Exploratory – A reference document-based testing process to verify an application to assess the functionality of the features that are in scope for testing.
-
Gray box testing – A combination of black and white box testing types, to verify how input affects the required output of an application.
H
-
High level test scenarios – A document that consists of high-level business critical scenarios to execute for the application in test.
-
Human Error – A mistake resulting from insufficient knowledge of application, skill sets, incorrect installation, or negligence.
I
-
Issue – Error or flaw that is observed in an application that needs to be resolved.
-
Integration Testing – Testing the application end to end after combining the units or components.
J
-
Jira – A project management tool for managing projects across multiple teams.
K
-
Known Error – An error for which is already known for which a temporary or permanent alternative has been defined.
-
Kick – off Meeting – A meeting conducted before the start of a project to determine and finalize the scope, goals, and objectives of the project.
L
-
Localization Testing – A type of testing, which is designed for a specific locality, to verify functional support for that locale.
-
Load Testing – A type of testing an application by inserting heavy load to verify if it handles a huge amount of data.
M
-
Multi Device Testing – Testing an application on multiple devices to verify the behavior on different screens, platforms, and operating systems.
-
Manual Testing – A testing type where testers test an application manually.
-
Mobile application testing – Testing an application on a mobile device, to verify its usability and functionality.
N
-
Non-Functional Testing – Testing the application for its non-functional requirements like the behavior of the system, performance, and reliability.
-
Negative Testing – Testing any application to verify how it behaves for random and unexpected inputs.
O
-
O-primes – India’s largest UX and Quality assurance SaaS Platform.
-
Out of Scope – The features or issues that are not included in the current release / build for testing.
-
Outcome – The results after a test that has been executed.
P
-
Priority – Anything that should be resolved or taken care of immediately.
-
Performance testing – A testing type which is used to determine if the application meets the performance requirements like page load, responsiveness, or occurrence.
-
POC – Proof of concept, a document in which the actual idea, concept and solution is showcased as a proof.
-
Production Environment – Using an application other than the development environment. It’s also called live.
Q
-
Quality Assurance – A process to verify if the software or application is of precise level of quality.
-
Quality Control – A technique used to verify the quality of requirements.
R
-
Regression Testing – A testing type to verify that the new functionality is not affecting the current features of the application.
-
Release – A version of an application has been tested completely and is ready to be released to the customers.
-
Retest – Testing a particular functionality again and again to verify their correctness.
S
-
Sanity Testing – A type of testing performed to verify if the major functionality of the application is working as expected.
-
Scope of Testing – The features or functionalities that are included in the current test build or project.
-
Severity – Keyword used to determine the impact of any issue in the application.
T
-
Test Case – A document containing a defined set of steps to be executed to verify the flow or features of the application while testing.
-
Test Scenario – A document which describes the details of objectives that a user must verify during testing.
-
Test Coverage – The total number of devices, OS, OEM, platforms, bugs, and browsers etc that are achieved after execution of the test cycle.
-
Test Data – The set of inputs used during testing an application.
U
-
Unit Testing – A type of testing which is used to verify the smallest modules of an application, individually.
-
Use Case – A document consisting of requirements written in sequence.
-
Usability Testing – A testing type used to verify how good is the application to use.
-
User Acceptance testing – A testing type performed by end users before the delivery.
V
-
Verification – A process of verifying if the correct product has been built.
-
Validation – A process of verifying if the product is built correctly.
-
Volume Testing – Testing to verify how an application behaves with a large volume of data.
W
-
Walkthrough – A team meeting, in which the design and requirements are presented to the team members for understanding.
-
White box testing – A type of testing that is performed with the access to source code of the application.
-
Web application testing – Testing an application on the web browsers to verify its usability and functionality.