First Factory

  • About Us
    • Our Values
    • Nearshore Solutions
      • Why Costa Rica
    • Team
      • About Jason
    • Inc 5000 Honoree
    • Carbon Neutral
    • Industries Served
  • Solutions
    • AI
      • AI Capabilities
    • Cloud
    • Product
    • Software Development
    • Engagement Models
  • Expertise
    • Software Engineering
    • UX/UI Design
      • UX Research
    • Project Management
    • InfoSecurity
    • Technical Expertise
  • Careers
    • Open Positions
    • Referral process
    • Employee Benefits
    • Employee Satisfaction
  • Resources
    • AI Corner
    • Startup Stories
    • Blog
    • Whitepapers
    • Client Reviews
    • Guarantee
    • FAQs
  • Contact Us

  • About Us
    • Our Values
    • Nearshore Solutions
      • Why Costa Rica
    • Team
      • About Jason
    • Inc 5000 Honoree
    • Carbon Neutral
    • Industries Served
  • Solutions
    • AI
      • AI Capabilities
    • Cloud
    • Product
    • Software Development
    • Engagement Models
  • Expertise
    • Software Engineering
    • UX/UI Design
      • UX Research
    • Project Management
    • InfoSecurity
    • Technical Expertise
  • Careers
    • Open Positions
    • Referral process
    • Employee Benefits
    • Employee Satisfaction
  • Resources
    • AI Corner
    • Startup Stories
    • Blog
    • Whitepapers
    • Client Reviews
    • Guarantee
    • FAQs
  • Contact Us

First Factory Software Testing Procedures

August 1, 2019

All clients have unique development needs and different budgets. Given the best-case scenario and a client who understands the need for thorough testing, the following describes our software testing procedures:

Automated Deployments

By automatically deploying software builds across different stages of system environments (Development, QA, Staging, Production), we perform System testing. This ensures that the software works within different environments and that the deployment has everything it needs to function properly in the final stage, production. Let's break it down by individual stages to give you a better sense of what this all means.

  • Development is where the developers do their work and their independent testing
  • QA is where a dedicated QA resource will test the build
  • Staging is where we do a smoke test, you do your beta testing and acceptance testing
  • Production is where the live product resides

Developer Environment

In the development environment, the developer does their individual unit testing, the team does code review and automated process run regression tests.

Unit testing is the process of testing each component, and a group of components, independently to ensure they behave as expected. For a given set of inputs, the component(s) should return the same results. This can be done with code and the results can be evaluated automatically with a push of a button. This allows us to do automated regression testing.

We strive for 80-90% code coverage, meaning that 80-90% of the code that we write can be automatically tested to make sure that it continues to behave as expected with each new piece of code that we introduce.

Code review means that every line of code that is delivered to the build is reviewed by a peer on the team. This second set of eyes validates that the code is written well, is clear and does what the requirements specify. Doing a code review in this fashion helps find bugs early in the process and produces more quality code.

After code review, comments are logged in the source control system and the developer makes the appropriate adjustments. Then we come to continuous integration.

Continuous integration is the process of automatically testing and building the software with every push of the code to source control. Doing this ensures that the regression testing created by the automated use cases is run frequently. Bugs created by new software are identified immediately.

Quality Assurance Environment

Using an Agile Scrum methodology, we strive to deliver workable software with each two-week Sprint. This workable software can now be tested as a whole (Integration Testing) and tested to ensure that it meets the original requirements (Functional Testing).

A dedicated QA resource (not a software developer) works with the requirements to write a verbose test plan for every feature of a project. As these features are delivered, the QA resource can manually test each plan by following its steps.

It is possible to automate a lot of the functional and integration testing, but some level of manual testing is always required.

The QA resource then either passes the delivery or opens tickets with detailed failures for the developers to fix before the build is passed. If the build passes functional and integration testing, the build is moved to staging where it is prepped for delivery.

Staging Environment

This is a shared environment where we give the product one last look over before handing it off to the client for beta and user acceptance testing. Before making any delivery to our clients, the quality assurance individual 

will do a manual smoke test to ensure major functionality is still working in the new environment that the software has been pushed to.

Early on with any project delivery, as an extra precaution, either the COO or the CEO will perform a smoke test to make sure that the project can be delivered to the client.

Once delivered to staging, it is the client’s responsibility to perform beta testing with a subset of their users, and ultimately user acceptance testing to sign off that the build is a success, and should be pushed to production where it will be used.

As the project becomes more mature, the frequency of deployments should be very regular and quick. Automated testing in prior stages should give us increased confidence that working functionality is not broken with each subsequent deployment.

Other Special Case Testing

In some cases, depending on the project, additional specific testing needs to be done.

  • Usability Testing: This is when we make sure that the product being developed is intuitive to the user. If it is not, we can make adjustments to the user interface so that the application is more usable. Early on, usability testing can be done from paper, clickable mockups or a shell of the application. As the project progresses, usability comments should always be addressed in each revision of the software.
  • Vulnerability Testing: All public-facing applications will be tested for vulnerabilities such as SQL injection and other hacks that can potentially allow the attacker access to data or the ability to corrupt data. Vulnerability testing can be done through automated test cases and by exploratory testing.
  • Stress & Performance Testing: For applications that can be used by large numbers of people over a short amount of time, we can automate stress and performance testing. This ensures the project can handle larger than expected amounts of usage.

Related posts

Inc. Power Partner 2025 Honoree

Backend, Building Core Logic

AI Anxiety


NEW YORK

228 Park Avenue South, #88643
New York, NY 10003
Tel: +1.646.688.5070

COSTA RICA

Plaza Cariari, Segundo Piso,
Office C54
Heredia, Costa Rica
Tel: +506 4101.8282


SOCIAL

  LinkedIn

  Facebook

  Instagram

  YouTube


COMPANY

About Us

Code of Business Ethics

Team

Our Values

DEI Statement

FAQ

Client Reviews


CONTACT US

Employment

Careers

Email: jobs@firstfactory.com


Software Development Needs

Tel: +1.646.688.5070

Contact Us keyboard_double_arrow_right


 
 
 

FIRST FACTORY © · PRIVACY POLICY

Join Our Newsletter

Signup today and be the first to get notified of new updates

Name(Required)
Email(Required)
Privacy(Required)
Serving Up Cookies

Decide for yourself if you want Cookies to sweeten your experience. We use Cookies to offer enhanced site navigation and performance, analyze site traffic, and serve targeted messaging. If you’re not in the mood for Cookies, no problem, opt-out below.

Functional Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes. The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.
  • Manage options
  • Manage services
  • Manage {vendor_count} vendors
  • Read more about these purposes
View preferences
  • {title}
  • {title}
  • {title}

Submit Your Referral

This field is hidden when viewing the form
Max. file size: 300 MB.