Navigating Common Misconceptions of Transitioning from Manual to Automation

Katalon
9 Aug 202411:32

Summary

TLDRIn this discussion, the speakers reflect on the early challenges and misconceptions of automation in the late 1990s. They emphasize how skepticism about its value, the assumption that automation could solve everything, and the belief that automation was a one-time effort were common. They share insights on proving the financial benefits of automation, learning to write maintainable and reusable scripts, and understanding that ongoing human oversight is crucial for successful automation. The conversation highlights the evolving understanding and strategic approach to automation over time.

Takeaways

  • 🔍 Early automation efforts faced skepticism, with doubts about its value compared to manual testing.
  • ⏳ One major concern was the time investment required for automation, which many believed wouldn't pay off in the long run.
  • 📊 Proving the financial benefits of automation was crucial in gaining buy-in from management, showing significant time and cost savings.
  • 🧩 A key lesson learned was that automation isn't a one-size-fits-all solution; not everything should or can be automated effectively.
  • 🔍 Early misconceptions included the belief that automation could replace all manual testing, which proved to be overly optimistic.
  • 🛠 Developing a strategy for automation was essential, particularly in creating smaller, reusable test scripts to ease maintenance.
  • ⚙️ Maintaining automation scripts is an ongoing effort; changes in the application can require frequent updates to the automation code.
  • 🚦 It’s important to have a dedicated testing environment for automation to avoid interference from other ongoing testing activities.
  • 💡 Waiting to automate areas of an application that are still undergoing frequent changes can prevent unnecessary rework and maintenance.
  • 🔄 Automation requires ongoing human oversight, particularly in maintaining and updating scripts as the application evolves.

Q & A

  • What was one of the main fears expressed by the speakers when they started with automation in the late 90s?

    -One of the main fears was that automation would take too much time initially, and the upfront work required would not be worth the effort on the back end.

  • How did the speaker demonstrate the value of automation to their company?

    -The speaker demonstrated the value by showing that automation reduced the time needed for a quarterly release from a week to three days, which also decreased the cost significantly.

  • What misconception did people have about automation that the speaker had to overcome?

    -People believed that automation would not provide any real value and that the initial effort to automate was not worth the time and effort.

  • How did the speaker prove to their company that automation was worth the investment?

    -The speaker showed that by automating a significant portion of a large Excel spreadsheet, they were able to reduce the time and manpower required for a quarterly release, proving the financial benefits of automation.

  • What was the initial approach to writing automated tests according to the speakers?

    -Initially, the speakers created large tests with many conditions, which made it difficult to identify which part of the application was failing when tests did not pass.

  • What did the speakers learn about test strategy as they began to automate?

    -They learned that each automated test should result in one outcome for one functionality, leading to the adoption of atomic tests and a more strategic approach to test automation.

  • Why did the speakers emphasize the importance of having a separate environment for automation?

    -Having a separate environment ensured that tests were not broken by changes made by others in the shared testing environment, which was crucial for maintaining the reliability of automated tests.

  • What was a key lesson the speakers learned about the maintenance aspect of automation?

    -They learned that automation scripts need to be written in a way that allows for easy maintenance, such as creating reusable test chunks that can be updated in one place rather than in multiple tests.

  • What misconception did the speakers address regarding the ability to automate everything?

    -They addressed the misconception that automation can replace all manual testing, learning that it's more about having a strategic approach to what should be automated and what should remain manual.

  • How did the speakers approach the issue of changes in the application affecting automated tests?

    -They learned to create smaller, more focused tests and to wait to automate areas of the code that were likely to change, reducing the need for constant script maintenance.

  • What advice would the speakers give to someone new to automation based on their experiences?

    -They would advise focusing on strategic test automation, maintaining separate environments for testing, and writing maintainable scripts with reusable components to handle changes in the application.

Outlines

00:00

😨 Overcoming Early Fears and Misconceptions in Automation

The speaker reflects on the early challenges and misconceptions surrounding automation in the late 90s, particularly the skepticism about its value and the belief that it would require too much upfront time and effort. They share how they were able to demonstrate the financial benefits and time savings of automation by reducing the time required for quarterly releases from a week to just three days. This success helped convince the company of the value of automation, despite initial resistance. The speaker emphasizes the importance of proving both the effectiveness and economic sense of automation to gain managerial buy-in.

05:01

🤖 Realizing the Limits of Automation

The speaker discusses the initial misconception that everything could be automated, leading to overly complex and error-prone tests. They learned that effective automation requires a well-thought-out test strategy, where each test should be focused on a single functionality. Collaboration with manual testers and test leads became crucial to determine which areas of an application were worth automating. The speaker also mentions the need for a separate, stable environment for automation to avoid interference from ongoing development, highlighting the importance of smaller, reusable test components to facilitate easier maintenance.

10:02

🛠 The Ongoing Need for Maintenance in Automation

The speaker categorizes three major misconceptions about automation: the initial belief that automation wouldn't work, the assumption that automation could solve everything, and the idea that once automated, no further attention would be needed. They emphasize that automation requires continuous human oversight, particularly in maintaining and updating tests as the application evolves. The speaker advises waiting to automate areas of code that are still undergoing significant changes to avoid excessive rework. This underscores the balance needed between automating efficiently and ensuring that the automated scripts remain reliable over time.

Mindmap

Keywords

💡Automation

Automation refers to the use of technology to perform tasks that were previously done manually. In the context of the video, automation is discussed as a transformative approach in testing processes, particularly in software development. The speakers mention initial skepticism about its value, challenges in implementation, and the realization that automation is not a one-time effort but requires ongoing maintenance and strategic planning.

💡Manual Testing

Manual testing involves human testers executing test cases without the help of automation tools. The video contrasts this traditional approach with automation, highlighting the initial reliance on manual testing and the industry's eventual shift towards automation. The transition from manual to automated testing is a central theme, reflecting broader changes in the software development process.

💡Misconceptions

Misconceptions refer to incorrect beliefs or assumptions. The video discusses several misconceptions about automation, including the idea that it wouldn't be valuable, that it could automate everything, and that it was a 'one and done' process. These misconceptions were barriers to adopting automation and were only overcome through experience and demonstration of automation's long-term benefits.

💡Test Strategy

A test strategy is a plan that outlines the testing approach, resources, schedule, and scope. In the video, the speakers emphasize the importance of developing a solid test strategy when implementing automation. They learned that effective automation requires careful planning to ensure that each test focuses on a specific functionality and that tests are maintainable over time.

💡Maintenance

Maintenance in the context of automation refers to the ongoing effort required to keep automated tests functional as the application evolves. The video highlights that automation is not a set-and-forget solution; instead, it requires continuous updates to accommodate changes in the application. Proper maintenance is crucial for the long-term success of automation efforts.

💡Regression Testing

Regression testing is a type of testing that ensures that new changes or updates to an application do not negatively affect existing functionality. The video mentions the importance of automating tests in areas that are likely to change to prevent regression issues. This type of testing is a key reason for adopting automation, as it helps maintain software quality over time.

💡Reusable Test Scripts

Reusable test scripts are pieces of code or tests that can be used across multiple test cases or scenarios. The video explains that creating reusable scripts helps reduce maintenance effort by allowing a single change to propagate across all relevant tests. This concept is central to efficient and scalable automation practices.

💡Financial Impact

The financial impact refers to the cost savings or economic benefits of implementing automation. In the video, one speaker describes how they demonstrated the value of automation by showing the company how it reduced the time and resources required for testing, leading to significant cost savings. This financial argument was crucial in gaining management's support for automation.

💡Testing Environment

A testing environment is a controlled setting where tests are executed. The video highlights the need for a separate testing environment for automation, as having a clean, stable environment ensures that automated tests are not disrupted by ongoing changes in the main application. This was a lesson learned through experience, emphasizing the importance of an isolated environment for reliable automation.

💡Conditional Statements

Conditional statements are programming constructs that execute different actions based on whether certain conditions are met. The video discusses the use of complex conditional statements in early automation efforts, which led to difficulties in troubleshooting and maintaining tests. The speakers learned to simplify their test scripts to make them more manageable and less prone to errors.

Highlights

Overcoming initial fears and misconceptions about automation in the late 90s.

The challenge of proving the value of automation to skeptics within the company.

The misconception that automation would take too much time and not be worth the effort.

Demonstrating the benefits of automation through a quarterly release scenario.

Reducing the all-hands-on-deck period from a week to three days through automation.

The importance of showing the financial benefits of automation to gain management support.

The common misconception that automation could replace all manual testing.

Learning the necessity of a clear test strategy for effective automation.

The realization that not all tests need to be automated due to their complexity or lack of change.

The need for a separate, clean environment for automated tests to avoid interference.

The evolution from writing large, complex scripts to creating smaller, reusable test components.

The importance of maintenance in automation and the need for scripts that are easy to update.

The learning process involved in determining which parts of an application should be automated.

The strategic approach to automation, including when to hold off on automating certain areas.

The role of human supervision in ensuring the effectiveness and accuracy of automated tests.

The economic impact of automation on reducing costs and increasing efficiency within a company.

The evolution of the speakers' understanding of automation from a tool to a strategic business asset.

The importance of aligning automation efforts with the overall business goals and objectives.

Transcripts

play00:00

[Music]

play00:02

talked about the fears you talked about

play00:05

some of the things that you you try to

play00:06

overcome and it looks like your boss

play00:08

told you hey that must have been one

play00:09

fear like hey go ahead and get this

play00:11

thing going or you know we're done here

play00:13

uh but then then what along with those

play00:16

fears what were the misconceptions of

play00:18

automation back then and like what what

play00:21

is it and I'm I'm guessing there is a

play00:23

lot of correlation to what we're seeing

play00:24

today with AI in testing but we're not

play00:28

going to go there yet I'm just curious

play00:29

like back in n in late 90s when you both

play00:31

were jumping into into the deep end of

play00:34

automation what uh are some of the

play00:37

things that that people misunderstood

play00:40

that ended up being more like uh than an

play00:43

actual myth or or something that I

play00:45

didn't become reality

play00:47

uh for me personally because it was new

play00:51

and people had just been doing manual

play00:53

testing I just don't think anyone

play00:55

believed that there was going to be any

play00:57

real value that was going to come from

play00:59

it

play01:00

and then there was also

play01:03

some of the

play01:06

that automating would take too much time

play01:10

like the initial upfront work that would

play01:12

would need to be done would not be wor

play01:15

on on the back end right and so it was

play01:18

really just

play01:20

getting um you know managers and stuff

play01:23

like that to see that it was it was

play01:25

worth the time and effort and one way

play01:27

that I was able to prove that

play01:30

was we used to do quarterly releases and

play01:33

when we did this quarterly release it

play01:35

would be like an all handson deck within

play01:37

the company for like a week and I was

play01:39

able to show that by automating we had

play01:44

this huge Excel spreadsheet that had all

play01:46

of these things that needed to be done

play01:47

in order for it to be you know deemed

play01:49

like good good enough to go for the

play01:51

quarterly release I had started what

play01:53

once I had gotten the green light to go

play01:55

and and stuff like that I had gotten um

play01:58

a fair amount of that spreadsheet

play02:00

automated and I was able to show the

play02:02

company um you know I was able to kind

play02:04

of break down all right this is roughly

play02:06

how much you're paying for that week of

play02:09

all Hands-On deck because it wasn't just

play02:11

the QA we were bringing in other members

play02:14

of the company you know analysts and

play02:16

implementation Specialists and stuff

play02:18

like that and I was able to show you

play02:20

know hey we able to that down to three

play02:23

days there was by me being able to

play02:26

automate a certain percentage of that

play02:28

spreadsheet that you had

play02:31

uh we only need two tester that was

play02:33

handling some of the newer um stories

play02:37

that had been brought in that had not

play02:39

been automated yet and I was able to

play02:41

show but look how much we were able to

play02:43

cover with just the tool so that was

play02:46

probably the biggest thing that was I

play02:49

was able to do to kind of get them to

play02:52

see hey this is worth it but that but

play02:55

getting them to to Really buy in for

play02:57

that initial part of me kind of being

play02:59

left alone

play03:00

because I didn't do any manual testing

play03:02

my job was just just to do everything I

play03:04

could to automate you know what we had

play03:06

and once I was able to prove that from a

play03:08

financial standpoint and show them um

play03:11

that we're able to do it faster and for

play03:13

less money that was really the

play03:16

big uh impact that allowed us to be able

play03:19

to like to make it like a huge part of

play03:21

who we are now at our company with the

play03:23

automation so it it does look like you

play03:25

had not only the hey I have to show that

play03:27

these work but also I have to show that

play03:29

these is e like the economics of these

play03:32

make sense yeah right otherwise I mean

play03:35

the investment wasn't going to be worth

play03:36

it regardless of how fast you could do

play03:38

it yeah and I you know with a business

play03:40

and that's ultimately what they're going

play03:42

to want to see a lot of times right I

play03:44

mean it really comes down to how can we

play03:46

save the company money by by doing this

play03:49

so yeah that was it for me Alex I don't

play03:53

know what you have but what what

play03:56

misconceptions did you face Alex when

play03:58

you were trying to I mean there is a

play04:01

technology I don't recall the name right

play04:02

now that both of you mentioned at at the

play04:04

very beginning but um as you were

play04:07

learning and implementing these things

play04:10

did you did you find anything that uh or

play04:14

or was there any objections really when

play04:17

you were trying to to adopt it or

play04:20

implement it yeah so um I'd say um the

play04:25

common misconception at the time when I

play04:28

was starting was that uh we could

play04:30

automate everything so right it's like

play04:34

oh great so there this tool can you know

play04:36

do like can run everything that we're

play04:39

doing manually and and it can basically

play04:42

like we'll be able to test everything

play04:44

super quick and then initially as we

play04:47

were learning as well like yeah I think

play04:49

so so we'll we would create tests and

play04:52

and and Dave you'll probably relate to

play04:55

it but like we would create tests like

play04:57

huge tests full of conditions like if

play05:01

then if then else and then choose this

play05:04

and choose that and then and it worked

play05:07

you know it could make it work but then

play05:08

very quickly when there were failures in

play05:11

the application that caused the tests

play05:13

the automated test to fail we would go

play05:15

like huh okay so why did it fail like

play05:18

which functionality actually is not

play05:21

working because our tests automated so

play05:23

many things that uh we had we literally

play05:26

had to trouble not troubleshoot but

play05:28

triage the test itself

play05:30

to find which part of the application

play05:32

was throwing the error and so very

play05:34

quickly we learned okay automation is

play05:36

not just about automation it's there's

play05:40

more about test strategy than actually

play05:43

automating right so then we very quickly

play05:45

learned that okay each automated test

play05:49

should result uh in one or should create

play05:52

one result right for one functionality

play05:55

uh uh like Atomic test basically right

play05:59

and uh and then that helped us uh Drive

play06:02

the test strategy for the overall

play06:03

automation effort uh and then as we

play06:06

started learning about how to do that we

play06:09

found ourselves working a lot closer

play06:11

with the uh manual testing team and also

play06:14

with the uh the test lead to make sure

play06:17

hey does this even make sense to to

play06:19

automate or is this one of those I don't

play06:22

know admin areas of the application that

play06:24

is never going to change in in the

play06:27

foreseeable future so why would we spend

play06:29

time automating that or no maybe it will

play06:32

change the surrounding areas so it's

play06:35

important for us to have that automated

play06:37

because of regression concerns right so

play06:40

it started helping us drive the strategy

play06:42

around testing as a whole and

play06:44

specifically about automation that was a

play06:46

big misconception is just not it's not

play06:49

because you can that you actually should

play06:53

so uh it's a that was a valuable lesson

play06:56

learned that cost us a lot of waste well

play07:00

not wasted but like a lot of time that

play07:02

we spent um and then eventually we

play07:05

learned yeah what I had to do too was

play07:08

the another misconception was oh well

play07:10

you can just build the scripts into this

play07:13

environment and there were people that

play07:15

were in that environment so it wasn't

play07:18

just me and so you know what I had to

play07:21

learn to do was kind of get the company

play07:24

to see I need my own completely separate

play07:26

environment that no one else will be

play07:29

touching

play07:30

um because you know we we had a very

play07:33

configurable application and so they

play07:36

could come along and completely remove

play07:38

or add new things and that that that was

play07:41

breaking the test too so initially I had

play07:44

started creating them within the system

play07:45

that everybody was testing against and

play07:48

had to say uh I'm going to need like my

play07:50

completely own separate environment

play07:52

that's completely clean that no one's

play07:53

going to be in and that has been crucial

play07:57

um and then like what you said Alex was

play08:01

initially I wrote those scripts that

play08:02

were gez 200 300 I don't know however

play08:06

many hundreds of lines of code and then

play08:09

you learned that you had to uh make

play08:12

those smaller um and then the other the

play08:14

other important piece was learning how

play08:16

to either create like a reusable test

play08:18

that you could call um things that you

play08:21

found that were going to be used on a

play08:23

regular basis or in some cases because

play08:28

regardless of how how much we'd like for

play08:30

an application to stay the same there

play08:32

there's going to be enhancements that

play08:34

are going to change and so learning how

play08:36

to create these chunks reusable tests or

play08:38

you could call you know the keyw the

play08:40

methods or something like that to where

play08:42

you could only have to make a change

play08:44

once and then it would be pushed out to

play08:46

every other test that called that uh was

play08:49

also um another like important piece in

play08:52

learning how to write them in a way

play08:55

because maintenance is a huge part of

play08:58

you know automation it's not just about

play09:00

going out and writing the scripts and

play09:01

then no that's it it'll you never have

play09:03

to worry about that one again we're just

play09:04

going to keep creating new ones it's

play09:06

like no there's a maintenance aspect so

play09:08

learning how to write the scripts in a

play09:09

way that if you f notice an area that's

play09:13

going to have possible more changes or

play09:16

something like that it's better to have

play09:18

those smaller reusable test chunks that

play09:21

you only have to make that call once or

play09:23

make that change once and it you don't

play09:24

have to worry about going and trying to

play09:25

change it in 15 other places that was

play09:28

another thing that I learned along the

play09:30

way that was important for making

play09:32

maintenance easier anyway because you

play09:34

would said Alex you know we had these

play09:36

huge scripts with all these conditional

play09:37

statements and then that one area would

play09:40

break and you know and if you had like a

play09:42

bunch of tests that were like that you'd

play09:44

have to go and find that chunk of code

play09:47

in every single one of those and make

play09:48

that so that was another thing that took

play09:50

time for me to learn as well so yeah so

play09:53

so it looks like if we if we categorize

play09:56

the the um I guess the misconceptions is

play09:59

that fir first is it looks like David

play10:02

the first one that you faced was this is

play10:05

not going to work yeah uh the second one

play10:08

Alex it looks like is like oh it will

play10:10

automation will solve for everything and

play10:13

also what I'm learning from David is the

play10:15

third one sounds like a misconception

play10:17

would have been like okay you automated

play10:19

and it's one and done yeah and you know

play10:22

now you're gonna let it run and and you

play10:24

don't have to touch that again but it

play10:25

looks like um ironically automation

play10:28

sounds like it also need some human

play10:31

supervision oh 100% and and and in this

play10:33

case it would be maintenance right

play10:35

making sure that that things are I mean

play10:37

you always what I found is you you

play10:40

definitely want to make sure that the

play10:42

area that the code that you're

play10:44

automating is at at some for lack of the

play10:48

better the word sound like it's not

play10:50

going to have a lot more changes coming

play10:52

down the pipe um I have found it's

play10:55

better to wait if you know that there's

play10:57

an area of the code that's being worked

play10:58

on it's going to have a lot of changes

play11:01

to just wait on that automation so that

play11:04

you're not having to you know write that

play11:07

script or maintain it over and over and

play11:09

over again because they keep making

play11:11

changes to it so that was another thing

play11:13

that I learned as well was sometimes

play11:15

it's better to just hold off on an area

play11:17

that you know is going to have drastic

play11:18

changes to it until that that code gets

play11:21

a little bit more solid and sound

play11:24

[Music]

play11:28

[Applause]

play11:29

[Music]

Rate This

5.0 / 5 (0 votes)

Связанные теги
AutomationTestingMisconceptionsStrategicImplementationMaintenanceEfficiencyROIAI TestingManual to AutomatedQA Insights
Вам нужно краткое изложение на английском?