Automata theory is a really interesting field of study which is closely related to formal language theory, and forms the bedrock of our theory of computation. My attempt at a simple explanation: automata theory describes very limited, theoretical computers, that are able to recognize different languages. Regular expressions are able to be solved by finite state machines, context free languages can be solved by push down automata, and the most powerful automata are Turing machines. By studying Turing machines, we can figure out what problems it is able to solve, and what problems it cannot solve. When we call a programming language "Turing complete", we say that the language can simulate a Turing machine, and thus is technically able to solve all problems a Turing machine can solve (no comment on how difficult it would be to write that solution!).
A basic understanding of automata theory, while not necessary for Software development, gives one a better understanding of computation as an abstract concept. Finite state machines are seeing a resurge in popularity, as people have discovered that they can model UI states very well. This whole field wouldn't exist (at least not how we know it) without the work of formal linguistics!
The main influence of Automata theory in programming was in the Compilers Theory.
It is used in the Lexical Analysis part, and it is used in the construction of parsers and compilers.
The finite automata allowed to implement such stuffs in a generic way.
As you say, the field wouldn't exist as we know it, but maybe another way.
Maybe the parsers, compilers would be very ugly. And the code would be full of conditional constructs (if, then, else).
We're a place where coders share, stay up-to-date and grow their careers.
We strive for transparency and don't collect excess data.