interpreter

All terms

Definition

An interpreter is a software program that converts a man-made language that has syntax and strict grammar called a programming language into instructions that make computers perform specific actions. Many of these operations together are what end up as software applications we use daily.

Another program that does this is called a compiler. The main difference between an interpreter and a compiler is when and how they do this conversion. A compiler will convert these instructions once into a program very similar to the apps you have on your computer and you can double click them to run them at any time. Interpreters on the other hand will directly convert these instructions into something the computer understands, which means you will always need the interpreter in other to run the software you have created.

Use cases and Examples

Because interpreters run through the instructions and do not produce a program you can execute later, It makes it perfect for scenarios where you need to interact with the computer and see its output in realtime, an example of this is shell scripting.

Most programming languages used for developing web applications use interpreters and this is mostly because web applications usually require making changes multiple times in the development process so much so that developers don't want to be sitting around waiting for a compiler to provide an executable program before testing to see if the everything works as intended. This is mostly true for large web applications.

With that being said interpreted programs are generally slower than compiled programs purely based on the fact that interpreters will have to convert the instructions each and every time you want to get the computer to perform some operation, whereas compilers convert your instructions once into a program and thats about it.

Examples of interpreted programming languages include but not limited to PHP, Python, Ruby, Javascript.

All terms
Github