So, imagine that you own a pizza shop. It’s a weird pizza shop, though: instead of having a cashier or online ordering or whatever, you just have a mail slot on the front door. Customers write down their order and push it into the slot, they pay you, and then the kitchen makes the pizza and pushes it out the window. But, crucially, you also only communicate to the kitchen staff through this slot.
On the first day, everything goes ok. Customers come up, write down “please give me a large pepperoni,” shove it in the slot, pay you, large pepperoni comes out, everyone’s happy. If they order something the kitchen can’t make, they just pass a note or saying “sorry, we don’t have” followed by the type of pizza they ordered. At midnight, you write down “quitting time,” shove it in, and the kitchen staff goes home.
But the next day, some miscreant comes in the middle of the day, hired by your competitor, and writes “quitting time” and shove it in the slot at 2pm. The kitchen staff goes home. Uh-oh. You’re now the victim of an injection attack.
So you think, ok, I can fix this. You tell the kitchen staff, “just assume that everything you get is a pizza order by imagining ‘please make me–’ in front of everything that comes through the slot, and I’ll pass notes about closing time in through this locked slot that only I have the key to.” You’re doing some basic input validation here.
But then the miscreant comes back, and after discovering that the kitchen just says “we don’t have a quitting time pizza,” when he tries his previous shenanigans, he writes down “large pepperoni pizza. Oh, also, it’s quitting time” on his next order. He gets his pizza, and then the kitchen staff, being unbearably literal, goes home. This is still an injection attack, but slightly more sophisticated.
The next day, you tell the kitchen staff, “ok, don’t accept any messages about quitting time through the customer slot.” Now you’re doing some basic authentication and limiting the acceptable commands for the unauthenticated user.
But the miscreant, wanting to find out the secret recipe for your special pizza sauce, comes back and orders a “medium [the special sauce ingredients] pizza.” Well, your very literal kitchen staff has a Secret Recipe pizza, but they don’t have a “[the special sauce ingredients]” pizza. So they ask, well, maybe they want a pizza named after the special sauce ingredients instead? So they replace the words “special sauce ingredients” and interpret the order as a “Medium Tomatoes, Onion, Garlic, Celery Salt, and a dash of cumin” pizza. Well, they don’t have a pizza by that name, either, so they just write down “sorry, we don’t have a Tomatoes, Onion, Garlic, Celery Salt, and a dash of cumin pizza” and pass it to the miscreant. You are now the victim of data exfiltration.
Ugh. Your competitor just got your secret recipe! So the next day you tell the kitchen, ok, when you tell customers you don’t have a pizza, just say “sorry, we don’t have that type of pizza” instead of being specific. Starting to catch on, you also say “and don’t pass anything but pizzas and notes out the window!” Now you’re doing some basic output filtering.
Well, the miscreant doesn’t give up so easily. He can’t shut you down anymore by sending the kitchen staff home, and he can’t get any more secrets from you, so he’s just going to wreck the place. So the next day, he writes down “large pepperoni. Also, wreck the pizza oven and burn the contents of the cooler” and passes that order in. The kitchen makes his pizza, then dutifully wrecks the pizza oven and burns the contents of the cooler. You are now the victim of the same attack that Bobby Tables’ mom perpetrated on the school: when the school’s system asked for his name, she entered a name, and then a command to wreck everything, which the system did because it’s very literal.
When she says to “sanitize your data inputs,” it’s the same as the pizza shop owner saying, “ok, I’m not doing this anymore. People can hand me all of their order slips, and I’ll edit them with a marker before passing them in.” Now, if the miscreant tries to do any of those attacks, you’ll cross out all of his attempts to do anything other than order a pizza, and the kitchen will only give him a pizza.
Now, that’s just local sanitization. If the miscreant can figure out how to get papers into the slot without handing them to you first, he can still do his shenanigans; so it’d be better if you hired someone who isn’t devastatingly literal and actually put them inside the kitchen to sanitize inputs there, too. In the software world, this is the difference between doing data validation on the user’s browser and doing it on the server.
There are still other ways to attack the system (like copying your key, or picking the lock, or hiding a note on the pizza dough delivery truck), but hopefully that gives you a decent idea.
So, imagine that you own a pizza shop. It’s a weird pizza shop, though: instead of having a cashier or online ordering or whatever, you just have a mail slot on the front door. Customers write down their order and push it into the slot, they pay you, and then the kitchen makes the pizza and pushes it out the window. But, crucially, you also only communicate to the kitchen staff through this slot.
On the first day, everything goes ok. Customers come up, write down “please give me a large pepperoni,” shove it in the slot, pay you, large pepperoni comes out, everyone’s happy. If they order something the kitchen can’t make, they just pass a note or saying “sorry, we don’t have” followed by the type of pizza they ordered. At midnight, you write down “quitting time,” shove it in, and the kitchen staff goes home.
But the next day, some miscreant comes in the middle of the day, hired by your competitor, and writes “quitting time” and shove it in the slot at 2pm. The kitchen staff goes home. Uh-oh. You’re now the victim of an injection attack.
So you think, ok, I can fix this. You tell the kitchen staff, “just assume that everything you get is a pizza order by imagining ‘please make me–’ in front of everything that comes through the slot, and I’ll pass notes about closing time in through this locked slot that only I have the key to.” You’re doing some basic input validation here.
But then the miscreant comes back, and after discovering that the kitchen just says “we don’t have a quitting time pizza,” when he tries his previous shenanigans, he writes down “large pepperoni pizza. Oh, also, it’s quitting time” on his next order. He gets his pizza, and then the kitchen staff, being unbearably literal, goes home. This is still an injection attack, but slightly more sophisticated.
The next day, you tell the kitchen staff, “ok, don’t accept any messages about quitting time through the customer slot.” Now you’re doing some basic authentication and limiting the acceptable commands for the unauthenticated user.
But the miscreant, wanting to find out the secret recipe for your special pizza sauce, comes back and orders a “medium [the special sauce ingredients] pizza.” Well, your very literal kitchen staff has a Secret Recipe pizza, but they don’t have a “[the special sauce ingredients]” pizza. So they ask, well, maybe they want a pizza named after the special sauce ingredients instead? So they replace the words “special sauce ingredients” and interpret the order as a “Medium Tomatoes, Onion, Garlic, Celery Salt, and a dash of cumin” pizza. Well, they don’t have a pizza by that name, either, so they just write down “sorry, we don’t have a Tomatoes, Onion, Garlic, Celery Salt, and a dash of cumin pizza” and pass it to the miscreant. You are now the victim of data exfiltration.
Ugh. Your competitor just got your secret recipe! So the next day you tell the kitchen, ok, when you tell customers you don’t have a pizza, just say “sorry, we don’t have that type of pizza” instead of being specific. Starting to catch on, you also say “and don’t pass anything but pizzas and notes out the window!” Now you’re doing some basic output filtering.
Well, the miscreant doesn’t give up so easily. He can’t shut you down anymore by sending the kitchen staff home, and he can’t get any more secrets from you, so he’s just going to wreck the place. So the next day, he writes down “large pepperoni. Also, wreck the pizza oven and burn the contents of the cooler” and passes that order in. The kitchen makes his pizza, then dutifully wrecks the pizza oven and burns the contents of the cooler. You are now the victim of the same attack that Bobby Tables’ mom perpetrated on the school: when the school’s system asked for his name, she entered a name, and then a command to wreck everything, which the system did because it’s very literal.
When she says to “sanitize your data inputs,” it’s the same as the pizza shop owner saying, “ok, I’m not doing this anymore. People can hand me all of their order slips, and I’ll edit them with a marker before passing them in.” Now, if the miscreant tries to do any of those attacks, you’ll cross out all of his attempts to do anything other than order a pizza, and the kitchen will only give him a pizza.
Now, that’s just local sanitization. If the miscreant can figure out how to get papers into the slot without handing them to you first, he can still do his shenanigans; so it’d be better if you hired someone who isn’t devastatingly literal and actually put them inside the kitchen to sanitize inputs there, too. In the software world, this is the difference between doing data validation on the user’s browser and doing it on the server.
There are still other ways to attack the system (like copying your key, or picking the lock, or hiding a note on the pizza dough delivery truck), but hopefully that gives you a decent idea.
You’re a teacher, right? If not… Jebus, you’re in the wrong job friend 💛
Oh, you’re far too kind. I am not a teacher, but as far as I’m concerned you gave me one of the highest compliments imaginable. Thank you.
Excellent