It's not about sorting: it's about teaching algorithms (the sorts), data structures (arrays, heaps) and Big-O notation.
They're all really important, but Big-O notation is really important. If nothing else, and this is real low hanging fruit, interviews index highly on Big-O notation. If a candidate is not very familiar with at least one O(n*log(n)) sort, they're going to get a no. Not because it's very important in day to day, but when it does come up, it matters to know that sorting is done in less than n2 time.
Here's a pretty real example, it's both an interview question and can be a real-life coding analogue:
You have two people, Alice and Bert, and you have an list of all their friends on Facebook (don't worry about how we got this data...). You want to find all the friends they have in common.
If a candidate or co-worker suggests an n2 approach, they'll get a hard no.
Showing algorithms vs intuition
The way most humans sort things, like a deck of cards, is via insertion sort (take one card, go through the sorted part of the deck, and put it where it belongs). This feels fine and works okay in real life, where scanning through the sorted deck is fast and we an intuit where the card should go in the sorted.
Finally, the last reason you might be seeing sorting a lot is that some teachers think it's beginner material while others think it's advanced, but they all think it's important, so they all include it.
If you're more interested in seeing beefier algorithms, I highly recommend The Algorithm Design Manual
We're a place where coders share, stay up-to-date and grow their careers.
We strive for transparency and don't collect excess data.