Those who tell the truth need an apace steed.

Studying mathematics for a while now, there are a few things that repeat to get on my nerves, mainly because there are some nonations missing or existing in only a strange way such that they either blow up your "preliminary definitions" section or make things less clear.

One thing that happens surprisingly often is that you have a set A and want a set B such that B\A contains only one element a, but you dont want to overblow your definitions (and give B an extra name), so you are writing A∪{a} instead - which is annoying and overloaded. Some people use the dirty (and wrong - at least in the common set theory) notation A∪a instead, some write A,a instead. Anyway, this problem occurs comparably often, and yet, no notation has spread for it.

Something similar is the set-Xor-operation between sets, for sets A and B it can be written as (A∪B)\(A∩B). Some people use an overlined or dotted ∪ for that, but that can be confused with the disjoint union (which is a dirty notation for showing that A and B were already disjoint). Some people also use a Δ for it. Anyway, there is no common symbol.

Also from set theory there is the cardinality of sets. There are essentially two notations for a cardinality of A, namely |A| and #A, and as long as A is finite, everyone means the number of elements. But as soon as A gets infinite, the notations differ, and some people just write |A|=∞, when they actually mean |A|≥ω. I mostly use #A as the number of elements of A if A is finite and ∞ otherwise, and |A| as the actual cardinality, which can never be ∞, because ∞ is not a defined cardinality. This makes sense because in many cases, the actual cardinality is not interesting but only whether the set is infinite or not. Anyway, I have seen both notations in both ways already.

I already suggested lambda-abstractions in analysis and algebra. There is no notation that has spread for these abstractions, and I think this is sometimes annoying, especially because in higher topics the abstraction level gets very high.

Somehow related is the problem that there is no common notation for algorithms in mathematics, so one can see "pseudocode" in some textbooks. Algorithms are not that important, but I wonder why there is no notation that has spread. Maybe with programming-courses getting compulsory, and with theoretical computer science reuniting with mathematics, this will change.

In a lot of lectures, I have already seen the problem that one wants to show how some equantion is derived by other equations, or is reduced to something that is obviously wrong. In many cases, it is not clear that an equation holds, so writing the equation on a blackboard always involves some additional natural language which could be avoided if there was some common character for "this equation maybe holds". Some people use an ≟. Such a character is not really needed, but we have characters like ∴ and ⇒. Some people think, words would make mathematical texts more readable - I think that depends on the content, and well-arranged typesetting can make texts a lot more readable than using words instead of special characters.

Something that happens regularly is the confusion about ℕ. Is 0∊ℕ? The answer is, sometimes yes, sometimes no. This is annoying, an it always gets on my nerves when somebody uses ℕ. There are alternatives, for example, one can define ℕi:={n∊ℤ|n≥i}, and then always write ℕ0 and ℕ1. So please, use that instead!

Same for ⊂. Some people use ⊂ as proper subset, some not. There are ⊆ and ⊊ which are unambiguous.