The first example that came to my mind for a recent notation that has caught on in the field would be siteswaps in juggling. It was only invented in the 1980s. I am a juggler and can confirm that all the technical juggling nerds know what this is and it is used in crazy tricks. For example see 5551 below, which I heard was the first trick that was found through the notation:
Juggling lab is a software for rendering these.
EDIT: I probably misremembered with 5551, the Wikipedia article mentions 441.
Many popular languages today (notably the C family) ultimately descend from ALGOL, which is from 1958.
"Structured programming", i.e. writing code as syntactically-delimited blocks, functions, and procedures rather than with numbered lines and GOTOs, was pioneered in ALGOL.
Popular languages today such as Python, Java, JavaScript, Go, and Rust may diverge pretty widely in features (and syntax), but all of these are ultimately ALGOL descendants; albeit with influences from other language families too.
(If your language has for loops, it's an ALGOL descendant.)
Lisp and Fortran are also pre-1960.
Simula (and thus object-orientation) is from '62, but influenced by ALGOL. Smalltalk is a Simula descendant. C++ is what you get if you try to build Simula ideas on top of a C compiler (and go a bit gaga for operator overloading).
There are some languages a little later than that, that look pretty different. For instance, APL is from '68. Forth is from 1970. ML, which gave rise to Haskell, is from '73.
I thought "new notation" included new symbols. Almost all programming languages exclusively use ASCII characters for their keywords, which are pretty old.
Various kinds of tensor networks might be an example. Wikipedia claims that Penrose's graphical tensor notation is from 1971. Its descendant, ZX calculus is from as late as 2008. Arguably the first tensor networks were Feynman diagrams though, and 1948 is before your cutoff of 1960. (Actually, now that I think about it, it's kind of funny that the infinite dimensional case came before the finite dimensional one here.)
Relatedly: string diagrams (with Penrose's tensor notation apparently being seen as a precursor)
many of these are skeuomorphic. perhaps it can be argued that they have history from the 60s. but at this point the digital interfaces have supplanted any real world metaphor. for example, the idea of showing a reticle moving along a line to represent "where you are in this song/movie" is a universal notation. i don't believe it was common before personal computers.
I do think there's some innovation on notation, but it mostly happens with existing typographic symbols because extending typography is harder than it used to be. Previously, you could just come up with whatever you wanted because work started out hand written. Then you'd pay to get the printer to make whatever weird symbol you wanted for publication, or, if on a budget, come up with some weird approximation using simpler symbols.
It seems like it should be easier on computers, and in theory it is, but lots of things drive us towards making default choices. The worst of these is probably that Unicode is already full of some many symbols that LaTeX can render, so it's much easier to just pick some existing symbol rather than try to go through all the work of cooking up a new one.
I separately hope that there's some effect from computer code, too, where people are trending towards favoring longer symbols that more resemble descriptive function and variables names, which feel less like notation but are easier to read on first glance, even if they bear costs in efficiency once familiar with them.
Some conjectures:
Possibly one factor is that the evident versatility of using ASCII in nearly all programming languages (and also for stuff like LaTeX) made people less inclined to invent new notation.
Emojis is a major potential example, as shown by the fact that the Unicode standard has been considerably extended to include them. However it’s debatable that these are notations in the sense you mean (technical symbols presumably).
In avant garde music there have indeed been notations invented and to some extent adopted since 1960. Back then it was quite common for composers to devise new notations for obscure techniques etc in their own works, though there were usually existing (often better) ones, albeit not standardised. A 1974 attempt to set standards with a conference in Ghent only partially worked.
The rise of music notation software since the 1990s has increased standardisation, as composers now use such software (rather than pen & manuscript paper), which somewhat constrains what fanciful notations they can use.
Huh, I did a bit of a search, and indeed very few examples show up, even if we allow those right at the 1960 cutoff.
Siteswap notation for juggling is the most common, and dates to 1981. New tricks have even be discovered due to it
Chess's PGN notation is from 1993, even if FEN is like a century older. Allegedly it took until around the 1980s for Anglosphere chess publications to switch to predominately using algebraic notation, though the Germans were using it a century earlier and spread it to the Russians!
Rubiks cube notation exists, though, mostly it's just standardization on what letters are used for the obvious concepts like moves.
There's a common core of markdown notation used very often, stuff like asterisks for italics (and two for bold) and etc, which apparently came from informal usenet conventions.
Similarly, UML diagrams and the conventions of informal diagrams wauld date past 1960.
Commutative diagrams in category theary were probably invented pre 1960 - but Categories for the Working Mathematician dates to 1971. My guess might be that using them all over the place only became common after 1960? String diagrams and Penrose notation and etc. are past 1960 too.
Combinatorial game theory is from 1960 and some years after that. I don't think there's that much novel notation, but maybe the standard notations for game values maaaybe counts? Similarly you probably count BNF notation as before 1960, since it was invented in 1959. For other things around the cutoff, integer floor/ceiling notation comes from Iverson right before 1960.
Also dataflow diagrams seem to come from 1970-s: https://en.wikipedia.org/wiki/Data-flow_diagram
Although visual dataflow programming seems to go back to 1960-s: https://en.wikipedia.org/wiki/Dataflow_programming
But yes, a bit later than 1960, but my examples are still quite old.
Writing consists of language and also notations, systems of marks that communicate meaning in a specialized domain. Examples of fields with their own highly developed notation are music, mathematics, architecture, electronics and chemistry. There are also more minor types of notation, for example, welding, meteorology and finite state machines. Here's the question: all the notations I'm aware of were invented before about 1960. Over the past few decades, people have invented all sorts of fancy notations, but none of them have caught on in the applicable field. Why not?
Some answers: