The more familiar we are with a particular technique, the harder it is to see its limitations. For those who have taken the appropriate courses, the machinery of real and complex numbers, vectors, matrices, and so on, seem to be the inevitable "right" way of doing things. Yet there is another way, pioneered in the nineteenth century by Grassmann (1844) and Clifford (1878) and called (by Clifford) "geometric algebra" (GA). For over a century, GA was buried by the "standard" approach of Hamilton, Gibbs, et al., but it has recently been resurrected by a small but dedicated group of people, perhaps most notably by David Hestenes in "New Foundations for Classical Mechanics" (1999).
"Geometric Algebra for Physicists" performs the monumental task of presenting almost all of basic physics, from the Lorentz transformation to the Schwarzchild metric, with an excursion into quantum theory on the way, in the notation of GA. Since most physicists seem to be fairly happy with the notations that they grew up with, it is fair to ask "Why bother?". The authors' answer is "In [GA] much of the standard subject matter taught to physicists can be formulated in an elegant and highly condensed fashion". Here's just one example: on page 230, Maxwell's equations are given in the form "del F = J" (in which del is a GA operator, not the standard operator used to define div, grad, etc.).
This is not a book for bedtime reading: you will need at least an undergraduate degree in mathematics or physics to get much out of it, and even then you will have to master an unfamiliar, but very powerful, notation. With that background, you could skim through and say "Wow!" or you could spend a few years studying the development in detail; either way, the book is worth every penny. You might also want to consider "Geometric Algebra for Computer Science" by Dorst et al.