Counting From Zero
Whole numbers, counting numbers, natural numbers. Which of those sets has the number 0 in it? You'd think mathematics would be precise, but naming things is hard. Depending on who you talk to, the natural numbers include 0, or they don't. It depends.
What about array indexes? They're zero based, right? Not exactly. Yes, the C family, Lisp, anything JVM based, ECMAScript, Go, and now Rust are 0 based. Other well known languages such as FORTRAN, COBOL, R, and Matlab are 1 based. But why?
The standard reason on the zero based side is that it's easier for the computer/compiler. Just multiple the size of an element by the index, add it to the base, and you have the address of the element. That makes sense. And when compilers were big and slow and there were no optimizers saving time and memory was important. And look at who uses those 0 based languages. If you're using one of those languages you're probably either writing frameworks/libraries for others to use or really concerned about performance.
Contrast that with who's using the 1 based languages. For many folks using that set the computer is a tool that does simple math quickly. They're concerned with lift and drag or modulus of elasticity or probability or transactional cash flow. And when they think of the first item in a list they're counting things, so the first thing (1st) goes by the number 1. They're optimizing for cognitive load in their own heads while they work on the problems they're trying to solve.
So who's right? As with everything else engineering, the answer is, it depends. Figure out what you're optimizing for, make a choice, and stick with it. Or, if you're Edsger W. Dijkstra, you could go back to how to represent a sequence of numbers and make a decision based on that.