Basically 0^0 and 0/0 are undefined in a strict interpretation, but in specific applications it is convenient to allow them to take on particular values. In the case of 0^0, it turns out that it is convenient in a LOT of situations to let it equal 1 because it has certain implications, but even so it's not rigorously correct. On the other hand, defining 0/0 a particular way doesn't really seem to have the same kind of ramifications so it's mostly left undefined.