Specific technologies are less important than technology concepts. For example, writing database queries is commonly useful for a tester to know how to do. Whether you learn in Oracle, SQL Server, DB2 or Sybase is mostly irrelevant if you are comfortable applying what you've learned conceptually by picking up a manual. The same holds true for programming. Whether you know Java or C# doesn't really matter. What matters is that you understand how functions and procedures work, you understand what object orientation and data abstraction are, and (with the help of a syntax guide) you can write basic unit tests, parse strings, output data to a file, create loops and apply conditional logic. Beyond that, if you are comfortable reading code and can write pseudo-code for things you don't know how to program, you will be in good shape most of the time.
Additional technology concepts that I have found valuable in my career include the following:
- Database design and data modeling
- Networks and communication protocols
- Enough familiarity with hardware to know what you may want to test differently on a RAID 5 server vs. a RAID 0 server, for example.
- Be a master of spreadsheets and data presentation.
- Experimental design
- Human psychology (specifically in terms of what people expect and how they respond to unexpected conditions in front of a computer)
- Human factors (specifically in terms of technology usability)