This is a really complicated question. TNet primarily uses TCP. TCP will re-transmit lost packets in order to ensure reliable transmission. So to realistically simulate packet loss using TCP I think you'd need to use a raw socket and mimic the behaviour of TCP. This would then give you control over retransmissions and a nice test bed to work with.
Latency, I think, is slightly easier to simulate, but I can't wrap my head around it at the moment. I don't think a stack + delayed popping would work, but you could try it.
Maybe I'm over-thinking it and there's a far simpler approach, like simulating the *effects* of latency and packet loss without actually simulating latency and packet loss. Who knows
