I would have used tinsert() or patterns[#patterns+1] instead of messing around with odd and even indizes. And also #patterns instead of limit * 2
If I were to write this code, I would probably follow Slakah's approach, because in my opinion it makes the code a lot more readable - we've got a table for all checks and then iterate over it, instead of cluttering the function with the lenghty variable name and gsub(). That is also the reason why I used the two locals in my first code instead of writing it all in one line.
It can be more efficient: We are building the table with our gsub()-patterns on load and then just access the table, thus avoiding two gsub() calls in our function, concatenating a string ("DRUNK_MESSAGE_OTHER"..i) and accessing the global table - and just doing one table access to our local one. So, it can be more efficient CPU-wise, but the effects are barely noticable if at all.
Conclusion: It's just a matter of preference and what you prefer for readabilty