-- Using a real global here to make sure anything stashed in here (and
-- in `wincent.g`) survives even after the last reference to it goes away.
_G.wincent = wincent
What would the consequences be if this variable was declared at the top level but not added to _G. What would cause references to it to “go away”?
There is no difference between _G.foo = 'foo' and foo = 'foo'. Well, there is one difference: we can use any name for the variable if we use the _G table, such as _G['@-/$*'] = 'foo'. I have no idea why you would want to do such a thing though. You can also generate a variable name dynamically and use its value:
local varname = io.read()
_G[varname] = 'lol'
As for why the user in your link did what he did, it’s best to ask him. The wincent variable is local, so it would be garbage-collected once the script terminates. I guess that is why he assigned it to a global variable. I personally would not want to mutate the global environment, but maybe he has a good reason. Or maybe it’s just an overengineered config, I don’t know.
I reached out to the author and it turns out it was to satisfy a linter rule that warned of a possibly unintentional global if it saw a non-local variable assignment, like: