fix functionalization handling for mixed functional/nonfunctional tensorlists (#82326)
There's an existing assert in functionalization that's probably too restrictive - when you pass a list of tensors to an op that has a mix of functional and nonfunctional tensors, we should just selectively unwrap the functional tensors and call the op rather than erroring.
I added a test for it in `test_functionalization.py` - it looks like this behavior can also show up when tracing with `make_fx()`, when constants get baked in as module properties, which don't get wrapped up when you try to functionalize the module's forward function.
Should fix the last of https://github.com/pytorch/torchdynamo/issues/88#issuecomment-1193059940
Pull Request resolved: https://github.com/pytorch/pytorch/pull/82326
Approved by: https://github.com/ezyang