openai[patch]: pass through with_structured_output kwargs (#31518)
Support
```python
from langchain.chat_models import init_chat_model
from pydantic import BaseModel
class ResponseSchema(BaseModel):
response: str
def get_weather(location: str) -> str:
"""Get weather"""
pass
llm = init_chat_model("openai:gpt-4o-mini")
structured_llm = llm.with_structured_output(
ResponseSchema,
tools=[get_weather],
strict=True,
include_raw=True,
tool_choice="required",
parallel_tool_calls=False,
)
structured_llm.invoke("whats up?")
```