-
Notifications
You must be signed in to change notification settings - Fork 2.8k
feat(flux): add scheduler selection for Flux models #8704
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
feat(flux): add scheduler selection for Flux models #8704
Conversation
Add support for alternative diffusers Flow Matching schedulers: - Euler (default, 1st order) - Heun (2nd order, better quality, 2x slower) - LCM (optimized for few steps) Backend: - Add schedulers.py with scheduler type definitions and class mapping - Modify denoise.py to accept optional scheduler parameter - Add scheduler InputField to flux_denoise invocation (v4.2.0) Frontend: - Add fluxScheduler to Redux state and paramsSlice - Create ParamFluxScheduler component for Linear UI - Add scheduler to buildFLUXGraph for generation
|
I did not test (the scheduler) changes yet. Frontend looks good. |
lstein
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm getting a validation error on the step progress callback for Euler and LCM. Heun is working ok. Something about the way steps are being counted, I'd guess?
[2025-12-27 21:47:18,123]::[InvokeAI]::ERROR --> Error while invoking session 8e1ec091-0679-473b-9a59-8b883e99537b, invocation 2c23b3e7-2fdc-4b7b-aac2-a691d36f
0916 (flux_denoise): 1 validation error for InvocationProgressEvent
percentage
Input should be less than or equal to 1 [type=less_than_equal, input_value=1.1666666666666667, input_type=float]
For further information visit https://errors.pydantic.dev/2.12/v/less_than_equal
[2025-12-27 21:47:18,124]::[InvokeAI]::ERROR --> Traceback (most recent call last):
File "/home/lstein/Projects/InvokeAI/invokeai/app/services/session_processor/session_processor_default.py", line 130, in run_node
output = invocation.invoke_internal(context=context, services=self._services)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lstein/Projects/InvokeAI/invokeai/app/invocations/baseinvocation.py", line 244, in invoke_internal
output = self.invoke(context) ^^^^^^^^^^^^^^^^^^^^
File "/home/lstein/invokeai-main/.venv/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/lstein/Projects/InvokeAI/invokeai/app/invocations/flux_denoise.py", line 171, in invoke latents = self._run_diffusion(context)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lstein/Projects/InvokeAI/invokeai/app/invocations/flux_denoise.py", line 425, in _run_diffusion
x = denoise(
^^^^^^^^
File "/home/lstein/Projects/InvokeAI/invokeai/backend/flux/denoise.py", line 222, in denoise
step_callback(
File "/home/lstein/Projects/InvokeAI/invokeai/app/invocations/flux_denoise.py", line 921, in step_callback
context.util.flux_step_callback(state)
File "/home/lstein/Projects/InvokeAI/invokeai/app/services/shared/invocation_context.py", line 626, in flux_step_callback
diffusion_step_callback(
File "/home/lstein/Projects/InvokeAI/invokeai/app/util/step_callback.py", line 185, in diffusion_step_callback
signal_progress("Denoising", percentage, image, (width, height))
File "/home/lstein/Projects/InvokeAI/invokeai/app/services/shared/invocation_context.py", line 687, in signal_progress
self._services.events.emit_invocation_progress(
File "/home/lstein/Projects/InvokeAI/invokeai/app/services/events/events_base.py", line 72, in emit_invocation_progress
self.dispatch(InvocationProgressEvent.build(queue_item, invocation, message, percentage, image))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lstein/Projects/InvokeAI/invokeai/app/services/events/events_common.py", line 149, in build
return cls(
^^^^
File "/home/lstein/invokeai-main/.venv/lib/python3.12/site-packages/pydantic/main.py", line 250, in __init__
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
pydantic_core._pydantic_core.ValidationError: 1 validation error for InvocationProgressEvent
percentage
Input should be less than or equal to 1 [type=less_than_equal, input_value=1.1666666666666667, input_type=float]
For further information visit https://errors.pydantic.dev/2.12/v/less_than_equal
LCM scheduler may have more internal timesteps than user-facing steps, causing user_step to exceed total_steps. This resulted in progress percentage > 1.0, which caused a pydantic validation error. Fix: Only call step_callback when user_step <= total_steps.
|
All the schedulers now work without crashing. Tested on both the linear view and workflow editor mode. However, the step count does not seem right. For Euler and LCM, when I request six denoising steps I get seven, which is inconsistent with previous behavior. For Heun, I get 11 steps, which is consistent with the behavior in SDXL. |
|
Reply in #8705 (comment). |
Remove the initial step_callback at step=0 to match SD/SDXL behavior. Previously Flux showed N+1 steps (step 0 + N denoising steps), while SD/SDXL showed only N steps. Now all models display N steps consistently.
Summary
Add support for alternative diffusers Flow Matching schedulers for Flux models:
The scheduler can be selected in both the Linear UI (Generation Settings → Advanced) and the Workflow Editor (Flux Denoise node).
Backend changes:
invokeai/backend/flux/schedulers.pywith scheduler type definitions and class mappingdenoise.pyto accept optional diffusers scheduler, with automatic detection ofsigmasparameter supportschedulerInputField toflux_denoiseinvocation (version 4.1.0 → 4.2.0)Frontend changes:
fluxSchedulerto Redux state inparamsSliceParamFluxSchedulercomponent for Linear UI dropdownbuildFLUXGraphRelated Issues / Discussions
QA Instructions
Merge Plan
Standard merge, no special considerations.
Checklist
What's Newcopy (if doing a release after this PR)