OAuth2 Standard Auth Flow Returns 404 Then Returns 200 on Hard Reload

I am trying to configure a basic OAuth2 client to authenticate users for JupyterHub. I have it working with one major annoyance:


  1. Unauthorized User enters this address into browser:
    https://datalab.example.com

  1. Unauthorized User is redirected to the following URL. This location initially returns 404, but then returns 200 when page is hard refreshed in Firefox (“Hard Refresh” entails holding down the Shift key while refreshing the URL: Shift+Cmd+R on Mac, Shift+Ctrl+R elsewhere). In Safari and Chrome, there is no getting past the 404:
    https://auth.example.com/realms/research/protocol/openid-connect/auth?response_type=code&redirect_uri=https%3A%2F%2Fblue-sea-697d.quartiers047.workers.dev%3A443%2Fhttps%2Fdatalab.example.com%2Fhub%2Foauth_callback&client_id=example-datalab&code_challenge=GeWmRK8vA1_O6NjPXrv4kNcCCYgxVTloQS6LVqloYkA&code_challenge_method=S256&state=eyJzdGF0ZV9pZCI6ICJhZGQ3ZDc2ODgyODA0MzEyYjgyZjZkYmFhYjEwYzc2NCJ9&scope=openid+email+profile+roles

  1. After User successfully authenticates with Keycloak, User is redirected to this callback URL. This location initially returns 404, but
    returns 200 when page is hard refreshed in Firefox:
    https://datalab.example.com/hub/oauth_callback?state=eyJzdGF0ZV9pZCI6ICJhZGQ3ZDc2ODgyODA0MzEyYjgyZjZkYmFhYjEwYzc2NCJ9&session_state=87f068e0-e2da-453e-85c7-452a856502cc&iss=https%3A%2F%2Fblue-sea-697d.quartiers047.workers.dev%3A443%2Fhttps%2Fauth.example.com%2Frealms%2Fresearch&code=320c0ce2-743b-4633-a705-6bf751108a6e.87f068e0-e2da-453e-85c7-452a856502cc.2a977078-9579-45a8-8764-9438aec50ba9
    After the hard refresh here, the User arrives in their newly-spawned JupyterLab environment as exepected. Everything is fully functional from there on out.

I have attempted thorough debugging regarding the browser cache to no avail.

Keycloak instance is responsible for SSL termination. Keycloak sits behind Traefik proxy with TLS Passthrough. Keycloak and Traefik are each in their own Docker container, deployed via Docker Compose.

Login to the Keycloak Admin Console at auth.example.com works just fine (without any 404 responses). I can’t for the life of me understand what’s going on here, including why I can ultimately get it to work in Firefox, but not in Safari and Chrome. Any suggestions would be greatly appreciated.

Could you share the logs of JupyterHub in DEBUG mode when this happens along with JupyterHub’s config?

Happy to!

Apparently, the limit for one of these discourse posts is limited to 32,000 characters, but including both the config and the logging here ran to 52,642 characters, so I will put the DEBUG log in a reply to this message.

I cleared out all previous logging and restarted the containers. I also cleared out everything in my browser cache and cookies for the website before starting the new flow.

Here is my jupyterhub_config.py:


# Configuration file for jupyterhub.

from collections import namedtuple
from curses import use_env
import os
import pprint
import pwd

# This is used by the c.DockerSpawner.mounts, near the end of the config.
from docker.types import DriverConfig  # noqa  # type: ignore

c = get_config()  # noqa  # type: ignore

c.Application.log_level = 'DEBUG'

# Convert boolean-like strings in environment variable values to proper boolean types in python
def getenv_bool(envar, default=None):
    value = os.getenv(envar, default)

    if isinstance(value, bool):
        return value
    elif value is None:
        raise ValueError(f"{envar} is not set")
    else:
        value = value.lower()

    if value in ('y', 'yes', 't', 'true', 'on', '1'):
        return True
    elif value in ('n', 'no', 'false', 'off', '0'):
        return False
    else:
        raise ValueError(f"{envar} does not evaluate to a boolean value: {value}")


def get_admins():
    admins = os.getenv('ADMIN_USERS')
    if admins is None:
        return set()
    else:
        admins = admins.split(',')
        return {admin.strip() for admin in admins}

# ------------------------------------------------------------------------------
# Authenticator configuration
# ------------------------------------------------------------------------------
## Set of users that will have admin rights on this JupyterHub.
#
#   Note: As of JupyterHub 2.0, full admin rights should not be required, and more
#   precise permissions can be managed via roles.
#
#   Admin users have extra privileges:
#    - Use the admin panel to see list of users logged in
#    - Add / remove users in some authenticators
#    - Restart / halt the hub
#    - Start / stop users' single-user servers
#    - Can access each individual users' single-user server (if configured)
#
#   Admin access should be treated the same way root access is.
#
#   Defaults to an empty set, in which case no user has admin access.
#
#   Default: set()
c.Authenticator.admin_users = get_admins()

## Delete any users from the database that do not pass validation
#
#   When JupyterHub starts, `.add_user` will be called
#   on each user in the database to verify that all users are still valid.
#
#   If `delete_invalid_users` is True,
#   any users that do not pass validation will be deleted from the database.
#   Use this if users might be deleted from an external system,
#   such as local user accounts.
#
#   If False (default), invalid users remain in the Hub's database
#   and a warning will be issued.
#   This is the default to avoid data loss due to config changes.
#
#   Default: False
c.Authenticator.delete_invalid_users = getenv_bool('DELETE_INVALID_USERS', False)

# OIDC/OAuth configuration for authentication via Keycloak/TroveAuth
# ------------------------------------------------------------------

# Base class for implementing an authentication provider for JupyterHub
c.JupyterHub.authenticator_class = 'local-generic-oauth'

# OAuth2 application info
# -----------------------
c.LocalGenericOAuthenticator.client_id = os.environ['OAUTH_CLIENT']
c.LocalGenericOAuthenticator.client_secret = os.environ['OAUTH_CLIENT_SECRET']

# Identity provider info
# ----------------------
c.LocalGenericOAuthenticator.authorize_url = os.environ['OAUTH_AUTHORIZE_URL']
c.LocalGenericOAuthenticator.oauth_callback_url = os.environ['OAUTH_CALLBACK_URL']
c.LocalGenericOAuthenticator.token_url = os.environ['OAUTH_TOKEN_URL']
c.LocalGenericOAuthenticator.userdata_url = os.environ['OAUTH_USERDATA_URL']

# What we request about the user
# ------------------------------
# scope represents requested information about the user, and since we configure
# this against an OIDC based identity provider, we should request "openid" at
# least.
# c.LocalGenericOAuthenticator.scope = ["openid", "email", "profile", "roles"]
c.LocalGenericOAuthenticator.scope = ["openid", "email"]
c.LocalGenericOAuthenticator.username_claim = "preferred_username"
c.LocalGenericOAuthenticator.manage_groups = True
c.LocalGenericOAuthenticator.manage_roles = True
c.LocalGenericOAuthenticator.auth_state_groups_key = "oauth_user.groups"

# Authorization
# ----------------------
# Preserve auth_state between requests in Auth Flow
c.LocalGenericOAuthenticator.enable_auth_state = True
# Allow all authenticated users to login
c.LocalGenericOAuthenticator.allow_all = True
# Automatically begin the login process, rather than starting with a “Login with…” link at /hub/login
c.LocalGenericOAuthenticator.auto_login = True
# Create system users that don't exist yet, once authenticated by Keycloak
c.LocalAuthenticator.create_system_users = True
# ALL users should be assigned to the 'users' group (GID=100). This will prevent a good many permissions errors.
c.LocalAuthenticator.add_user_cmd = ['useradd', '-m', '-g', 'users', '-s', '/bin/bash']


# This is a critical function in this config.  This is how we get the single-user server to run with the
# actual authorized user, rather than the inferanl 'jovyan' user. This function is set to trigger upon
# successful authentication. We get the user name from the auth_model and use it to look-up the UID in
# /etc/passwd.  We then set the NB_USER and NB_UID environment variables which we ensure are then passed into
# the single-user server by the DockerSpawner using the `c.DockerSpawner.env_keep` directive, further below.
def set_nb_env(authenticator, handler, auth_model):
    try:
        user_data = pwd.getpwnam(auth_model['name'])
    except KeyError:
        print(f"User {auth_model['name']} doesn't exist. Creating account now...")
        UserData = namedtuple('UserData',['pw_name', 'pw_passwd', 'pw_uid', 'pw_gid', 'pw_gecos', 'pw_dir', 'pw_shell'])
        user_data = UserData(auth_model['name'], 'bmw420!1150', 1800, 1800, '', f"/home/{auth_model['name']}", '/bin/bash')
    print("HERE WE ARE!!!")
    os.environ['NB_USER'] = user_data.pw_name
    os.environ['NB_UID'] = str(user_data.pw_uid)
    os.environ['HOME'] = user_data.pw_dir
    spawn_data = {
        'pw_data': user_data,
        'gid_list': os.getgrouplist(auth_model['name'], user_data.pw_gid)
    }

    if auth_model['auth_state'] is None:
        auth_model['auth_state'] = {}
    auth_model['auth_state']['spawn_data'] = spawn_data

    # pprint.pprint(auth_model)

    return auth_model

c.Authenticator.post_auth_hook = set_nb_env


## ------------------------------------------------------------------------------
# JupyterHub configuration
# ------------------------------------------------------------------------------
## An Application for starting a Multi-User Jupyter Notebook server.

####################
# Database
####################

POSTGRES_DB = os.getenv('POSTGRES_DB')
POSTGRES_USER = os.getenv('POSTGRES_USER')
POSTGRES_PASSWORD = os.getenv('POSTGRES_PASSWORD')
POSTGRES_SERVER = os.getenv('POSTGRES_SERVER')
POSTGRES_PORT = os.getenv('POSTGRES_PORT')

c.JupyterHub.db_url = (
    f"postgresql://{POSTGRES_USER}:{POSTGRES_PASSWORD}"
    f"@{POSTGRES_SERVER}:{POSTGRES_PORT}/{POSTGRES_DB}"
)

## log all database transactions. This has A LOT of output.
#  Default: False
c.JupyterHub.debug_db = getenv_bool('DEBUG_DB', False)

## Purge and reset the database.
#  Default: False
c.JupyterHub.reset_db = getenv_bool('RESET_DB', False)

## Upgrade the database automatically on start.
#
#   Only safe if database is regularly backed up.
#   Only SQLite databases will be backed up to a local file automatically.
#
#   Default: False
c.JupyterHub.upgrade_db = getenv_bool('UPGRADE_DB', False)


####################
# Hub
####################

## Maximum number of concurrent servers that can be active at a time.
#
#   Setting this can limit the total resources your users can consume.
#
#   An active server is any server that's not fully stopped. It is considered
#   active from the time it has been requested until the time that it has
#   completely stopped.
#
#   If this many user servers are active, users will not be able to launch new
#   servers until a server is shutdown. Spawn requests will be rejected with a 429
#   error asking them to try again.
#
#   If set to 0, no limit is enforced.
#
#   Default: 0
c.JupyterHub.active_server_limit = int(os.getenv('ACTIVE_SERVER_LIMIT', 0))

## The public facing URL of the whole JupyterHub application.
#
#   This is the address on which the proxy will bind.
#   Sets protocol, ip, base_url
#
#   Default: 'http://:8000'
c.JupyterHub.bind_url = os.getenv('BIND_URL', 'http://:8000')

# Whether to shutdown the proxy when the Hub shuts down.
#
#   Disable if you want to be able to teardown the Hub while leaving the proxy running.
#
#   Only valid if the proxy was starting by the Hub process.
#
#   If both this and cleanup_servers are False, sending SIGINT to the Hub will
#   only shutdown the Hub, leaving everything else running.
#
#   The Hub should be able to resume from database state.
#
#   Default: True
c.JupyterHub.cleanup_proxy = getenv_bool('CLEANUP_PROXY', True)

## Whether to shutdown single-user servers when the Hub shuts down.
#
#   Disable if you want to be able to teardown the Hub while leaving the single-user servers running.
#
#   If both this and cleanup_proxy are False, sending SIGINT to the Hub will
#   only shutdown the Hub, leaving everything else running.
#
#   The Hub should be able to resume from database state.
#
#   Default: True
c.JupyterHub.cleanup_servers = getenv_bool('CLEANUP_SERVERS', True)

## The config file to load
#   Default: 'jupyterhub_config.py'
c.JupyterHub.config_file = os.getenv('CONFIG_FILE', 'jupyterhub_config.py')

## Number of days for a login cookie to be valid.
#   Default is two weeks.
#   Default: 14
c.JupyterHub.cookie_max_age_days = int(os.getenv('COOKIE_MAX_AGE_DAYS', 14))

## The URL on which the Hub will listen.
#   This is a private URL for internal communication.
#   Typically set in combination with hub_connect_url.
#   If a unix socket, hub_connect_url **must** also be set.
#
#   For example:
#
#       "http://127.0.0.1:8081"
#       "unix+http://%2Fsrv%2Fjupyterhub%2Fjupyterhub.sock"
#
#   Default: ''
c.JupyterHub.hub_bind_url = os.getenv('HUB_BIND_URL', '')

## Downstream proxy IP addresses to trust.
#
#   This sets the list of IP addresses that are trusted and skipped when processing
#   the `X-Forwarded-For` header. For example, if an external proxy is used for TLS
#   termination, its IP address should be added to this list to ensure the correct
#   client IP addresses are recorded in the logs instead of the proxy server's IP
#   address.
#
#   Default: []
c.JupyterHub.trusted_downstream_ips = ['traefik']


####################
# Internal SSL
####################

## Generate certs used for internal ssl
#   Default: False
c.JupyterHub.generate_certs = getenv_bool('GENERATE_CERTS', False)

## Enable SSL for all internal communication
#
#   This enables end-to-end encryption between all JupyterHub components.
#   JupyterHub will automatically create the necessary certificate
#   authority and sign notebook certificates as they're created.
#
#   Default: False
c.JupyterHub.internal_ssl = getenv_bool('INTERNAL_SSL', False)

## Recreate all certificates used within JupyterHub on restart.
#
#   Note: enabling this feature requires restarting all notebook servers.
#
#   Use with internal_ssl
#
#   Default: False
c.JupyterHub.recreate_internal_certs = getenv_bool('RECREATE_INTERNAL_CERTS', False)

## Path to SSL certificate file for the public facing interface of the proxy
#
#   When setting this, you should also set ssl_key
#
#   Default: ''
c.JupyterHub.ssl_cert = os.getenv('INTERNAL_SSL_CERT', '')

## Path to SSL key file for the public facing interface of the proxy
#
#   When setting this, you should also set ssl_cert
#
#   Default: ''
c.JupyterHub.ssl_key = os.getenv('INTERNAL_SSL_KEY', '')

## Names to include in the subject alternative name.
#
#   These names will be used for server name verification. This is useful
#   if JupyterHub is being run behind a reverse proxy or services using ssl
#   are on different hosts.
#
#   Use with internal_ssl
#
#   Default: []
TRUSTED_ALT_NAMES = os.getenv('TRUSTED_ALT_NAMES')

if TRUSTED_ALT_NAMES:
    c.JupyterHub.trusted_alt_names = [name.strip() for name in TRUSTED_ALT_NAMES.split(',')]


####################
# Logging
####################

## Instead of starting the Application, dump configuration to stdout
c.JupyterHub.show_config = getenv_bool('SHOW_LOGGING_CONFIG', False)

## Instead of starting the Application, dump configuration to stdout (as JSON)
c.JupyterHub.show_config_json = getenv_bool('SHOW_LOGGING_CONFIG_JSON', False)


####################
# Metrics
####################

## Authentication for prometheus metrics
#  Default: True
c.JupyterHub.authenticate_prometheus = getenv_bool('AUTHENTICATE_PROMETHEUS', True)

## Host to send statsd metrics to. An empty string (the default) disables sending metrics.
#  Default: ''
c.JupyterHub.statsd_host = os.getenv('STATSD_HOST', '')

## Port on which to send statsd metrics about the hub
#  Default: 8125
c.JupyterHub.statsd_port = int(os.getenv('STATSD_PORT', 8125))

## Prefix to use for all metrics sent by jupyterhub to statsd
#  Default: 'datalab'
c.JupyterHub.statsd_prefix = os.getenv('STATSD_PREFIX', 'datalab')


# ------------------------------------------------------------------------------
# Spawner configuration
# ------------------------------------------------------------------------------
## Base class for spawning single-user notebook servers.
#   Should be a subclass of :class:`jupyterhub.spawner.Spawner`.
#
#   Subclass this, and override the following methods:
#       - load_state
#       - get_state
#       - start
#       - stop
#       - poll
#
#   Currently installed:
#       - default: jupyterhub.spawner.LocalProcessSpawner
#       - localprocess: jupyterhub.spawner.LocalProcessSpawner
#       - simple: jupyterhub.spawner.SimpleLocalProcessSpawner
#       - docker: dockerspawner.DockerSpawner
#       - docker-swarm: dockerspawner.SwarmSpawner
#       - docker-system-user: dockerspawner.DockerSpawner
#
#   As JupyterHub supports multiple users, an instance of the Spawner subclass
#   is created for each user. If there are 20 JupyterHub users, there will be 20
#   instances of the subclass.
#
#   Default: 'jupyterhub.spawner.LocalProcessSpawner'
c.JupyterHub.spawner_class = os.getenv('SPAWNER_CLASS', 'jupyterhub.spawner.LocalProcessSpawner')

## Maximum number of consecutive failures to allow before shutting down JupyterHub.
#
#   This helps JupyterHub recover from a certain class of problem preventing
#   launch in contexts where the Hub is automatically restarted (e.g. systemd,
#   docker, kubernetes).
#
#   A limit of 0 means no limit and consecutive failures will not be tracked.
#
#   Default: 0
c.Spawner.consecutive_failure_limit = int(os.getenv('SPAWNER_CONSECUTIVE_FAILURE_LIMIT', 0))

## Enable debug-logging of the single-user server
#   Default: False
# c.Spawner.debug = False

## The URL the single-user server should start in.
#
#   `{username}` will be expanded to the user's username
#
#   Example uses:
#       - You can set this to `/lab` to have JupyterLab start by default, rather than Jupyter Notebook.
#
#  Default: ''
c.Spawner.default_url = '/lab'

## Run Jupyter Lab by default in lab containers
c.Spawner.environment = {
    'JUPYTER_ENABLE_LAB': '1'
}

# The image to use for single-user servers.
#
#   This image should have the same version of jupyterhub as the Hub itself installed.
#
#   If the default command of the image does not launch jupyterhub-singleuser,
#   set `c.Spawner.cmd` to launch jupyterhub-singleuser, e.g.
#
#   Any of the jupyter docker-stacks should work without additional config, as
#   long as the version of jupyterhub in the image is compatible.
#
#   Default: 'quay.io/jupyterhub/singleuser:latest'
c.Spawner.image = os.getenv('DOCKER_JUPYTERLAB_IMAGE', 'quay.io/jupyterhub/singleuser:latest')

# List of environment variables for the single-user server to inherit from the JupyterHub process.
# This is IMPORTANT to ensure correct permissions.
c.DockerSpawner.env_keep = ['NB_UID', 'NB_USER', 'HOME']

# The command used for starting notebooks.
c.Spawner.cmd = ['jupyterhub-singleuser']

## The IP address (or hostname) the single-user server should listen on.
#
#   Usually either '127.0.0.1' (default) or '0.0.0.0'.
#
#   The JupyterHub proxy implementation should be able to send packets to this interface.
#
#   Subclasses which launch remotely or in containers should use '0.0.0.0'.
#
#   Default: '0.0.0.0'
c.Spawner.ip = os.getenv('SPAWNER_IP', '0.0.0.0')

## Path to the notebook directory for the single-user server.
#
#   The user sees a file listing of this directory when the notebook interface is
#   started. The current interface does not easily allow browsing beyond the
#   subdirectories in this directory's tree.
#
#   `~` will be expanded to the home directory of the user, and {username} will be
#   replaced with the name of the user.
#
#   Note that this does *not* prevent users from accessing files outside of this path!
#   They can do so with many other means.
#
#   Default: ''
c.Spawner.notebook_dir = os.getenv('NOTEBOOK_DIR', '~')

## Prefix for container names.
#
#  See name_template for full container name for a particular user’s server.
#  Default: 'jupyter'
c.Spawner.prefix = os.getenv('SPAWNER_PREFIX', 'jupyter')


# ------------------------------------------------------------------------------
# DockerSpawner configuration
# ------------------------------------------------------------------------------
# See https://jupyterhub-dockerspawner.readthedocs.io/en/latest/api/index.html

## MINIMUM number of cpu-cores a single-user notebook server is guaranteed to have available.
#
#   If this value is set to 0.5, allows use of 50% of one CPU.
#   If this value is set to 2, allows use of up to 2 CPUs.
#
#   **This is a configuration setting. Your spawner must implement support for the limit to work.**
#   The default spawner, `LocalProcessSpawner`, does **not** implement this support.
#   A custom spawner **must** add support for this setting for it to be enforced.
#
#   Default: None
# CPU_GUARANTEE = os.getenv('CPU_GUARANTEE')
#
# if CPU_GUARANTEE:
#     c.DockerSpawner.cpu_guarantee = float(CPU_GUARANTEE)

## MAXIMUM number of cpu-cores a single-user notebook server is allowed to use.
#
#   If this value is set to 0.5, allows use of 50% of one CPU.
#   If this value is set to 2, allows use of up to 2 CPUs.
#
#   The single-user notebook server will never be scheduled by the kernel to use
#   more cpu-cores than this. There is no guarantee that it can access this many
#   cpu-cores.
#
#   **This is a configuration setting. Your spawner must implement support for the limit to work.**
#   The default spawner, `LocalProcessSpawner`, does **not** implement this support.
#   A custom spawner **must** add support for this setting for it to be enforced.
#
#   Default: None
# CPU_LIMIT = os.getenv('CPU_LIMIT')
#
# if CPU_LIMIT:
#     c.Spawner.cpu_limit = float(CPU_LIMIT)

## Spawn the new lab container as root
# Actually needed to set the owner to the authenticated user, rather than the infernal 'jovyan' user.
c.DockerSpawner.extra_create_kwargs = {'user': 'root'}

# The URL the single-user server should connect to the Hub.
c.DockerSpawner.hub_connect_url = os.getenv('HUB_CONNECT_URL')

# Specify docker link mapping to add to the container.
# If the Hub is running in a Docker container, this can simplify routing because
# all traffic will be using docker hostnames.
c.DockerSpawner.links = {'jupyterhub': 'datalab'}

## MINIMUM number of bytes a single-user notebook server is guaranteed to have available.
#
#   Allows the following suffixes:
#     - K -> Kilobytes
#     - M -> Megabytes
#     - G -> Gigabytes
#     - T -> Terabytes
#
#   **This is a configuration setting. Your spawner must implement support for the limit to work.**
#   The default spawner, `LocalProcessSpawner`, does **not** implement this support.
#   A custom spawner **must** add support for this setting for it to be enforced.
#
#   Default: None
# c.DockerSpawner.mem_guarantee = None

## MAXIMUM number of bytes a single-user notebook server is allowed to use.
#
#   Allows the following suffixes:
#     - K -> Kilobytes
#     - M -> Megabytes
#     - G -> Gigabytes
#     - T -> Terabytes
#
#   If the single user server tries to allocate more memory than this, it will fail.
#   There is no guarantee that the single-user notebook server will be able
#   to allocate this much memory - only that it can not allocate more than this.
#
#   **This is a configuration setting. Your spawner must implement support for the limit to work.**
#   The default spawner, `LocalProcessSpawner`, does **not** implement this support.
#   A custom spawner **must** add support for this setting for it to be enforced.
#
#   Default: None
# c.DockerSpawner.mem_limit = None

## Run the containers on this docker network.
#
#   If it is an internal docker network, the Hub should be on the same network,
#   as internal docker IP addresses will be used. For bridge networking,
#   external ports will be bound.
#   Default: 'bridge'
c.DockerSpawner.network_name = os.getenv('DOCKER_NETWORK_NAME', 'bridge')

# If True, delete containers when servers are stopped.
#
#   This will destroy any data in the container not stored in mounted volumes.
#
#   Default: False
c.DockerSpawner.remove = getenv_bool('SPAWNER_REMOVE_STOPPED_CONTAINERS', False)

## Configure Docker Networking
c.DockerSpawner.use_internal_ip = getenv_bool('DOCKER_USE_INTERNAL_IP', True)

# We're NOT specifying `c.DockerSpawner.volumes` like we did previously, because
# that config spec only allows `bind` mounts to the host platform.  We want to
# instead mount the named docker volume that is used by JupyterHub for
# users' home directories.  This is where the user home directories are created
# by the `c.LocalGenericOAuthenticator.create_system_users` functionality.
etc_volume = os.getenv('GUEST_ETC_VOLUME', 'trove_datalab_etc')
etc_dir = os.getenv('GUEST_ETC_DIR', '/etc')

home_volume = os.getenv('GUEST_HOME_VOLUME', 'trove_datalab_home')
home_dir = os.getenv('GUEST_HOME_DIR', '/home')

c.DockerSpawner.mounts = [
    {
        'source': etc_volume,
        'target': etc_dir,
        'type': 'volume',
        'driver_config': DriverConfig('local')
    },
    {
        'source': home_volume,
        'target': home_dir,
        'type': 'volume',
        'driver_config': DriverConfig('local')
    }
]

# ------------------------------------------------------------------------------
# Collaboration configuration
# ------------------------------------------------------------------------------
# Enable collaboration
c.LabApp.collaborative = True

Here is my DEBUG log:

[I 2025-07-22 15:50:50.407 JupyterHub app:3354] Running JupyterHub version 5.3.0
[I 2025-07-22 15:50:50.407 JupyterHub app:3384] Using Authenticator: oauthenticator.generic.LocalGenericOAuthenticator-17.3.0
[I 2025-07-22 15:50:50.407 JupyterHub app:3384] Using Spawner: dockerspawner.dockerspawner.DockerSpawner-14.0.0
[I 2025-07-22 15:50:50.407 JupyterHub app:3384] Using Proxy: jupyterhub.proxy.ConfigurableHTTPProxy-5.3.0
/usr/local/lib/python3.10/dist-packages/jupyter_events/schema.py:68: JupyterEventsVersionWarning: The `version` property of an event schema must be a string. It has been type coerced, but in a future version of this library, it will fail to validate. Please update schema: https://schema.jupyter.org/jupyterhub/events/server-action
  validate_schema(_schema)
[D 2025-07-22 15:50:50.414 JupyterHub app:1875] Generating new cookie_secret
[I 2025-07-22 15:50:50.414 JupyterHub app:1880] Writing cookie_secret to /srv/jupyterhub_config/jupyterhub_cookie_secret
[D 2025-07-22 15:50:50.422 JupyterHub app:2006] Connecting to db: postgresql://datalab:[redacted]@db:5432/datalab
[D 2025-07-22 15:50:50.454 JupyterHub orm:1509] database schema version found: 4621fec11365
[I 2025-07-22 15:50:50.503 JupyterHub proxy:556] Generating new CONFIGPROXY_AUTH_TOKEN
[D 2025-07-22 15:50:50.504 JupyterHub app:2346] Loading roles into database
[D 2025-07-22 15:50:50.525 JupyterHub app:2693] Purging expired APITokens
[D 2025-07-22 15:50:50.527 JupyterHub app:2693] Purging expired OAuthCodes
[D 2025-07-22 15:50:50.529 JupyterHub app:2693] Purging expired Shares
[D 2025-07-22 15:50:50.530 JupyterHub app:2693] Purging expired ShareCodes
[D 2025-07-22 15:50:50.532 JupyterHub app:2467] Loading role assignments from config
[D 2025-07-22 15:50:50.546 JupyterHub app:2978] Initializing spawners
[D 2025-07-22 15:50:50.553 JupyterHub app:3128] Loaded users:

[I 2025-07-22 15:50:50.553 JupyterHub app:3424] Initialized 0 spawners in 0.007 seconds
[I 2025-07-22 15:50:50.557 JupyterHub metrics:425] Found 1 active users in the last ActiveUserPeriods.twenty_four_hours
[I 2025-07-22 15:50:50.558 JupyterHub metrics:425] Found 1 active users in the last ActiveUserPeriods.seven_days
[I 2025-07-22 15:50:50.559 JupyterHub metrics:425] Found 1 active users in the last ActiveUserPeriods.thirty_days
[W 2025-07-22 15:50:50.559 JupyterHub proxy:748] Running JupyterHub without SSL.  I hope there is SSL termination happening somewhere else...
[I 2025-07-22 15:50:50.559 JupyterHub proxy:752] Starting proxy @ http://datalab:8000
[D 2025-07-22 15:50:50.559 JupyterHub proxy:753] Proxy cmd: ['configurable-http-proxy', '--ip', 'datalab', '--port', '8000', '--api-ip', '127.0.0.1', '--api-port', '8001', '--error-target', 'http://datalab:8081/hub/error', '--log-level', 'info']
[D 2025-07-22 15:50:50.560 JupyterHub proxy:670] Writing proxy pid file: jupyterhub-proxy.pid
[D 2025-07-22 15:50:50.560 JupyterHub utils:277] Waiting 10s for server at datalab:8000
[D 2025-07-22 15:50:50.560 JupyterHub utils:121] Server at datalab:8000 not ready: [Errno 111] Connection refused
[D 2025-07-22 15:50:50.560 JupyterHub utils:277] Waiting 10s for server at 127.0.0.1:8001
[D 2025-07-22 15:50:50.560 JupyterHub utils:121] Server at 127.0.0.1:8001 not ready: [Errno 111] Connection refused
[D 2025-07-22 15:50:50.662 JupyterHub utils:121] Server at datalab:8000 not ready: [Errno 111] Connection refused
15:50:50.706 [ConfigProxy] info: Proxying http://datalab:8000 to (no default)
15:50:50.707 [ConfigProxy] info: Proxy API at http://127.0.0.1:8001/api/routes
[D 2025-07-22 15:50:50.724 JupyterHub utils:285] Server at 127.0.0.1:8001 responded in 0.16s
[D 2025-07-22 15:50:50.934 JupyterHub utils:285] Server at datalab:8000 responded in 0.37s
[D 2025-07-22 15:50:50.934 JupyterHub proxy:832] Proxy started and appears to be up
[D 2025-07-22 15:50:50.935 JupyterHub proxy:925] Proxy: Fetching GET http://127.0.0.1:8001/api/routes
[I 2025-07-22 15:50:50.959 JupyterHub app:3747] Hub API listening on http://datalab:8081/hub/
[D 2025-07-22 15:50:50.959 JupyterHub proxy:389] Fetching routes to check
[D 2025-07-22 15:50:50.959 JupyterHub proxy:925] Proxy: Fetching GET http://127.0.0.1:8001/api/routes
15:50:50.959 [ConfigProxy] info: 200 GET /api/routes
15:50:50.962 [ConfigProxy] info: 200 GET /api/routes
[D 2025-07-22 15:50:50.962 JupyterHub proxy:392] Checking routes
[I 2025-07-22 15:50:50.963 JupyterHub proxy:477] Adding route for Hub: / => http://datalab:8081
[D 2025-07-22 15:50:50.963 JupyterHub proxy:925] Proxy: Fetching POST http://127.0.0.1:8001/api/routes/
15:50:50.969 [ConfigProxy] info: Adding route / -> http://datalab:8081
15:50:50.971 [ConfigProxy] info: Route added / -> http://datalab:8081
15:50:50.972 [ConfigProxy] info: 201 POST /api/routes/
[I 2025-07-22 15:50:50.972 JupyterHub app:3778] JupyterHub is now running at http://datalab:8000
[D 2025-07-22 15:50:50.973 JupyterHub app:3347] It took 0.579 seconds for the Hub to start
[D 2025-07-22 15:55:50.973 JupyterHub proxy:925] Proxy: Fetching GET http://127.0.0.1:8001/api/routes
15:55:50.979 [ConfigProxy] info: 200 GET /api/routes
[D 2025-07-22 15:55:50.981 JupyterHub proxy:392] Checking routes
[I 2025-07-22 15:57:18.631 JupyterHub log:192] 302 GET / -> /hub/ (@74.96.66.129) 0.41ms
[I 2025-07-22 15:57:18.671 JupyterHub log:192] 302 GET /hub/ -> /hub/login?next=%2Fhub%2F (@74.96.66.129) 1.49ms
[I 2025-07-22 15:57:18.713 JupyterHub log:192] 302 GET /hub/login?next=%2Fhub%2F -> /hub/oauth_login?next=%2Fhub%2F (@74.96.66.129) 2.46ms
[I 2025-07-22 15:57:18.747 JupyterHub oauth2:126] OAuth redirect: https://datalab.example.com/hub/oauth_callback
[D 2025-07-22 15:57:18.748 JupyterHub base:681] Setting cookie oauthenticator-state: {'httponly': True, 'expires_days': 1}
[I 2025-07-22 15:57:18.750 JupyterHub log:192] 302 GET /hub/oauth_login?next=%2Fhub%2F -> https://auth.example.com/realms/research/protocol/openid-connect/auth?response_type=code&redirect_uri=https%3A%2F%2Fblue-sea-697d.quartiers047.workers.dev%3A443%2Fhttps%2Fdatalab.example.com%2Fhub%2Foauth_callback&client_id=jupyter&code_challenge=[secret]&code_challenge_method=[secret]&state=[secret]&scope=openid+email (@74.96.66.129) 3.96ms
[E 2025-07-22 15:57:36.918 JupyterHub oauth2:1245] The auth_state_groups_key oauth_user.groups does not exist in the auth_model. Available keys are: dict_keys(['access_token', 'refresh_token', 'id_token', 'scope', 'token_response', 'oauth_user'])
[D 2025-07-22 15:57:36.941 JupyterHub roles:326] Assigning default role to User btf
[D 2025-07-22 15:57:36.954 JupyterHub base:681] Setting cookie jupyterhub-session-id: {'httponly': True, 'path': '/'}
[D 2025-07-22 15:57:36.954 JupyterHub base:685] Setting cookie for btf: jupyterhub-hub-login
[D 2025-07-22 15:57:36.955 JupyterHub base:681] Setting cookie jupyterhub-hub-login: {'httponly': True, 'path': '/hub/'}
[I 2025-07-22 15:57:36.955 JupyterHub _xsrf_utils:130] Setting new xsrf cookie for b'4X9bEARTyEAREsSQTI9BuVcgp-p8xAqTKmmRULCBb2E=:9648a06bb6524b5f8ad16d4974419dc1' {'path': '/hub/'}
[I 2025-07-22 15:57:36.955 JupyterHub base:973] User logged in: btf
[I 2025-07-22 15:57:36.956 JupyterHub log:192] 302 GET /hub/oauth_callback?state=[secret]&session_state=[secret]&iss=https%3A%2F%2Fblue-sea-697d.quartiers047.workers.dev%3A443%2Fhttps%2Fauth.example.com%2Frealms%2Fresearch&code=[secret] -> /hub/ ([email protected]) 319.78ms
[D 2025-07-22 15:57:37.054 JupyterHub user:496] Creating <class 'dockerspawner.dockerspawner.DockerSpawner'> for btf:
[I 2025-07-22 15:57:37.059 JupyterHub log:192] 302 GET /hub/ -> /hub/spawn ([email protected]) 34.44ms
[D 2025-07-22 15:57:37.145 JupyterHub scopes:1013] Checking access to /hub/spawn via scope servers!server=btf/
[D 2025-07-22 15:57:37.147 JupyterHub pages:216] Triggering spawn with default options for btf
[D 2025-07-22 15:57:37.148 JupyterHub base:411] Refreshing auth for btf
[E 2025-07-22 15:57:37.188 JupyterHub oauth2:1245] The auth_state_groups_key oauth_user.groups does not exist in the auth_model. Available keys are: dict_keys(['access_token', 'refresh_token', 'id_token', 'scope', 'token_response', 'oauth_user'])
[D 2025-07-22 15:57:37.188 JupyterHub roles:326] Assigning default role to User btf
[D 2025-07-22 15:57:37.198 JupyterHub base:1097] Initiating spawn for btf
[D 2025-07-22 15:57:37.198 JupyterHub base:1101] 0/100 concurrent spawns
[D 2025-07-22 15:57:37.198 JupyterHub base:1106] 0/10 active servers
[I 2025-07-22 15:57:37.233 JupyterHub provider:661] Creating oauth client jupyterhub-user-btf
[D 2025-07-22 15:57:37.246 JupyterHub user:913] Calling Spawner.start for btf
[D 2025-07-22 15:57:37.286 JupyterHub dockerspawner:1034] Getting container 'example-datalab-btf'
[I 2025-07-22 15:57:37.288 JupyterHub dockerspawner:1040] Container 'example-datalab-btf' is gone
[D 2025-07-22 15:57:37.288 JupyterHub dockerspawner:1212] Starting host with config: {'auto_remove': True, 'binds': {}, 'links': {'jupyterhub': 'datalab'}, 'mounts': [{'Target': '/etc', 'Source': 'example_datalab_etc', 'Type': 'volume', 'ReadOnly': False, 'VolumeOptions': {'DriverConfig': {'Name': 'local'}}}, {'Target': '/home', 'Source': 'example_datalab_home', 'Type': 'volume', 'ReadOnly': False, 'VolumeOptions': {'DriverConfig': {'Name': 'local'}}}], 'mem_limit': 0, 'cpu_period': 100000, 'cpu_quota': 0, 'network_mode': 'backend'}
[I 2025-07-22 15:57:37.331 JupyterHub dockerspawner:1318] Created container example-datalab-btf (id: 4e4314a) from image elmo:5001/scipy-notebook:latest
[I 2025-07-22 15:57:37.331 JupyterHub dockerspawner:1342] Starting container example-datalab-btf (id: 4e4314a)
[D 2025-07-22 15:57:37.484 JupyterHub spawner:1693] Polling subprocess every 30s
[D 2025-07-22 15:57:37.484 JupyterHub dockerspawner:977] Persisting state for btf: container name=example-datalab-btf, id=4e4314a30c7cff35e4da5c3877efed7313334c274da88e66054c82487a8112e8
[D 2025-07-22 15:57:37.487 JupyterHub utils:297] Waiting 30s for server at http://10.1.0.13:8888/user/btf/api
[I 2025-07-22 15:57:38.148 JupyterHub log:192] 302 GET /hub/spawn -> /hub/spawn-pending/btf ([email protected]) 1022.63ms
[D 2025-07-22 15:57:38.215 JupyterHub scopes:1013] Checking access to /hub/spawn-pending/btf via scope servers!server=btf/
[I 2025-07-22 15:57:38.215 JupyterHub pages:400] btf is pending spawn
[D 2025-07-22 15:57:38.216 JupyterHub _xsrf_utils:161] xsrf id mismatch b'4X9bEARTyEAREsSQTI9BuVcgp-p8xAqTKmmRULCBb2E=:9648a06bb6524b5f8ad16d4974419dc1' != b'1d0a96bfdea040a3beb2a11d070496fb:9648a06bb6524b5f8ad16d4974419dc1'
[I 2025-07-22 15:57:38.216 JupyterHub _xsrf_utils:130] Setting new xsrf cookie for b'1d0a96bfdea040a3beb2a11d070496fb:9648a06bb6524b5f8ad16d4974419dc1' {'path': '/hub/'}
[I 2025-07-22 15:57:38.246 JupyterHub log:192] 200 GET /hub/spawn-pending/btf ([email protected]) 36.33ms
[D 2025-07-22 15:57:38.324 JupyterHub log:192] 200 GET /hub/static/css/style.min.css?v=90495ca2cd6745c4b19a42dfd4b244ac5ca697ae76bf6f58a465da54045d2e0032f25207e2ebe4df838e4d7bd40c183228f28bbacc2456fe706797438809f749 (@74.96.66.129) 3.26ms
[D 2025-07-22 15:57:38.327 JupyterHub log:192] 200 GET /hub/static/components/bootstrap/dist/js/bootstrap.bundle.min.js?v=ecf8bfa2d7656db091f8b9d6f85ecfc057120c93ae5090773b1b441db838bd232fcef26375ee0fa35bf8051f4675cf5a5cd50d155518f922b9d70593f161741a (@74.96.66.129) 0.66ms
[D 2025-07-22 15:57:38.328 JupyterHub log:192] 200 GET /hub/static/js/darkmode.js?v=2fd9a7d11ad78df9351fed40ab35eab52e1e6a3d516f188b652120e6faf57b8e387a30aae8f52a6fb51563d06d04545c7005da0b77a98c21b0bd28f6d1cdfa11 (@74.96.66.129) 0.53ms
[D 2025-07-22 15:57:38.329 JupyterHub log:192] 200 GET /hub/static/components/jquery/dist/jquery.min.js?v=bf6089ed4698cb8270a8b0c8ad9508ff886a7a842278e98064d5c1790ca3a36d5d69d9f047ef196882554fc104da2c88eb5395f1ee8cf0f3f6ff8869408350fe (@74.96.66.129) 0.64ms
[D 2025-07-22 15:57:38.330 JupyterHub log:192] 200 GET /hub/static/components/requirejs/require.js?v=1ff44af658602d913b22fca97c78f98945f47e76dacf9800f32f35350f05e9acda6dc710b8501579076f3980de02f02c97f5994ce1a9864c21865a42262d79ec (@74.96.66.129) 0.83ms
[D 2025-07-22 15:57:38.332 JupyterHub log:192] 200 GET /hub/logo (@74.96.66.129) 0.55ms
[D 2025-07-22 15:57:38.463 JupyterHub log:192] 200 GET /hub/static/favicon.ico?v=fde5757cd3892b979919d3b1faa88a410f28829feb5ba22b6cf069f2c6c98675fceef90f932e49b510e74d65c681d5846b943e7f7cc1b41867422f0481085c1f (@74.96.66.129) 0.63ms
[D 2025-07-22 15:57:38.468 JupyterHub log:192] 200 GET /hub/static/components/@fortawesome/fontawesome-free/webfonts/fa-solid-900.woff2 (@74.96.66.129) 0.72ms
[D 2025-07-22 15:57:38.479 JupyterHub scopes:1013] Checking access to /hub/api/users/btf/server/progress via scope read:servers!server=btf/
[I 2025-07-22 15:57:39.993 JupyterHub log:192] 200 GET /hub/api (@10.1.0.13) 0.49ms
[D 2025-07-22 15:57:40.102 JupyterHub base:366] Recording first activity for <APIToken('215c...', user='btf', client_id='jupyterhub')>
[D 2025-07-22 15:57:40.108 JupyterHub scopes:1013] Checking access to /hub/api/users/btf/activity via scope users:activity!user=btf
[D 2025-07-22 15:57:40.112 JupyterHub users:1006] Activity for user btf: 2025-07-22T15:57:39.976288Z
[D 2025-07-22 15:57:40.112 JupyterHub users:1024] Activity on server btf/: 2025-07-22T15:57:39.976288Z
[I 2025-07-22 15:57:40.115 JupyterHub log:192] 200 POST /hub/api/users/btf/activity ([email protected]) 14.89ms
[D 2025-07-22 15:57:40.116 JupyterHub utils:333] Server at http://10.1.0.13:8888/user/btf/api responded in 2.63s
[D 2025-07-22 15:57:40.117 JupyterHub _version:73] jupyterhub and jupyterhub-singleuser both on version 5.3.0
[I 2025-07-22 15:57:40.117 JupyterHub base:1126] User btf took 2.919 seconds to start
[I 2025-07-22 15:57:40.117 JupyterHub proxy:331] Adding user btf to proxy /user/btf/ => http://10.1.0.13:8888
[D 2025-07-22 15:57:40.117 JupyterHub proxy:925] Proxy: Fetching POST http://127.0.0.1:8001/api/routes/user/btf
15:57:40.120 [ConfigProxy] info: Adding route /user/btf -> http://10.1.0.13:8888
15:57:40.120 [ConfigProxy] info: Route added /user/btf -> http://10.1.0.13:8888
15:57:40.120 [ConfigProxy] info: 201 POST /api/routes/user/btf
[I 2025-07-22 15:57:40.121 JupyterHub users:899] Server btf is ready
[I 2025-07-22 15:57:40.121 JupyterHub log:192] 200 GET /hub/api/users/btf/server/progress?_xsrf=[secret] ([email protected]) 1646.15ms
[D 2025-07-22 15:57:40.201 JupyterHub scopes:1013] Checking access to /hub/spawn-pending/btf via scope servers!server=btf/
[I 2025-07-22 15:57:40.201 JupyterHub log:192] 302 GET /hub/spawn-pending/btf -> /user/btf/ ([email protected]) 4.76ms
[D 2025-07-22 15:57:40.291 JupyterHub provider:421] Validating client id jupyterhub-user-btf
[D 2025-07-22 15:57:40.292 oauthlib.oauth2.rfc6749.grant_types.authorization_code authorization_code:362] Validating redirection uri /user/btf/oauth_callback for client jupyterhub-user-btf.
[D 2025-07-22 15:57:40.292 oauthlib.oauth2.rfc6749.grant_types.base base:231] Using provided redirect_uri /user/btf/oauth_callback
[D 2025-07-22 15:57:40.292 JupyterHub provider:497] validate_redirect_uri: client_id=jupyterhub-user-btf, redirect_uri=/user/btf/oauth_callback
[D 2025-07-22 15:57:40.293 oauthlib.oauth2.rfc6749.grant_types.base base:172] Validating access to scopes ['read:users:groups!user', 'access:servers!server=btf/', 'read:users:name!user'] for client 'jupyterhub-user-btf' (<OAuthClient(identifier='jupyterhub-user-btf')>).
[D 2025-07-22 15:57:40.294 JupyterHub provider:624] Allowing request for scope(s) for jupyterhub-user-btf:  read:users:groups!user,access:servers!server=btf/,read:users:name!user
[D 2025-07-22 15:57:40.294 JupyterHub auth:326] Skipping oauth confirmation for <User(btf 1/1 running)> accessing Server at /user/btf/
[D 2025-07-22 15:57:40.295 oauthlib.oauth2.rfc6749.endpoints.authorization authorization:98] Dispatching response_type code request to <oauthlib.oauth2.rfc6749.grant_types.authorization_code.AuthorizationCodeGrant object at 0x7f1d5be18790>.
[D 2025-07-22 15:57:40.295 JupyterHub provider:421] Validating client id jupyterhub-user-btf
[D 2025-07-22 15:57:40.295 oauthlib.oauth2.rfc6749.grant_types.authorization_code authorization_code:362] Validating redirection uri /user/btf/oauth_callback for client jupyterhub-user-btf.
[D 2025-07-22 15:57:40.295 oauthlib.oauth2.rfc6749.grant_types.base base:231] Using provided redirect_uri /user/btf/oauth_callback
[D 2025-07-22 15:57:40.295 JupyterHub provider:497] validate_redirect_uri: client_id=jupyterhub-user-btf, redirect_uri=/user/btf/oauth_callback
[D 2025-07-22 15:57:40.296 oauthlib.oauth2.rfc6749.grant_types.base base:172] Validating access to scopes {'read:users:groups!user', 'access:servers!server=btf/', 'read:users:name!user'} for client 'jupyterhub-user-btf' (<OAuthClient(identifier='jupyterhub-user-btf')>).
[D 2025-07-22 15:57:40.297 JupyterHub provider:624] Allowing request for scope(s) for jupyterhub-user-btf:  read:users:groups!user,access:servers!server=btf/,read:users:name!user
[D 2025-07-22 15:57:40.297 oauthlib.oauth2.rfc6749.grant_types.authorization_code authorization_code:245] Pre resource owner authorization validation ok for <oauthlib.Request SANITIZED>.
[D 2025-07-22 15:57:40.297 oauthlib.oauth2.rfc6749.grant_types.authorization_code authorization_code:170] Created authorization code grant {'code': 'fLgcRK02MRNrngXaul7nMN9nLYonsd', 'state': 'qBrKBCsSa-TW340ZvVVrCA'} for request <oauthlib.Request SANITIZED>.
[D 2025-07-22 15:57:40.297 oauthlib.oauth2.rfc6749.grant_types.authorization_code authorization_code:277] Saving grant {'code': 'fLgcRK02MRNrngXaul7nMN9nLYonsd', 'state': 'qBrKBCsSa-TW340ZvVVrCA'} for <oauthlib.Request SANITIZED>.
[D 2025-07-22 15:57:40.297 JupyterHub provider:247] Saving authorization code jupyterhub-user-btf, fLg..., (), {}
[I 2025-07-22 15:57:40.303 JupyterHub log:192] 302 GET /hub/api/oauth2/authorize?client_id=jupyterhub-user-btf&redirect_uri=%2Fuser%2Fbtf%2Foauth_callback&response_type=code&state=[secret] -> /user/btf/oauth_callback?code=[secret]&state=[secret] ([email protected]) 15.96ms
[D 2025-07-22 15:57:40.349 oauthlib.oauth2.rfc6749.endpoints.token token:112] Dispatching grant_type authorization_code request to <oauthlib.oauth2.rfc6749.grant_types.authorization_code.AuthorizationCodeGrant object at 0x7f1d5be18790>.
[D 2025-07-22 15:57:40.350 JupyterHub provider:58] authenticate_client <oauthlib.Request SANITIZED>
[D 2025-07-22 15:57:40.380 oauthlib.oauth2.rfc6749.grant_types.authorization_code authorization_code:533] Using provided redirect_uri /user/btf/oauth_callback
[D 2025-07-22 15:57:40.380 JupyterHub provider:117] confirm_redirect_uri: client_id=jupyterhub-user-btf, redirect_uri=/user/btf/oauth_callback
[D 2025-07-22 15:57:40.380 oauthlib.oauth2.rfc6749.grant_types.authorization_code authorization_code:301] Token request validation ok for <oauthlib.Request SANITIZED>.
[D 2025-07-22 15:57:40.380 JupyterHub provider:345] Saving bearer token {'access_token': 'REDACTED', 'expires_in': 1209600, 'token_type': 'Bearer', 'scope': 'read:users:groups!user access:servers!server=btf/ read:users:name!user', 'refresh_token': 'REDACTED'}
[D 2025-07-22 15:57:40.389 JupyterHub provider:205] Deleting oauth code fLg... for jupyterhub-user-btf
[I 2025-07-22 15:57:40.396 JupyterHub log:192] 200 POST /hub/api/oauth2/token ([email protected]) 53.87ms
[D 2025-07-22 15:57:40.399 JupyterHub base:366] Recording first activity for <APIToken('HQpU...', user='btf', client_id='jupyterhub-user-btf')>
[I 2025-07-22 15:57:40.403 JupyterHub log:192] 200 GET /hub/api/user ([email protected]) 5.42ms
[D 2025-07-22 15:58:08.749 JupyterHub dockerspawner:1034] Getting container 'example-datalab-btf'
[D 2025-07-22 15:58:08.757 JupyterHub dockerspawner:1017] Container 4e4314a status: {"Status": "running", "Running": true, "Paused": false, "Restarting": false, "OOMKilled": false, "Dead": false, "Pid": 1574376, "ExitCode": 0, "Error": "", "StartedAt": "2025-07-22T15:57:37.350591935Z", "FinishedAt": "0001-01-01T00:00:00Z", "Health": {"Status": "unhealthy", "FailingStreak": 9, "Log": [{"Start": "2025-07-22T11:57:54.828141405-04:00", "End": "2025-07-22T11:57:54.929299273-04:00", "ExitCode": 1, "Output": "/bin/sh: 1: /etc/jupyter/docker_healthcheck.py: not found\n"}, {"Start": "2025-07-22T11:57:57.930585836-04:00", "End": "2025-07-22T11:57:57.974467745-04:00", "ExitCode": 1, "Output": "/bin/sh: 1: /etc/jupyter/docker_healthcheck.py: not found\n"}, {"Start": "2025-07-22T11:58:00.975511789-04:00", "End": "2025-07-22T11:58:01.029303895-04:00", "ExitCode": 1, "Output": "/bin/sh: 1: /etc/jupyter/docker_healthcheck.py: not found\n"}, {"Start": "2025-07-22T11:58:04.030299103-04:00", "End": "2025-07-22T11:58:04.125957029-04:00", "ExitCode": 1, "Output": "/bin/sh: 1: /etc/jupyter/docker_healthcheck.py: not found\n"}, {"Start": "2025-07-22T11:58:07.12701605-04:00", "End": "2025-07-22T11:58:07.230189926-04:00", "ExitCode": 1, "Output": "/bin/sh: 1: /etc/jupyter/docker_healthcheck.py: not found\n"}]}}
[D 2025-07-22 15:58:10.037 JupyterHub dockerspawner:1034] Getting container 'example-datalab-btf'
[D 2025-07-22 15:58:10.044 JupyterHub dockerspawner:1017] Container 4e4314a status: {"Status": "running", "Running": true, "Paused": false, "Restarting": false, "OOMKilled": false, "Dead": false, "Pid": 1574376, "ExitCode": 0, "Error": "", "StartedAt": "2025-07-22T15:57:37.350591935Z", "FinishedAt": "0001-01-01T00:00:00Z", "Health": {"Status": "unhealthy", "FailingStreak": 9, "Log": [{"Start": "2025-07-22T11:57:54.828141405-04:00", "End": "2025-07-22T11:57:54.929299273-04:00", "ExitCode": 1, "Output": "/bin/sh: 1: /etc/jupyter/docker_healthcheck.py: not found\n"}, {"Start": "2025-07-22T11:57:57.930585836-04:00", "End": "2025-07-22T11:57:57.974467745-04:00", "ExitCode": 1, "Output": "/bin/sh: 1: /etc/jupyter/docker_healthcheck.py: not found\n"}, {"Start": "2025-07-22T11:58:00.975511789-04:00", "End": "2025-07-22T11:58:01.029303895-04:00", "ExitCode": 1, "Output": "/bin/sh: 1: /etc/jupyter/docker_healthcheck.py: not found\n"}, {"Start": "2025-07-22T11:58:04.030299103-04:00", "End": "2025-07-22T11:58:04.125957029-04:00", "ExitCode": 1, "Output": "/bin/sh: 1: /etc/jupyter/docker_healthcheck.py: not found\n"}, {"Start": "2025-07-22T11:58:07.12701605-04:00", "End": "2025-07-22T11:58:07.230189926-04:00", "ExitCode": 1, "Output": "/bin/sh: 1: /etc/jupyter/docker_healthcheck.py: not found\n"}]}}
[I 2025-07-22 15:58:10.069 JupyterHub log:192] 200 GET /hub/home ([email protected]) 54.16ms
[D 2025-07-22 15:58:10.196 JupyterHub log:192] 200 GET /hub/static/js/home.js?v=20250722155050 (@74.96.66.129) 2.28ms
[D 2025-07-22 15:58:10.244 JupyterHub log:192] 200 GET /hub/static/components/moment/moment.js?v=20250722155050 (@74.96.66.129) 3.02ms
[D 2025-07-22 15:58:10.249 JupyterHub log:192] 200 GET /hub/static/js/jhapi.js?v=20250722155050 (@74.96.66.129) 2.26ms
[D 2025-07-22 15:58:10.322 JupyterHub log:192] 200 GET /hub/static/js/utils.js?v=20250722155050 (@74.96.66.129) 0.70ms
[D 2025-07-22 15:58:12.086 JupyterHub scopes:1013] Checking access to /hub/api/users/btf/server via scope delete:servers!server=btf/
[D 2025-07-22 15:58:12.100 JupyterHub dockerspawner:1034] Getting container 'example-datalab-btf'
[D 2025-07-22 15:58:12.106 JupyterHub dockerspawner:1017] Container 4e4314a status: {"Status": "running", "Running": true, "Paused": false, "Restarting": false, "OOMKilled": false, "Dead": false, "Pid": 1574376, "ExitCode": 0, "Error": "", "StartedAt": "2025-07-22T15:57:37.350591935Z", "FinishedAt": "0001-01-01T00:00:00Z", "Health": {"Status": "unhealthy", "FailingStreak": 10, "Log": [{"Start": "2025-07-22T11:57:57.930585836-04:00", "End": "2025-07-22T11:57:57.974467745-04:00", "ExitCode": 1, "Output": "/bin/sh: 1: /etc/jupyter/docker_healthcheck.py: not found\n"}, {"Start": "2025-07-22T11:58:00.975511789-04:00", "End": "2025-07-22T11:58:01.029303895-04:00", "ExitCode": 1, "Output": "/bin/sh: 1: /etc/jupyter/docker_healthcheck.py: not found\n"}, {"Start": "2025-07-22T11:58:04.030299103-04:00", "End": "2025-07-22T11:58:04.125957029-04:00", "ExitCode": 1, "Output": "/bin/sh: 1: /etc/jupyter/docker_healthcheck.py: not found\n"}, {"Start": "2025-07-22T11:58:07.12701605-04:00", "End": "2025-07-22T11:58:07.230189926-04:00", "ExitCode": 1, "Output": "/bin/sh: 1: /etc/jupyter/docker_healthcheck.py: not found\n"}, {"Start": "2025-07-22T11:58:10.23172262-04:00", "End": "2025-07-22T11:58:10.326737919-04:00", "ExitCode": 1, "Output": "/bin/sh: 1: /etc/jupyter/docker_healthcheck.py: not found\n"}]}}
[I 2025-07-22 15:58:12.107 JupyterHub proxy:356] Removing user btf from proxy (/user/btf/)
[D 2025-07-22 15:58:12.108 JupyterHub proxy:925] Proxy: Fetching DELETE http://127.0.0.1:8001/api/routes/user/btf
15:58:12.111 [ConfigProxy] info: Removing route /user/btf
15:58:12.113 [ConfigProxy] info: 204 DELETE /api/routes/user/btf
[D 2025-07-22 15:58:12.114 JupyterHub user:1097] Stopping btf
[D 2025-07-22 15:58:12.114 JupyterHub dockerspawner:1034] Getting container 'example-datalab-btf'
[D 2025-07-22 15:58:12.120 JupyterHub dockerspawner:1017] Container 4e4314a status: {"Status": "running", "Running": true, "Paused": false, "Restarting": false, "OOMKilled": false, "Dead": false, "Pid": 1574376, "ExitCode": 0, "Error": "", "StartedAt": "2025-07-22T15:57:37.350591935Z", "FinishedAt": "0001-01-01T00:00:00Z", "Health": {"Status": "unhealthy", "FailingStreak": 10, "Log": [{"Start": "2025-07-22T11:57:57.930585836-04:00", "End": "2025-07-22T11:57:57.974467745-04:00", "ExitCode": 1, "Output": "/bin/sh: 1: /etc/jupyter/docker_healthcheck.py: not found\n"}, {"Start": "2025-07-22T11:58:00.975511789-04:00", "End": "2025-07-22T11:58:01.029303895-04:00", "ExitCode": 1, "Output": "/bin/sh: 1: /etc/jupyter/docker_healthcheck.py: not found\n"}, {"Start": "2025-07-22T11:58:04.030299103-04:00", "End": "2025-07-22T11:58:04.125957029-04:00", "ExitCode": 1, "Output": "/bin/sh: 1: /etc/jupyter/docker_healthcheck.py: not found\n"}, {"Start": "2025-07-22T11:58:07.12701605-04:00", "End": "2025-07-22T11:58:07.230189926-04:00", "ExitCode": 1, "Output": "/bin/sh: 1: /etc/jupyter/docker_healthcheck.py: not found\n"}, {"Start": "2025-07-22T11:58:10.23172262-04:00", "End": "2025-07-22T11:58:10.326737919-04:00", "ExitCode": 1, "Output": "/bin/sh: 1: /etc/jupyter/docker_healthcheck.py: not found\n"}]}}
[I 2025-07-22 15:58:12.121 JupyterHub dockerspawner:1431] Stopping container example-datalab-btf (id: 4e4314a)
[I 2025-07-22 15:58:12.371 JupyterHub log:192] 302 GET /user/btf/lab/api/workspaces?1753199892304 -> /hub/user/btf/lab/api/workspaces?1753199892304 (@74.96.66.129) 1.90ms
[I 2025-07-22 15:58:12.376 JupyterHub log:192] 302 GET /user/btf/api/terminals?1753199892304 -> /hub/user/btf/api/terminals?1753199892304 (@74.96.66.129) 1.76ms
[W 2025-07-22 15:58:12.469 JupyterHub base:1637] Failing suspected API request to not-running server: /hub/user/btf/lab/api/workspaces
[W 2025-07-22 15:58:12.470 JupyterHub log:192] 424 GET /hub/user/btf/lab/api/workspaces?1753199892304 ([email protected]) 2.18ms
[W 2025-07-22 15:58:12.472 JupyterHub base:1637] Failing suspected API request to not-running server: /hub/user/btf/api/terminals
[W 2025-07-22 15:58:12.473 JupyterHub log:192] 424 GET /hub/user/btf/api/terminals?1753199892304 ([email protected]) 2.34ms
[I 2025-07-22 15:58:12.808 JupyterHub dockerspawner:1070] Removing container 4e4314a30c7cff35e4da5c3877efed7313334c274da88e66054c82487a8112e8
[D 2025-07-22 15:58:12.819 JupyterHub dockerspawner:1076] Already removing container: 4e4314a30c7cff35e4da5c3877efed7313334c274da88e66054c82487a8112e8
[D 2025-07-22 15:58:12.826 JupyterHub user:1119] Deleting oauth client jupyterhub-user-btf
[D 2025-07-22 15:58:12.832 JupyterHub user:1122] Finished stopping btf
[I 2025-07-22 15:58:12.838 JupyterHub base:1349] User btf server took 0.731 seconds to stop
[I 2025-07-22 15:58:12.838 JupyterHub log:192] 204 DELETE /hub/api/users/btf/server?_xsrf=[secret] ([email protected]) 770.92ms
[I 2025-07-22 15:58:13.607 JupyterHub log:192] 302 GET /user/btf/api/collaboration/room/JupyterLab:globalAwareness -> /hub/user/btf/api/collaboration/room/JupyterLab:globalAwareness (@74.96.66.129) 1.74ms
[I 2025-07-22 15:58:14.611 JupyterHub log:192] 302 GET /user/btf/api/collaboration/room/JupyterLab:globalAwareness -> /hub/user/btf/api/collaboration/room/JupyterLab:globalAwareness (@74.96.66.129) 1.75ms
[I 2025-07-22 15:58:15.329 JupyterHub login:46] User logged out: btf
[D 2025-07-22 15:58:15.349 JupyterHub _xsrf_utils:161] xsrf id mismatch b'1d0a96bfdea040a3beb2a11d070496fb:9648a06bb6524b5f8ad16d4974419dc1' != b'1d0a96bfdea040a3beb2a11d070496fb:4X9bEARTyEAREsSQTI9BuVcgp-p8xAqTKmmRULCBb2E='
[I 2025-07-22 15:58:15.350 JupyterHub _xsrf_utils:130] Setting new xsrf cookie for b'1d0a96bfdea040a3beb2a11d070496fb:4X9bEARTyEAREsSQTI9BuVcgp-p8xAqTKmmRULCBb2E=' {'path': '/hub/', 'max_age': 3600}
[I 2025-07-22 15:58:15.356 JupyterHub log:192] 200 GET /hub/logout (@74.96.66.129) 44.50ms
[I 2025-07-22 15:58:15.680 JupyterHub log:192] 302 GET /user/btf/api/collaboration/room/JupyterLab:globalAwareness -> /hub/user/btf/api/collaboration/room/JupyterLab:globalAwareness (@74.96.66.129) 1.70ms
[I 2025-07-22 15:58:16.780 JupyterHub log:192] 302 GET /user/btf/api/collaboration/room/JupyterLab:globalAwareness -> /hub/user/btf/api/collaboration/room/JupyterLab:globalAwareness (@74.96.66.129) 1.66ms
[I 2025-07-22 15:58:18.450 JupyterHub log:192] 302 GET /user/btf/api/collaboration/room/JupyterLab:globalAwareness -> /hub/user/btf/api/collaboration/room/JupyterLab:globalAwareness (@74.96.66.129) 1.68ms

I also generated a DEBUG log for Keycloak for this test, but it ran to over 18,000 lines and you didn’t ask for it, so I’m leaving it out for now, but I can also attach that here, if it helps.

Thanks for the config and logs.

There are no 404s or any major errors in your JupyterHub logs. The only thing I noticed is this line:

[E 2025-07-22 15:57:36.918 JupyterHub oauth2:1245] The auth_state_groups_key oauth_user.groups does not exist in the auth_model. Available keys are: dict_keys(['access_token', 'refresh_token', 'id_token', 'scope', 'token_response', 'oauth_user'])

Your config uses c.LocalGenericOAuthenticator.auth_state_groups_key = "oauth_user.groups" currently and it seems oauth_user.groups does not exist in user data. Probably it should be oauth_user.oauth_user.groups according to logs?

Probably you might find root cause of the issue on the Keycloak side. Maybe a good test is to remove Traefik proxy and TLS (if possible) from your deployment and see if it helps?! The idea is to remove as many variables as you can to help identify the issue!