According to this, PySpark notebooks do not support parameterized linked services. If you attempt to connect ADLS through a linked service, you may encounter the following error:
An error occurred while calling z:mssparkutils.fs.mount. : com.microsoft.spark.notebook.msutils.InvalidCredentialsException: fetch Token from linkedService failed with POST failed with 'Bad Request' (400) and message: {"result":"DependencyError","errorId":"BadRequest","errorMessage":"[Code=LinkedServiceParametersNotSupported, Target=AzureDataLakeStorage1, Message=Failed to load LinkedService, Exception: Linked Services using parameters are not supported yet, LinkedServiceName: AzureDataLakeStorage1]. TraceId : 76484a03-fd79-43cd-aa8d-b54db5a3a7f5 | client-request-id : 302d35df-97ae-4cfc-85b7-e12568582071. Error Component : LSR"}, no any user credential info available for authorization
The error message states that "Linked Services using parameters are not supported yet." This may be a feature request for Synapse notebooks. Instead, create parameters by manually entering the storage account name with managed identity authentication in the storage account linked services for both environments, as shown below:

Use these linked services according to the specific environments.