2

I'm looking for a way to replicate the functionality of SQL Server's JSON_VALUE function using a SQLAlchemy query. I'm using metadata.reflect to define my existing db tables in SQLAlchemy.

SQL:

SELECT Id,
JSON_VALUE(BankDataJSON,'$.AccountName')
FROM BankData

SQLAlchemy Model:

db = SQLAlchemy()
db.Model.metadata.reflect(db.engine)

class BankData(db.Model):
    __table__ = db.Model.metadata.tables['BankData'] 

Endpoint / Query:

@cust_accts_bp.route('/api/CustomerAccts')
def get_cust_accts():
    custId = request.args.get('custId')

    db = SQLAlchemy(app)
    BankData = models.bank_data.BankData
    BankAccounts = models.bank_accounts.BankAccounts

    qry = db.session.query(BankAccounts.Id, BankAccounts.AccountNumber, BankAccounts.BankName,
                           BankData.AppId, BankData.CustomerId, BankAccounts.Filename, BankData.BankDataJSON) \
                           .filter(
                               and_(BankData.Id == BankAccounts.BankDataId, BankData.CustomerId == custId)
                            )

    engine = app.config['SQLALCHEMY_DATABASE_URI']
    df = pd.read_sql(qry.statement, engine)

    df['BankDataJSON'] = df['BankDataJSON'].apply(json.loads) # convert string representation of JSON
    df['BankDataJSON'] = df['BankDataJSON'].map(lambda x:[x[i] for i in x if i=='AccountName'][0])
    df = df.rename(columns={'BankDataJSON':'BusinessName'})

    response = json.loads(df.to_json(orient="records"))

    return(json.dumps(response))

Using this method, I have to manually serialize the JSON object (BankDataJSON) to a Python dict, and parse it to get the value I want ('AccountName'). If I were to use SQL Server's JSON_VALUE function, this is all done for you.

JSON response:

[
  {
    "Id": 3003,
    "AccountNumber": "111111111",
    "BankName": "Wells Fargo",
    "AppId": 111111,
    "CustomerId": "555555",
    "Filename": "some filename.pdf",
    "BusinessName": "Some BusinessName"
  },
  {
    "Id": 3004,
    "AccountNumber": "22222222",
    "BankName": "Wells Fargo",
    "AppId": 111111,
    "CustomerId": "555555",
    "Filename": "Some filename",
    "BusinessName": "Some Businessname"
  },
]

How can I go about doing this? I walso want to be able to replicated SQL Server's CROSS APPLY OPENJSON functionality for working with array of JSON objects in the future. Do I need to define the BankDataJSON column as a JSON type in my model? When I do this, I get an error regarding pyodbcs inability to deserialize JSON in the MSSQL dialect

1

1 Answer 1

2

may be you can try to implement the server's function in your query, something like this

from sqlalchemy.sql import func


db = SQLAlchemy(app)
BankData = models.bank_data.BankData

qry = db.session.query(BankData.Id, 
     func.JSON_VALUE(BankData.BankDataJSON,'$.AccountName'))
Sign up to request clarification or add additional context in comments.

1 Comment

thanks, this works! though I'd still be interested in finding a way to do it without calling the JSON_VALUE function directly

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.