emilhe / dash-extensions

The dash-extensions package is a collection of utility functions, syntax extensions, and Dash components that aim to improve the Dash development experience

Home Page:https://www.dash-extensions.com/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

AttributeError: 'Output' object has no attribute 'backend' when activating memoization

abubakar12 opened this issue · comments

When i activate memoization following error comes: AttributeError: 'Output' object has no attribute 'backend'. My logs are this.
Capture

@abubakar12 I made an attempted fix as part of the 0.1.4 release. If you are still seeing an issue after upgrading, please post a MWE demonstrating the issue.

@emilhe , thank you for the fix. I upgraded it , but it installs flask-cache 2.0 along with it. When i run program, error comes in __get_filename function.

def _get_filename(self, key: str) -> str:
if isinstance(key, str):
bkey = key.encode("utf-8") # XXX unicode review
bkey_hash = self._hash_method(bkey).hexdigest()
return os.path.join(self._path, bkey_hash)

The error says that local_variable is referenced before initializing it. I think if isinstance() is not being True. Whereas if I use flask cache 1.10.1, folowing line (bkey_hash = self._hash_method(bkey).hexdigest()) is outside the isinstance() condition. So i have reverted back to dash-extensions 0.13. After flask-cache 1.10.1 , flask is using cachelib for cache storage.

Following is the place where error occurs:
https://github.com/pallets-eco/cachelib/blob/0cc195a3eaf06f6031c1b93a6744f240ccfb3a26/src/cachelib/file.py#L198

Yes, that upgrade is intended (it's because flask-caching has changed their implementation). Could you try clearing the file system cache before running the code (with the 0.1.4 release)? And if you still see issues, please post a MWE here demonstrating the issue. Then I'll take a look.

Emil tank you for solution it is working now. I reinstalled your dash-extensions 0.1.4 and now it is working fine. I cannot understand what was the problem . Although, i had repeated the same procedure three times yesterday , but unexpected error was occuring in cachelibrary , but now it is working fine.
Following is the code which worked for me, I hope it is fine.

1)data_refresh_code() loads data from database. Data consist of approximately 3.1million rows.
2)table_code displays rows based on user selection, after first load, there is no need for data reloading due to memoization funcion.

import numpy as np
import plotly.express as px
import pandas as pd
from dash_extensions.enrich import DashProxy, Output, Input, State, ServersideOutput, html, dcc, \
    ServersideOutputTransform,callback,FileSystemStore,callback_context,MultiplexerTransform
import plotly.graph_objects as go
from plotly.subplots import make_subplots
import plotly.io as pio
pio.renderers.default='browser'
from datetime import date, timedelta
import datetime
import numpy as np
import urllib.parse
import urllib
import dash_bootstrap_components as dbc
import sqlalchemy
import base64 
from Crypto.Cipher import AES
from Crypto.Util.Padding import pad,unpad
# import dask.dataframe as dd
# from pandarallel import pandarallel
from dash.exceptions import PreventUpdate
import dash
selected_chart_template='simple_white'
ai_green="#228779"
ai_gray="#bab0ac"
color_sch=px.colors.sequential.Greens
color_sch.reverse()  
import pyodbc


dictionary={}

def data_pull_sql(client_id):
    global dictionary
    if 'dictionary' not in globals():
        dictionary = {}
    server = 'something.database.windows.net'
    database = 'something'
    username = 'something'
    password = 'something'   
    driver= '{ODBC Driver 13 for SQL Server}'
    
    
    db=pd.DataFrame()
    with pyodbc.connect('DRIVER='+driver+';SERVER=tcp:'+server+';PORT=1433;DATABASE='+database+';UID='+username+';PWD='+ password) as conn:
        with conn.cursor() as cursor:
            cursor.execute("SELECT Total_Rows= SUM(st.row_count) FROM sys.dm_db_partition_stats st \
                WHERE \
        object_name(object_id) = 'SalesAggregates' AND (index_id < 2)")
        
            row = cursor.fetchone()
            count=row[0]
            chunksize=100000
            tot=int(count/chunksize)
            # tot=1
              
            cursor.execute(f"select Date,CustomerID,product_type,quantity,amount,price,product_id,variant_id,sku\
            ,variant_title,revenue,total_products,sum_qty,sum_dollar_value,avg_selling_price,count_orders,unique_customer\
                  ,revenue_id,total_products_id,sum_qty_id,sum_dollar_value_id,avg_selling_price_id,count_orders_id,unique_customer_id\
                  ,revenue_var,total_products_var,sum_qty_var,sum_dollar_value_var,avg_selling_price_var,count_orders_var,unique_customer_var\
                  ,unique_sku,sum_qtys,avg_product_per_customer from SalesAggregates where cid = {client_id} and quantity IS NOT NULL")
            count=0
            progress=0
            for i in (range(tot)):
                count=count+1
                
                row=cursor.fetchmany(chunksize)
                progress=int((count/(tot))*100)
                print("row : {} ---count: {}---progress : {}....total :{}".format(len(row),count,progress,tot))
                # df = pd.DataFrame(row, columns = ['Date', 'CustomerID'])
                df = pd.DataFrame.from_records(row, columns=[col[0] for col in cursor.description])
                db=pd.concat([db,df])
                
                dictionary[str(client_id)]=progress
            return db
        
    

###########################################################################################
#Gather all call backs

option_selected = dbc.Container([
        dcc.Store(id="store-data", storage_type='session'),
        dbc.Row(
            [
                
                dbc.Col(
                    html.Div([
                    html.H6(id="data_refresh"),
                    dbc.Button("Refresh Data",id="refresh_button_init",n_clicks=None,color="primary"),
                    html.H6("table_length_dropdown"),
                    dcc.Dropdown(
                    id='table_length',
                    options=[2,3,4,5,6,7,8],
                    value=2,
                    optionHeight=60)       
                         ])
                    ),
            ]
        ),
        html.Hr(),
        html.Hr(),
        dbc.Spinner(dbc.Row(dbc.Col(id="table_init")))
        
        ],
    fluid=True
)


app = DashProxy(__name__,transforms=[ServersideOutputTransform()],\
                external_stylesheets=[dbc.themes.SIMPLEX],suppress_callback_exceptions=True,\
                   )

app.title = "Shopify Data Analysis"    
app.layout=dbc.Container(option_selected)  
    
@app.callback(
    ServersideOutput("store-data", "data",arg_check=False,session_check=False), 
    Input("refresh_button_init","n_clicks"),
    State("store-data", "data"),
    Input("table_length","value"),
    prevent_initial_call=True,memoize=True
    )
def data_refresh_code(refresh_button_init,df,rows):
    
    
                tab_file=data_pull_sql(100)
                return tab_file
        
            
   
@app.callback(
    Output("table_init","children"),
    Input("store-data", "data"),
    State("table_length","value"),
    prevent_initial_call=True
    )
def table_code(tableau_file,rows):
    
                
                
                # table=tableau_file[:10]
                table = dbc.Table.from_dataframe(tableau_file[:rows], striped=True, bordered=True, hover=True)

                
        
                ################################################################################################################
                            
                return table
            

app.config['suppress_callback_exceptions'] = True
if __name__ == '__main__':
    app.run_server(debug=True,port=3200)