Kang-Dong-Hwi / pytorch0730

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

pytorch0730



정규화 함수



np.mean, np.std 로 직접 정규화를 해주는 함수에서
scikitlearn.preprocessing 의 StandardScaler 를 사용해
data의 평균을 1로, 분산을 0으로 만들어 정규화하였습니다.


0730

0729
def dB( magnitude ):
    decibel = 20*np.log10( np.abs(magnitude) + np.finfo(float).eps )    
    return decibel


def scaler( L, R):
    LR = np.concatenate( (L,R), axis=0 )
    
    """normalization"""
    #z = MinMaxScaler().fit_transform(LR[:])
    z = StandardScaler().fit_transform(LR[:])
    #z = RobustScaler().fit_transform(LR[:])
    #z = MaxAbsScaler().fit_transform(LR[:])
    
    z = z.reshape(2, 257, 382)
    return z[0], z[1]
def dB( magnitude ):
    return 20*np.log10( np.abs(magnitude) + np.finfo(np.float32).eps )
    

def Mag_normalization( L, R ):

    Mag = np.asarray( [ L, R ] )  #(2, 257, 382)
    mu = np.mean( Mag )
    sigma = np.std( Mag )
    z = ( Mag - mu ) / sigma
    return z[0], z[1]


def Phase_normalization( phase ):
    mu = np.mean( phase )
    sigma = np.std( phase )
    
    z = ( phase - mu ) / sigma
    return z




Screenshots


epoch=100
batch_size=20
lr=0.00002



0729 (np.mean, np.std)
training accuracy: 82%
validation accuracy: 11%


0730 (sklearn.preprocessing StandardScaler)
training accuracy: 91.250%
validation accuracy: 44.5%

About


Languages

Language:Jupyter Notebook 100.0%