• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    基于變換空間近鄰圖的自助型局部保持投影

    2010-05-05 22:55:34喬立山1張麗梅1孫忠貴2
    關(guān)鍵詞:工程系南京航空航天大學(xué)聊城

    喬立山1,2 張麗梅1,2 孫忠貴2

    (1.南京航空航天大學(xué)計(jì)算機(jī)科學(xué)與工程系,南京,210016,中國;2.聊城大學(xué)數(shù)學(xué)科學(xué)學(xué)院,聊城,252000,中國)

    INTRODUCTION

    In practice,the high-dimensional data are often utilized such as face images and gene expression micro-arrays. Dimensionality reduction(DR)is a principle way to mine and understand such high-dimensional data by mapping them into another(usually low-dimensional)space.Classical DR methods include PCA,ICA and LDA,and so on[1].However,these methods cannot discover the nonlinear structure in data.To address this issue,in the past decade,researchers have developed many nonlinear manifold learning algorithms such as LLE[2],ISOM AP[3]and Laplacian eigenmaps[4].They give more flexibility in data modeling,but generally suffer from high computational cost and the so-called″out-of-sample″problem.Ref.[5]showed that although such nonlinear techniques perform well on selected artificial data sets,they generally do not outperform the traditional PCA on real-world tasks yet.

    In the recent years,there is an increasing interest in linearized locality-oriented DR methods,e.g., the locality preserving projections(LPP)[6],the neighborhood preserving embedding (NPE)[7],the unsupervised discriminant projection(UDP)[8],and the sparsity preserving projections(SPP)[9],etc..On one hand,these algorithms are linear in nature,thus avoiding the″out-of-sample″problem involved in nonlinear manifold learning.On the other hand,they model the local neighborhood structure in data,and generally achieve better performance than typical global linear methods such as PCA.According to Ref.[10],almost all of the existing locality-oriented methods essentially share the similar objective function yet with different well-constructed neighborhood graphs.Therefore,without loss of generality,LPP in virtue of its popularity and typicality is chosen to develop the algorithm and demonstrate the idea,though the idea in this paper can be easily and naturally extended to other locality-oriented DR methods.

    As a representative of locality-oriented DR algorithms,LPP has been widely used in many practical problems,such as the face recognition[11].Despite its unsupervised nature,LPP has potential discriminative power by preserving the local geometry of data.However,the neighborhood graph underlying LPP is defined(Eq.(2))based on original data points,and must be fixed all along once constructed.As a result,the performance of LPP generally relies heavily on how well the nearest neighbor criterion work in original space[12].In addition,the so-defined neighborhood graph suffers seriously from the difficulty of parameter selections,i.e.,the neighborhood size k and the Gaussian kernel widthe.

    To address these problems,this paper proposes a novel DR algorithm,called the self-dependent LPP(sdLPP),which is based on the observation that the nearest neighbor criterion usually works well in LPP transformed space.Firstly,LPP is performed based on the typical neighborhood graph.Then,a new neighborhood graph is constructed in LPP transformed space and repeats LPP.Furthermore,a new criterion called the improved laplacian score is developed as an empirical reference for discriminative power and iterative stop condition.Finally,experiments on several publicly available UCI and face data sets verify the feasibility and the effectiveness of the proposed algorithm with promising results.

    1 BRIEF REVIEW OF LOCALITY PRESERVING PROJECTIONS

    Given a set of data pointsX= [x1,x2,…,xn],xi∈Rd,LPP aims at seeking a set of projection directions by preserving the local geometric data structure.The objective function of LPP is defined as follows

    whereW∈ Rd×d′(d′<d)is the projection matrix and S= (Sij)n×nthe adjacency weight matrix defined as

    where Nk(xj)is k nearest neighbors of xj.

    To avoid degenerative solution,this paper,where diiis the

    where D=diag(d11,d22,…,dnn)is a diagonal matrix and L=D-S the graph Laplacian[6].Eq.(3)is a typical non-convex[13],and it is often solved approximately by the generalized eigenvalue problem:XLXTw=λ XDXTw,where w is the eigenvector for constructing.row sum of S(or column since S is symmetrical).Then, with simple formulation, LPP can be rewritten as a compact trace ratio form[6]

    2 SELF-DEPENDENT LOCALITY PRESERVING PROJECTIONS

    2.1 Motivation

    According to Refs.[6,14],the locality preserving power of LPP is potentially related to discriminating power.As a result,the 1-NN classifier in LPP transformed space generally achieves better performance in comparison with the baseline scenario(i.e.,performing 1-NN classifier in original space without any transformation).In fact,this is also the observation from the experiments in many research work[6,9,14]related to LPP.

    Since the neighborhood graph and 1-NN classifier are both closely related to the nearest neighbor criterion,a natural idea is that the neighborhood graph constructed in the LPP transformed space includes more discriminative information than the one constructed in the original space.Therefore,this paper updates the neighborhood graph in previous LPP transformed space,and then repeats LPP.The corresponding algorithm is called the self-dependent LPP(sdLPP),since it only depends on LPP itself instead of resorting to other extra tricks or tools.This paper develops an improved Laplacian score(LS)as an alternative under the stop condition,and proposes a specific algorithm.

    2.2 Improved Laplacian score

    The original Laplacian score(LS)is introduced to scale the importance of the feature(or variable)[15]. Different from classical Fisher score[16]which can only work under supervised scenario,LS can work under supervised,unsupervised and even semi-supervised scenarios.Although LS aims at the feature selection,it can naturally be extended to the feature extraction or the dimensionality reduction field.However,typical LS is based on the artificially predefined neighborhood graph,and it becomes a constant once the specific projection directions are given.So the reliability of LS relies heavily on the single pre-fixed neighborhood graph;and LS cannot directly be used as iteration termination condition in the proposed algorithm.

    An improved Laplacian score(iLS)is defined as follows

    The i LS shares the similar mathematical expression to the objective function of Eq.(3)and typical LS[15],but it has remarkable differences from them.In typical LS(or more generally,the most existing locality-oriented DR algorithms),the adjacency weight matrix S of the graph is fixed in advance.While in the proposed iLS,S is variable,that is,the iLS is a joint function with respect to W and S.

    2.3 sdLPP Algorithm

    Based on the above discussion,sdLPP algorithm is given as follows:

    (1)As in LPP and other locality-oriented DR algorithms,the data points in a PCA transformed subspace are firstly projected to remove the null space of XXTand the possible singularity problem is avoided.Without loss of generality,X is used to denote the data points in the PCA transformed subspace.

    (2)Constructing initial neighborhood graph G(X,S)with appropriate neighborhood parameter k and Gaussian kernel width ein Eq.(2),then calculating the projection matrix W by solving the generalized eigenvalue problem: XLXTw =λ XDXTw,and finally calculating the iLS,Lold=L(W,S)with current W and S.

    (3)Updating S in the previous LPP transformed space with the same parameters k ande,then repeatly computing the new projection matrix W and i LS,Lnew=L(W,S)with the new W and S.

    (4)If Lnew>Lold,let Lold= Lnewand turn to Step(3),otherwise,break and return to W.

    It is easy to see that the sdLPP algorithm is simple.Here,it needs to point out that iLSis not an accurate indicator of the discriminate power,since the proposed algorithm completely works under the unsupervised scenario.Therefore,in the experiments,this paper first performs ten iterations and judges whether they should be continued according to iLS at another ten iterations.This trick is used to avoid the unexpected stop due to the possible fluctuation of iLS.Of course,a well-designed heuristic strategy is worthy of deep study in the future.

    3 EXPERIMENTS

    In this section,the algorithm is compared with LPP through the illustrative example,the clustering and the face recognition experiments.

    3.1 Illustrative example

    Firstly,this paper visually show how and why the algorithm works on the widely used Wine data set from the UCI machine learning repository.The Wine has 13 features,3 classes and 178 instances.A main characteristic of this data set is that its last feature has large range and variance relative to other ones.As a result,such remarkable feature will play a key role in the data distribution.This generally challenges the typical locality-oriented DR methods including LPP,since the neighborhood graph is fixed in advance and heavily depends on the last remarkable feature due to its large range and variance in the original space.

    (1)Data visualization

    In particular,Fig.1(a)shows the 2-D projections of Wine by the typical LPP whose adjacency graph is constructed with neighborhood size k=min{n1,n2,n3}- 1,where niis the sample number from the i th class,and heat kernel widtheas the mean norm of the data[17](the influence of these parameters on the ultimate performance is discussed.).By the LPP transformation,the three classes are overlapped together.Then,the proposed sdLPP is performed.Fig.1(b)shows the improved Laplacian score at each iteration.Generally speaking,it presents an increasing tendency with the iteration.Figs.1(c-f)give the 2-D projections by sdLPP after 1,5,10,and 20 iterations,respectively,which show that the three classes in the subspace are gradually separated from each other.This illustrates that the graph updating strategy can potentially benefit the subsequent learning task.

    Fig.1 Data visualization results and iLS of sdLPP at each iteration on Wine data set

    (2)Sensitivity to parameters k ande

    The model selection for unsupervised learning is one of classical challenges to machine learning and pattern recognition tasks.Fig.2 shows the performances of LPP and sdLPP on Wine data set with different parameter values,respectively.In the experiment,25 samples per class are randomly selected for training and remaining for test.Then,the classification accuracies of 20 training splits per test are averaged,and the best results at certain dimensions are plotted.In particular,Fig.2(a)shows the classification accuracies using graphs with the different neighborhood size k and fixed heat kernel widthe0,where k is traversed from 1 to 50 ande0is set as the mean norm of the training samples[17].Fig.2(b)shows the accuracies using graphs with different kernel width e and fixed k,where e is chosen from{2-10e0,…,20e0,…,210e0}and k is set to 24,i,e.,25-1.

    Fig.2 Classification accuracies on Wine using graphs

    As shown in Fig.2,with the pre-fixed graph,typical LPP generally suffers from serious performance drop if it gives an improper parameter value.In contrast,sdLPP is not so sensitive to the setting of the initial parameters due to the fact that the graph becomes better with the subsequent updating process.

    3.2 Clustering

    In what follows,this paper performs clustering experiments using ten data sets by widely used UCIincluding Iris,Wine,Wdbc,and so on.The statistical parameters of all the data sets are summarized in Table 1.

    Table1 Statistical parameters of used data sets for clustering

    In the experiments,the normalized mutual information(NMI)[17]is adopted to evaluate the clustering performance.NMI measurement is defined as

    where A and B are the random variables of the cluster memberships from the ground truth and the output of clustering algorithm,respectively;I(A,B)is the mutual information between A and B.H(A)and H(B)are the entropies of A and B,respectively.

    On all the data sets,the k-means clustering is performed in the original space(baseline),the LPP transformed space and the sdLPP transformed space,respectively.For LPP and sdLPP,the neighborhood size k is set to min{n1,n2,…,nc},where niis the training sample number from the ith class;the heat kernel parametereis set as the mean norm of the training samples according to the scheme used in Ref.[17].This paper repeats 50 experiments and reports the mean performances and the corresponding subspace dimensions in Table 2.

    Table 2 Performance(NMI)comparisons for clustering task

    The results show that:(1)the performance of k-means can generally be improved after DR,i.e.,on data sets①—⑤⑧⑨,which illustrates that the locality preserving power is potentially related to discriminative power;(2)on very few data sets(sets⑥⑦),the DR algorithms do not help the clustering,which is due to serious overlap of the data or inappropriately assigned param-eter values for the neighborhood graph;(3)the sdLPP algorithm can remarkably outperform typical LPP on most of the data sets.This illustrates that the neighborhood graph updating can potentially benefit the subsequent discriminant task.

    3.3 Face recognition

    Typical LPP has been successfully used in the face recognition where it is appeared as the popular Laplacianface[11].This paper experimentally comparesthe proposed algorithmswith Laplacianface on two publicly available face databases:AR and extended Yale B.

    (1)Database description

    In the experiments,a subset of the AR face database provided and preprocessed by Martinez[18]is used.This subset contains 1 400 face images corresponding to 100 person(50 men and 50 women),where each person has 14 different images with illumination change and expressions.The original resolution of these image faces is 165×120. For the computational convenience,this paper resizes them to 33× 24.Fig.3(a)gives 14 face images of one person taken in two sessions with different illuminations and expressions.Extended Yale B database[19]contains 2 414 front-view face images of 38individuals.For each individual,about 64 pictures are taken under various laboratory-controlled lighting conditions.In the experiments,this paper uses cropped images with resolution of 32×32.Fig.3(b)gives some face images of one person from this database.

    (2)Experimental setting

    For AR database,the face images taken in the first session are used for training,and the images taken in the second session are used for test;for extended Yale B,this paper randomly selects l(l=10,20 and 30,respectively)samples for training and remaining for test.The ultimate performance is the mean of 10 training per test splits.

    Firstly,the data is projected in a PCA transformed subspace,which is calculated using the training samples and kept 98% energy;then,LPP and sdLPP are performed in the subspace,and 1-NN classifier is chosen due to its simplicity,effectiveness and efficiency to evaluate the recognition rates on the test data.As a Baseline,the recognition rates of 1-NN classifier are also given on the raw data.

    Fig.3 Face images of one person

    (3)Parameter selection

    For LPP and sdLPP,the model parameters include the neighborhood size k and the kernel widthe.In our experiments,we empirically sete as the mean norm of the training samples[17],and determine k values by searching in a large range of candidate values and reporting the best results.

    (4)Experimental results and discussion

    Fig.4 shows the recognition rate curves of different methods on AR and extended Yale B(l=10)databases.For AR,the best recognition rates of Baseline,LPP and sdLPP are 74.57%,79.00% and 81.71%,respectively,where the best neighborhood size k= 1.For extended Yale B,the best performances and corresponding dimensions are reported in Table 3.For l=10,20 and 30,the best neighborhood size k is 1,1 and 2,respectively.

    From the experimental results,the following observations are obtained:

    Table3 Performance comparisons on extended Yale B database

    Fig.4 Recognition rate curves based on different methods

    (1)Typical LPP and the proposed sdLPP can achieve betterperformance than the baseline method.This further illustrates that the locality preserving DR algorithms can encode potential discriminating information,even though under the unsupervised scenario.

    (2)The proposed sdLPP consistently outperforms LPP on the used face databases.This illustrates that sdLPP actually benefits from the graph updating process.

    4 CONCLUSIONS

    This paper develops a novel LPP algorithm with the adjustable neighborhood graph.As a summary,several favorable properties of the algorithm are enumerate.

    (1)sdLPP is self-dependent.It does not acquire to pay incidental expenses for exploiting new tools,but directly work on off-the-shelf LPP.So sdLPP is very simple and analytically tractable.And sdLPP naturally inherits some good characteristics from original LPP.For example,it can avoid the″out-of-sample″problem involved in manifold learning.

    (2)sdLPP is not so sensitive to neighborhood size k and Gaussian kernel width erelative to typical LPP,due to the fact that the neighborhood graph becomes better and better in the subsequent updating process.

    (3)sdLPP can potentially use the discriminative information lying in both the original space and the transformed space,since the graph in sdLPP is adjustable instead of being fixed beforehand as in LPP.This can potentially help the subsequent learning task.

    (4)The idea behind sdLPP is extremely general.It can easily and naturally be applied to many other graph-based DRs just through slight modifications.

    It is worthwhile to point out that the proposed algorithm including the improved Laplacian score is completely unsupervised.Although unsupervised DR methods do not require the efforts of human annotators,reliable supervised information generally help to achieve better discriminative power.In the future,we expect to further improve the proposed algorithm by absorbing possibly known label information and extend it to semi-supervised scenario.

    [1] Hastie T.The elements of statistical learning:data mining,inference,and prediction [M].2nd Ed.New York:Springer,2009.

    [2] Roweis S T,Saul L K.Nonlinear dimensionality reduction by locally linear embedding[J].Science,2000,290(5500):2323-2326.

    [3] Tenenbaum J B,Mapping a manifold of perceptual observations [C]//Neural Information Processing Systems(NIPS). Cambridge,M A,USA:MIT Press,1998:682-688.

    [4] Belkin M,Niyogi P.Laplacian eigenmaps for dimensionality reduction and data representation[J].Neural Computation,2003,15(6):1373-1396.

    [5] Van-der-Maaten L J P,Postma E O,Van-den-Herik H J.Dimensionality reduction:a comparative review[EB/OL].http://ict.ewi.tudelft.nl/~ lvandermaaten/Publications-files/JM LR-Paper.pdf,2009-10.

    [6] He X F,Niyogi P.Locality preserving projections[C]//Neural Information Processing Systems(NIPS).Cambridge,M A,USA:M IT Press,2003:153-160.

    [7] He X F,Cai D,Yan S C,et al.Neighborhood preserving embedding[C]//IEEE International Conference on Computer Vision(ICCV).Washington,DC,USA:IEEE Computer Society,2005:1208-1213.

    [8] Yang J,Zhang D,Yang J Y,et al.Globally maximizing,locally minimizing: Unsupervised discriminant projection with applications to face and palm biometrics[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2007,29(4): 650-664.

    [9] Qiao L,Chen S,Tan X,Sparsity preserving projections with applications to face recognition[J].Pattern recognition,2010,43(1):331-341.

    [10]Yan S C,Xu D,Zhang B Y,et al.Graph embedding and extensions:a general framework for dimensionality reduction[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2007,29(1):40-51.

    [11]He X F,Yan S C,Hu Y X,et al.Face recognition using Laplacianfaces[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2005,27(3):328-340.

    [12]Chen H T,Chang H W,Liu T L.Local discriminant embedding and its variants[C]//IEEE Conference on Computer Vision and Pattern Recognition(CVPR).Washington,DC,U SA:IEEE ComputerSociety,2005:846-853.

    [13]Wang H,Yan S C,Xu D,et al.Trace ratio vs.ratio trace for dimensionality reduction[C]//IEEE Conference on Computer Vision and Pattern Recognition(CV PR).Washington,DC,U SA:IEEE Computer Society,2007:1-8.

    [14]Cai D,He X F,Han JW,et al.Orthogonal Laplacianfaces for face recognition [J].IEEE Transactions on Image Processing,2006,15(11): 3608-3614.

    [15]He X,Cai D,Niyogi P.Laplacian score for feature selection[C]//Neural Information Processing Systems(NIPS).Cambridge,M A,U SA:MIT Press,2005:507-514.

    [16]Bishop C M.Pattern recognition and machine learning[M].New York:Springer,2006.

    [17]Wu M,Scholkopf B.A local learning approach for clustering[C]//Neural Information Processing Systems(NIPS).Cambridge,MA,USA:MIT Press,2006:1529-1536.

    [18]Martinez A M,Kak A C.PCA versus LDA[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2001,23(2):228-233.

    [19]Lee K C,Ho J,Kriegman D J.Acquiring linear subspaces for face recognition under variable lighting[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2005,27(5):684-698.

    猜你喜歡
    工程系南京航空航天大學(xué)聊城
    南京航空航天大學(xué)機(jī)電學(xué)院
    南京航空航天大學(xué)機(jī)電學(xué)院
    南京航空航天大學(xué)
    南京航空航天大學(xué)生物醫(yī)學(xué)光子學(xué)實(shí)驗(yàn)室
    聊城高新區(qū)多措并舉保障貧困戶“居住無憂”
    聊城 因水而生 有水則靈
    走向世界(2018年11期)2018-12-26 01:12:44
    聊城,宛在水中央
    走向世界(2018年11期)2018-12-26 01:12:44
    新動能,新聊城
    走向世界(2018年11期)2018-12-26 01:12:32
    電子信息工程系
    機(jī)電工程系簡介
    久久天堂一区二区三区四区| 中文字幕av电影在线播放| av电影中文网址| 99热全是精品| 天天躁狠狠躁夜夜躁狠狠躁| 国产欧美亚洲国产| 两性夫妻黄色片| 日韩中文字幕视频在线看片| av国产久精品久网站免费入址| 日韩一区二区三区影片| a级片在线免费高清观看视频| 亚洲七黄色美女视频| 男人操女人黄网站| xxxhd国产人妻xxx| 水蜜桃什么品种好| 国产精品国产av在线观看| 成年美女黄网站色视频大全免费| 亚洲成国产人片在线观看| 黄色毛片三级朝国网站| 亚洲图色成人| 国产有黄有色有爽视频| 少妇的丰满在线观看| 亚洲欧美一区二区三区久久| 黑人巨大精品欧美一区二区蜜桃| 亚洲成人av在线免费| 又大又黄又爽视频免费| 9191精品国产免费久久| 国产精品蜜桃在线观看| 久久女婷五月综合色啪小说| 午夜老司机福利片| 国产在视频线精品| 极品少妇高潮喷水抽搐| 制服丝袜香蕉在线| 国产精品三级大全| 亚洲欧美激情在线| 不卡视频在线观看欧美| 午夜老司机福利片| 我要看黄色一级片免费的| 亚洲少妇的诱惑av| 国产成人精品久久久久久| av片东京热男人的天堂| 热99国产精品久久久久久7| 日本91视频免费播放| 老司机影院毛片| 亚洲欧美成人综合另类久久久| 久久青草综合色| 久久久精品区二区三区| 久久av网站| 18在线观看网站| 黑人猛操日本美女一级片| a 毛片基地| 国产精品香港三级国产av潘金莲 | 精品福利永久在线观看| 多毛熟女@视频| 亚洲欧美一区二区三区国产| 国产精品国产av在线观看| 1024视频免费在线观看| 久久综合国产亚洲精品| 18禁观看日本| 国产精品一区二区在线观看99| 国产国语露脸激情在线看| 丁香六月欧美| tube8黄色片| 一级毛片黄色毛片免费观看视频| 少妇的丰满在线观看| 在现免费观看毛片| 午夜免费男女啪啪视频观看| 十八禁人妻一区二区| 亚洲欧洲精品一区二区精品久久久 | 超碰成人久久| 晚上一个人看的免费电影| 午夜福利乱码中文字幕| 国产探花极品一区二区| 久久精品国产亚洲av高清一级| 亚洲av中文av极速乱| 国产97色在线日韩免费| 久久精品亚洲熟妇少妇任你| 久久精品aⅴ一区二区三区四区| 日韩人妻精品一区2区三区| 亚洲欧洲日产国产| 亚洲精品国产区一区二| 国产精品av久久久久免费| 亚洲免费av在线视频| 人妻人人澡人人爽人人| 秋霞伦理黄片| 亚洲精品久久久久久婷婷小说| 久久热在线av| 成年av动漫网址| 久久久久久久久久久免费av| 国产97色在线日韩免费| 老司机亚洲免费影院| 美女中出高潮动态图| av网站免费在线观看视频| 国产精品欧美亚洲77777| 最近的中文字幕免费完整| 久久久欧美国产精品| 又大又爽又粗| 丁香六月欧美| 国产色婷婷99| 午夜福利在线免费观看网站| 欧美精品一区二区免费开放| 精品国产一区二区久久| 一本色道久久久久久精品综合| av一本久久久久| 精品少妇内射三级| 欧美精品亚洲一区二区| 亚洲美女搞黄在线观看| 午夜福利一区二区在线看| 性高湖久久久久久久久免费观看| 日韩av在线免费看完整版不卡| 性少妇av在线| 久久97久久精品| 男女边吃奶边做爰视频| 国产视频首页在线观看| 成年人午夜在线观看视频| 国产成人欧美| 中文字幕人妻丝袜制服| 亚洲av福利一区| 欧美人与性动交α欧美软件| 尾随美女入室| 街头女战士在线观看网站| av在线播放精品| 嫩草影视91久久| 叶爱在线成人免费视频播放| 亚洲精品国产色婷婷电影| 天堂俺去俺来也www色官网| 高清视频免费观看一区二区| 桃花免费在线播放| 亚洲精品第二区| 热99久久久久精品小说推荐| 日本午夜av视频| 午夜日韩欧美国产| h视频一区二区三区| 在线观看免费日韩欧美大片| 王馨瑶露胸无遮挡在线观看| 日韩 欧美 亚洲 中文字幕| 美女大奶头黄色视频| 伦理电影免费视频| 老熟女久久久| 亚洲精品视频女| 亚洲情色 制服丝袜| 女的被弄到高潮叫床怎么办| 满18在线观看网站| 制服人妻中文乱码| av在线app专区| 国产成人精品久久二区二区91 | 日日啪夜夜爽| 精品福利永久在线观看| 国产成人a∨麻豆精品| 丝袜喷水一区| 如日韩欧美国产精品一区二区三区| 少妇猛男粗大的猛烈进出视频| 在线观看人妻少妇| 国产精品成人在线| 中文字幕精品免费在线观看视频| 尾随美女入室| 日韩中文字幕视频在线看片| 99热全是精品| 欧美激情高清一区二区三区 | 亚洲,欧美,日韩| 国产精品秋霞免费鲁丝片| 欧美人与性动交α欧美软件| 久久av网站| 久久精品久久久久久噜噜老黄| 夜夜骑夜夜射夜夜干| 两个人免费观看高清视频| 日本av免费视频播放| 99久久99久久久精品蜜桃| 欧美97在线视频| 色精品久久人妻99蜜桃| 9191精品国产免费久久| 亚洲,欧美精品.| 久久精品久久久久久噜噜老黄| 老司机深夜福利视频在线观看 | 美女午夜性视频免费| 超色免费av| 王馨瑶露胸无遮挡在线观看| 亚洲国产精品999| 亚洲欧洲国产日韩| 韩国av在线不卡| 亚洲av日韩精品久久久久久密 | 欧美久久黑人一区二区| 久久国产精品男人的天堂亚洲| 又大又爽又粗| 99久国产av精品国产电影| 久久亚洲国产成人精品v| 久久久精品国产亚洲av高清涩受| 欧美成人午夜精品| 久久人妻熟女aⅴ| 男女国产视频网站| 精品亚洲乱码少妇综合久久| 青草久久国产| 亚洲四区av| 久久亚洲国产成人精品v| 国产精品秋霞免费鲁丝片| 中文乱码字字幕精品一区二区三区| 亚洲,欧美精品.| 女人精品久久久久毛片| 久久久精品区二区三区| 亚洲av欧美aⅴ国产| 啦啦啦中文免费视频观看日本| 女人高潮潮喷娇喘18禁视频| 午夜av观看不卡| 中文字幕精品免费在线观看视频| 九草在线视频观看| 多毛熟女@视频| 999久久久国产精品视频| 亚洲av欧美aⅴ国产| 国精品久久久久久国模美| 一级爰片在线观看| 日韩伦理黄色片| 日本欧美国产在线视频| 精品酒店卫生间| 国产精品国产三级专区第一集| 久久韩国三级中文字幕| 欧美人与善性xxx| 各种免费的搞黄视频| 亚洲熟女毛片儿| 国产免费现黄频在线看| 久久综合国产亚洲精品| 成人午夜精彩视频在线观看| 一本色道久久久久久精品综合| 热re99久久精品国产66热6| 免费观看av网站的网址| 黄色毛片三级朝国网站| 亚洲男人天堂网一区| 一本—道久久a久久精品蜜桃钙片| 国产亚洲av高清不卡| 天天添夜夜摸| 最近中文字幕高清免费大全6| 天天影视国产精品| 在线观看免费午夜福利视频| 亚洲欧美精品综合一区二区三区| 欧美xxⅹ黑人| 亚洲,一卡二卡三卡| 国产xxxxx性猛交| 波多野结衣一区麻豆| 日本欧美国产在线视频| 成人亚洲欧美一区二区av| av天堂久久9| 少妇精品久久久久久久| 国产精品秋霞免费鲁丝片| 国产乱来视频区| 亚洲欧美精品自产自拍| 亚洲精品av麻豆狂野| 男女午夜视频在线观看| 天堂中文最新版在线下载| 午夜老司机福利片| 久久97久久精品| 三上悠亚av全集在线观看| 国产成人精品福利久久| 日韩一卡2卡3卡4卡2021年| 午夜91福利影院| 国产精品久久久av美女十八| 欧美精品人与动牲交sv欧美| 亚洲欧洲日产国产| 99热全是精品| 亚洲在久久综合| 日韩不卡一区二区三区视频在线| 国语对白做爰xxxⅹ性视频网站| 国产免费又黄又爽又色| 伊人亚洲综合成人网| 中文字幕最新亚洲高清| 悠悠久久av| 国产熟女欧美一区二区| 亚洲欧洲精品一区二区精品久久久 | 久久精品aⅴ一区二区三区四区| 免费不卡黄色视频| 中文字幕亚洲精品专区| 两性夫妻黄色片| 777久久人妻少妇嫩草av网站| 99精国产麻豆久久婷婷| 91老司机精品| 秋霞伦理黄片| 别揉我奶头~嗯~啊~动态视频 | 天天躁夜夜躁狠狠久久av| 大片电影免费在线观看免费| 老司机在亚洲福利影院| 免费看不卡的av| 99精品久久久久人妻精品| 亚洲国产最新在线播放| 青春草视频在线免费观看| 欧美亚洲 丝袜 人妻 在线| 国产99久久九九免费精品| 精品人妻一区二区三区麻豆| 永久免费av网站大全| 国产高清国产精品国产三级| 美女视频免费永久观看网站| 亚洲人成77777在线视频| 久久久久国产一级毛片高清牌| 侵犯人妻中文字幕一二三四区| 亚洲成色77777| 黄色怎么调成土黄色| 欧美变态另类bdsm刘玥| 欧美中文综合在线视频| 精品一区二区三卡| 国产精品人妻久久久影院| 亚洲图色成人| 久久久久精品久久久久真实原创| 黄色 视频免费看| 伊人亚洲综合成人网| 两个人看的免费小视频| 亚洲国产成人一精品久久久| 肉色欧美久久久久久久蜜桃| 黄色怎么调成土黄色| 国产精品国产av在线观看| 又粗又硬又长又爽又黄的视频| 国产日韩欧美亚洲二区| 菩萨蛮人人尽说江南好唐韦庄| 亚洲国产欧美日韩在线播放| 波多野结衣一区麻豆| 99热全是精品| 免费高清在线观看视频在线观看| 免费黄色在线免费观看| 又大又爽又粗| 一级a爱视频在线免费观看| 亚洲av日韩在线播放| 美国免费a级毛片| 久久久国产精品麻豆| 91老司机精品| 精品少妇黑人巨大在线播放| 天美传媒精品一区二区| 深夜精品福利| 91老司机精品| 亚洲第一青青草原| 久久热在线av| 亚洲久久久国产精品| 久久久久人妻精品一区果冻| 亚洲,欧美精品.| 国产亚洲最大av| 精品人妻在线不人妻| 色综合欧美亚洲国产小说| 免费黄频网站在线观看国产| 国产精品亚洲av一区麻豆 | 午夜久久久在线观看| 巨乳人妻的诱惑在线观看| 免费不卡黄色视频| 国产亚洲av片在线观看秒播厂| 亚洲av成人精品一二三区| 老司机靠b影院| 成人手机av| 国产日韩欧美视频二区| 只有这里有精品99| 亚洲av综合色区一区| 又大又爽又粗| 亚洲精品在线美女| 成人手机av| 国产亚洲一区二区精品| 亚洲美女黄色视频免费看| 麻豆乱淫一区二区| 大香蕉久久成人网| 超碰97精品在线观看| 丝袜人妻中文字幕| 另类精品久久| 69精品国产乱码久久久| 亚洲国产成人一精品久久久| 亚洲精品自拍成人| av国产久精品久网站免费入址| 国产成人午夜福利电影在线观看| 亚洲国产av影院在线观看| 亚洲精品第二区| 大陆偷拍与自拍| 中文字幕最新亚洲高清| 少妇被粗大的猛进出69影院| 久久久久精品国产欧美久久久 | 侵犯人妻中文字幕一二三四区| 国产精品 国内视频| 国产精品99久久99久久久不卡 | 国产精品蜜桃在线观看| 亚洲欧美成人综合另类久久久| 熟妇人妻不卡中文字幕| 中文欧美无线码| 夜夜骑夜夜射夜夜干| videosex国产| 91精品伊人久久大香线蕉| 人妻 亚洲 视频| 免费不卡黄色视频| 在线观看一区二区三区激情| 啦啦啦在线观看免费高清www| 国产有黄有色有爽视频| 免费日韩欧美在线观看| 久久久久久久久久久久大奶| 亚洲欧美激情在线| 亚洲熟女精品中文字幕| 女的被弄到高潮叫床怎么办| 一区在线观看完整版| 欧美 日韩 精品 国产| 69精品国产乱码久久久| 韩国精品一区二区三区| 国产精品亚洲av一区麻豆 | 亚洲av电影在线观看一区二区三区| 国产不卡av网站在线观看| 中国国产av一级| 高清视频免费观看一区二区| 国产免费视频播放在线视频| 欧美日韩亚洲综合一区二区三区_| 9191精品国产免费久久| 国产精品.久久久| av卡一久久| 欧美最新免费一区二区三区| 亚洲自偷自拍图片 自拍| 国产免费福利视频在线观看| 亚洲国产精品成人久久小说| 日韩av在线免费看完整版不卡| 99热全是精品| 日本午夜av视频| 久久人人97超碰香蕉20202| 菩萨蛮人人尽说江南好唐韦庄| 国产野战对白在线观看| 日韩av免费高清视频| 成人午夜精彩视频在线观看| 天堂8中文在线网| www.av在线官网国产| 卡戴珊不雅视频在线播放| 黑丝袜美女国产一区| 亚洲自偷自拍图片 自拍| 你懂的网址亚洲精品在线观看| 一二三四中文在线观看免费高清| 国产午夜精品一二区理论片| 视频区图区小说| 少妇 在线观看| 欧美老熟妇乱子伦牲交| 国产精品免费大片| 久久av网站| 亚洲 欧美一区二区三区| 成人亚洲精品一区在线观看| 热99国产精品久久久久久7| 美女大奶头黄色视频| 亚洲三区欧美一区| 日韩精品有码人妻一区| 日日啪夜夜爽| 国产男女内射视频| 宅男免费午夜| 一区二区三区四区激情视频| 国产av精品麻豆| 亚洲精品日本国产第一区| 国产成人精品无人区| 一边摸一边抽搐一进一出视频| 日韩av在线免费看完整版不卡| 热re99久久精品国产66热6| 国产极品粉嫩免费观看在线| 无遮挡黄片免费观看| 一级片'在线观看视频| 久久天堂一区二区三区四区| 日日摸夜夜添夜夜爱| 精品国产露脸久久av麻豆| 男人爽女人下面视频在线观看| 婷婷色麻豆天堂久久| 亚洲av综合色区一区| 日本wwww免费看| 亚洲第一av免费看| 国产人伦9x9x在线观看| 亚洲中文av在线| 天天躁日日躁夜夜躁夜夜| 纵有疾风起免费观看全集完整版| 黄网站色视频无遮挡免费观看| 99精品久久久久人妻精品| 国产日韩欧美亚洲二区| 国产精品 欧美亚洲| 亚洲国产精品999| 91精品国产国语对白视频| 免费女性裸体啪啪无遮挡网站| 美女国产高潮福利片在线看| 一区二区三区精品91| 亚洲av电影在线观看一区二区三区| 亚洲,欧美精品.| 在线观看免费高清a一片| 你懂的网址亚洲精品在线观看| 水蜜桃什么品种好| 哪个播放器可以免费观看大片| 日日啪夜夜爽| svipshipincom国产片| 久久久久精品人妻al黑| 亚洲综合精品二区| 男女边摸边吃奶| 欧美成人精品欧美一级黄| 亚洲免费av在线视频| 精品人妻在线不人妻| 日本av手机在线免费观看| 国产成人精品在线电影| 黄色视频在线播放观看不卡| 亚洲一卡2卡3卡4卡5卡精品中文| 亚洲国产毛片av蜜桃av| 欧美精品av麻豆av| 色94色欧美一区二区| 在现免费观看毛片| 国产一区二区三区av在线| 亚洲精品一区蜜桃| 午夜久久久在线观看| 精品人妻熟女毛片av久久网站| 国产免费一区二区三区四区乱码| 日韩人妻精品一区2区三区| 亚洲国产欧美日韩在线播放| 成年女人毛片免费观看观看9 | 成人亚洲精品一区在线观看| 下体分泌物呈黄色| 99精品久久久久人妻精品| 在线观看一区二区三区激情| 久久人人爽av亚洲精品天堂| 国产男女超爽视频在线观看| 九色亚洲精品在线播放| 久久精品久久久久久噜噜老黄| 日本欧美视频一区| 91老司机精品| 狠狠婷婷综合久久久久久88av| 亚洲欧美日韩另类电影网站| 日本wwww免费看| 欧美黑人精品巨大| 国产免费视频播放在线视频| 男的添女的下面高潮视频| 女的被弄到高潮叫床怎么办| 色综合欧美亚洲国产小说| 欧美人与善性xxx| 一本久久精品| 亚洲人成电影观看| 新久久久久国产一级毛片| 精品一区在线观看国产| 波野结衣二区三区在线| 少妇人妻精品综合一区二区| 亚洲美女视频黄频| 另类亚洲欧美激情| 精品少妇内射三级| 亚洲精品视频女| 亚洲av成人不卡在线观看播放网 | 可以免费在线观看a视频的电影网站 | 亚洲成国产人片在线观看| 宅男免费午夜| 一本久久精品| 国产精品无大码| 国产一区有黄有色的免费视频| 国产一区二区 视频在线| 国产高清不卡午夜福利| 亚洲精品成人av观看孕妇| 日本wwww免费看| 少妇被粗大的猛进出69影院| 9191精品国产免费久久| 欧美日韩成人在线一区二区| 精品亚洲成国产av| 青青草视频在线视频观看| 中文字幕人妻丝袜制服| 亚洲成人一二三区av| 可以免费在线观看a视频的电影网站 | 美女扒开内裤让男人捅视频| 国产在线免费精品| a 毛片基地| 亚洲美女视频黄频| 国产不卡av网站在线观看| 午夜91福利影院| 只有这里有精品99| 国产成人系列免费观看| 久久精品久久精品一区二区三区| 国产成人精品久久久久久| 悠悠久久av| 国产男女内射视频| 久久女婷五月综合色啪小说| 一边摸一边做爽爽视频免费| 国产精品亚洲av一区麻豆 | 中文字幕精品免费在线观看视频| 国产精品一国产av| 日韩电影二区| av卡一久久| 最新的欧美精品一区二区| 久久久国产欧美日韩av| 日本av手机在线免费观看| 国产成人精品无人区| 国产欧美日韩一区二区三区在线| 亚洲欧洲日产国产| 黑丝袜美女国产一区| 性色av一级| 亚洲av男天堂| 三上悠亚av全集在线观看| 无限看片的www在线观看| 久久精品国产亚洲av高清一级| 亚洲七黄色美女视频| 国产乱来视频区| 久久久久精品久久久久真实原创| 在线观看人妻少妇| 久久久国产一区二区| 国产免费又黄又爽又色| 日韩人妻精品一区2区三区| 韩国av在线不卡| 国产熟女午夜一区二区三区| 亚洲欧美精品综合一区二区三区| 日韩伦理黄色片| 超碰97精品在线观看| 精品一区二区三区四区五区乱码 | 激情视频va一区二区三区| 免费在线观看黄色视频的| 国产野战对白在线观看| 多毛熟女@视频| 五月开心婷婷网| 久久久久网色| av免费观看日本| 国产xxxxx性猛交| 精品一区二区三区av网在线观看 | 国产精品久久久久久人妻精品电影 | 中文字幕制服av| 成人毛片60女人毛片免费| 亚洲五月色婷婷综合| 男人操女人黄网站| av线在线观看网站| 国产精品三级大全| 精品酒店卫生间| 黄色视频在线播放观看不卡| 久久99精品国语久久久| 天天躁夜夜躁狠狠躁躁| 日本猛色少妇xxxxx猛交久久| 精品酒店卫生间| 国产精品久久久久久精品电影小说| 欧美精品一区二区免费开放| 国产日韩欧美亚洲二区| 丰满乱子伦码专区| 日韩熟女老妇一区二区性免费视频| 美女国产高潮福利片在线看| 大香蕉久久成人网|