Páginas:124Géneros:12:TGB:Mechanicalengineering12:UYQS:SpeechrecognitionSinopsis:Inthisbook,weintroducethebackgroundandmainstreammethodsofprobabilisticmodelinganddiscriminativeparameteroptimizationforspeechrecognition.Thespecificmodelstreatedindepthincludethewidelyusedexponential-familydistributionsandthehiddenMarkovmodel.Adetailedstudyispresentedonunifyingthecommonobjectivefunctionsfordiscriminativelearninginspeechrecognition,namelymaximummutualinformation(MMI),minimumclassificationerror,andminimumphone/worderror.Theunificationispresented,withrigorousmathematicalanalysis,inacommonrational-functionform.Thiscommonformenablestheuseofthegrowthtransformation(orextendedBaum-Welch)optimizationframeworkindiscriminativelearningofmodelparameters.Inadditiontoallthenecessaryintroductionofthebackgroundandtutorialmaterialonthesubject,wealsoincludedtechnicaldetailsonthederivationoftheparameteroptimizationformulasforexponential-familydistributions,discretehiddenMarkovmodels(HMMs),andcontinuous-densityHMMsindiscriminativelearning.Selectedexperimentalresultsobtainedbytheauthorsinfirsthandarepresentedtoshowthatdiscriminativelearningcanleadtosuperiorspeechrecognitionperformanceoverconventionalparameterlearning.Detailsonmajoralgorithmicimplementationissueswithpracticalsignificanceareprovidedtoenablethepractitionerstodirectlyreproducethetheoryintheearlierpartofthebookintoengineeringpractice._,
"Sinopsis" puede pertenecer a otra edición de este libro.
In this book, we introduce the background and mainstream methods of probabilistic modeling and discriminative parameter optimization for speech recognition. The specific models treated in depth include the widely used exponential-family distributions and the hidden Markov model. A detailed study is presented on unifying the common objective functions for discriminative learning in speech recognition, namely maximum mutual information (MMI), minimum classification error, and minimum phone/word error. The unification is presented, with rigorous mathematical analysis, in a common rational-function form. This common form enables the use of the growth transformation (or extended Baum–Welch) optimization framework in discriminative learning of model parameters. In addition to all the necessary introduction of the background and tutorial material on the subject, we also included technical details on the derivation of the parameter optimization formulas for exponential-family distributions, discrete hidden Markov models (HMMs), and continuous-density HMMs in discriminative learning. Selected experimental results obtained by the authors in firsthand are presented to show that discriminative learning can lead to superior speech recognition performance over conventional parameter learning. Details on major algorithmic implementation issues with practical significance are provided to enable the practitioners to directly reproduce the theory in the earlier part of the book into engineering practice.
"Sobre este título" puede pertenecer a otra edición de este libro.
EUR 3,34 gastos de envío en Estados Unidos de America
Destinos, gastos y plazos de envíoLibrería: HPB-Red, Dallas, TX, Estados Unidos de America
Paperback. Condición: Good. Connecting readers with great books since 1972! Used textbooks may not include companion materials such as access codes, etc. May have some wear or writing/highlighting. We ship orders daily and Customer Service is our top priority! Nº de ref. del artículo: S_358647312
Cantidad disponible: 1 disponibles
Librería: SecondSale, Montgomery, IL, Estados Unidos de America
Condición: Very Good. Item in very good condition! Textbooks may not include supplemental items i.e. CDs, access codes etc. Nº de ref. del artículo: 00081017020
Cantidad disponible: 1 disponibles
Librería: BookOrders, Russell, IA, Estados Unidos de America
Soft Cover. Condición: Good. Ex-library with the usual features. Library label on front cover. The interior is clean and tight. Binding is good. Cover shows light wear. Ex-Library. Nº de ref. del artículo: 122061
Cantidad disponible: 1 disponibles