A New Translation Template Learning Based on Hidden Markov Modeling

Similar documents
Supporting information How to concatenate the local attractors of subnetworks in the HPFP


ANOTHER CATEGORY OF THE STOCHASTIC DEPENDENCE FOR ECONOMETRIC MODELING OF TIME SERIES DATA

II The Z Transform. Topics to be covered. 1. Introduction. 2. The Z transform. 3. Z transforms of elementary functions

Review: Transformations. Transformations - Viewing. Transformations - Modeling. world CAMERA OBJECT WORLD CSE 681 CSE 681 CSE 681 CSE 681

Variants of Pegasos. December 11, 2009

4.8 Improper Integrals

( ) () we define the interaction representation by the unitary transformation () = ()

Introduction. Section 9: HIGHER ORDER TWO DIMENSIONAL SHAPE FUNCTIONS

THEORETICAL AUTOCORRELATIONS. ) if often denoted by γ. Note that

Modeling and Predicting Sequences: HMM and (may be) CRF. Amr Ahmed Feb 25

e t dt e t dt = lim e t dt T (1 e T ) = 1

Solution in semi infinite diffusion couples (error function analysis)

Remember: Project Proposals are due April 11.

Simplified Variance Estimation for Three-Stage Random Sampling

A NEW INTERPRETATION OF INTERVAL-VALUED FUZZY INTERIOR IDEALS OF ORDERED SEMIGROUPS

Hidden Markov Model. a ij. Observation : O1,O2,... States in time : q1, q2,... All states : s1, s2,..., sn

Epistemic Game Theory: Online Appendix

Interval Estimation. Consider a random variable X with a mean of X. Let X be distributed as X X

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!") i+1,q - [(!

TSS = SST + SSE An orthogonal partition of the total SS

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS

P-Convexity Property in Musielak-Orlicz Function Space of Bohner Type

Linear Response Theory: The connection between QFT and experiments

Testing a new idea to solve the P = NP problem with mathematical induction

THE EXISTENCE OF SOLUTIONS FOR A CLASS OF IMPULSIVE FRACTIONAL Q-DIFFERENCE EQUATIONS

Modeling of magnetic levitation system

The Characterization of Jones Polynomial. for Some Knots

P441 Analytical Mechanics - I. Coupled Oscillators. c Alex R. Dzierba

Fingerprint Registration Using Centroid Structure and Line Segments 1

Chapter 2: Evaluative Feedback

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005

Motion. Part 2: Constant Acceleration. Acceleration. October Lab Physics. Ms. Levine 1. Acceleration. Acceleration. Units for Acceleration.

Decompression diagram sampler_src (source files and makefiles) bin (binary files) --- sh (sample shells) --- input (sample input files)

GAUSS ELIMINATION. Consider the following system of algebraic linear equations

On One Analytic Method of. Constructing Program Controls

Cubic Bezier Homotopy Function for Solving Exponential Equations

Exponents and Powers

Forms of Energy. Mass = Energy. Page 1. SPH4U: Introduction to Work. Work & Energy. Particle Physics:

Privacy-Preserving Bayesian Network Parameter Learning

Query Data With Fuzzy Information In Object- Oriented Databases An Approach The Semantic Neighborhood Of Hedge Algebras

Introduction. Voice Coil Motors. Introduction - Voice Coil Velocimeter Electromechanical Systems. F = Bli

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas)

Chapter Newton-Raphson Method of Solving a Nonlinear Equation

Quick Fuzzy Backpropagation Algorithm

September 20 Homework Solutions

Response of MDOF systems

Chapter 6 DETECTION AND ESTIMATION: Model of digital communication system. Fundamental issues in digital communications are

Applied Statistics Qualifier Examination

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD

[ ] 2. [ ]3 + (Δx i + Δx i 1 ) / 2. Δx i-1 Δx i Δx i+1. TPG4160 Reservoir Simulation 2018 Lecture note 3. page 1 of 5

MODEL SOLUTIONS TO IIT JEE ADVANCED 2014

The Mathematics of Harmonic Oscillators

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.

Online Supplement for Dynamic Multi-Technology. Production-Inventory Problem with Emissions Trading

Bag for Sophia by Leonie Bateman and Deirdre Bond-Abel

Hidden Markov Models. A Specific Form of Process.. Doubly Stochastic Processes. What a sensible agent must do. A Common Trait

Li An-Ping. Beijing , P.R.China

Chapter Direct Method of Interpolation

Rank One Update And the Google Matrix by Al Bernstein Signal Science, LLC

CURVE FITTING LEAST SQUARES METHOD

Main questions Motivation: Recognition

Jordan Journal of Physics

The Schur-Cohn Algorithm

Chapter One Mixture of Ideal Gases

OXFORD H i g h e r E d u c a t i o n Oxford University Press, All rights reserved.

UNIVERSITY OF IOANNINA DEPARTMENT OF ECONOMICS. M.Sc. in Economics MICROECONOMIC THEORY I. Problem Set II

Chapter Lagrangian Interpolation

Chapter 12 Lyes KADEM [Thermodynamics II] 2007

Chapter Newton-Raphson Method of Solving a Nonlinear Equation

Laplace Transform. Definition of Laplace Transform: f(t) that satisfies The Laplace transform of f(t) is defined as.

Example: MOSFET Amplifier Distortion

Mathematics 805 Final Examination Answers

f t f a f x dx By Lin McMullin f x dx= f b f a. 2

DCDM BUSINESS SCHOOL NUMERICAL METHODS (COS 233-8) Solutions to Assignment 3. x f(x)

Appendix H: Rarefaction and extrapolation of Hill numbers for incidence data

Stat 6863-Handout 5 Fundamentals of Interest July 2010, Maurice A. Geraghty

M. Y. Adamu Mathematical Sciences Programme, AbubakarTafawaBalewa University, Bauchi, Nigeria

Midterm Exam. Thursday, April hour, 15 minutes

4. Eccentric axial loading, cross-section core

EEM 486: Computer Architecture

Calculus 241, section 12.2 Limits/Continuity & 12.3 Derivatives/Integrals notes by Tim Pilachowski r r r =, with a domain of real ( )

CH.3. COMPATIBILITY EQUATIONS. Continuum Mechanics Course (MMC) - ETSECCPB - UPC

Graduate Macroeconomics 2 Problem set 5. - Solutions

Comparison of Differences between Power Means 1

Contraction Mapping Principle Approach to Differential Equations

Some Inequalities variations on a common theme Lecture I, UL 2007

Notes on the stability of dynamic systems and the use of Eigen Values.

J i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes.

Normal Random Variable and its discriminant functions

T h e C S E T I P r o j e c t

In this Chapter. Chap. 3 Markov chains and hidden Markov models. Probabilistic Models. Example: CpG Islands

f(x) dx with An integral having either an infinite limit of integration or an unbounded integrand is called improper. Here are two examples dx x x 2

_ J.. C C A 551NED. - n R ' ' t i :. t ; . b c c : : I I .., I AS IEC. r '2 5? 9

A Simple Method to Solve Quartic Equations. Key words: Polynomials, Quartics, Equations of the Fourth Degree INTRODUCTION

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s

An Introduction to Trigonometry

Transcription:

A New Trnslon Teple Lernng Bsed on Hdden Mrov Modelng NGUYEN MINH LE, AKIRA HIMAZU, nd UUMU HORIGUCHI Grdue chool of Inforon cence, JAIT IHIKAWA 93-9, JAPAN Absrc: Ths pper ddresses novel rnslon ehod bsed on Hdden Mrov Model, usng eple rules h re lerned fro he blngul corpus. The ehod cn enhnce he rnslon ccurcy nd ensure low coplexy n coprng wh prevous eple lernng rnslon ehod, nd presens new perspecve for pplyng sscl chne lernng on exple bsed rnslon don. Keywords: Mchne rnslon, EBMT, Trnslon Teple lernng, HMM. Inroducon Exple bsed rnslon (EBMT), orgnlly proposed by Ngo [], s one of he n pproches of corpus-bsed chne rnslon. Followng Ngo's orgnl proposl, severl ehods were presened by Ngo nd o [], u nd Id [3], Nren Burg e l[4] nd Brown [5]. The excellen revew pper of EBMT [6] descrbes he n de behnd EBMT s follows. A gven npu senence n he source lnguge s copred wh he exple rnslons n he gven blngul prllel ex o fnd he close chng exples so h hese exples cn be used n he rnslon of h npu senence. Afer fndng he close chng for he senence n he source lnguge, prs of he correspondng rge lnguge senence re consruced usng srucurl equvlences nd devnces n hose ches. Ccel nd Güvernr [7][8] proposed lernng rnslon eple whch ws ppled he Ngo s pproch o rnslons fro Englsh o Tursh. Ths ehod s one of he successful ehods nd uses he slry nd dfference beween source senence nd rge senence n he gven blngul corpus o buld eple rules for rnslon. Is dvnge s h does no need coplex syncc or senc prsng nd overcoes he perfecons of rule-bsed chne rnslon. One of he dsdvnges of he ehod s h ny eples cn be ched wh prculr npu senence. To overcoe hs proble, Öz [9] presen ehod whch llows eple rules o sor he rnslon resuls ccordng o her confdence fcors. However, n order o obn he oupu resuls, hs ehod needs o evlue ll chng rules for ech npu senence, nd ny of hese re redundn rules. The exponenl clculon proble wll rse when n npu senence s long nd he nuber of eple rules s lrge. Here, we presen novel ehod bsed on n HMM odel h uses consrns o esblsh se of chng rules for ech npu senence. Thus, we cn vod he exponenl clculon proble nd fnd he bes resuls for rnslon n npu senence by perforng dync lgorh on HMM odel. The render of hs pper s orgnzed s follows: The eple lernng lgorh s gven n econ. econ 3 descrbes n HMM odel for rnslon usng eple rules. econ 4 shows experens on n Englsh-Venese rnslon syse, nd econ 5 presens our conclusons nd dscusses soe ousndng probles o be solved n he fuure wor. Trnslon Teple Lernng The Trnslon Teple Lernng lgorh (TTL) nfers rnslon eples usng slres nd dfferences beween wo rnslon exples, E nd E b, en fro blngul prllel corpus. Forlly, rnslon exple E : E E s

coposed of pr of senences, E nd E, h re rnslons of ech oher n Englsh nd Venese respecvely. A slry beween wo senences of lnguge s non-epy sequence of coon es (roo words or orphees) n boh senences. A dfference beween wo senences of lnguge s pr of wo sequences ( D, D ) where D s sub-sequence of he frs senence, D s sub sequence of he second senence, nd D nd D do no conn ny coon es. Gven wo rnslon exples ( E, Eb ), we ry o fnd slres beween he consuens of E nd E b. A senence s consdered s sequence of lexcl es. If no slres cn be found, hen no eple s lerned fro hese exples. If here re slr consuens hen ch sequence M, b n he followng for s genered., D,..., Dn, n, D,,..., D, for n, Here, represens slry ( sequence of coon es) beween E nd E b. lrly, D : ( D,, D, b ) represens dfferences beween E nd E b, where D,, D re non-epy dfferen, b es beween wo slr consuens, +. For nsnce, le us ssue h he followng rnslon exples re gven: I bough he boo for John <-> Tô đã u ộ quyển sách cho John I bough he rng for John <-> Tô đã u ộ chếc nhẫn cho John For hese rnslon exples, he chng lgorh obns he followng ch sequence. I bough he (boo, rng) for John <-> Tô đã u ộ (quyển sách, chếc nhẫn) cho John Th s, = I bough he, D =(boo, rng), = for John, =Tô đã u ộ, D =(quyển sách, chếc nhẫn), =cho John. Afer ch sequence s found for wo rnslon exples, we use wo dfferen lernng heurscs o nfer rnslon eples [7][8] fro h ch sequence. These wo lernng heurscs ry o loce correspondng dfferences or slres n he ch sequence. The frs heursc, he lry Trnslon Teple Lernng lgorh (TTL), res o loce ll correspondng dfferences nd generes new rnslon eple by replcng ll of he dfferences wh vrbles. The second heursc, he Dfference Trnslon Teple Lernng lgorh (DTTL), cn nfer rnslon eples by replcng slres wh vrbles, f cn loce correspondng slres n he ch sequence. The TTL nd DTTL re cobned s he Trnslon Teple Lernng lgorh (TTL). Fro he corpus, he TTL lgorh res o nfer rnslon eples usng he bove wo lgorhs. Afer ll of he rnslon eples hve been lerned, hey re sored ccordng o her specfces. Gven wo eples, he one h hs hgher nuber of ernls s ore specfc hn he oher. In he followng secon, we ddress new ehod o ncrese rnslon ccurcy nd reduce clculon coplexy. 3 Trnslon Teple Lernng Bse on HMM To expln rnslon eple lernng bsed on HMM odel, soe noons re defned bellow, followed by he presenon of rnslon bsed HMM odel. 3. Teple Rules Le L nd TL be he source nd rge lnguges nd... n TT... T be eple rules, n whch s sequence of word or vrble n L nd T s sequence of words (clled consn eleen) or vrble n TL. Ech vrble n he lef sde s lgned wh ech vrble n he rgh. A vrble n he lef sde nd vrble n he rgh sde of eple rule wll be receved s phrse or word n L nd TL, respecvely. Fgure depcs n exple of eple rules n whch senence connng gve up n Englsh s rnsled o senence n Venese connng u bo. gve 3 up T u bo T3 Fgure. Teple rule exple Le lexcl rule be eple rule h hs no vrble nsde. A lexcl rule s blngul phrse n L nd TL.

3. Trnslon bse on HMM Model 3.. The Model The odel we propose hs wo seps. Frs, we forule eple lernng rnslon s n equvlen proble h cn be solved by usng he HMM odel, bsed on se of consrns rules whch re observed fro he chrcerscs of L nd TL nd on rnng corpus. Nex, dync progrng echnque, vrn of he Verb lgorh, s used o fnd he bes rnslon resuls. phrse lexcl rule lexcl rule lexcl rule subsrng vrble X Inpu Vrble Y senence phrse Proble: Gven n npu senence e e... e ( e s word) nd se of eple rules r, r,..., rd, fnd he se of rules so h her rnslon resuls bes expln h senence. For convenence we wll use e [: ] s shorhnd for he npu senence e e... e. The proble s equvlen o fndng ll rnslon resuls for ech rule r (=, d). Assung h he rule r s defned s... n TT... T, he orgnl ehod [8] res o fnd ll possble wys o replce he vrbles wh phrses n L so h he npu senence e [:] cn be produced fro hs rule. Nex, fnds ech correspondng phrse n TL whn he se of lexcl rules wh phrse n L, n order o rnsfor he npu senence no he rge lnguge. However, when he npu senence s long nd he nuber of rules s lrge wh ny vrbles nsde, he orgnl ehod hs o cope wh n exponenl clculon. To overcoe hs proble, we propose n pproch bsed on HMM odelng, s dscussed below. Fgure shows h n npu senence cn be decoposed n ny wys usng he lef sde of he eple rules. uppose h he vrbles X nd Y ech hve eleens whose subsrngs cn be found n he npu senence on he lef sdes of he lexcl rules. For ech eleen n he vrble X whch hs poson whn he npu senence we hve o fnd ll eleens for he vrble Y h hve subsrngs whch sr fro poson + nd re lso on he lef sde of lexcl rule. Thus, we hve o consder x rnslon cobnons, os of whch us be dscrded. Fro he exple n Fgure, ech consn, s cn be ssoced wh phrse n he rgh sde of he rule, r, nd ech vrble s whn he rule r cn be ssoced wh se of lexcl rules whose lef sde s subsrng h could sr fro ny possble posons whn he npu senence. In such frewor, we cn ssue h lexcl rule corresponds o hdden se nd subsrng n he npu senence o n observed sybol produced fro he se, nd h he proble of rnslon s equvlen o fndng lexcl rule for ech vrble. Fgure. Exple of rnslon bsed HMM Accordngly, he proble cn be solved by usng he vrn of HMM odelng s enoned bove. To fnd he os lely sequence of lexcl rules, we us fnd sequence of lexcl rules h xzes he probbly P ( r e, e,... e ). nce r :... n T T... T, we obn he followng. P ( r e, e,..., e ) = P (,,..., n e, e,..., e ) () nce e e... e s sequence of npu words, nd he probbly P ( e, e,..., e ) s gven, we need o xze he forul below. P e, e,.., e,,.., ) P(,,..., ) () ( n n Usng he Bgr odel, () cn be pproxed s n n P + ) P( e e... e ) (3) ( Where e... e ches wh he lef sde of lexcl rule chng wh. To fnd he sequence of lexcl rules h xzes forul (3), nd of dync progrng, he Verb lgorh [] cn be used. If he rule r hs n vrbles nd ech vrble consss of l eleens, hen he coplexy s n l, whle he recursve wy s l n. In ddon, ech rule r cn be ssgned rnslon score s he vlue of forul (3) nd oupu rnslons for he npu senence cn be sored ccordng o he score vlue on he whole se of eple rules. Therefore, usng HMM odelng enbles us o vod he exponenl clculon proble by usng he dync Verb lgorh. In ddon, cn sor rnslon resuls wh hgher ccurcy whou he need for coplex processng on se of eple rules. I lso presens new perspecve for pplyng sscl chne

lernng heores n he exple-bsed rnslon don. 3.. Eson of HMM Model The HMM odel for rnslon s esed by usng he Forwrd-Bcwrd lernng []. The corpus of source senences nd rge senences s used o genere observed sequences. Ech source senence s rnsled by usng sequence of lexcl rules f he rgh-hnd sde of he rules s he se s he rge senence whn he corpus. Afer obnng sequence of lexcl rules, he sequence of observed sybols s genered becuse ech observed sybol s lef-hnd sde of lexcl rule. Therefore, usng se of eple rules nd he corpus we cn genere rnng d s follows: O,...,,..., + + O,...,,,..., + O +. O +,...,,..., + Here O +,..., s sequence of observed sybols,, +,..., s sequence of lexcl rules, nd O +,...,,..., + ens h sequence of observed sybols s ssoced wh he sequence of lexcl rules. uppose h c( l ), c( l, l ) nd c ( o, l ) re he nuber of occurrences of lexcl rule l, he nuber of occurrence of lexcl rule l followng lexcl rule l, nd he nuber of occurrences of n observed sybol o correspondng wh lexcl rule l respecvely. Wh hese noons, he nlzon lgorh for esng n HMM odel by perforng he Forwrd-Bcwrd lgorh on he rnng d bove s descrbed s follows: For ll lexcl rules l do For ll lexcl rules l do c( l, l ) P ( l l ) = c( l ) For ll lexcl rule l do l For ll observed sybols o do l l c( o, l ) P ( o l ) = c( l ) Fgure 3. Algorh for nlzng he preers of he HMM odel for eple rules Afer nlzng he probbly of observed sybols nd lexcl rules, Forwrd-Bcwrd lernng s used o ese he HMM for rnslon. 3..3 Exple We descrbe rnslon exple usng he orgnl ehod nd he HMM ehod for n npu senence wh eple rule nd se of lexcl rules s shown n he Tble. There re hree rnslon oupus when pplyng he orgnl ehod: (L,L,L3), (L,L4,L5),(L,L6,L7). uppose h he probbles of wo lexcl rules n he exple re esed s follows: P(L L)=.; P(L L4)=.6; P(L L6)=.; P(L L3)=,; P(L4 L5)=.5; P(L6 L7)=.. Usng forul (3), we hve P(L,L,L3)=.4, P(L,L4,L5)=.3, P(L,L6,L7)=.4. Thus, he rnslon resul s he lely sequence of lexcl rules (L,L4,L5). Tble. An exple of rnslon usng eple rnslon lernng Inpu: I do no hn s necessry o lunch full nqury hs e Lexcl rule Teple rule: X necessry o lunch Y Z <=> X cần hế để bắ đầu Y Z L I do no hn s ô hông nghĩ nó là L L3 L4 L5 L6 L7 full sự đầy đủ nqury hs e đò hỏ ở hờ để này full nqury ộ cuộc đều r đầy đủ hs e ở hờ để này full nqury ộ câu hỏ đầy đủ ở hs e hờ gn này Hun rnslon: Tô hông nghĩ là nó hực sự cần hế để bắ đầu cuộc đều r ở hờ để này. EBMT(he orgnl lgorh hs o enuere ll rnslon resuls) (L,L,L3) : ô hông nghĩ là cần hế để bắ đầu sự đầy đủ đò hỏ ở hờ để này. (L, L4, L5): ô hông nghĩ nó là cần hế để bắ đầu ộ cuộc đều r đầy đủ ở hờ để này. (L, L6, L7): ô hông nghĩ nó là cần hế để bắ đầu ộ câu hỏ đầy đủ ở hờ gn này. HMM: (The proposed ehod obns he bes rnslon ) (L, L4, L5): ô hông nghĩ nó là cần hế để bắ đầu ộ cuộc đều r đầy đủ ở hờ để này Tble shows h he orgnl ehod hs o enuere ll rnslon resuls, whle he proposed ehod cn obn he bes rnslon resuls by pplyng dync lgorh. 4 Experens nd Dscusson In order o sser our ehod cn enhnce he ccurcy n rnslon whle ensure he coplexy

s low. We pleened n Englsh Venese rnslon nd esed on corpus of blngul senences colleced nully fro soe ex boos nd newsppers. We re experenng on he HMM odel. 4. Teple Trnslon Lernng Fgure 4 shows he nuber of eple rnslon rules vs. he nuber of senences whn he corpus. These resuls show how he nuber of eple rules for blngul Englsh-Venese corpus ncrese wh he nuber of senences. Fgure 4. The relon of he nuber of lexcl rules nd he nuber of eple rules o he nuber of senences whn he corpus. The nuber of senences n he corpus s fro 3 o senences. The sold lne nd he doed lne show he relon beween he nuber of eple rules nd he nuber of lexcl rules wh he nuber of senences whn he corpus. 4. Consrns Applcon In cse of he corpus sze s sll, we cn obn HMM odel by usng consrn pplcon. We splfy he odel by usng consrns nsed of he probbly of Bgr nd ge rnslons usng he Verb lgorh. Le A nd B be wo lexcl rules. The consrns re s follows: Consrn : A sequence of words rnsled by he lexcl rule A nd B s pered f he rgh sdes of A nd B ssfy he orphologcl condon n he rge lnguge nd hey re wo consecuve posons n he oupu rnslon. Consrn : A sequence of words rnsled by he lexcl rule A nd B s pered f here exs Venese senence whn he corpus h conn wo rgh sdes of A nd B. 4.3 HMM Model In our exple, here re,34 eple rules nd,87 lexcl rules, usng he eple rnslon lernng. The nuber of lexcl rules s he nuber of hdden ses n our HMM odel. Usng he eple rules nd he d corpus, we obned he rnng d for esng he HMM odel descrbed n secon 3..; he nlzon preers for he HMM odel were esed usng he lgorh n Fgure 3. The rnng d for esng he HMM odel consss of observed sequences; ech sequence corresponds o sequence of lexcl rules. We used observed sequences o nlze he preers for HMM odels, usng he lgorh n Fgure 3. Aferwrd, he Forwrd nd Bcwrd lgorh ws ppled o he renng sequences o rn he odel. 4.4 Experenl Resuls Afer we genered se of eple rules on he corpus, we esed he HMM odel s descrbed bove. We esed he rnslon ccurcy by usng he senences whn he corpus. Usng he Verb lgorh for ech rule, we were ble o obn ls of oupu rnslons. We copred he rnslon resuls fro our ehods wh hose of he orgnl ehod by clculng correc rnslons n he ol rnslon oupus. We obned he resuls shown n Tble. The senences whn he corpus were seleced rndoly nd used s npus for boh he orgnl ehod nd our ehods. Tble shows h he consrns pplcon nd he rnslon bsed-hmm cheved beer resuls n coprson wh he orgnl TTL lgorh. In ddon, our ehod cheved lower coplexy, O( n l ), n coprson wh he orgnl ehod O( l n ), n whch l s he nuber of lexcl rules nd n s nuber of vrbles n eple rule. Ths ws due o our use of dync lgorh o vod he exponenl clculon proble. Tble. Perfornce resuls Percenge of Percenge of Percenge of correc resuls correc resuls wh correc resuls by he orgnl consrns wh HMM ehod pplcons 48% 7% 8%

oe exples of our rnslon ehod re descrbed n Tble 3. The second colun of he ble shows he bes rnslon resuls cheved by our ehods. Tble 3. oe exples of our rnslon resuls Inpu senence Trnslon oupu How long wll you sy Anh sẽ ở lạ đây được bo lâu? here? My boo s s neresng Quyển sách củ ô hì lý hú s yours ngng vớ quyển sách củ nh everl new proposls re Nhều dự án ớ đng được ủy beng consdered by he bn cứu xé coee Before long rce seedlngs were bg enough o be plned n he feld Hve you wren your repor ye? If she hd seen he ove, she would hve old you 4.5 Dscusson Chẳng bo lâu so các cây lú đó đủ lớn để được cấy vào ruộng đó. Anh vế xong bản báo cáo chư? Nếu cô đã nhìn hấy ph, cô đã nó vớ bạn In cse of corpus sze s sll, we re ble o obned HMM odel by usng consrns pplcon snce our ehod doesn depend on he sze of corpus. When he corpus sze s lrge enough, he HMM odel cn be esed by usng coon lgorh such s he forwrd-bcwrd lgorh. In boh cses, he rnslon eple lernng usng HMM odel sgnfcnly proved he ccurcy nd he copuon clculon n coprng wh he orgnl lgorh. 5 Concluson Our use of HMM odelng vods he exponenl clculon proble by usng dync lgorh. In ddon, cn sor rnslon resuls ccordng o he beer ccurcy whou ny coplex process n se of eple rules nd herefore, ensure he rnslon ccurcy. Moreover, drws new perspecve for pplyng sscl chne lernng heory on exple bsed rnslon don. Mergng our proposed ehod wh rule-bsed rnslon ehod s currenly underwy. Acnowledgen We would le o hn o Judh eeh for edng he pper. Ths reserch ws suppored n pr by he JAIT nernonl reserch proec grn nd JP Grn-n- Ad for cenfc Reserch. References [] M.Ngo, Frewor of echncl rnslon beween Jpnese nd Englsh by nlogy prncple, n Arfcl nd Hun Inellgence, eded by A. Elhorn nd R Bner, NATO publcon: Norh-Hollnd, Ednburgh, 984, pp. 73-8. [].o nd M.Ngo, Towrd eory-bsed rnson, In Proceedng of he 3h Inernonl Conference on Copuonl Lngusc Helsn, Flnd, 99 (3), pp. 47-5. [3] E.u nd H. Id, Experens nd prospecs of exple-bsed chne rnslon, In proceedng of he 9h Annul Meeng of he Assocon for Copuonl Lnguscs 85-9. [4]. Nrenbug,. Bele, nd C. Doshnew, A full-ex experen n Exple-Bsed Mchne Trnslon, In New Mehods n Lnguge Proceesng, udes n Copuonl Lngusc, Mncheser, Englnd. [5] R.D. Brown, Trnsfer-rule nducon for exple bsed rnslon, In Proceedng of he Worshop on Exple-Bsed Mchne Trnslon. hp://www.e.org/suviii/ppers/-brown.pdf. [6].Hrld, Revew Arcle: Exple-Bsed, Mchne Trnslon, Mchne Trnslon, 999 (4), pp. 3-57. [7] I.Ccel, H.A. Güvenr, Lernng Trnslon Rules Fro A Blngul Corpus, Proceedng of he nd Inernonl Conference on New Mehod n Lnguge Processng (NeMLP-), Anr, Tury, epeber 996, pp.9-97 [8] H.A. Güvenr, I.Ccel, Lernng rnslon eples fro exples, In Inforon yse, 998(6), pp. 353-363. [9] Öz nd I.Ccel, Orderng Trnslon Teples by Assgnng Confdence Fcors, Proceedng of he 3 rd Conference of Assocon for Mchne Trnslon n he Mercs, Lnghorne, PA,998, pp.5-6. [] A.J. Verb, Error bounds for convoluon codes nd n sypoclly opl decodng lgorh, IEEE Trns on Inforon Theory, 967(3), pp.6-69. [] L.E. Bu nd J.A. Egon, An nequly wh pplcon o sscl eson for probblsc funcons of rov processes nd o odel of ecology, Bull. Aer. Mh. oc., 967(73), pp.36-363.