202503202103

Tags : Weighted Automata and Transducers

Normalizing a Transducer


Given a Sequential Transducers, consider the following situation:

  • After reading a word the transducer has produced the string .
  • On reading a further the tranducer continues with the string
  • If instead it read a it would have continued with
  • If the word would have ended reading there it would have appeneded another to the output.

We see that in all of these scenarios the string is guaranteed to be produced by the transducer after reading , so we might as well modify the automata to make sure that this happens. The process of doing this is called normalizing the transducer.

Given a transducer , the following will be its normalized version:

    • where

For all of these construction for some state is the string that one is guaranteed to output irrespective of the input. It can be defined as the longest common prefix of all possible string that can be generated from .

The correctness for and are fairly trivial. For the correctness of :

  • If then we are done as before this transition our normalized automata had already read so we can remove it from . Also after reading one can safely read .
  • If Then we have that whatever was going to be produced by the given transition was already covered in , which means parts of were also covered. So what we would want in our reduced automata are specifically of that is not read, hence we are done.

References