top of page

Data OilSt.

Writer's pictureKrishna Kankipati

NLP with Tensorflow — Padding sentences

Alright, in the previous post we have learned to tokenize and sequence the tokens from a sentence. We can observe that the length of tokens differ.


We need to make sure that inputs are of the same length. Padding saves us from this problem!


‘pad_sequences’ padded the sequences into the same length. You can observe that 0’s are padded in the beginning of a list which is smaller in size.


‘pad_sequence’ can be used to pad a sentence in the end or in the beginning, or padding a sentence to a desired length by truncating the sequence…


 

Okay, that’s enough! In the next post we will handle a real dataset by applying the techniques we have learned!

Comments


bottom of page