Trending February 2024 # Explaining Text Generation With Lstm # Suggested March 2024 # Top 10 Popular

You are reading the article Explaining Text Generation With Lstm updated in February 2024 on the website We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested March 2024 Explaining Text Generation With Lstm

This article was published as a part of the Data Science Blogathon.

An end-to-end guide on Text generation using LSTM

                                                                  Source: develop-paper

Hey Folks!

In this article, we are going to talk about text generation using LSTM with end-to-end examples. we will also look at concepts related to LSTM as a quick revision.

In the next generation, we predict the next character of a given word of a sequence. Text data can be seen as a sequence of words or a sequence of individual data. For the prediction of sequence, we have used deep learning models like RNN/LSTM/GRU.

I have already written a very detailed article on the idea of RNN then I discussed why RNN is not practical and explained RNN and GRU with examples. you can refer to this link.

Table of Contents

Introduction to LSTM

Why does RNN fail?

Understanding LSTM architecture and various Gates

The idea of Text Generation

Implementation of Text Generation using LSTM


Text generation can be considered a very important feature of AI-based tools. it comes very useful in machines which are supposed to become more interactive towards humans. smart gadgets like smart-speakers, home assistants use text generation in some forms.

Use cases of Text-Generation

Search engines


Text summarize

Question answering

Why RNN isn’t Practical for Text Generation?

RNN has a big problem of vanishing and exploding gradients. hence RNN can’t hold longer sequential information and in the Text-generation task, we particularly need a model that can memorize a long sequence of data. for this purpose LSTM came into the picture.

LSTM (Long Short Term Memory)

As we know that RNN can’t hold /memorize sequential data for a long time and begins to forget the previous inputs and new input comes. In order to fix this problem, LSTM is designed with various gates.

LSTM solves the problem of short-term memory learning by using different types of gates.

When a new input comes in RNN, it modifies the existing information without deciding if the incoming input is important or not, whereas in the case of LSTM gates are available to allow only important inputs to modify the existing information.

In LSTM gates decide what data to be ignored and what to be feed-forward for the training. there are 3 gates in LSTM:

Input Gate

Output Gate

Forget Gate


Forget Gate

This gate is responsible for selecting relevant information and discarding irrelevant information. after selecting relevant information it is passed through the input gate.

First, the information from the current state and previous hidden state is passed through the activation function. here will be the sigmoid activation function. Sigmoid activation function return value between 0 to1.a value closer to 0 means current information should be ignored otherwise it should be passed through the input gate.

Input Gate

This gate is responsible for adding information to the model by using the activation function sigmoid. Using the activation function tanhcreates an array of information that is passed through the input gate. the array of information contains values ranging from -1 to 1 and a sigmoid function filter and maintain what information should be added to the model and what information should be discarded.

Output Gate

Output Gate is responsible for generating the next hidden states along with cell states that are carried over the next time step. It creates a hidden state using the activation function tanhand its value ranges from -1 to 1.


The Idea of Text Generation

Text generation is nothing but a continuous series of next-word predictions. as we already know that text data is a sequence of words, using these sequences we can predict the next word.

Implementing Text Generation

There are steps various steps listed for text generation:-

Load the necessary libraries

Load the textual- data

Perform text-cleaning if needed

Data preparation for training

Define and train the LSTM model


Loading necessary libraries

libraries for data handling

import pandas as pd import numpy as np import string, os import warnings warnings.filterwarnings("ignore") warnings.simplefilter(action='ignore', category=FutureWarning)

loading deep learning libraries

# set seeds for reproducability from tensorflow import set_random_seed from numpy.random import seed set_random_seed(2) seed(1) # keras module for building LSTM from keras.preprocessing.sequence import pad_sequences from keras.layers import Embedding, LSTM, Dense, Dropout from chúng tôi import Tokenizer from keras.callbacks import EarlyStopping from keras.models import Sequential import keras.utils as ku Loading the Dataset # Loading the all headlines as a list curr_dir = '../input/' all_headlines = [] for filename in os.listdir(curr_dir): if 'Articles' in filename: article_df = pd.read_csv(curr_dir + filename) all_headlines.extend(list(article_df.headline.values)) break all_headlines = [line for line in all_headlines if line!= "Unknown"] print(all_headlines[:10])

                                                                            Source: Author

We have a total of 829 headlines and we will use these headlines to generate text.

Dataset Preparation

For Dataset Preparation our first task will be to clean the text data which includes removing punctuations, lowercasing words, etc.

Data Cleaning

We defined a function that takes a single headline at a time and returns the cleaned headline. Using iteration we have passed each headline and made a list of cleaned data corpus.

def clean_text(txt): txt = "".join(t for tin txt if t not in string.punctuation).lower() txt = txt.encode("utf8").decode("ascii",'ignore') return txt corpus = [clean_text(x) for x in all_headlines] print(corpus[:10])

Source: Author

Generating n-gram Sequence for training

In NLP language model requires sequential input data, and input word/token must be numerical. Here we are generating n-grams in order to train our model for next word prediction.

tokenizer = Tokenizer() def get_sequence_of_tokens(corpus): ## tokenization tokenizer.fit_on_texts(corpus) total_words = len(tokenizer.word_index) + 1 ## convert data to a token sequence input_sequences = [] for line in corpus: token_list = tokenizer.texts_to_sequences([line])[0] for i in range(1, len(token_list)): n_gram_sequence = token_list[:i+1] input_sequences.append(n_gram_sequence) return input_sequences, total_words inp_sequences, total_words = get_sequence_of_tokens(corpus) print(inp_sequences[:10])

                                                                                      Source: Author

As you see that inp_sequence is an n-gram sequence that is required for training next-word prediction. we had 829 headlines and using the n-gram concept we have now 4544 rows.


You can relate the inp_sequences with this picture where you can clearly see that in every step we add a token to the Seed sequence for training.

Padding the Sequences

The inp_sequence we just made have variable sequence length, which is not favorable for training, using padding we make every sequence of having the same length.

                                                                              source: Kaggle

def generate_padded_sequences(input_sequences): max_sequence_len = max([len(x) for x in input_sequences]) input_sequences = np.array(pad_sequences(input_sequences, maxlen=max_sequence_len, padding='pre')) predictors, label = input_sequences[:,:-1],input_sequences[:,-1] label = ku.to_categorical(label, num_classes=total_words) return predictors, label, max_sequence_len predictors, label, max_sequence_len = generate_padded_sequences(inp_sequences)

predictors : these are tokens that will be used as input for predicting the next word.

label: is the next word to be predicted.

max_sequence_len: is the sequence length.

pad_sequence: provided by Keras is used to pad an array of tokens to a given length.

In this case,max_sequence_len is 17.

Model Creation

So far we have prepared the data for training. now in this step, we will create an LSTM model that will take predictors as input X and labels as input y.

A quick reminder on Layers in Keras:-

Input Layer: This is responsible for taking input sequence.

LSTM Layer: It calculates the output using LSTM units and returns hidden and cell states. In our case we have added 100 units in the layer, that can be fine-tuned later.

Dropout Layer: This layer is responsible for regularisation which means it prevents over-fitting. this is done by turning off the activations of some neurons in the LSTM layer.

Output Layer: This Computes the probability of our prediction.

def create_model(max_sequence_len, total_words): input_len = max_sequence_len - 1 model = Sequential() model.add(Embedding(total_words, 10, input_length=input_len)) model.add(LSTM(100)) model.add(Dropout(0.1)) model.add(Dense(total_words, activation='softmax')) return model model = create_model(max_sequence_len, total_words) model.summary()

                                                                                   Source: Author

Training the model

After building the model architecture we can train the model using our predictors (X_train) and label(y_train).100 epochs should be enough., label, epochs=100, verbose=5) Text Generation (Prediction)


We have trained our model architecture and now it’s ready to generate text. We need to write a function to predict the next word based on the input words. We also have to tokenize the sequence and pad it with the same sequence_length we provided for training, and then we will append each predicted word as a string.

def generate_text(seed_text, next_words, model, max_sequence_len): for _ in range(next_words): token_list = tokenizer.texts_to_sequences([seed_text])[0] token_list = pad_sequences([token_list], maxlen=max_sequence_len-1, padding='pre') predicted = model.predict_classes(token_list, verbose=0) output_word = "" for word,index in tokenizer.word_index.items(): if index == predicted: output_word = word break seed_text += " "+output_word return seed_text.title()

seed_text : it’s the initial words that will be passed for text generation.

predict_classes: it will return the token id for the predicted word.

predicted: Its token id for predicted word and this will be converted back into a word using the dictionarytokenizer.word_index .items().

next_words It’s the number of next words we want to be predicted.


Calling the function generate_textwill generate text.generate_text function takes initial words and number of words to be predicted, model name, and sequence length.

print (generate_text("india and pakistan", 3, model, max_sequence_len)) print (generate_text("president trump", 3, model, max_sequence_len)) print (generate_text("united states", 4, model, max_sequence_len)) print (generate_text("donald trump", 2, model, max_sequence_len)) print (generate_text("new york", 3, model, max_sequence_len)) print (generate_text("science and technology", 5, model, max_sequence_len))

                                                                        Source: Author


In this article, we have discussed the LSTM model with its architecture and then we discussed the Idea of the text-generation and we implemented the text-generation using the LSTM model.

Our trained model worked perfectly well but you can improve the model by:-

Adding more data to be trained on

Fine Tuning the model architecture, ie ( number of units, layers, etc).

Fine Tuning the parameters like ( epochs, units, learning rate, activation function, etc)

Thanks for Reading !!

Feel free to hit me on my Linkedin if you have any suggestions or questions for me.


The media shown in this article is not owned by Analytics Vidhya and are used at the Author’s discretion.


You're reading Explaining Text Generation With Lstm

How To Translate Text In Excel With Examples?

Introduction to Excel Translate Function

Excel Translate is a powerful in-built function to convert any word or sentence into several languages. This tool is ideal for professionals working across different native languages.

Moreover, it facilitates effective communication by sharing documents with colleagues and business partners in multiple languages, enhancing readability and preventing confusion.

Where is the Translator in Excel?

In Excel, the “Translate” feature is available under the “Language” section of the “REVIEW.”

How to Translate in Excel?

Let us learn how to translate words and sentences from one language to another with the help of the following examples.

You can download this Translate Excel Template here – Translate Excel Template

Example #1

Step 2: Type the word you want to convert and select the”Fro” and“T” languages from the options. For example, type Today and select English in From and Hindi in the To section. The result will display automatically.

Note: The Swap option of the translator function can be used to interchange the To and From languages.

Note: You also directly select the cell you want to translate. The selected text will display automatically in the upper box. Then you have to choose the target language. Text is immediately translated into the target language, as shown below.

Example #2

Suppose you have the below data of some words in English, and you want to translate those words into another language like Chinese and French.

The result will be displayed in the menu as shown below.

Note: Remember that the translated words will only appear on the menu. You have to copy-paste the translated word into their respective cells.

Step 4: Repeat the same procedure for Chinese and French.

All the words are now translated into different languages, as shown above.

Example #3

You can also change a long sentence from one language to another. For example, there is a long sentence in English -“Hello, my name is David. I love cooking and listening to music. “You want to translate this sentence into French.

Type the above sentence in the From section and select French in To. The output is shown in the image below.

How to Install Functions Translator Add-Ins

Follow the below procedure to install the Functions translator in Excel.

The Function Translator is installed successfully. It will appear at the bottom, as shown below.

The function Translator will appear in the Home tab. It will have two options: Reference and Translator.

A language settings dialog will open. This dialog box allows you to select From and To

Step 8: Select From and To languages.

To is the language you know and are familiar with, and From is the language you want to find or translate. Note: You can change the language anytime through the Preferences pane. The swipe button, represented in an up-down arrow, allows you to switch the language easily.

For example, if you know French and want to translate Excel formulas into English. You can select English in the From and French in the To, as shown below.

Step 10: Enter the formula you want to translate.

Things to Remember

The shortcut to open the translate window is Alt + Shift + F7Excel’s’s translator result may be prone to errors and inconsistenciesExcel’s translator is mainly suitable for personal understanding.

The translated text or sentence may not be ideal for professional use.

You should have an internet connection to access the Excel function translator.

You can search for the Excel function under the dictionary of Function Translator.

Recommended Articles

This article is a guide to Translate in Excel. Here we learn to translate text to different languages using Translate in Excel and how to add a translate option in Quick Access ToolBar. You can also go through our other suggested articles –

Steps For Effective Text Data Cleaning (With Case Study Using Python)


One of the first steps in working with text data is to pre-process it. It is an essential step before the data is ready for analysis. Majority of available text data is highly unstructured and noisy in nature – to achieve better insights or to build better algorithms, it is necessary to play with clean data. For example, social media data is highly unstructured – it is an informal communication – typos, bad grammar, usage of slang, presence of unwanted content like URLs, Stopwords, Expressions etc. are the usual suspects.

In this blog, therefore I discuss about these possible noise elements and how you could clean them step by step. I am providing ways to clean data using Python.

As a typical business problem, assume you are interested in finding:  which are the features of an iPhone which are more popular among the fans. You have extracted consumer opinions related to iPhone and here is a tweet you extracted:

[stextbox id = “grey”] [/stextbox]

Steps for data cleaning:


Here is what you do:

Escaping HTML characters: Data obtained from web usually contains a lot of html entities like &lt; &gt; &amp; which gets embedded in the original data. It is thus necessary to get rid of these entities. One approach is to directly remove them by the use of specific regular expressions. Another approach is to use appropriate packages and modules (for example htmlparser of Python), which can convert these entities to standard html tags. For example: &lt; is converted to “<” and &amp; is converted to “&”.

Decoding data: Thisis the process of transforming information from complex symbols to simple and easier to understand characters. Text data may be subject to different forms of decoding like “Latin”, “UTF8” etc. Therefore, for better analysis, it is necessary to keep the complete data in standard encoding format. UTF-8 encoding is widely accepted and is recommended to use.

[stextbox id = “grey”]


tweet = original_tweet.decode("utf8").encode(‘ascii’,’ignore’)



Apostrophe Lookup: To avoid any word sense disambiguation in text, it is recommended to maintain proper structure in it and to abide by the rules of context free grammar. When apostrophes are used, chances of disambiguation increases.

For example “it’s is a contraction for it is or it has”.

All the apostrophes should be converted into standard lexicons. One can use a lookup table of all possible keys to get rid of disambiguates.

[stextbox id = “grey”]


APPOSTOPHES = {“'s" : " is", "'re" : " are", ...} ## Need a huge dictionary words = tweet.split() reformed = [APPOSTOPHES[word] if word in APPOSTOPHES else word for word in words] reformed = " ".join(reformed)



Removal of Stop-words: When data analysis needs to be data driven at the word level, the commonly occurring words (stop-words) should be removed. One can either create a long list of stop-words or one can use predefined language specific libraries.

Removal of Punctuations: All the punctuation marks according to the priorities should be dealt with. For example: “.”, “,”,”?” are important punctuations that should be retained while others need to be removed.

Removal of Expressions: Textual data (usually speech transcripts) may contain human expressions like [laughing], [Crying], [Audience paused]. These expressions are usually non relevant to content of the speech and hence need to be removed. Simple regular expression can be useful in this case.

Split Attached Words: We humans in the social forums generate text data, which is completely informal in nature. Most of the tweets are accompanied with multiple attached words like RainyDay, PlayingInTheCold etc. These entities can be split into their normal forms using simple rules and regex.

[stextbox id = “grey”]


cleaned = “ ”.join(re.findall(‘[A-Z][^A-Z]*’, original_tweet))



Slangs lookup: Again, social media comprises of a majority of slang words. These words should be transformed into standard words to make free text. The words like luv will be converted to love, Helo to Hello. The similar approach of apostrophe look up can be used to convert slangs to standard words. A number of sources are available on the web, which provides lists of all possible slangs, this would be your holy grail and you could use them as lookup dictionaries for conversion purposes.

[stextbox id = “grey”]


            tweet = _slang_loopup(tweet)



Standardizing words: Sometimes words are not in proper formats. For example: “I looooveee you” should be “I love you”. Simple rules and regular expressions can help solve these cases.

[stextbox id = “grey”]


tweet = ''.join(''.join(s)[:2] for _, s in itertools.groupby(tweet))



[stextbox id = “grey”]

[stextbox id = “grey”]

Final cleaned tweet:



Advanced data cleaning:

Grammar checking: Grammar checking is majorly learning based, huge amount of proper text data is learned and models are created for the purpose of grammar correction. There are many online tools that are available for grammar correction purposes.

Spelling correction: In natural language, misspelled errors are encountered. Companies like Google and Microsoft have achieved a decent accuracy level in automated spell correction. One can use algorithms like the Levenshtein Distances, Dictionary Lookup etc. or other modules and packages to fix these errors.

End Notes:

Go Hack 🙂

If you like what you just read & want to continue your analytics learning, subscribe to our emails, follow us on twitter or like our facebook page.


Customer Generation: Secrets To Outperform Your Competition

Need a new approach to demand generation? Are you looking for a different way of going to market?

On August 24, I moderated a sponsored Search Engine Journal webinar presented by Garrett Mehrguth, President and CEO at Directive.

Mehrguth shared insight on how marketers can use customer generation to deliver better results.

Here’s a recap of the webinar presentation.

Customer Generation: What It Is & How It Differs From Demand Gen

By simply creating MQLs as the starting point instead of the endpoint, you completely change the way you go about your start.

Mehrguth says it has improved Directive’s own product quality, transformed their company, and allowed them to grow exponentially.

Demand Generation vs. Customer Generation

There are areas where typical demand generation campaigns do not deliver. Focusing on customer generation can change all of that.

Here are five ways demand generation and customer generation differ.

Demand Generation

Customer Generation

Third-party data for targeting.

First-party data for targeting.

Product-led approach for go-to-market.

Customer-led approach for go-to-market.

ROI is the goal. No tracking required.

LTV:CAC ratio is the goal. Tracking required.

Indicator of marketing success: MQL.

Indicator of marketing success: SQL.

“Don’t stand out, we are B2B.”

“Create emotional experiences, we are B2C.”

Customer Generation: The Process

At a high level, the customer generation process works this way.

First, map out your total addressable market (TAM), identify your best customers, and then enrich those accounts with first-party data along with search data.

Break the accounts into customer segments (or tiers) and identify the service lines they’re interested in.

Determine your financial model. How much can you pay to get each customer segment for each product?

Hone in on your value proposition. What would get someone from apathy to action? What jobs do they need to get done that you could position yourselves to be discoverable for?

How do we use emotion personalization and iteration to keep improving?

Ultimately, customer generation aligns your entire brand to a new philosophy:

“Who” you market to and “why” you exist for them matters more than “how” (a.k.a., your tactics).

This results in more personalized customer journeys across all touchpoints from acquisition to retention.

The Five Principles of Customer Generation

Customer generation can be treated as a methodology. Let’s unpack the specifics of each principle and how you can directly apply that to your marketing.

Principle 1: First-Party Data Unlocks Growth

You need to map your total addressable market (TAM), build account lists, and aggressively scale spend without using an account-based marketing (ABM) provider.

Choose Your Data Provider First

While ZoomInfo is still leading in the B2B contact data space, there are other players that are great at certain things, such as:

Clearbit (technologies that the companies use). (accuracy).

Crunchbase (funding data).

You should choose one (or more) of these providers depending on what triggers you think are most indicative of the data that makes an account a great fit for your product or service.

Use Employee Size, Technology, Funding, Etc.

Build lists based on headcount, technology, and funding rather than revenue data.

Do not use revenue data for privately held companies as it’s mostly inaccurate.

Headcount is a far better understanding of the size of an organization. You can also scrape it from LinkedIn with a high level of accuracy compared to privately held companies’ revenue numbers.

The technology companies use, on the other hand, will help you identify the maturity of an organization and whether they are a fit for your business.

Meanwhile, funding data informs you whether these prospects can afford your product or service.

Unsure which indicator to focus on? Take your current client list and upload that data to one of these data providers.

You can then enrich your client list with your point of contact to see what title they have and what their experience level is.

This exercise will help you understand your niche and customer personas. Who exactly is the perfect fit for your product or service?

Mehrguth’s team at Directive also built their own database called Pulse to give them unique data insights. This helps them assign a score that’s more indicative if a person is an excellent fit for their business or not.

Manually Verify

It’s essential to manually verify your TAM because you can’t trust third-party data providers 100%.

Check every single solitary account to see if they fit your ideal customer profile (ICP). Cleaning this data is the only way to make sure you aren’t wasting a dime.

Integrate to Salesforce

When you integrate your first-party data and your TAM into Salesforce, you can change that.

Distribute With a Customer Data Platform (CDP) Like Segment

After getting everything integrated into Salesforce, set it up on Segment and you can start distributing all your data in real-time to all your channels.

Articulating the success of your campaigns according to market share penetration will allow you to become exponentially more aligned with your executive team.

Leveling up your reporting this way will ultimately get you more buy-in, budget, and support for your efforts.

Principle 2: Customer Led Over Product Led

Your product is not for the masses. A customer-led approach gives you the power to impact business KPIs such as:

Average contract value.

Trial conversion rate.

Lifetime value.

Customer acquisition cost.

And more.

By focusing on your most valuable customers, you truly are in control.

Most B2B SaaS companies start with what they’re selling, instead of focusing on who they’re selling to. We need to fix that.

Instead of structuring around the products you offer, consider structuring around your customer segments and customer needs.

This change in how you write your copy, position your value propositions, and articulate your value to your ICP is the difference between a poor conversion rate and a great conversion rate.

Here’s a sample worksheet you can use to enter your ICP’s “Jobs to be Done” (JTBD), needs, voice, challenges/pain points, etc. You can then map them to your products or solutions.

The most successful B2B SaaS companies are customer-led.

Case in point? Workable, a recruiting software and hiring platform.

Their core site is organized around customer segments and need states.

They have thousands of HR resources for their customers.

The result?

Workable drives 35 million visits per year and dominates their keywords.

Customer-led is the backbone of Directive’s strategy as an agency.

It is applied in the early stages of the project phase and influences all strategy and execution.

Principle 3: Financial Modeling Is a Need to Have

Scale without financial modeling is a pipe dream.

Marketers, in general, are not great at financial modeling. As such, we struggle to connect our efforts to revenue and get the respect, authority and budget we deserve as marketers.

Using the LTV:CAC financial model can help change that and improve your ability to be a valuable partner to the executive team and to the business’s vision.

Gather Your Numbers

You need to know your business’s financials, including Lifetime Customer Value (LTV) and Customer Acquisition Cost (CAC) metrics, in order to compare the value of a customer compared to the cost of acquiring them.

Customer generation is not all about any customer. It’s about finding your most profitable customer.

Make sure to put those numbers into a sheet and review your actual historical performance.

Track Your Channels

Gathering and tracking your numbers allow you to create benchmarks and inform channel strategy and tactics.

Tracking lets you know better where you should be spending more money, where you should be spending less, and what you need to do to hit your goals.

Leverage First-Touch Attribution

Many mid-market and publicly traded companies don’t have the maturity to pull off multi-touch attribution. They are still struggling across the board with marketing ops and attribution.

So for this type of model what we’re trying to look at is where should we spend another dollar to get another customer?

Oftentimes, it’s hard to fund indirect or complementary channels. We want to fund channels that have that first-touch attribution.

Use this LTV:CAC Tool

Directive’s free LTV-CAC tool can help you with your campaign planning.

Use it to level up your financial modeling, articulate value, increase your budget, and be more valuable to the organization.

Principle 4: SQLs Beat MQLs Every Time

Let’s address the elephant in the room. Google has intent but poor firmographics.

It’s really hard to monetize Google Ads when you’re in a niche or when you have a high average order value because the majority of the customers who search that query don’t fit.

And when you specify your query there are not enough people searching for you to take make it a substantial revenue channel.

Conversely, paid social allows you to target audiences with firmographics but they have no intent.

After years of dealing with this dilemma, Garrett and his team discovered a solution – monetizing paid social with a great offer.

They found an incentive so good that they were able to create intent from social. Gift cards are what worked for them.

Monetizing comes down to improving your sales organization and also improving your value proposition.

A few more tips:

Buy Sendoso for gift sending.

Choose a tool like Calendly or Chili Piper so you are driving appointments not form fills.

Train sales development reps and account executives on how to have gift card/incentivized intro calls. Use the BANT script:

Budget: What is their spending ability? Do they meet our minimums?

Authority: Are they the decision-maker? If not, who is the true decision-maker?


What immediate need or urgent problems are there?

What are their pain points within each of our product lines?


Paid Media


Marketing Ops


What gaps did we find in our initial research prior to the call?

Timeline: In what timeframe will they need a solution?

Principle 5: There’s No Such Thing as B2B

Where direct-to-consumer marketing (D2C) marketing is all about building a brand and creating an emotional connection, business-to-business (B2B) marketers seem to be simply optimizing for safeness.

Your customers are people, not corporations. They hate boring marketing as much as you do.

It’s time to change the expectation of what B2B marketing is.

Marketing is emotional. Ask yourself: How can I motivate my potential customers from apathy to action?

We sell to individuals in corporations and we need to create an emotional connection with them.

And video creates a strong bridge from a gift to your value proposition. (Here’s an example from Directive.)

People make emotional decisions, not scientific ones, and operate accordingly.

When we put all these together, you have a differentiated go-to-market strategy that will drive impressive returns.

Key Takeaways

The biggest problem in demand generation today is that it could be 50% inaccurate before you launch your campaign. Customer generation fixes that.

Monetizing comes down to improving your sales organization and value proposition. Why do you exist and who do you exist for.

Want to learn more about Customer Generation from one of Directive’s experts? Head over to their website to book an intro call.

[Slides] Customer Generation: Delivering on the Promise Demand Gen Forgot About

Customer Generation: Delivering on the Promise Demand Gen Forgot About from Search Engine Journal

Join Us For Our Next Webinar! KPIs, Metrics & Benchmarks That Matter For SEO Success In 2023

Reserve my Seat

Image Credits

All screenshots taken by author, August 2023

Bend The Rules: Sync Android With Itunes, Text On Pc, Keep E

It’s not uncommon to have iOS and Android users living under the same roof. Indeed, perhaps you’ve been an iPod owner for many users, but recently decided to pick up an Android-powered smartphone or tablet. Now the question becomes, how do you sync your iTunes music and playlists with your Android device?

You can, of course, side-load music from your desktop to your device, meaning copying it over manually. But that’s not nearly as easy or convenient as true synchronization. It is, in a word, a hassle.

Enter iSyncr, which combines a Windows utility and an Android app to keep your phone/tablet in sync with iTunes. It’s dirt cheap, and it works like a charm. There’s even an optional Wi-Fi add-on for syncing your music wirelessly.

To get started, you simply buy the app from Android Market for a whopping $3. When you first run it, it will install a small Windows utility on your device and/or storage card. (If you’re going to be bringing over a lot of music, a storage card is definitely your best bet.) Then you just connect your device to your PC, enable USB storage, and run the iSyncr utility.

From there it’s a simple matter of choosing the playlists you want to sync. When the copy process is done, “safely remove” your phone/tablet, then fire up the music app. Presto! Your iTunes favorites have magically appeared. The app also supports syncing in the other direction, meaning songs purchased and playlists created on your device can be synced back to your PC.

This gets even better if you use the free iSyncr Wi-Fi Add-On, a desktop server program for Windows that enables synchronization without the USB cable.

Send and Receive Text Messages on Your PC

Text messaging is great–except when it isn’t. For one thing, it’s expensive (unless you have an unlimited messaging plan, which itself can be expensive). Plus, it forces you to type on your phone’s tiny keyboard–not always the fastest or most convenient method.

Indeed, when you’re sitting at your desk and want to text, say, your spouse, do you really have to pull out your phone, navigate to the messaging app, then mangle those cramped keys?

Actually, you don’t. Pinger Textfree Web brings free and easy text messaging to your browser. Using a large, attractive interface, you can compose a message to any mobile number and view the replies. It’s not unlike using an instant-messaging service like Meebo.

Registering for a Textfree Web account is free, and it includes a chúng tôi e-mail address. (If you’re an Android or iOS user, you might be familiar with the eponymous apps, which are great for messaging without paying your carrier for the privilege.)

In my tests, messages sent from Textfree Web arrived almost instantly, the replies came just as quickly. And trust me: it’s so much nicer composing texts with a full-size keyboard. Cheaper, too. The service even lets you attach images to your messages, effectively recreating MMS. (If someone wants to send you back an image, it needs to go to your Textfree e-mail address.)

If there’s one downside, it’s that you can’t import contacts, so you’ll have to enter phone numbers manually–at least the first time. Textfree Web keeps a (brief) log of recently sent and received messages, so it’s easy to resume a conversation with people you’ve texted in the last 72 hours.

This is a great service. It lets you keep your phone in your pocket and text to your heart’s content, all at no charge.

Extend the Loan Period of a Borrowed E-Book

If you subscribe to the belief that rules are made to be broken, read on. No, literally, read on. I’m talking about e-books, which you can borrow from friends and family members to read on your Barnes & Noble Nook or Amazon Kindle–but only for two weeks. If that’s not enough time to finish your e-copy of, say, “The Help,” you’re pretty much out of luck; you’ll have to see if you can borrow it again from someone else (any purchased e-book can be loaned only once) or buy it outright.

Or maybe not. I’ve discovered a way to keep a borrowed e-book indefinitely–or at least until you finish reading it.

The back story: I was halfway through Jonathan Franzen’s Freedom when my e-book loan (by way of eBookFling) reached the two-week mark. I’d received e-mail notification that this was imminent, so I knew exactly when time was about to be up.

I’d been using the Kindle app to read the book on both my iPhone and my Nook Color (which I’d converted to an Android tablet in part so I could use it for Amazon e-books). The expiration date for “Freedom” arrived, and sure enough, when I returned to it on my iPhone, the book was expired and inaccessible.

Then I got to wondering: are the books “hardwired” to expire after two weeks, or is it only through communication with Amazon’s Whispernet sync service that they’re directed to time out? Turns out it’s the latter, because I then disabled Wi-Fi on my Nook, loaded the Amazon app, and discovered that “Freedom” was still there, still readable. Because I’d cut off the app’s ability to communicate with Whispernet, Whispernet couldn’t tell the app that the book’s loan period had expired. I’m now about five days past that period.

There are a few downsides to this approach. The big one is that I have to remember to turn off Wi-Fi every time I run the Amazon app–or just leave it off altogether, which limits my Nook’s capabilities. Also, it prohibits me from downloading any new books, at least until I’m done with the current one, because the moment the app gets a chance to sync, it’s game over for my expired loan.

Also, let me caveat this by saying I love being able to borrow e-books, but think two weeks isn’t enough time. Heck, a brick-and-mortar library typically gives you three weeks, and lets you renew books if you want to keep them longer. Why should e-books be any different?

How To Place An Image In Text With Photoshop Cc And Cs6

Written by Steve Patterson.

In this tutorial, we learn how to place an image in text, one of Photoshop’s most popular and classic effects. As we’ll see, thanks to the power of clipping masks, placing an image inside text with Photoshop is simple and easy. I’ll be using Photoshop CS6 here but this tutorial is fully compatible with the latest version of Photoshop up to 2023. If you’re using an older version of Photoshop, be sure to check out my original Placing An Image In Text tutorial.

Here is the image I’m using (tropical beach sunset photo from Adobe Stock):

The original image.

And here’s what the same image will look like when placed inside text:

The final result.

Let’s get started!

How To Place An Image In Text With Photoshop Step 1: Duplicate The Background Layer

Open the image you want to place inside your text. With the image newly opened, if you look in your Layers panel, you’ll see the image sitting on the Background layer, currently the only layer in the document:

The Layers panel showing the image on the Background layer.

We need to make a copy of this layer. Go up to the Layer menu in the Menu Bar along the top of the screen, choose New, then choose Layer via Copy. Or, you can select this same command from the keyboard by pressing Ctrl+J (Win) / Command+J (Mac):

Photoshop creates a copy of the layer, names it “Layer 1”, and places it directly above the Background layer:

A copy of the layer appears above the original.

Step 2: Add A White Solid Color Fill Layer

Choose Solid Color from the top of the list that appears:

Selecting a Solid Color Fill layer.

Photoshop will pop open the Color Picker so we can choose the color we want to fill the layer with. I’m going to use white for my background color by entering a value of 255 into the R, G and B boxes:

A value of 255 for the R, G and B values gives us white.

The Layers panel showing the Solid Color Fill layer.

And because the Fill layer is sitting above both of our image layers, the document is now temporarily filled with white:

The image is temporarily hidden by the Fill Layer.

Step 3: Drag The Solid Color Fill Layer Below Layer 1

Dragging the Fill layer between the Background layer and Layer 1.

Release your mouse button when the highlight bar appears to drop the Fill layer into place between the two image layers. Your image will reappear in the document window:

The Fill layer now sits between the two image layers.

Step 4: Select Layer 1

Selecting Layer 1.

Step 5: Select The Type Tool

We’re ready to add our text. Select Photoshop’s Type Tool from the Tools panel along the left of the screen. You can also select the Type Tool simply by pressing the letter T on your keyboard:

Selecting the Type Tool.

Step 6: Choose Your Font

With the Type Tool selected, go up to the Options Bar along the top of the screen and choose your font. Since our goal is to place an image within the text, generally fonts with thick letters work best. I’m going to choose Arial Black, but of course you can choose any font you like. Don’t worry about the font size for now. We’ll resize the type manually later:

Selecting a font from the Options Bar.

Step 7: Set Your Type Color To White

Setting the R, G and B values to 255.

Step 8: Add Your Text

Adding my text.

Learn all about working with type in Photoshop with our Photoshop Type Essentials tutorial!

Step 9: Drag The Type Layer Below Layer 1

If we look in the Layers panel, we see our newly added Type layer sitting above Layer 1, which is why the text is appearing in front of the image in the document:

The Type layer currently sits above the image.

Dragging the Type layer below Layer 1.

Release your mouse button when the highlight bar appears to drop the Type layer into place:

The Type layer now sits below Layer 1.

Step 10: Select Layer 1 Again Step 11: Create A Clipping Mask

Choose Create Clipping Mask from the menu that appears:

Choosing the Create Clipping Mask command.

This clips the image on Layer 1 to the text on the Type layer below it, meaning that only the area of the image that sits directly above the actual text on the Type layer remains visible, creating the illusion that the image is inside the text. The rest of the image is now hidden from view, and in its place, we see the solid white Fill layer:

Only the area of the image that sits directly above the text remains visible.

If we look again in the Layers panel, we see that Layer 1 has been indented to the right, with a small arrow pointing down at the Type layer below it. This is how Photoshop lets us know that the Type layer is being used as a clipping mask for Layer 1:

The Layers panel showing Layer 1 clipped to the Type layer.

Related tutorial: How Photoshop Clipping Masks Work

Step 12: Select The Type Layer

Selecting the Type layer.

Step 13: Resize And Reposition The Text

All that’s left to do now is to move and resize the type, and we can do both of those things using Photoshop’s Free Transform command. With the Type layer selected, go up to the Edit menu at the top of the screen and choose Free Transform. Or, press Ctrl+T (Win) / Command+T (Mac) on your keyboard to select Free Transform with the shortcut:

Moving and resizing the text with Free Transform.

When you’re done, press Enter (Win) / Return (Mac) to accept the transformation and exit out of Free Transform:

The effect after moving and resizing the type.

Step 14: Add A Drop Shadow (Optional)

Choose Drop Shadow from the bottom of the list that appears:

Selecting a Drop Shadow layer effect.

This opens the Layer Style dialog box set to the Drop Shadow options in the middle column. I’ll lower the Opacity of the drop shadow from its default value of 75% down to 50% to reduce its intensity, then I’ll set the Angle of the shadow to 120°. I’ll increase my Distance value to 30px and the Size to 40px, but these two values depend a lot on the size of your image so you may need to play around with them on your own to find the settings that work best:

The Drop Shadow options.

The final effect.

And there we have it! In this tutorial, we learned how to place an image in a single word, or a single Type layer. In the next tutorial, learn the trick to placing an image in multiple text layers at once! Or visit our Text Effects or Photo Effects sections for more Photoshop effects tutorials!

Update the detailed information about Explaining Text Generation With Lstm on the website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!