2

I am trying to remove rows from a pandas df. Specifically, I want to keep everything the first string in Col A finishes in. So, for the df below, the string finishes with Mon so I want to remove any rows that don't finish with this value.

import pandas as pd

df = pd.DataFrame({
    'Col 1' : ['val1-Mon','val2-Mon','val3-Tues','val4-Tues','val5-Mon','val6-Mon','val7-Mon','val8-Mon'],  
    'Col 2' : ['A','B','A','B','A','B','A','B'], 
     })

This is easy enough for the df above by using the following.

df = df[~df['Col 1'].str.contains("Tues")]

But my input data changes every day. While I want to keep all Mon values today, I may want Tues values tomorrow. So I'd have to go in and manually update the day I didn't want.

The consistency is that first value. So if it ends in Mon, I want to keep everything ending in Mon. If the first row ends in Tues, I want to keep everything ending in Tues ect.

0

2 Answers 2

2

I think need extract first value and then compare with str.contains or by str.endswith:

first = df['Col 1'].iloc[0].split('-')[1]
#if want check first today day
#first = pd.datetime.now().strftime('%a')
print (first)
Mon

df = df[~df['Col 1'].str.contains(first)]
#Jon Clements suggestion, thank you
df = df[~df['Col 1'].str.endswith(first)]
print (df)

       Col 1 Col 2
2  val3-Tues     A
3  val4-Tues     B
Sign up to request clarification or add additional context in comments.

2 Comments

That Timestamp is redundant: pd.datetime.now().strftime('%a') is fine...
But it seems need check first value, so solution should be changed
1

Create a day column by splitting on '-', and then filter using .iloc to ensure we match the first row:

df['Day'] = df['Col 1'].str.split('-').str[-1]

filtered = df[df['Day'] == df['Day'].iloc[0]]

print(filtered)

prints:

      Col 1 Col 2  Day
0  val1-Mon     A  Mon
1  val2-Mon     B  Mon
4  val5-Mon     A  Mon
5  val6-Mon     B  Mon
6  val7-Mon     A  Mon
7  val8-Mon     B  Mon

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.