# Delete a part of a file



## clinty (May 16, 2009)

Hello.

I have a file like:

```
VAR=&z&z
VAR2=dzdz
VAR5=23232323
etc...
etc...
VAR68676="hndj zjdbzjdb"
TOTO="hdzdz njndz"
**** CUSTOM VARS ****
TXT="Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum."
```
The part after "**** CUSTOM VARS ***" often change. I must install a cron job which will get the new TXT var and put it in this file.
But... How delete, in this file, ALL datas AFTER "**** CUSTOM VARS ***"? Which utils must I use?
Do you have advices? I have no idea for making this task.

Thanks a lot.

Best regards,


----------



## ale (May 16, 2009)

I'm not sure I've understood what you have to do.
Do you have to remove all the lines after "custom vars" before appending the new TXT variable?
Can't you keep a "master" somewhere else and copy it over the modified one before appending the new TXT?


----------



## clinty (May 16, 2009)

I have to copy this file, without all datas after "**** CUSTOM VARS ****".
Then, I'll append new datas after.

The first point is the problem to me: how clean a file, removing all datas after a certain line?


----------



## MG (May 16, 2009)

cat input.txt | grep -B 1000 "**** CUSTOM VARS ****" > output.txt

writes the last 1000 lines before this line is found.


----------



## ale (May 16, 2009)

So can't you keep a master copy of the file which ends with "**** CUSTOM VARS ****" and copy it over the modified file when you have to clean it?

At t0 you have two identical files _/path/to/datafile_ and _/path/to/datafile.master_ both ending with "**** CUSTOM VARS ****".

Between t0 and tx data are appended to the end of /path/to/datafile.

At tx cron "clean" the file doing a _cp /path/to/datafile.master /path/to/datafile_, new data are appended to it, ...


----------



## clinty (May 16, 2009)

Yes, but HOW clean a file? How delete all lines after a line, in a file txt? That's the problem, in my case... All others tasks of this problem are OK.


----------



## MG (May 16, 2009)

If you really want to delete lines without copying the file first, you should write a c++ program that opens the file with read/write access and delete those lines. But what's wrong with simply swapping old and new filenames?


----------



## clinty (May 16, 2009)

MG said:
			
		

> cat input.txt | grep -B 1000 "**** CUSTOM VARS ****" > output.txt
> 
> writes the last 1000 lines before this line is found.


Yes, it's working  The solution is not "good" (if there's 1001 lines before for example).

I'm looking for a solution with sed. Print the file and exclude datas.


----------



## ale (May 16, 2009)

Bah, if you prefer to waste cpu...

```
cat INPUT.txt | while read L; do
if [ -z $STOP ];
then
  echo "$L" >> OUTPUT.txt
  if [ "$L" = "**** CUSTOM VARS ****" ];
  then
    STOP=stop
  fi
fi
done
```


----------



## MG (May 16, 2009)

I don't know the sed solution but sh:


```
line=`cat input.txt | grep -n "**** CUSTOM VARS ****" | cut -d : -f 1` && head -n $line input.txt > output.txt
```


----------



## keramida@ (May 16, 2009)

You can use sed if you want to cut everything after a specific pattern:


```
sed -e '/^\*\*\*\* CUSTOM VARS \*\*\*\*/q' < input > output
```

Then append "**** CUSTOM VARS ****" and your new TXT to the end of the "output" file, and you are done.


----------



## clinty (May 16, 2009)

It's rocks! Thanks.


----------



## DutchDaemon (May 16, 2009)

Or just plain old ed(1).


----------



## dinoex@ (May 21, 2009)

Solution with ed(1)


```
ed  input-file << 'EOF'
/CUSTOM VARS
.,$d
wq
EOF
```


----------

