Guest User

Untitled

a guest
Sep 22nd, 2017
48
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 30.11 KB | None | 0 0
  1. awk
  2. Created Saturday 11 May 2013
  3.  
  4. ------------------------------------------------------------------------------------------------
  5. Output Separators
  6. ------------------------------------------------------------------------------------------------
  7. As mentioned previously, a print statement contains a list of items separated by commas. In the output, the items are normally separated by single spaces. However, this doesn’t need to be the case; a single space is simply the default. Any string of characters may be used as the output field separator by setting the predefined variable OFS. The initial value of this variable is the string " " (i.e., a single space).
  8. The output from an entire print statement is called an output record. Each print statement outputs one output record, and then outputs a string called the output record separator (or ORS). The initial value of ORS is the string "\n" (i.e., a newline character). Thus, each print statement normally makes a separate line.
  9. In order to change how output fields and records are separated, assign new values to the variables OFS and ORS. The usual place to do this is in the BEGIN rule (see BEGIN/END), so that it happens before any input is processed. It can also be done with assignments on the command line, before the names of the input files, or using the -v command-line option (see Options). The following example prints the first and second fields of each input record, separated by a semicolon, with a blank line added after each newline:
  10. ------------------------------------------------------------------------------------------------
  11. EXAMPLE
  12. $ awk 'BEGIN { OFS = ";"; ORS = "\n\n" }
  13. > { print $1, $2 }' mail-list
  14. -| Amelia;555-5553
  15. -|
  16. -| Anthony;555-3412
  17. -|
  18. -| Becky;555-7685
  19. ------------------------------------------------------------------------------------------------
  20. USING GSUB TO SEARCH/REPLACE STRINGS
  21. IN THIS EXAMPLE I AM REMOVING "double quote" and commas
  22. curl -s http://hgfix.net/paste/view/raw/708cd7c5 | awk '{ gsub ("\"|,",""); print }'
  23. ------------------------------------------------------------------------------------------------
  24. for i in 123456 admin123 info123 test password ; do awk -F : -v i="$i" '{OFS="\t";split($0, f, "$");p="openssl passwd -1 -salt "f[3]" "i;if(p|getline o);split(o, q, "$");m=match($0,q[4]);}m!=0{print$1,i}' shadow ; done
  25.  
  26. for i is passing serveral variables in a loop
  27. awk sets a delimiter : and then defines the variable i=$i then we set the output field seperator to \t(tab) and perform a split on the entire line $0 where f is the array and "$" is the delimiter
  28. the "p" variable is performing an openssl passwd -1 -salt on the array[3] (here is an example line admin:$1$GK3i7sYW$bH1.QzR9izDa3eRqboRcL/:15916:::::: )
  29. since our split is of the entire line "$0" with an additional delimiter set as "$" which would provide the string GK3i7sYW from the example above then " " providing a space and finally i which would be (123456 admin123 info123 test password)
  30. to better understand I will provide the output being ran in this command below:
  31. openssl passwd -1 -salt GK3i7sYW admin123
  32. $1$GK3i7sYW$bH1.QzR9izDa3eRqboRcL/
  33. ---------------------------------
  34. now we perform an if $p getline which would get the output above and set it to variable "o"
  35. we do a split on variable "o" where o is the entire line ( $1$GK3i7sYW$bH1.QzR9izDa3eRqboRcL/ ) q is the array and "$" is the delimiter
  36. afterwords we define "m" to perform a match on ($0,q[4]) where $0 is the entire line ( admin:$1$GK3i7sYW$bH1.QzR9izDa3eRqboRcL/:15916:::::: ) and q is "QzR9izDa3eRqboRcL/"
  37. if m does not equal 0 ( meaning a match is found or true ) we print $1 of the entire line ( admin:$1$GK3i7sYW$bH1.QzR9izDa3eRqboRcL/:15916:::::: ) which would be "admin
  38. and $i which is one of our variables set at the very beginning of our command ( 123456 admin123 info123 test password ) in this case a match was found on admin123
  39. ------------------------------------------------------------------------------------------------
  40. CONVERT UNIX TIMESTAMP TO HUMAN READABLE
  41. awk '{$2=strftime("%Y-%m-%d %H:%M:%S", $2); print $0}'
  42. ------------------------------------------------------------------------------------------------
  43. CONVERT UNIX TIME STAMP FROM BASH HISTORY FILE
  44. awk -F- '{x=$1;sub(/^\#/,"",$1);sub(/#.*/,strftime("#%Y-%m-%d %H:%M:%S",$1),x);$1=x;}1' ~/.bash_history
  45. ------------------------------------------------------------------------------------------------
  46. EXAMPLE STRING MATCHING AND USING SUB
  47. awk -F, '{x=$1;sub(/^\#/,"",$1);sub(/#.*/,strftime("#%Y-%m-%d %H:%M:%S",$1),x);$1=x;}1' time_log
  48. ------------------------------------------------------------------------------------------------
  49. time_log
  50. #1394663100
  51. cd public_html/
  52. #1394663104
  53. nano wp-config.php
  54. #1394663115
  55. ll
  56. #1394663162
  57. exit
  58. ------------------------------------------------------------------------------------------------
  59. #2014-03-12 17:25:00
  60. cd public_html/
  61. #2014-03-12 17:25:04
  62. nano wp-config.php
  63. #2014-03-12 17:25:15
  64. ll
  65. #2014-03-12 17:26:02
  66. exit
  67. ---------------------------------
  68. Example:
  69. grep bhabin /var/log/mysql_queries.log | awk '{$2=strftime("%Y-%m-%d %H:%M:%S", $2); print $0}'
  70. [ 2017-03-14 01:51:01 ] [ bhabin_wrdp31 ] [ 3570009 ] [ 134 ] [ bhabin_wrdp31 ] [ Sleep ] [ KILLED ] [ ] [ ]
  71. [ 2017-03-25 02:00:01 ] [ bhabin_wrdp31 ] [ 7173763 ] [ 103 ] [ bhabin_wrdp31 ] [ Sleep ] [ KILLED ] [ ] [ ]
  72. [ 2017-03-26 04:36:01 ] [ bhabin_wrdp26 ] [ 7530669 ] [ 109 ] [ bhabin_wrdp26 ] [ Sleep ] [ KILLED ] [ ] [ ]
  73. [ 2017-03-26 04:51:01 ] [ bhabin_wrdp26 ] [ 7533318 ] [ 68 ] [ bhabin_wrdp26 ] [ Sleep ] [ KILLED ] [ ] [ ]
  74. [ 2017-03-27 11:03:01 ] [ bhabin_wrdp38 ] [ 7987001 ] [ 81 ] [ bhabin_wrdp38 ] [ Sleep ] [ KILLED ] [ ] [ ]
  75. [ 2017-03-28 02:27:01 ] [ bhabin_wrdp26 ] [ 8252842 ] [ 96 ] [ bhabin_wrdp26 ] [ Sleep ] [ KILLED ] [ 'cloud1028.hostgator.com' ] [ ]
  76. [ 2017-03-28 07:30:01 ] [ bhabin_wrdp26 ] [ 8311080 ] [ 92 ] [ bhabin_wrdp26 ] [ Sleep ] [ KILLED ] [ ] [ ]
  77.  
  78. ------------------------------------------------------------------------------------------------
  79. jcook:~/Documents/QA/AuditsInProgress/2017-01-19$ curl -s http://hgfix.net/paste/view/raw/9e9b7145 |grep -oP '\d+\s.*(?=01\/\d{2}\/2017)' | sort -k1,1 -u | awk 'BEGIN { OFS = ","}$1 ~ /[0-9]{8}/{tid=$1}{if (match($0, /([0-9]{2}:){2}[0-9]{2}/,tit)) print tid,tit[0]}' > restore_contact_drivers.csv
  80. ------------------------------------------------------------------------------------------------
  81. MATCH FIELD ONE "docroot" If the NR variable is equal to 1, then it sends the first line as the argument to printf; else it sends a comma concatenated with the current line.
  82. (so if is the first part with ? as the delimiter to identify the then statement, and the : is the delimiter to identify the else statement)
  83. When the first line is seen (NR == 1), only it is printed; otherwise a comma and the line are sent as arguments to printf.
  84. This solution uses AWK's ternary operator ?:, that is:
  85. NR == 1 ? $0 : ","$0
  86. curl -s http://hgfix.net/paste/view/raw/10673f0a | awk '$1=="docroot" {printf("%s", NR == 1 ? $2 : ","$2);} END {printf("\n");}'
  87. ------------------------------------------------
  88. EXAMPLE TEXT
  89. docroot /home4/b0y4o3s3/public_html/basic.wwus.net
  90. domain basic.wwus.net
  91. docroot /home4/b0y4o3s3/public_html/basic.wwus.net
  92. domain basicw.fun2fart.com
  93. docroot /home4/b0y4o3s3/public_html/biz.wwus.net
  94. ------------------------------------------------
  95. EXAMPLE OUTPUT
  96. /home4/b0y4o3s3/public_html/basic.wwus.net,/home4/b0y4o3s3/public_html/basic.wwus.net,/home4/b0y4o3s3/public_html/biz.wwus.net,/home4/b0y4o3s3/public_html/biz.wwus.net,/home4/b0y4o3s3/public_html/hg.wwus.net,/home4/b0y4o3s3/public_html/hg1.wwus.net,/home4/b0y4o3s3/public_html/hg1.wwus.net,/home4/b0y4o3s3/public_html/hg.wwus.net,/home4/b0y4o3s3/public_html/pro.wwus.net,/home4/b0y4o3s3/public_html/pro.wwus.net,/home4/b0y4o3s3/public_html/test
  97. ------------------------------------------------------------------------------------------------
  98. WILL MATCH FIELD ONE "ADDON_ID" AND IF ANYTHING ON EACH RECORD MATCHES "Could\ not\ add\ zone\ record|CSR\ validation|No\ CSR\ defined" ASSIGN IT TO err THEN PRINT id,err [ARRAY]
  99. ------------------------------------------------------------------------------------------------
  100. curl -s http://hgfix.net/paste/view/raw/b3327a0f | awk '$1=="ADDON_ID"{id=$2}{ if (match($0,/Could\ not\ add\ zone\ record|CSR\ validation|No\ CSR\ defined/,err)) print id,err[0] }'
  101. ------------------------------------------------------------------------------------------------
  102. FIND ALL LOGS WITH Dec-2016.gz, USE ZCAT PIPED TO AWK TO READ THE FILE CONTENT ALSO DEFINING FILE AS F TO PRINT WHERE COLUMN 4 MATCH DATE AND COLUMN 7 MATCHES xmlrpc THEN GET A COUNT AND PIPE BACK TO AWK TO ONLY PRINT COUNT GREATER THAN 1000
  103. ------------------------------------------------------------------------------------------------
  104. find /home/*/logs -type f -name "*Dec-2016.gz" -print | while read FILE ; do zcat "$FILE" | awk -v F="$FILE" '$4 ~ /12\/Dec\/2016/ && $7 ~/xmlrpc/{ print F }' | uniq -c |awk '$1 > 1000 {print}';done
  105. ------------------------------------------------
  106. EXAMPLE OUTPUT:
  107. 1449 /home/madredel/logs/madredellachiesa-settimo.it-Dec-2016.gz
  108. 6018 /home/webstari/logs/web-star.info-Dec-2016.gz
  109. ------------------------------------------------------------------------------------------------
  110. LOCATING ALL EMAIL ACCOUNTS. USING DELIMITER ":" AND @ AS THE OFS THEN SPLIT FILENAME INTO AN ARRAY WITH DELIMITER "/" OUTPUTING FIELD "2"
  111. ------------------------------------------------------------------------------------------------
  112. awk -F : '{OFS="@";split(FILENAME, d, "/"); print$1,d[2]}' etc/*/shadow
  113. info@propheticcenter.net
  114. sandralugo@propheticcenter.net
  115. sandra@propheticcenter.net
  116. events@propheticcenter.net
  117. prayer@propheticcenter.net
  118. orderdept@propheticcenter.net
  119. insync@propheticcenter.net
  120. membership@propheticcenter.net
  121. kingdomconnections@propheticcenter.net
  122. ------------------------------------------------------------------------------------------------
  123. TRYING TO FIGURE OUT HELP HOURS FOR AGENT, SO COPIED CHART RESULTS OF HH PAGE OUTPUT TO TEXT FILE AND RAN FOLLOWING
  124. https://inet.houston.hostgator.com:8443/?gnet_pid=29&p=1479708000&e=10700&m=0&uf=0&ns=0
  125. SNIPPET : http://hgfix.net/paste/view/raw/498c9b6b
  126. sum_min += $4 this means add up all values in column 4 and assign the output to sum_min
  127. sum_hour=int(hours+(sum_min/60) i had to use int here because, by default, awk numbers are always floating point
  128. awk -F "[\t:]" '{OFS=":"}$1!~/[A-Za-z]+/{sum_min += $4;min=sum_min%60;hours+=$3;sum_hour=int(hours+(sum_min/60))}END{print sum_hour,min}' hh_checker
  129. 2:37
  130. ------------------------------------------------------------------------------------------------
  131. STRING MATCH 0-9 WITH 3-5 CHARACTERS FOR COLUMN ( FILE OWNERSHIP ) IN MAIL
  132. ls -lhc /home*/*/mail/ | awk '$3~/[0-9]{3,5}/{print$0}'
  133. ------------------------------------------------------------------------------------------------
  134. AWK MATCH A STRING
  135. for i in `find / -type f -name '*.c' -exec grep -l mysql {} \; | awk '/\/include\// {print}'`; do dirname $i; done
  136. ------------------------------------------------------------------------------------------------
  137. DEFINE FIELDS USING MATCH STRING ON TWO SEPERATE LINES ONLY PRINTING IF BOTH RETURN A VALUE THEN UNSETTING THEM THEN PRINTING THE OUTPUT ON A SINGLE LINE
  138. curl -s http://hgfix.net/paste/view/raw/75dfc19d | awk '$1=="docroot"{dr=$2}$1=="domain"{dom=$2}dr&&dom{print dr,dom;dr=dom=0}'|while read docroot domain; do echo "Docroot: $docroot :: Domain: $domain";done
  139. ------------------------------------------------------------------------------------------------
  140. ADDING UP BANDWIDTH TOTAL FOR USER FROM MAIL LOGS
  141. grep vcshaeffer@kiskipby.org /var/log/maillog | grep -oP '(?<=bytes=)\S*' | awk 'BEGIN {FS = "/"} ; {sum+=$1; sum2+=$2} END {print "IN",sum/1024/1024,"MB","\n""OUT",sum2/1024/1024,"MB"}'
  142. ------------------------------------------------------------------------------------------------
  143. USING PRINTF WITH ARGUMENTS
  144. grep "Created\ Ticket" vps_dedi_empowerment_escalations.csv | awk -F, '{printf("%s,%s,%s\n",$1,$2,$3)}' | sort -k1
  145. ------------------------------------------------------------------------------------------------
  146. FILTER OUTPUT USING GREATER THAN/LESS THAN
  147. ------------------------------------------------------------------------------------------------
  148. FIND COLUMN 5 GREATER THAN 141 AND PRINT COLUMN 7
  149. EXAMPLE: awk '$5 <= 141 { print $7}' /opt/hgmods/kill_imap.log | grep ttchildren.org | sort | uniq -c | sort -rn
  150. ------------------------------------------------
  151. FIND WHERE COLUMN 9 IS LESS THAN COLUMN 10 PRINT 1,2 AND 9
  152. EXAMPLE: awk '$9 < 10 {print $1,$2,$9}'
  153. ------------------------------------------------------------------------------------------------
  154. STRING MATCH A FIELD
  155. ------------------------------------------------------------------------------------------------
  156. MATCH FIELD 7 PRINT COLUMN 5
  157. EXAMPLE: awk 'match($7, /200/){print $5}'
  158. ------------------------------------------------------------------------------------------------
  159. STRING MATCH SEARCHING THE ENTIRE LINE, THEN PRINT OUT COLUMN 1 AS WELL THE MATCHING STRING AND EVERYTHING FOLLOWING
  160. ------------------------------------------------------------------------------------------------
  161. awk 'match($0, /Mozilla.*$/){print $1,substr($0,RSTART,RLENGTH)}'
  162. ------------------------------------------------------------------------------------------------
  163. PRINT OUT A COLUMN AND EVERYTHING AFTER IT WHICH IN THE EXAMPLE PROVIDED IS COLUMN 12
  164. awk '{ s = ""; for (i = 12; i <= NF; i++) s = s $i " "; print s }'
  165. ------------------------------------------------------------------------------------------------
  166. PRINT THE LAST FIELD
  167. tail -f file | grep A1 | awk '{print $NF}'
  168. ------------------------------------------------------------------------------------------------
  169. USING DELIMTER TO PRINT OUT EVERYTHING WITHIN IT
  170. tail -100 access-logs/pennystocktweets.com | awk -F \" '{print$(NF-1)}'
  171. ------------------------------------------------------------------------------------------------
  172. REMOVE DUPLICATE ENTIRES FOR COLUMN 2 BASED ON MATCHES IN COLUMN 1
  173. ------------------------------------------------------------------------------------------------
  174. awk '$1~/regextomatch/&&!_[$2]++{print$2}' filename
  175. ------------------------------------------------------------------------------------------------
  176. SET MULTIPLE DELIMITERS :. THEN MATCH /malek/ AND WHERE DELIMITER : THEN . NOT EQUAL STRING 'coderhall' GSUB 'malek' with nothing and Print
  177. ------------------------------------------------------------------------------------------------
  178. awk -F "[:.]" '$NF~/malek/&&$(NF-2)!="coderhall"{ gsub (": malek",""); print }' /etc/userdomains
  179. akaind.com
  180. siavash.rocks
  181. hamidlighting.com
  182. iranianeyeclinic.com
  183. akagroup.org
  184. isecho.org
  185. akhgarelectric.com
  186. smartgym.akafitness.co
  187. scsir.org
  188. akafitness.co
  189. faradidafzar.com
  190. akafitness.net
  191. ----------------------------------------------------
  192. FULL OUTPUT OF DOMAINS MATCHING malek
  193. ------------------------------------------------------------------------------------------------
  194. grep malek /etc/userdomains
  195. akaind.com: malek
  196. akhgarelectric.coderhall.com: malek
  197. akagroup.coderhall.com: malek
  198. coderhall.com: malek
  199. siavash.coderhall.com: malek
  200. siavash.rocks: malek
  201. iranianeyeclinic.coderhall.com: malek
  202. persiangallery.coderhall.com: malek
  203. hamidlighting.com: malek
  204. isecho.coderhall.com: malek
  205. tornado.coderhall.com: malek
  206. ------------------------------------------------------------------------------------------------
  207. PULLING ADDON DOMAINS AND PATH
  208. ------------------------------------------------------------------------------------------------
  209. addon_domains () { awk -v user="$1" -F "[:=]" '$6 ~ /addon/ && $2 ~ " "user"$" {print $10,$1}' /etc/userdatadomains;}; addon_domains sami
  210. /home3/sami/public_html/berraquerapaisa.com berraquerapaisa.com
  211. /home3/sami/public_html/criaderoaristogatos.com criaderoaristogatos.com
  212. /home3/sami/public_html/esperanzagomez.online esperanzagomez.online
  213. /home3/sami/public_html/fincalacascada.com fincalacascada.com
  214. /home3/sami/public_html/limpiezadelser.com limpiezadelser.com
  215. /home3/sami/public_html/lonchiseda.com lonchiseda.com
  216. /home3/sami/public_html/marketingypromociones.com marketingypromociones.com
  217. /home3/sami/public_html/orgullosamentepaisa.com orgullosamentepaisa.com
  218. /home3/sami/public_html/orgullosamentepaisas.com orgullosamentepaisas.com
  219. /home3/sami/public_html/transportesdya.com transportesdya.com
  220. ------------------------------------------------------------------------------------------------
  221. To pull Addon domains and output only the docroot domain on a single line
  222. ------------------------------------------------------------------------------------------------
  223. hal cpanel_api server_id 163484 username hgdesign function listaddondomains module AddonDomain | awk '$1=="dir"{dr=$2}$1=="domain"{dom=$2}dr&&dom{print dr,dom;dr=dom=0}' | while read docroot domain; do echo "$docroot $domain";done
  224. /home1/hgdesign/public_html/iheartmontrose.com iheartmontrose.com
  225. /home1/hgdesign/public_html/sandwichboard.net sandwichboard.net
  226. ------------------------------------------------------------------------------------------------
  227. for ip in 216.172.184.9{0..9} ; do echo $ip ; whois $ip | grep -oP "(?<=network:Organization;I:)[a-z]\S+" | while read dom ; do dig A $dom | awk '{OFS="\t"}BEGIN{nores=1;}{gsub(".\t","\t");if ($1~/^'$dom'/){print$1,$NF; nores=0}}END{if (nores) print "'$dom'";}'; done ; done
  228. ------------------------------------------------------------------------------------------------
  229. awk -F "[ :]" '$9=="teaevent"&&$8=="CREATE"&&$7=="2016"&&!_[$NF]++{system("if [ -e /var/cpanel/users/"$NF" ] ; then echo -e \""$NF"\t"$(NF-2)"\" ; fi")}' /var/cpanel/accounting.log
  230. ------------------------------------------------------------------------------------------------
  231. PULL ALL USERS CREATED IN 2016 THAT ARE STILL PRESENT ON THE SERVER
  232. ------------------------------------------------------------------------------------------------
  233. awk -F "[ :]" '$8=="CREATE"&&$7=="2016"&&!_[$NF]++{system("if [ -e /var/cpanel/users/"$NF" ] ; then echo -e \""$NF"\t"$(NF-2)"\" ; fi")}' /var/cpanel/accounting.log
  234. ------------------------------------------------------------------------------------------------
  235. ALTERNATE METHOD W/O SYSTEM CALL
  236. ------------------------------------------------------------------------------------------------
  237. awk -F "[ :]" '{OFS="\t"}$8=="CREATE"&&$7=="2016"&&!_[$NF]++{ xxx = " ls /var/cpanel/users/"$NF " 2>/dev/null";if (xxx | getline yyy);else yyy=0 ;close (xxx); if (yyy) print$NF,$(NF-2);close (yyy)}' /var/cpanel/accounting.log
  238. ------------------------------------------------------------------------------------------------
  239. AWK USING GETLINE
  240. ------------------------------------------------------------------------------------------------
  241. [root@gator3314 /home2/ibmperu]# awk -F : '{OFS=":"}$(NF-1)=="ibmperu"{ cmd = "date -d @"$1 ; if (cmd | getline t) $1=t; print$0; close (CMD)}' /var/log/abusetool.log
  242. Fri Jul 15 05:05:52 CDT 2016:http:disable:ibmperu:NBA-45676225
  243. Fri Jul 15 15:38:33 CDT 2016:http:enable:ibmperu:NBA-45676225
  244. Thu Aug 25 02:09:09 CDT 2016:http:disable:ibmperu:GKM-508-47251
  245. Thu Aug 25 09:54:39 CDT 2016:http:enable:ibmperu:GKM-508-47251
  246. Mon Sep 5 08:53:43 CDT 2016:http:disable:ibmperu:HZW-968-32594
  247. Mon Sep 5 11:23:38 CDT 2016:http:enable:ibmperu:HZW-968-32594
  248. Mon Sep 5 15:58:00 CDT 2016:http:disable:ibmperu:FBO-312-70888
  249. Mon Sep 5 23:23:25 CDT 2016:http:enable:ibmperu:FBO-312-70888
  250. Sat Sep 10 02:54:20 CDT 2016:http:disable:ibmperu:CDT-46695619
  251. Sat Sep 10 16:27:44 CDT 2016:http:enable:ibmperu:CDT-46695619
  252. Sat Sep 10 16:34:56 CDT 2016:http:disable:ibmperu:CDT-46695619
  253. Sat Sep 10 16:35:08 CDT 2016:http:enable:ibmperu:CDT-46695619
  254. Sun Oct 23 09:09:51 CDT 2016:http:disable:ibmperu:DLK-445-69830
  255. normally, that file has the epoch time at the beginning. i just used awk and system to replace the epoch time with the output of date -d
  256. that close(cmd) should be lowercase. apparently it doesn't matter though.
  257. ------------------------------------------------------------------------------------------------
  258. File Output
  259. ------------------------------------------------------------------------------------------------
  260. 1456606799:http:enable:adesanya:BHU-42368795
  261. 1457721176:http:enable:dinhdoan:LES-42371168
  262. 1457938626:http:disable:altopode:FKN-42911813
  263. 1458287053:http:disable:delta12:CFK-43040585
  264. 1458345475:http:enable:delta12:CFK-43040585
  265. 1458345566:http:disable:delta12:CFK-43040585
  266. 1458505315:http:enable:delta12:CFK-43040585
  267. 1461223348:http:disable:medwards1965:IAX-800-64427
  268. 1462354690:http:disable:mija:TLG-957-30790
  269. 1462505493:http:disable:onebrady:QLA-521-54355
  270. 1464183707:http:disable:booker10:ZGU-376-21306
  271. 1464310054:http:enable:booker10:ZGU-376-21306
  272. 1464398105:http:disable:imrand:BTD-628-48445
  273. 1464793551:http:enable:imrand:BTD-628-48445
  274. 1464954518:http:disable:imrand:YYF-698-47816
  275. 1465337702:http:enable:imrand:YYF-698-47816
  276. 1468697876:http:enable:mija:TLG-957-30790
  277. 1469009543:http:disable:imrand:XRA-635-30977
  278. 1469220697:http:disable:jmakhoul:IJS-476-61292
  279. 1469640446:http:enable:imrand:XRA-635-30977
  280. 1469640740:http:disable:imrand:XRA-635-30977
  281. 1469640819:http:enable:imrand:XRA-635-30977
  282. 1471273992:http:enable:jmakhoul:IJS-476-61292
  283. 1472649407:http:disable:abuzuluf:UIS-685-57379
  284. 1472691443:http:disable:kiddcuzzclo:BEU-256-20074
  285. 1473148395:http:disable:resttemp:SFP-893-85816
  286. 1473149742:http:disable:alliance:SNH-766-33920
  287. 1473168119:http:enable:resttemp:SFP-893-85816
  288. 1474496232:http:disable:jmakhoul:OSK-128-23639
  289. 1474665094:http:enable:alliance:SNH-766-33920
  290. 1475611369:http:enable:jmakhoul:OSK-128-23639
  291. 1476295452:http:disable:jmakhoul:FMW-723-46783
  292. 1476299671:http:enable:jmakhoul:FMW-723-46783
  293. 1476383235:http:disable:jmakhoul:WHQ-843-26992
  294. 1476469890:http:disable:khosrov:DXE-903-74816
  295. 1476530736:http:disable:james948:ZJM-394-63813
  296. 1476554214:http:enable:james948:ZJM-394-63813
  297. 1476752907:http:enable:jmakhoul:WHQ-843-26992
  298. 1477207923:http:disable:jolib:IET-300-45720
  299. ------------------------------------------------------------------------------------------------
  300. to figure out help hours for agents, so i copied the chart results of the hh page output into a text file and ran the following.
  301. sum_min += $4 this means add up all values in column 4 and assign the output to sum_min
  302. sum_hour=int(hours+(sum_min/60) i had to use int here because, by default, awk numbers are always floating point.
  303. awk -F "[\t:]" '{OFS=":"}$1!~/[A-Za-z]+/{sum_min += $4;min=sum_min%60;hours+=$3;sum_hour=int(hours+(sum_min/60))}END{print sum_hour,min}' hh_checker
  304. ------------------------------------------------------------------------------------------------
  305. TO OUTPUT INTO CSV
  306. ------------------------------------------------------------------------------------------------
  307. awk -F"\t" '{OFS=","}{print $1,$2,$3}' cheri-final1.csv > zzz.csv
  308. 1
  309. # Add comma as separator (as original comma)
  310. 2
  311. awk -F"," '{OFS=","}{print $1,$2,$3}' AviationData.csv > Filtered_AviationData_threefields.csv
  312. 3
  313. # Add vertical bar as separator (as original vertical bar)
  314. 4
  315. awk -F"|" '{OFS="|"}{print $1,$2,$3}' AviationData.vsv > Filtered_AviationData_threefields.vsv
  316. 5
  317. # Add tab as separator (changed from ,)
  318. 6
  319. awk -F"," '{OFS="\t"}{print $1,$2,$3}' AviationData.csv > Filtered_AviationData_threefields.tsv
  320. ------------------------------------------------------------------------------------------------
  321. Using awk
  322. for i in $(cat blah) ; do echo /home/hgbackupdir/restore.pl $(echo $i |awk -F_ '{print $1}') mysql $i ; done------------------------------------------------------------------------------------------------
  323. cat xxx | while read i ; do awk '/$i/{system("touch ~"$2"/.skip_nextdb,hg_backup")}' /etc/userdomains;done
  324. cat xxx | while read i ; do awk '/'$i'/{system("touch ~"$2"/.skip_nextdb,hg_backup")}' /etc/userdomains; done
  325. awk '$7~/hostgator.com/{gsub(".hostgator.com.*$","");print$7}' exim.alerts
  326. br58
  327. gator4093
  328. gator4112
  329. br60
  330. !_[$2]++
  331. ------------------------------------------------------------------------------------------------
  332. awk '/Updated:/{flag=1;next}/\#\t+\#/{flag=0}{gsub("^#","*");gsub("\t#","")}flag' amc_0.3.py
  333. Replaced checkown function with
  334. pwd.getpwuid.
  335. Replaced %s with .format
  336. Removed unecessary logic in logfile.
  337. This is actually pulling from the top of the file output below.
  338.  
  339. #!/usr/bin/python
  340.  
  341. #########################################
  342. # #
  343. # Script: Archive Malicious Content #
  344. # Author: Andrew Narunsky #
  345. # Version: 0.3 #
  346. # #
  347. #########################################
  348. # #
  349. # Updated: #
  350. # Replaced checkown function with #
  351. # pwd.getpwuid. #
  352. # Replaced %s with .format #
  353. # Removed unecessary logic in logfile. #
  354. # #
  355. #########################################
  356. # #
  357. # To Do: #
  358. # Get hash for files. #
  359. # Add flag for more hash checking. #
  360. # Add flag for timestamp checking. #
  361. # #
  362. #########################################
  363. ------------------------------------------------------------------------------------------------
  364. cpmp() { egrep -sc "${1?Please specify a string.}" /usr/local/apache/domlogs/* | awk -F':' '{if ($NF > 0) print $NF,$1}' | sort -n ; } ; cpmp 187.158.5.15
  365. ------------------------------------------------------------------------------------------------
  366. 4.5 Specifying How Fields Are Separated
  367.  
  368. Default Field Splitting: How fields are normally separated.
  369. Regexp Field Splitting: Using regexps as the field separator.
  370. Single Character Fields: Making each character a separate field.
  371. Command Line Field Separator: Setting FS from the command line.
  372. Full Line Fields: Making the full line be a single field.
  373. Field Splitting Summary: Some final points and a summary table.
  374. The field separator, which is either a single character or a regular expression, controls the way awk splits an input record into fields. awk scans the input record for character sequences that match the separator; the fields themselves are the text between the matches.
  375. In the examples that follow, we use the bullet symbol (•) to represent spaces in the output. If the field separator is ‘oo’, then the following line:
  376.  
  377. moo goo gai pan
  378. is split into three fields: ‘m’, ‘•g’, and ‘•gai•pan’. Note the leading spaces in the values of the second and third fields.
  379.  
  380. The field separator is represented by the predefined variable FS. Shell programmers take note: awk does not use the name IFS that is used by the POSIX-compliant shells (such as the Unix Bourne shell, sh, or Bash).
  381.  
  382. The value of FS can be changed in the awk program with the assignment operator, ‘=’ (see Assignment Ops). Often, the right time to do this is at the beginning of execution before any input has been processed, so that the very first record is read with the proper separator. To do this, use the special BEGIN pattern (see BEGIN/END). For example, here we set the value of FS to the string ",":
  383.  
  384. awk 'BEGIN { FS = "," } ; { print $2 }'
  385. Given the input line:
  386.  
  387. John Q. Smith, 29 Oak St., Walamazoo, MI 42139
  388. this awk program extracts and prints the string ‘•29•Oak•St.’.
  389.  
  390. Sometimes the input data contains separator characters that don’t separate fields the way you thought they would. For instance, the person’s name in the example we just used might have a title or suffix attached, such as:
  391.  
  392. John Q. Smith, LXIX, 29 Oak St., Walamazoo, MI 42139
  393. ------------------------------------------------------------------------------------------------
  394. awk -F : '{split($0, d, " |:"); m=(match("JanFebMarAprMayJunJulAugSepOctNovDec",d[2])+2)/3; ; if (d[3]=="") t=mktime(d[8]" "m" "d[4]" "d[5]" "d[6]" "d[7]);else t=mktime(d[7]" "m" "d[3]" "d[4]" "d[5]" "d[6]);c=(strftime("%s") - 5184000);}t>c{checksus = "grep -L SUSPEND /var/cpanel/users/"$NF" 2>/dev/null";checkoff = "ls -1 /home{,1,2,3,4}/"$NF"/public_html/{,*/}{OFERT,geoip,*pay*pal,combo-familia,barclays,santander,identificacao}* 2>/dev/null | head -1"; if (checksus|getline notsus) ; else notsus = 0 ; if (checkoff | getline off) ; else off = 0 ; if (notsus && off) print"\n"$0"\n"off ; close(notsus);close(checkof)}END{print""}' /var/cpanel/accounting.log
  395.  
  396. Fri Dec 23 15:28:27 2016:CREATE:root:root:retroman.us:192.185.128.171:retroma3
  397. /home2/retroma3/public_html/includes/paypalconfig.php
  398.  
  399. Sat Dec 24 21:28:30 2016:CREATE:root:root:kosakatadesign.com:192.185.128.171:kosakata
  400. /home2/kosakata/public_html/tools/geoip:
  401.  
  402. Sun Dec 25 18:53:54 2016:CREATE:root:root:wikeshop.net:192.185.128.171:wikeshop
  403. /home1/wikeshop/public_html/tools/geoip:
  404.  
  405. Mon Dec 26 20:23:34 2016:CREATE:root:root:murddyeffy.com:192.185.128.172:murddyef
  406. /home1/murddyef/public_html/includes/paypalconfig.php
  407.  
  408. Tue Dec 27 11:23:03 2016:CREATE:root:root:casasbahia-sslblindados.com:192.185.128.171:casasbb8
  409. /home1/casasbb8/public_html/home/OFERTAS.zip
  410.  
  411. Wed Dec 28 12:12:08 2016:CREATE:root:root:casas-bahias.com:192.185.128.171:casasbc0
  412. /home1/casasbc0/public_html/images/santander.jpg
  413. ------------------------------------------------------------------------------------------------
  414. awk -F \" 'BEGIN{OFS=","}/^[0-9]/{udata=$0}$2=="status"{status=$4}{if($2=="reason"&&$4!="")print udata,status,$4;else if($2=="reason"&&$4=="")print udata,status}' /home/${user}/.cpanel/backup_status
  415. ------------------------------------------------------------------------------------------------
  416. Example list
  417. ------------------------------------------------------------------------------------------------
  418. [root@gator2007 ~]# cat list2
  419. 1708371 1684080 platinum gator2007.hostgator.com
  420. 1635639 1684091 mrpierre gator2007.hostgator.com
  421. 1708386 1684095 ldoogs1 gator2007.hostgator.com
  422. 1708442 1684145 kraegerb gator2007.hostgator.com
  423. 1708478 1684182 nongtae gator2007.hostgator.com
  424. 1708567 1684275 siammap gator2007.hostgator.com
  425. ------------------------------------------------------------------------------------------------
  426. Example Output
  427. ------------------------------------------------------------------------------------------------
  428. cat list2 | while read bid pkgid user hname; do nas=$(awk -F \" 'BEGIN{OFS=","}/^[0-9]/{udata=$0}$2=="status"{status=$4}{if($2=="reason"&&$4!="")print udata,status,$4;else if($2=="reason"&&$4=="")print udata,status}' /home/${user}/.cpanel/backup_status) ; echo "$bid,$pkgid,$user$nas"; done
  429. 1708371,1684080,platinum,failed,disk
  430. 1635639,1684091,mrpierre,finished
  431. 1708386,1684095,ldoogs1,finished
  432. 1708442,1684145,kraegerb,finished
  433. 1708478,1684182,nongtae,failed,inode
  434. 1708567,1684275,siammap,finished
Add Comment
Please, Sign In to add comment