123
-=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- (c) WidthPadding Industries 1987 0|108|0 -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=- -=+=-
Socoder -> Blitz -> File Manipulation

Mon, 06 Dec 2010, 03:34
mole
Hi,

Long time no see. I still lurk occasionally but I haven't posted for a while and I certianly haven't coded for a while. Hence, I am rusty.

Therein lies my dilemna: I have a task I wish to complete, and I believe that a little Blitz program will make this easier.... If only I could properly remember how to program, that is.

Anyway, I'll start off with my first problem. I am trying to read a column of data from a text file. The first line of the column is a text string and the lines beneath are floats. I am trying to commit this data to 2 arrays: 'FirmName$(a)' for the string, and all of the rest of the floats into 'DataSet1#(b)'.

Here if my code so far. Unfortunately, I have got stuck quite early on in the sense that my debug reporting suggests that it can't see any data in the file...



Any help, links, tips would be super cool.

I will no doubt return quickly after this problem is solved with another problem, as my program develops in size and sophistication.

Thanks in advance!


Mon, 06 Dec 2010, 03:55
Jayenkai
Is the "Float" stored AS a float, or is it a line of text?
(as in, if you open the .dat in notepad, is it readable, or gobbledegook)

If so, ReadTxt$=ReadLine(CorrelData): Read1#=float(ReadTxt$)

-=-=-
''Load, Next List!''
Mon, 06 Dec 2010, 08:11
mole
The data in the file is readable. I will upload it for reference.

I managed get the program to output the data as I expect it to; the data is being stored as I want.

However, that means we now have the next step. By the end, the file will have anything up to 100 columns of data. I need to be able to store all that data in arrays (I can't quite remember but can you have arrays such as 'Dim ArrayName(5,5)'?).

When all the data is stored, I hope to be able to calculate the correlation coefficients between each of the columns of data. That is something I will consider more deeply when I get to it; at the moment I am just trying to get all the data into array ready for processing.

What I can't think of how to do, is the parse/read the file of data and get each chunk of data into its appropriate place in the array. that is where I need help. I never was good at file manipulation

Here is the latest code:


Correlation Data File
Note that I had to change the file extension to the less cool .txt in order to upload it.
Mon, 06 Dec 2010, 17:57
Andy_A
Here's what it sounds like you want.


Wed, 08 Dec 2010, 06:19
mole
I have spend the last day and a half working on your code, Andy, and fine tuning it/modifying it to my needs. Your structure has proved invaluable in allowing me to progress, so a big thanks for that.

The latest I have done so far is make it super easy to change the number of firms in the data file without making the program have a wobbly, introduce a correlation coefficient calculator function, and implement it in a manner such that it calculates the coefficients between all the firms in the data file.

Calculating the correlation coefficient between the data from the file is exactly what I wanted - and it works! This makes me happy.

So now that the program is robust enough for my needs, all I need to do now is get data on another 95 firms or so!!

I should be able to make any minor modifications to the program if the need arises, but I'll be back if there is something that stumps me

Many thanks for your help, Jay and Andy. It is most appreciated.
Wed, 08 Dec 2010, 06:35
Jayenkai
Sorry, I completely forget to get back to this topic! Glad you sorta sorted it!

-=-=-
''Load, Next List!''
Wed, 08 Dec 2010, 08:44
Andy_A
Glad it all worked out for you.

Getting the data into arrays was the toughest part (and it wasn't all that hard).