Forum Archive

"Connecting" Two Files

procryon

Hi all, I have another basic question here, I was wondering if it's possible (I'm sure it is but I just don't know how it works in Python) to have like a main file with all the code and another file with only arrays to hold data.

I wanted to know how I would go about "connecting" the two in the sense that the main file with all the code can look through the arrays in the other "data-storing" file and use them. Thanks for the help!

ccc

You can make both files valid Python code like big_data.py and main_program.py...:

# big_data.py

abc = [x for x in 'abcdefghijklmnopqrstuvwxyz']

# ===============

# main_program.py

import big_data

print(big_data.abc)
dgelessus

It's not possible to have two files and pretend that they were one. If you want to do that, you can simply use one file. :) As @ccc said, you can put the second file in the same folder as your main script, and then import it, but this means that you always have to use the second file's name when you want something from it.

Although you can do something like this, I would not recommend it. In most cases it doesn't make sense to split the code from the data that it uses, all it does is spread your code across multiple files without really helping much. Python's import and module system is meant for separating things that are independent, not for splitting up one thing that should be in one piece.

abcabc

If you have just data in a separate file, may be you can use ast.literal_eval or json .

For example, data1.txt file would contain the following

["aa", "bb", "cc", "dd"]

and main.py would contain the following

import ast
import json

my_list = ast.literal_eval(open('data1.txt').read())
print(my_list)

my_list1 = json.loads(open('data1.txt').read())
print(my_list1)
ccc

@abcabc, you are not closing the files you open.

json.load(open('data1.txt'))

abcabc

ok. modified main.py

import ast
import json

with open('data1.txt') as fp:
    my_list = ast.literal_eval(fp.read())
    print(my_list)

with open('data1.txt') as fp:
    my_list1 = json.load(fp)
    print(my_list1)

Phuket2

Sorry to say, but I would love to see the reemergence of the resource/data fork. I understand it was just an illusion provided by the filesystem and os that linked the files. But it did work well for many years, I am sure it's gone for a reason. But it's also possible that reason has been made obsolete with newer hardware. Just saying.

If you are too young to know about the the macs resource/data fork file system, here is one wiki listing. It was really nice.

Yeah, there is some cross platform/NAS type issues also. Which is key to Python/windows.

Really, If the resource/data fork concept was alive today and cross platform, this would fit Python perfectly. Well, that's a bold statement. But I can not see how it wouldn't.

Ok, that's my 2 cents worth. I know a lot don't agree. I think transmission is another issue. But they can be overcome.